Leaving computers on 24/7 - effect?

I stated that it ends up as heat because it is true. I don't believe I stated that we are using those other devices.

I sat down and thought about this for a bit.

Lets assume we have a computer and a heater of the same power output. Now take a room of a certain heat capacity, which will be our system. We will assume this system is not perfect, and leaks a certain amount of heat, lets say 50 watts. Each case, the heater and the computer will be a separate system but with the same parameters.

So, we calculate the amount of heat that remains in the system after one second for the computer: (300W-50W) *1s = 250J.
And the heater: (300W-50W) * 1s = 250J

In both cases, the system ends up with the same amount of energy in the end.

According to thermodynamics, our two systems are completely equal. Both of the proposed rooms contain 250J of energy after 1 second of operation.
 
Except 300w of heat doesn't leave the computer, it lingers, or it doesn't even end up as heat after 1s... Again, you're not looking at useful work. You said earlier (correctly) that a computer would take longer to heat a room than an equally rated (i.e. same power consumption) electric heater -- that is to say, the useful power out of a computer is lower than the useful power output of an electric heater. The reason it takes longer to heat is because there's no or very little forced convection from a computer, whereas a large part of an electric heater's job is indeed to force air around the room, facilitating conduction.
 
From the viewpoint of thermodynamics movement within the system does not matter as the system ends up at the same net energy.

EDIT -- did you not say your heating is a radiator? Radiators have no active fans.
 
Except 300w of heat doesn't leave the computer, it lingers, or it doesn't even end up as heat after 1s... Again, you're not looking at useful work. You said earlier (correctly) that a computer would take longer to heat a room than an equally rated (i.e. same power consumption) electric heater -- that is to say, the useful power out of a computer is lower than the useful power output of an electric heater. The reason it takes longer to heat is because there's no or very little forced convection from a computer, whereas a large part of an electric heater's job is indeed to force air around the room, facilitating conduction.

The original point was that if you spend $200 on heating your home uniformly throughout, operating a computer for $10 will reduce your heating bill to $190. (And if you're running A/C, operating a computer for $10 will increase your A/C bill by more than $10.) This is still correct.

It doesn't matter if heat is lingering, in the long run it dissipates throughout the house, the only relevance to lingering is that heat stuck on an outside wall will dissipate a larger amount out of the house.

While there are some assumptions necessary for this (uniform temperature regardless of room in home, and regardless of time of day), they're not going to have much of an effect on long-run cost.

Also, FWIW, fans in electric heaters have less to do with spreading heat around a room than they do with preventing the heater from melting down from excess heat, as they tend to do if the fan stops working.
 
@GB: I was talking about fan heaters, I thought I made that pretty clear... I mentioned radiators once, in response to mdwh and gas heating.

Your first sentence is just completely missing the point...

The original point was that if you spend $200 on heating your home uniformly throughout, operating a computer for $10 will reduce your heating bill to $190. (And if you're running A/C, operating a computer for $10 will increase your A/C bill by more than $10.) This is still correct.

It doesn't matter if heat is lingering, in the long run it dissipates throughout the house, the only relevance to lingering is that heat stuck on an outside wall will dissipate a larger amount out of the house.
Actually, the amount of heat that merely lingers, and therefore the rate at which a computer heats a house, is very important, because if it heats your house slower than heat leaks out of the house then it doesn't reduce your bill by $10, but by a much lower amount. It will, of course, still reduce your bill, but given how little heat lost through windows, doors, etc a computer actually replaces, it really isn't significant.

Also, FWIW, fans in electric heaters have less to do with spreading heat around a room than they do with preventing the heater from melting down from excess heat, as they tend to do if the fan stops working.
Well, FWIW, computers have less to do with spreading heat around than with playing games.
 
I can vouch for computers being able to heat up a room. Living in Australia heat is not such a big problem but leaving my computer on 24/7 and idle for the greater part of the day and night keeps my room warmer then any other room in the house. Of course my room isn't that big and the windows and door are closed. After long gaming sessions it feels even warmer. For the Australian winter it is quite useful.

In a country/area that can actually get cold I would suggest other forms of heating. :)

About the efficiency of using a computer as a heater I don't really want to get involved in it but it can be argued that a computer doesn't need to be used purely as a heater when not being used. Downloads can be left running, camera systems can use the machine to record data, TV shows can be recorded etc while at the same time providing some albeit minor warmth also the more intensive the task the greater the heat.
 
Actually, the amount of heat that merely lingers, and therefore the rate at which a computer heats a house, is very important, because if it heats your house slower than heat leaks out of the house then it doesn't reduce your bill by $10, but by a much lower amount. It will, of course, still reduce your bill, but given how little heat lost through windows, doors, etc a computer actually replaces, it really isn't significant.

No, the rate is not relevant at at. In the long run, there's no difference between heat output from a computer and heat output from a primary form of heating in effectiveness of heating.
 
No, the rate is not relevant at at. In the long run, there's no difference between heat output from a computer and heat output from a primary form of heating in effectiveness of heating.
Assuming you leave your heating on all day... Again, you might as well leave your toaster on all day.
 
No, my claim is not contigent on that assumption.

A house needs X BTUs to heat to a certain level.

A computer produces Y BTUs. Therefore your heating system only need to produce X-Y BTUs.
No, because if you turn your heating off during the day and night and on during the evening and morning, it stops being a "long run" calculation and starts being a "short run" calculation -- in which the rate at which a computer heats a house is highly significant.

Seriously, Zelig, what you're saying simply amounts to an argument for leaving everything on all the time. Don't turn off your TV when you're not watching -- it helps with the heating! Don't turn off your kettle, it helps with the heating! Don't turn off your oven, it helps with the heating! It's absurd; I'm trying to explain why it's absurd, but no-one seems to want to know. You all just seem to want to conclude that leaving your computer on all the time doesn't waste money or energy...
 
No, because if you turn your heating off during the day and night and on during the evening and morning, it stops being a "long run" calculation and starts being a "short run" calculation -- in which the rate at which a computer heats a house is highly significant.

This is pretty much a worst case scenario, but it just means (essentially) that the energy used by a computer while the heat is turned off is wasted, which is obvious in the first place. Heats dissipates (relatively) quickly from a computer anyway. Within 30 minutes of turning my computer off, it's completely cooled down - in other words, the heat has spread to the rest of the house.

Seriously, Zelig, what you're saying simply amounts to an argument for leaving everything on all the time. Don't turn off your TV when you're not watching -- it helps with the heating! Don't turn off your kettle, it helps with the heating! Don't turn off your oven, it helps with the heating! It's absurd; I'm trying to explain why it's absurd, but no-one seems to want to know. You all just seem to want to conclude that leaving your computer on all the time doesn't waste money or energy...

That's not absurd at all... however, again, it's relevant when you're heating the house.

If you're heating a house, leaving appliances on subtracts X watts worth of heating via electricity.

If you're not heating or using A/C, using an appliance for X watts costs exactly X watts. (And heats your house up)

If you're using A/C, using an appliance for X watts costs >2*X watts.

So over the course of a year, if you use A/C a similar amount as you heat your home, it will more or less balance out.

FWIW, I live in Canada, and only heat my house for about 10 weeks of the year, the rest of the time, appliances and people keep it warm enough.
 
Except it doesn't subtract X, because that heat isn't replacing any heat lost from windows and doors during the time that heating is being used, since the heat released from X doesn't circulate fast enough. That's why the rate is important.

Maybe another example might be helpful... Lets simplify this to 1 room, with an electric heater that's very fast to warm up a room, and one that's very slow to warm up a room, both drawing the same amount of power. Lets put some numbers (or rather letters) to that: Heater 1 raises the temp of a perfectly insulated room by T at a rate of r'; heater 2 raises the temp by T at a rate of r''. Our simple room, though, isn't perfectly insulated, so heat escapes such that the temp falls by T in time a. Heater 1 is "very fast", while heater 2 is "very slow", so a < r'' < r'. The outside temperature is such that T happens to be the exact temperature difference between a "cold" room (i.e. without any heaters on) and a "warm" room (i.e. at a comfortable temperature).

Now, in order to raise the temperature from "cold" to "hot", heater 1 needs to be on for t' = 1/(r'-a); heater 2 needs to be on for t'' = 1/(r''-a). Since r' > r'', t' < t''. And since both draw the same amount of power, P, the amount of energy used by each E ~ t ~ 1/(r-a) => E' < E''.

So the fast heater, since it has to be on for less time, ends up costing less in energy than the slow heater of the same rating. That's why the rate at which heat gets pushed around the room is important.


[Note: in reality, Newton's law says that a is dependent on the temperature difference between outside and inside, and therefore changes as the inside temp changes. We'll ignore this for now, but I might come back to it later, because I really like doing this sort of thing.......]
 
Except it doesn't subtract X, because that heat isn't replacing any heat lost from windows and doors during the time that heating is being used, since the heat released from X doesn't circulate fast enough. That's why the rate is important.

Replacing lost heat isn't relevant in this sense. Any house heating system isn't going to be turned on all the time, it kicks on and off to keep temperature more or less constant. Again, heat from a computer is completely dissipated from the computer after 30 minutes. This heat raises the house temperature, and causes the home heating system to not kick in when it otherwise would have.

Maybe another example might be helpful... Lets simplify this to 1 room, with an electric heater that's very fast to warm up a room, and one that's very slow to warm up a room, both drawing the same amount of power. Lets put some numbers (or rather letters) to that: Heater 1 raises the temp of a perfectly insulated room by T at a rate of r'; heater 2 raises the temp by T at a rate of r''. Our simple room, though, isn't perfectly insulated, so heat escapes such that the temp falls by T in time a. Heater 1 is "very fast", while heater 2 is "very slow", so a < r'' < r'. The outside temperature is such that T happens to be the exact temperature difference between a "cold" room (i.e. without any heaters on) and a "warm" room (i.e. at a comfortable temperature).

Now, in order to raise the temperature from "cold" to "hot", heater 1 needs to be on for t' = 1/(r'-a); heater 2 needs to be on for t'' = 1/(r''-a). Since r' > r'', t' < t''. And since both draw the same amount of power, P, the amount of energy used by each E ~ t ~ 1/(r-a) => E' < E''.

So the fast heater, since it has to be on for less time, ends up costing less in energy than the slow heater of the same rating. That's why the rate at which heat gets pushed around the room is important.

a < r'' < r' doesn't make sense, they're not all the same units. I'll just assume r'' < r' for any a.

I'll skip the rest of the math, because r'-a doesn't make sense either.

What you're neglecting is that heater 2 continues to dissipate heat for longer than heater 1. Let's jump to the end of your scenario. If the heater kicks on when the temperature reaches T' (lower than T, obviously), heater 1 will have to kick in sooner than heater 2, as the temperature will fall more rapidly to T'. The difference in time between heater 1 and heater 2 starting again will be approximately equal to the difference in time between the heaters turning off, leading to overall equivalent energy consumption.
 
Sorry, I meant "at a rate of a" instead of "in time a". I changed it half way through (actually I changed it back and forth a couple times...), missed out that one.

Anyway, what you're essentially saying is that the slow heater will produce temperatures that on the upside are too hot -- wasted heat -- and on the downside too cold -- insufficient heat. That doesn't sound very efficient to me!
 
Anyway, what you're essentially saying is that the slow heater will produce temperatures that on the upside are too hot -- wasted heat -- and on the downside too cold -- insufficient heat. That doesn't sound very efficient to me!

It's not wasted on the upside, since only the undissipated heat is going to be above T. (Assuming the rate of dissipation is lower than the rate of heat loss - I'm not sure how reasonable this assumption is for the abstract heater/room example, but the rate of dissipation of a computer is obviously lower than the rate of dissipation of heat from a house.)

And there's not insufficient heat on the downside, either heater kicks in at temperature T'.
 
The slow heater overshoots on the upside and undershoots on the downside. Say the mean temp is T'' (the thermostat's upper bound); the thermostat switches the heater off. But the temp at the heater is much much higher than the mean temp, so the mean temp continues to rise (since we're assuming there's no fans or anything on heater 2, the temp at the heater is always higher than the average temp when the heater is or has just been on). If heat dissipates around the room faster than a -- a fair stipulation, given that the heater has no fans and we already said that r'' > a -- the mean temp will continue to rise beyond T''. Similarly, on the way down, it takes longer between the heater kicking in and the mean temp rising again for heater 2 than heater 1, so the temp continues to fall for heater 2 further than for heater 1 (since r' > r'') (this was basically what I illustrated in the previous post).

It might be worth reductio ad absurdum-ing this. Imagine a heater that heats a room instantly, vs a heater that takes 10 years to heat the room. The first heater would always be between T' and T''. And the second heater...?
 
Or work done in moving something from point A to point B, such as a fan. ... How?? If I move an object from point A to point B, most of that energy is simply used up in moving the object. Hardly any of it will be released as heat (mostly from my body, but a tiny amount in friction too). Fans move air from point A to point B.
Energy cannot be created or destroyed - first law of thermodynamics. The energy "used" has to be converted to something else. For a fan, that's heat and sound.

See http://en.wikipedia.org/wiki/Conservation_of_energy .

Surely it matters in figuring out whether leaving a computer on all night is a "good idea" or not? If it doesn't actually help in heating your room, then what's the point in mentioning heating at all?
If it doesn't heat your room, then any cost from leaving it on is also trivial. As I say, you can't have it both ways. Of course, if you're under the impression that energy can be "used up", this is the source of your misconception. It can't.

How does visible light end up as heat? Air doesn't absorb visible light at all, and our bodies reflect most of it.
But where does it go after that? If light continually reflected, you could turn a light on in a closed room, then switch it off, and the room would still be lit.

Either you spend energy on a fan to pump air around the room, or you spend energy heating a larger surface area.
You don't "spend energy" on a fan. Are you suggesting that a whole stack of computers would be cheaper at heating the room, because the "heat would move around better like a radiator"?

Yes, exactly - as I said, they're either far too wasteful to be left on when you're not using it, or it's completely useless to leave it on, in which case saving $10 is a pretty compelling reason to turn it off if you're not using it.
My point is that people are overestimating the cost, by assuming that the energy in a computer is actually "used up" somehow. This is in violation of one of a fundamental physical law.

Your later posts:

You might as well say that calling your friends on your mobile phone or listening to an iPod is a viable addition to heating. But nobody would say that. Ask yourself why.
It's only not viable on the grounds that it uses very little energy in the first place. As I say, my small heater runs at 3000W. What does your phone run at?

My living room used to have 3x 100w lightbulbs... You gonna argue that a computer is about as effective as a lightbulb at heating a room? Cos I'd agree....
That was exactly the same point people were making earlier. Although the lights, it's more likely for light to escape out of windows, thus it will be "lost" from the house. And I also made the point about lights usually being on the ceiling.

Except 300w of heat doesn't leave the computer, it lingers
It can't linger indefinitely - the heat will spread out. So you're arguing that a computer might not heat your room as quickly when you first turn it on, but it will still heat the room. It's still unclear to me that a computer is "slow" at heating - the reason heaters seem quicker is because they're vastly more powerful. How would a 300W computer compare to a dedicated 300W electric heater?

Don't turn off your TV when you're not watching -- it helps with the heating!
No one's saying that, as I already stated. The point being made that the cost of leaving things on is not equal to the cost of the electricity, if you'd otherwise be heating the room.
 
Although the lights, it's more likely for light to escape out of windows, thus it will be "lost" from the house.
4% of it is light. The rest is heat.
And I also made the point about lights usually being on the ceiling.
Which is marginally less useful than heat inside a computer or toaster.
How would a 300W computer compare to a dedicated 300W electric heater?
That's what these later discussions are trying to establish.
No one's saying that
Except that your argument in favour of computers is identical to an argument in favour of toasters or ovens or TVs or any other device. It all ends up as heat anyway, therefore the cost can be deducted from your heating bill. That's nonsense, because it doesn't end up usefully contributing to heat over relevant timescales (another way of saying that heat stuck inside a microwave is useless, and by the time it actually gets anywhere useful, you've already turned off your heating and gone to bed/work/school).

No-one's saying it explicitly because it's stupid, but replace "computer" in these discussions with "toaster" and you have logically identical arguments, albeit with slightly a different lexicon.
 
the mean temp will continue to rise beyond T''.

No, because:

It's not wasted on the upside, since only the undissipated heat is going to be above T. (Assuming the rate of dissipation is lower than the rate of heat loss - I'm not sure how reasonable this assumption is for the abstract heater/room example, but the rate of dissipation of a computer is obviously lower than the rate of dissipation of heat from a house.)

It might be worth reductio ad absurdum-ing this. Imagine a heater that heats a room instantly, vs a heater that takes 10 years to heat the room. The first heater would always be between T' and T''. And the second heater...?

Except this violates some of the inequalities in rates which I think are very reasonable assumptions. The fact that dissipation rates of the appliance is less than the primary sources of heat loss and heating effectiveness is less than the primary heating device is somewhat important.

No-one's saying it explicitly because it's stupid, but replace "computer" in these discussions with "toaster" and you have logically identical arguments, albeit with slightly a different lexicon.

Yeah, you can heat a few rooms just fine using an oven with the door open. And a toaster is pretty much perfectly efficient as a heating device, my toaster has totally cooled off 5 minutes after use, ie. the heat has totally dissipated into the air.
 
Back
Top Bottom