Or work done in moving something from point A to point B, such as a fan. ... How?? If I move an object from point A to point B, most of that energy is simply used up in moving the object. Hardly any of it will be released as heat (mostly from my body, but a tiny amount in friction too). Fans move air from point A to point B.
Energy cannot be created or destroyed - first law of thermodynamics. The energy "used" has to be converted to something else. For a fan, that's heat and sound.
See
http://en.wikipedia.org/wiki/Conservation_of_energy .
Surely it matters in figuring out whether leaving a computer on all night is a "good idea" or not? If it doesn't actually help in heating your room, then what's the point in mentioning heating at all?
If it doesn't heat your room, then any cost from leaving it on is also trivial. As I say, you can't have it both ways. Of course, if you're under the impression that energy can be "used up", this is the source of your misconception. It can't.
How does visible light end up as heat? Air doesn't absorb visible light at all, and our bodies reflect most of it.
But where does it go after that? If light continually reflected, you could turn a light on in a closed room, then switch it off, and the room would still be lit.
Either you spend energy on a fan to pump air around the room, or you spend energy heating a larger surface area.
You don't "spend energy" on a fan. Are you suggesting that a whole stack of computers would be cheaper at heating the room, because the "heat would move around better like a radiator"?
Yes, exactly - as I said, they're either far too wasteful to be left on when you're not using it, or it's completely useless to leave it on, in which case saving $10 is a pretty compelling reason to turn it off if you're not using it.
My point is that people are overestimating the cost, by assuming that the energy in a computer is actually "used up" somehow. This is in violation of one of a fundamental physical law.
Your later posts:
You might as well say that calling your friends on your mobile phone or listening to an iPod is a viable addition to heating. But nobody would say that. Ask yourself why.
It's only not viable on the grounds that it uses very little energy in the first place. As I say, my small heater runs at 3000W. What does your phone run at?
My living room used to have 3x 100w lightbulbs... You gonna argue that a computer is about as effective as a lightbulb at heating a room? Cos I'd agree....
That was exactly the same point people were making earlier. Although the lights, it's more likely for light to escape out of windows, thus it will be "lost" from the house. And I also made the point about lights usually being on the ceiling.
Except 300w of heat doesn't leave the computer, it lingers
It can't linger indefinitely - the heat will spread out. So you're arguing that a computer might not heat your room as quickly when you first turn it on, but it will still heat the room. It's still unclear to me that a computer is "slow" at heating - the reason heaters seem quicker is because they're vastly more powerful. How would a 300W computer compare to a dedicated 300W electric heater?
Don't turn off your TV when you're not watching -- it helps with the heating!
No one's saying that, as I already stated. The point being made that the cost of leaving things on is not equal to the cost of the electricity, if you'd otherwise be heating the room.