Leaving computers on 24/7 - effect?

Yeah, you can heat a few rooms just fine using an oven with the door open.
But nowhere near as efficiently as a dedicated electric heater, because the oven doesn't actually push air into the other room. Instead, it has to operate at a higher temperature, in order to create the temperature gradient down which heat conducts. That's vastly less efficient than having a heater with a fan. See my first example in reply to GB.
 
But nowhere near as efficiently as a dedicated electric heater, because the oven doesn't actually push air into the other room. Instead, it has to operate at a higher temperature, in order to create the temperature gradient down which heat conducts. That's vastly less efficient than having a heater with a fan. See my first example in reply to GB.

The only way this is less efficient is if you're losing more heat to the outdoors because of a higher temperature difference between the hot air around the oven and the cold wall directly beside the oven.

If the oven is situated in the middle of a circular room, it's going to be the same efficiency as a an electric heater and fan in the middle of the room. (Actually, probably more efficient per mean room temperature, as the fan/heater would place more hot air at the walls of the room.)
 
No, because you're ignoring the fact that it doesn't take a particularly big fan to enable a reduction in temperature at the heat source, whilst maintaining the same mean temp. E.g. a 1 kw heater might heat a room adequately, but if you strapped on a 100w fan you might only need a 500w heater to provide the same mean temp. This is because it doesn't rely on a large temperature gradient to conduct through the rest of the room, but instead on pushing the hot air to parts of the room where cold air is, meaning you need a shallower temp gradient to achieve the same mean temp.

And even if what you're saying is correct, which it isn't, if I've reduced your argument to "an oven in the middle of a circular room", then I'd say my job here is done...
 
4% of it is light. The rest is heat. Which is marginally less useful than heat inside a computer or toaster.
If you have a computer which keeps all the heat inside it, then you're going to have overheating problems...

That's what these later discussions are trying to establish.
This is already established by overwhelming scientific consensus.

Do you at least concede that the energy ends up as heat, and are now switching to the argument that the heat takes too long to spread out? Or are you still maintaining that the energy in turning fans etc is "used up"?

I see what you're saying about the heat dissipation rate, but I don't see any evidence that computers are bad at this. They're specifically designed to get rid of heat as quickly as possible (due to overheating issues). As I've said, a radiator only relies on air convection, and that would work for a computer too.

The point about pushing the heat all the way round the house doesn't work either - no one is suggesting using a single bank of computers in one room to heat a whole house. Rather that the energy contribution would reduce the heating required. Even if it's just in that one room where you need lower heating, it's still reducing the heating required. Whether you have just one computer, or whether you have a computer in every room, they're still just as efficient as a radiator.

Except that your argument in favour of computers is identical to an argument in favour of toasters or ovens or TVs or any other device. It all ends up as heat anyway, therefore the cost can be deducted from your heating bill.
Yes - allowing for the points that this only applies when you'd want the heating on, and also acknowledging the difference between gas and electric, if you have gas.

No-one's saying it explicitly because it's stupid, but replace "computer" in these discussions with "toaster" and you have logically identical arguments, albeit with slightly a different lexicon.
You think that the basic laws of physics don't apply to toasters?

My kitchen doesn't have its own radiator, and I know it soon heats up when the oven is used (even though presumably ovens are designed to try to retain heat, for obvious reasons). The only reason a toaster is less noticable *is because it doesn't use as much energy in the first place*! You can't have it both ways, and claim that something is a colossal waste of energy, whilst also saying it doesn't produce much heat energy. *All* energy ends up as heat. Zelig makes a good point - if the heat was really being well retained by toasters and computers, they should still be hot to touch long after you've switched them off.
 
If you have a computer which keeps all the heat inside it, then you're going to have overheating problems...
I already said that a computer's fan moves air from inside the case to outside the case.

Do you at least concede that the energy ends up as heat, and are now switching to the argument that the heat takes too long to spread out? Or are you still maintaining that the energy in turning fans etc is "used up"?
What I've been saying is that the amount of useful work done by a heater+fan combination is not simply the total amount of heat pumped out, but also in pushing the air around the room to facilitate convection. The heater+fan clearly does more useful work than just the heating element by itself, and not because the fan also produces heat, but because it facilitates convection. I've also looked at it from the other way: a heater that just heats one single molecule and doesn't let that molecule go anywhere is not doing any useful work, even though it outputs a great deal of heat. And, most recently, I've looked at it from a third way: if you need a 1,000w heating element to create a sufficient temperature gradient to heat a room (i.e. replace heat lost from windows/doors after at equilibrium, which in a previous example escaped at a rate of a), but you only need a 500w heating element and a 100w fan to heat the same room, then the fan has done more useful work than the extra 500w pumped out by the first heater.

It may be difficult for you to see, and that may be because I haven't expressed myself very clearly, but what what you think is "switching arguments" is just presenting the same concept in several different ways.

The point about pushing the heat all the way round the house doesn't work either - no one is suggesting using a single bank of computers in one room to heat a whole house. Rather that the energy contribution would reduce the heating required. Even if it's just in that one room where you need lower heating, it's still reducing the heating required. Whether you have just one computer, or whether you have a computer in every room, they're still just as efficient as a radiator.
Heh, now who's switching arguments. I'm not saying it doesn't reduce the heating required, because that would amount to saying that it doesn't produce any heat at all. What I'm saying is that they're not as efficient as electric fan heaters (or, put another way, that an electric fan heater drawing the same amount of power would reduce heating required more). If you operate under the naive belief that any heat out is useful heat, or the rather baffling belief that there's no utility in forced convection, then of course you will find that all electrical devices are as efficient as each other. But efficiency is about useful output, not just heat that lingers around the ceiling or at the back of a fridge or around a toaster.

I haven't considered radiators at all in this, but they are usually placed under windows, so that they don't have to operate at as high a temperature as, say, an oven in the middle of a room. I'd contend that, for this reason, an electric radiator placed under a window would be more efficient than a fridge or a computer. The discussion to date, however, hasn't involved radiators, and I haven't given them more thought than that. Maybe a radiator under a window is more efficient than an electric heater - that wouldn't surprise me at all.

EDIT: Just thought about it and yes, a radiator the width of the window and under it would be MUCH more efficient than any other option, because it would only have to operate at a little above room temperature in order to maintain room temp. Still need to work out what that depends on though.
 
I have a related question.

Sometimes my computer goes into some kind of sleep mode (even though I have it turned off), and when I try to press a key or move the mouse to get it out, it often gets stuck and won't return to normal. What is causing that?

I hate sleep modes for that reason. I tried to turn it off on my new computer, but apparently I did not do so correctly. I normally have my monitor shut off fairly fast (I think it's set around 3 or 5 mins), but do not like sleep modes.

Because it doesn't work that well, I often turn it off if I'm not going to use it within an hour.
 
I already said that a computer's fan moves air from inside the case to outside the case.
Which then heats up the room as well as any radiator.

What I've been saying is that the amount of useful work done by a heater+fan combination is not simply the total amount of heat pumped out, but also in pushing the air around the room to facilitate convection. The heater+fan clearly does more useful work than just the heating element by itself, and not because the fan also produces heat, but because it facilitates convection.
Do you at least concede that a computer is just as useful as a radiator of equal power - even if you're arguing that a fan powered heater of equal power is better?

I've also looked at it from the other way: a heater that just heats one single molecule and doesn't let that molecule go anywhere is not doing any useful work
How does it not let the molecule go anywhere? Heat rises. The heat out of a computer (or by a radiator) spreads through the room by convection.

And anyhow, a computer does have a fan chucking out heat. Which is the same reason why electric heaters have fans, as Zelig pointed out: "Also, FWIW, fans in electric heaters have less to do with spreading heat around a room than they do with preventing the heater from melting down from excess heat, as they tend to do if the fan stops working." To which you replied:

Well, FWIW, computers have less to do with spreading heat around than with playing games.
So you agree that electric fans are no better than a computer, as neither's fans are to do with spreading heat around?

Heh, now who's switching arguments.
I'm not sure I've ever argued only using your computer to heat the house.

I'm not saying it doesn't reduce the heating required, because that would amount to saying that it doesn't produce any heat at all. What I'm saying is that they're not as efficient as electric fan heaters
Which is wrong, by the basic laws of physics. Possibly you mean something other than efficiency, but I'm still not sure what you mean. A fan heater is good if you've just come in from the cold and you need something to heat you up really quick - but that's irrelevant anyway, when it comes to the normal way in which people heat their homes. I want my whole house to be warm, and any heat added to that by a computer is done so with 100% efficiency. My computer isn't on when I've just come into the house - but it's on during the hours when I've also got radiators on, maintaining a warm temperature throughout the house.

But efficiency is about useful output, not just heat that lingers around the ceiling or at the back of a fridge or around a toaster.
You have your computer stuck on a ceiling? If so, I guess that explains why your computer isn't as good as heating the room :) Most people have them on floor level.

I haven't considered radiators at all in this, but they are usually placed under windows, so that they don't have to operate at as high a temperature as, say, an oven in the middle of a room. I'd contend that, for this reason, an electric radiator placed under a window would be more efficient than a fridge or a computer.
My computer is next to the window, and my radiators aren't - so my computer is more efficient then?

Though I don't see how a window affects anything?

The discussion to date, however, hasn't involved radiators
Of course it has, as that's how most people heat their homes. Even for electric heaters, it's only the smaller ones that tend to have fans, not all of them do. Is all of the heating in your home fan powered?

My original post was talking about reducing the heating bills, which is normally made up of things like radiators. Not that it matters - a fan powered electric heater might be a bit quicker to heat a small space up if you point it at you, but it's no more efficient, or indeed quicker, at heating up a room or a house in general (and as I've said, the only reason it's warmer is because it's far more powerful - if you had a hypothetical 3000W computer, you'd get warm pretty quick if you stood behind the fan anyway; and the room would heat up just as quick as anything, whether it was fan powered or not).

EDIT: Just thought about it and yes, a radiator the width of the window and under it would be MUCH more efficient than any other option, because it would only have to operate at a little above room temperature in order to maintain room temp.
How?
 
Though I don't see how a window affects anything?
If your radiator is under the window, it doesn't have to create as large a temperature gradient in order to replace heat lost at the window as a radiator placed at the other end of the room, say. The reason is just that the cold air from the window falls and the hot air from the radiator rises; if the radiator is on the opposite wall, the hot air will just sit at the top of the room, like a lightbulb, before being pushed to the window and cooled, meaning that you'd have to have a much hotter radiator in order to heat the room. I'm much less confident about the actual temp required for a radiator under a window, though; I can't remember what reason I had for saying "just over room temp" specifically, since I suppose it depends on the strength of the convection current created by the window. The point, though, is that the heat from the radiator rising counteracts the cold air at the window falling, meaning you get a "virtuous" convection cycle, rather than the vicious one that happens with a radiator at the other end of the room. I was probably overzealous saying it was "MUCH" more efficient than other options; it might not be, since a fan might be better than the "natural" current created by the window at reducing the radiator temp required.

Of course, as you said, if you get an electric fan heater and blast it directly at you, you'll need much less power still to keep you warm enough...

In response to the rest of your post, you're just conflating "100% efficient at converting electricity to heat" with "100% efficiency at heating a room", or "reduces heating bills by the same amount as any other electrical appliance or dedicated heater". Maybe that's not what you're arguing, but it's what Zelig and Genocidic Bunny have been arguing, which may be why we're struggling to agree. In some ways, Zelig's is at least more consistent; if you're saying that the rate at which a heater heats a room doesn't matter, why should you care whether the heat lingers behind the fridge or on the ceiling? It all adds up, doesn't it? If the rate does matter, then surely an electric heater with dedicated fan is better? Surely if the position of the heat matters, then a fan, which positions the heat in whatever direction/area of the room you want, would provide some amount of useful work?
 
If your radiator is under the window
Even if this does make a difference, as I say it's irrelevant to the computer versus radiator discussion, since you can have radiators that aren't under windows, and computers that are.

Of course, as you said, if you get an electric fan heater and blast it directly at you, you'll need much less power still to keep you warm enough...
Yes, if you spend all your time sat in one location of your house, it's cheaper to just blast hot air at you rather than using a non-fan radiator or computer. But it's still true that a computer is no less efficient that a non-fan radiator.

And for the vast majority of the population who like to heat our homes, rather than confining ourselves to a small spot with a fan heater focused on us, this argument isn't relevant (not to mention you could always turn the computer round if you were that worried, so the fan is blasting hot air at you...)

How do you heat your home? Do you seriously just have a fan heater that you direct at wherever you're sitting?

if you're saying that the rate at which a heater heats a room doesn't matter, why should you care whether the heat lingers behind the fridge or on the ceiling? It all adds up, doesn't it? If the rate does matter, then surely an electric heater with dedicated fan is better? Surely if the position of the heat matters, then a fan, which positions the heat in whatever direction/area of the room you want, would provide some amount of useful work?
The position matters only in the sense that most people don't live on their ceilings. And since heat rises, a heater stuck at the ceiling might be a problem, because some heat will immediately be lost. The point is it doesn't matter if having a heater on the ceiling is less useful - if it's true, then it's irrelevant because computers aren't stuck on the ceiling. If it's not true, then the argument is irrelevant anyway.

A fan is no help at all here - you don't need to direct the heat downwards, if your computer/radiator starts off on the ground. As I say above, most people don't decide to confine themselves to a small region of space in their homes - and even if they did, you could simply put the computer there anyway, so there's no need to "position the heat" with a fan.
 
Your reasoning has changed so much since the start, I'm not even sure what you're saying anymore.
 
What I've been saying is that the amount of useful work done by a heater+fan combination is not simply the total amount of heat pumped out, but also in pushing the air around the room to facilitate convection. The heater+fan clearly does more useful work than just the heating element by itself, and not because the fan also produces heat, but because it facilitates convection.

Wrong, natural convection easily makes fans unnecessary for the task of heating a whole room until equilibrium. Heaters with fans are useful for directing hot air to some specific area in situations where equilibrium is not the end desired (or where you really are in a hurry to heat a part of the room quicker than the rest).

And you shouldn't be using the term "useful work", because that has a specific meaning while discussing thermodynamics which is not what you may believe it is.

I've also looked at it from the other way: a heater that just heats one single molecule and doesn't let that molecule go anywhere is not doing any useful work, even though it outputs a great deal of heat.

Actually, you cannot heat a single molecule. I understand that it is a thought experiment, but heating a single molecule means accelerating it, and in order to keep it from going anywhere you're going to have collisions with whatever would be keeping it from escaping, transferring head out of that system in the process. There is no such thing as perfect isolation.

And, most recently, I've looked at it from a third way: if you need a 1,000w heating element to create a sufficient temperature gradient to heat a room (i.e. replace heat lost from windows/doors after at equilibrium, which in a previous example escaped at a rate of a), but you only need a 500w heating element and a 100w fan to heat the same room, then the fan has done more useful work than the extra 500w pumped out by the first heater.

And this one is really bad, because it shows you do not understand the concept of system borders. If you have a room heated to a certain temperature, which loses a certain amount of energy across its border, that does not change unless either the border or the temperature gradient across it changes. If it's losing energy at the rate of 1000 W, it requires 1000 W to remain at that temperature, regardless of the source.

If you operate under the naive belief that any heat out is useful heat, or the rather baffling belief that there's no utility in forced convection, then of course you will find that all electrical devices are as efficient as each other. But efficiency is about useful output, not just heat that lingers around the ceiling or at the back of a fridge or around a toaster

It's not a naive belief, it's reality. It's the first law of thermodynamics. Energy (unlike money :lol:) doesn't get made up or destroyed out of nowhere.
And, continuing with the comparison to money, in thermodynamics (unlike economics) "efficiency" is not a magic word which can be used to justify that creation or destruction of energy out of nowhere. It shouldn't be abused in economics either, but that's not a science so they get away with anything.
 
Doesn't the monitor sustain damaged from prolonged use?
 
Doesn't the monitor sustain damaged from prolonged use?

Older CRTs can suffer from burn-in. Unless you have an older CRT I wouldn't worry about it. But it's a good idea to turn off the monitor when you're not at your computer -- it's become habit to me.
 
Wrong, natural convection easily makes fans unnecessary for the task of heating a whole room until equilibrium. Heaters with fans are useful for directing hot air to some specific area in situations where equilibrium is not the end desired (or where you really are in a hurry to heat a part of the room quicker than the rest).
As I said, the natural convection current in the room may or may not mean that a fan is unnecessary. See above for how the placement of the radiator is important in exploiting the natural convection currents created by the radiator and window, and conversely how bad radiator placement creates a vicious circle where a room never gets heated properly.

As for the bit in brackets, that's kinda the point - you don't want to heat the whole room, you just want to heat the bits where you actually are working/sitting/eating/etc in. You don't want to heat the ceiling, nor behind the fridge, so you don't count the heat generated by a lightbulb as "useful".

And you shouldn't be using the term "useful work", because that has a specific meaning while discussing thermodynamics which is not what you may believe it is.
I know it has a specific meaning in thermodynamics, but I can't think of a better way of describing, say, "heat that just lingers in places you don't want to heat" using common English words. The key word is "useful", which is why I bolded it, rather than "work".

Actually, you cannot heat a single molecule. I understand that it is a thought experiment, but heating a single molecule means accelerating it, and in order to keep it from going anywhere you're going to have collisions with whatever would be keeping it from escaping, transferring head out of that system in the process. There is no such thing as perfect isolation.
Yeah, there's no such thing as perfect isolation, but as you said, it was a thought experiment. It's a reductio ad absurdum of heat that lingers at the ceiling or inside a toaster. The point is, heat that doesn't actually get to the place where you want it (e.g. person-level in a room) isn't useful.

And this one is really bad, because it shows you do not understand the concept of system borders. If you have a room heated to a certain temperature, which loses a certain amount of energy across its border, that does not change unless either the border or the temperature gradient across it changes. If it's losing energy at the rate of 1000 W, it requires 1000 W to remain at that temperature, regardless of the source.
Yes, you're right, I shouldn't have mentioned equilibrium at all. What I meant by that was just that the person-level temperature was at the required level to begin with, so ignoring how long it takes to heat the room up, not that the whole room was actually in thermodynamic equilibrium. In fact, what I'm saying kinda rests on the room not necessarily being in thermodynamic equilibrium during the time in which the heating is actually on, so using the word "equilibrium" was even more misleading...

Anyway, my point was that the room might lose heat at a rate of 800W, say, but the extra 200W produced by the heater was simply heating, say, the ceiling (or the inside of a toaster/behind the fridge, if we're sticking with electrical appliance analogies) to higher and higher temperatures. Instead of heating bits of the room that we don't want to heat, the fan pushes air to the bits that we do want to heat. That way, the heat gets to the furthest reaches of the room before rising to the ceiling. Indeed, as I explained in the last post, it's possible that the convection current created by a window at the opposite end of the room to the radiator can actually mean that hot air goes from radiator to ceiling, then from ceiling to the window where it's cooled, and never actually getting to person-level at all.

Obviously I have no idea about the actual numbers involved, which is why I've changed them above. But surely if heat just lingers at the ceiling it's not useful...

It's not a naive belief, it's reality. It's the first law of thermodynamics. Energy (unlike money :lol:) doesn't get made up or destroyed out of nowhere.
And, continuing with the comparison to money, in thermodynamics (unlike economics) "efficiency" is not a magic word which can be used to justify that creation or destruction of energy out of nowhere. It shouldn't be abused in economics either, but that's not a science so they get away with anything.
"Efficiency" just means "useful output" / "useful input". It doesn't matter whether the context is engineering or economics, it's all the same. I studied neither engineering nor economics, but the principle is the same: if some heat produced isn't actually doing something useful, it can't go on the numerator of that equation.
 
As for the bit in brackets, that's kinda the point - you don't want to heat the whole room, you just want to heat the bits where you actually are working/sitting/eating/etc in. You don't want to heat the ceiling, nor behind the fridge, so you don't count the heat generated by a lightbulb as "useful".

The initial discussion was about computers and these are usually at your desk or somewhere else where you frequently are and where you want it warm. And because of all the fans the transfer of heat out of a computer case is on a timescale of minutes, which is quite fast for temperature changes. So for the short term a computer is pretty efficient at generating heat where you want it.

And in the long term it doesn't matter: If you want to keep a certain temperature at a certain point, it is thermodynamically inevitable that the whole room is heated (unless your insulation is so bad that you heat the city instead of your room). For that case it doesn't matter matter much how, where and how fast the heat is generated, what really counts are the efficiencies. Electric equipment is very efficient in converting energy to heat. It's only the generation of electricity that is so inefficent compared to other heating sources.

But even 100% percent efficiency is not ideal, as with heat it is possible to be more than 100% efficient. (Actually because of this "efficiency" is technically the wrong word when converting energy to heat)
 
The initial discussion was about computers and these are usually at your desk or somewhere else where you frequently are and where you want it warm. And because of all the fans the transfer of heat out of a computer case is on a timescale of minutes, which is quite fast for temperature changes. So for the short term a computer is pretty efficient at generating heat where you want it.

And in the long term it doesn't matter: If you want to keep a certain temperature at a certain point, it is thermodynamically inevitable that the whole room is heated (unless your insulation is so bad that you heat the city instead of your room). For that case it doesn't matter matter much how, where and how fast the heat is generated, what really counts are the efficiencies. Electric equipment is very efficient in converting energy to heat. It's only the generation of electricity that is so inefficent compared to other heating sources.

But even 100% percent efficiency is not ideal, as with heat it is possible to be more than 100% efficient. (Actually because of this "efficiency" is technically the wrong word when converting energy to heat)

Wait wait, how is it possible to be more than 100% efficient? That means you're creating energy. (Big no-no!)
 
Wait wait, how is it possible to be more than 100% efficient? That means you're creating energy. (Big no-no!)

You're not creating energy, but rather shuffling it around. With a heat pump you use a small amount of energy to transfer heat from outside to inside. The total energy stays the same, but now it is where you want it.
 
Back
Top Bottom