To take part in discussions on talkSFU, please apply for membership (SFU email id required).
How much energy does a computer waste overnight?
So I'm just at the office right now closing up and I realized that a lot of my coworkers have the habit of leaving their computers on overnight. Considering the amount of environmental and financial costs which we should all be well aware of by now, I'm pretty surprised that some people are still so wasteful.
Anyone wanna take a stab at how much resources could be saved yearly by turning the computers off for 12 hours every night? There are 12 computers in the office with 17" LCD monitors (Specs). About half of them go into sleep mode while the other half stay fully on. The computers are running dual Intel Pentium Processors @ 2.0 GHZ with 2.0 GB of Ram. I'm going to present the cost and energy savings to my manager and see what she does, so if your good at Physics nows your time to shine :smile:
Anyone wanna take a stab at how much resources could be saved yearly by turning the computers off for 12 hours every night? There are 12 computers in the office with 17" LCD monitors (Specs). About half of them go into sleep mode while the other half stay fully on. The computers are running dual Intel Pentium Processors @ 2.0 GHZ with 2.0 GB of Ram. I'm going to present the cost and energy savings to my manager and see what she does, so if your good at Physics nows your time to shine :smile:
Comments
Ok, so 300 watts in 12 hours of overnight as opposed to 100 watts, etc, means an excess of 200 watts. Since power consumption is measured in watt-hours, you need to multiply by 12 for the # of hours and then 12 again for the # of computers.
That's 28800 excess watt-hours, or 28.8 kWh in the more familiar language of your Hydro bill.
Assume industrial customers pay 7 cents per kWh, and you're looking at two bucks a night, or 60 bucks a month.
For comparison purposes my apartment has anywhere from three to five computers fired up at any given time and my Hydro bill is like $45 a month. That does not include hot water heating, which is buried in the rent.
Edit: Individually it doesn't look like much, but scale that up by a million computers instead of just one, and you start to get the idea.
I know I got into this argument with someone who wanted to create a server in their house.
http://michaelbluejay.com/electricity/howmuch.html
Edit: Honestly though, computer + LCD monitor during idle (which is the state they'll be in at night anyway.. and besides, the monitor would be off) should really only use 100W max, assuming it's just your average office-type computer and s not some beastly gaming setup with several massive GPU's :P.