Power Usage Estimates

Anyone know how much it costs them to run their XXX Watt power supply and PC for X amount of time, how much it would cost you per hour or per month? For example, if you had a home server with a 500W power supply and ran it 24/7 at half load, how much would it increase your power bill?

I am curious how much keeping a home server on 24/7 is worth it over keeping it on only when I am at home. I thought someone said keeping a 360 or PS3 on 24/7 is only a few bucks a month.

LZ?

http://www.eia.doe.gov/cneaf/electricity/epm/table5_6_a.html

So ballpark

$0.17/kwh * 24 hours * 30 days * 0.5 kw * 50% load = $30.60/month.

man i should turn off my computer

Wow I didn’t realize it was that much. I thought someone say it was only a few bucks for a typcal home PC.

idk about that 50% load when idle, though.

I don’t know what kind of load a pc actually draws off the PS though. Maybe it’s normally 50W? 300W?

I’ve always thought about putting an ammeter on my power supply to find out, but I’ve never been quite that bored.

I have a 1kW power supply and run it all day.

Screw you, Al Gore. :lol:

Word. 10% load = $6.12/mo. :shrug: That actually seems more reasonable since my electric bill for the whole house, including an electric oven, fridge, PC, and dehumidifer running 24/7 (except the oven, duh) is only like $90 IIRC.

Ya. I think to draw full load, you need a CPU, drives running, and fans going 24/7 which is only in a server environment.

So a 500W device running all day = 12 kilowatt-hours = .17x12 = 2.04/day?

So many variables. (Components, amount processing, are the HDDs running, external devices)

Roughly a server I was running averaged me an extra 33ish a month. However that was a pretty inefficient setup by todays standards.

I have 2 desktops at home that are always on but have the HDs off/and do low power “sleep” or whatever it is…turning them off during that day didn’t yield much of a change in my power bill.

Need that kill-a-watt device to know for sure. You plug it in between the device and the outlet and it shows you exactly how much power it uses.

Old Blue Eyes has one.

Get a watts up meter or a kill a watt and measure what your computer draws.

Ill assume youre running a Core 2 Duo around 3Ghz, and a older video card, say a 8800GT. If you have C1E and everything stock on, it would draw about 75-90 watts on idle with a fairly efficient PSU. Most power supplies are more efficient around 1/2 load, but efficiency data is out there for any quality PSU at different loads, of which its likely yours is not.

Even if youre running say a i7 at 4Ghz and two 4870X2’s, it only takes about 400 watts off the wall at idle. Thats a whole lot, yes, but nobody here has anywhere near that kind of a rig.

Where the real power consumption comes into play is the video cards. The CPU takes less power than you would assume. A single 4870 takes about 80 watts at IDLE. Its doing nothing, and drawing as much as your light. If you load the GPU, a 4870 can take over spec at around 200W. Once you start doing some radical overlocks on a quad core processor and turn all the power saving features off to try and get a higher overclock, then your CPU starts taking some serious power. But out of the box, a quad takes 40 watts or less at idle. Plus your average PSU is only 75-80% efficient at lower loads.

lol, I remember when 8800GTX came out, people figured out the cost of running them in SLI @ 8 hours a day (since each one consumes 300-350W under load).

If you’re wondering about wattage amounts, just go to newegg.com to figure it out.
Generally yeah… 20-40watts at idle, 65-120w under full load. (mine is 90W)

My electric bill just got raised to $239 a month on budget plan. FML.

LOL. My budget plan down in VA is $36/mo :slight_smile: that sucks big time.