Question:

How much power does a computer use when its on standby?

by  |  earlier

0 LIKES UnLike

i leave my computer on 24 hours a day, i think its better for the computer to leave it on but i am now trying to conserve electricity because my bill was too high, i want to know if my computer even uses enough electricity to make any difference. how many kwh does my computer use when its on standby (standby for my computer just means the screen turns off.)

 Tags:

   Report

4 ANSWERS


  1. simple answer - LOTS!

    i live in a solar powered house. I have a computer, TV and radio / CD player. This is all I need - and I could probably lose the TV!

    One of the principles underlying living with solar power is that you turn ALL appliances off at the WALL....leaving anything on standby will chew power because it's not fully off. It's also not the wattage of the item you need to be worried about but the number of amps the device uses in an hour which is expressed as amphours (aH). The lower number of amps the less power it will consume in a day.

    If you're on the mains you'll find that if you switch off at the wall you're going to save about $20 to $40 per year, which also equates to saving 40kg of carbon from getting into the atmosphere...and this is for EVERY ELECTRONIC device in your house!

    turning off at the wall? it's a "no-brainer" really!

    Love and Light,

    Jarrah


  2. I have a Dell Inspiron 600m notebook.  Since these have power adapters rated at 65 watts, this is the most power that they can use.

    Where I live, electric energy is billed at about $1.00 (U.S.A.) for 6.5 kilo-watt-hrs.

    That means that you pay a dollar to run a 6500 watt appliance for an hour (or $0.50 to run it for a half-hour).  So the cost depends on the amount of time that it is used.

    The cost also depends on the power taken by the appliance.  So it would cost me $1.00 to run a 13000 watt device for a half-hour.

    After doing the math, I find that my 65 watt notebook computer could cost me 23 cents per day to run, resulting in $7.00 per month.

    The truth is that it costs less because it cannot use full power while it is idle in standby mode.

    I think that $7.00 is "noticeable".  Remember that towers may take more power than notebooks.  A notebook must be specially designed to conserve power so that it can run on a battery.

  3. It's better for your computer to be off than on. I know this because I work with computers every single day.

  4. If the screen is off, then the only power is that from the main power supply unity. In most cases, you can read how many Watts it can supply under full load on the back of it (where the cable leading to the outlet is connected.) For standard mid-range computers this would be about 350W to 500W. However, I hear that some ultra-hightech gaming systems are already going over 1 kW, which you would use when you're running 4 graphics cards in parallel etc.

    That Watts value is a maximum, so when you're computer is idle it probably needs quite a bit less. You can reduce it by putting it into standby (it only needs as much power as is needed to not lose the data in memory) or hibernation (not all computers support it - it stores the memory data on harddisk and turns off completely, then restores its state when you turn it back on).

Question Stats

Latest activity: earlier.
This question has 4 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.