Question:

If a lightbulb is 100w and an A/C 5000w, does it mean that AC eats...?

by  |  earlier

0 LIKES UnLike

50 times more energy than a lightbulb? is it directly proportional? if i run my AC for 1hr does it equal to 100 lightbulbs lit for an hour or 1 bulb for 100 hours?

 Tags:

   Report

6 ANSWERS


  1. Well, they are "rated" for that much use.  So you have to have a plug/circuit that can handle that much load.  The light bulb is a constant load (100W all the time) while the AC unit may be varying load (on off, high temp, low temp, fan high, fan low).  However, you you put the AC on high fan with the coldest temperature, you will probably hit the max around 5kW.  If you are on low fan and moderate cooling, you won't be close to the 5kW limit.

    But you HAVE to have a circuit capable of it.  Many AC units will require a 20A circuit while most household wall sockets are only wired for 15A.  So you may want to check your AC unit and your location to ensure it has a wall socket nearby that fits with what the unit needs.  

    A fun check is to go to your local hardware store and many are starting to sell cheap "watt meter" or "power meter" systems.  As low as $10 or $20 US as I remember.  You plug it into the wall and then your device into it.  It measures the wall voltage and the current flowing to your device, then displays a "watts, current (amps), and voltage" reading.   Some are nice enough to calculate over a length of time.  Your local power company may also know sell or know where to get a meter.  Of course, much more fancy/capable meters are available and cost a bit more too.

    The meters can help you track the "load" mentioned in the previous posts, be it 90% or 10% loading, for example.


  2. You are correct, but there is one issue.

    The AC, like your frig, cycles on and off controlled by a thermostat. It does not run all the time.

    If the room is hot and you turn the AC on, it will run continuously for  perhaps an hour, and your calculations are correct. But when the room temperature reaches the point set by the thermostat, the compressor shuts off, and the power consumption of the AC goes down a lot.

    You can usually tell the difference in the sound. The fan continues running, but that is less than 1/10 of the power, most of the power is used in the compressor.

    So you have to factor in the duty cycle of the AC. It could be 50%, it could be 90%, it could be 10%, it all depends on the setting, the outside temp, the number of windows, etc.

  3. Simply yes.

    However a 100w light bulb is a constant load in one hour it will consume .1kwh.  The a/c is rated as output capacity most likely, not what it consumes. And there are many nuances beyond that such as loading and cycling. Most likely it will have a "name plate rating" that will tell you what amperage it consumes.

    watt= voltage x current (amps)

    kw=kilowatt= 1000 watts

    kwh=kilowatt hour= 1000watts for 1 hours.

  4. Basically yes, but the light bulb is a nominal 100W any time it is on, while the air conditioner has a thermostat and it's power draw varies, the 5000W is maximum (not counting start up surge).

    Even if the AC is "on" for an hour it might not be using 5000W all the time, the compressor cycles on and off. When the compressor is off and just the fan is running the power draw will be less.

  5. Yes... if you run both for equal amounts of time, the AC used 50 times the energy of the light bulb.

    If you run the AC for 1 hour, it is like running the light bulb for 50 hours.

    .

  6. Yes you are right ..there's direct proportionality.

    1w=1 joule of energy consumed in 1 sec.

    there's proportionality of 50 (5000/100)  so if AC runs for 1 hour=50 light bulbs for 1 hour.

    1KWH is generally used as unit of energy for 1KW of power consumes in 1 hour. energy=power x time

Question Stats

Latest activity: earlier.
This question has 6 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.