Question:

Do i use the input voltage/amps or the output voltage/amps when calculating how much watts i am using?

by  |  earlier

0 LIKES UnLike

I am trying to figure out how many watts my ps2/mini tv are using but on the adapters it has input 100v-200v 1.2A, and that is a LOT of watts. Do i use the output which varies from 8-10v and 4-5A? Thanks for anyone who can help

 Tags:

   Report

3 ANSWERS


  1. i'd say the input terms , bec the output is the input+Internal Losses , so according to ur current rating u can say how many watts u are using with the 1.2 amps


  2. The problem with adapters is that they give only the maximum loaded ratings. The actual power used by your system depends on the load so you should examine the ps2/mini tv for what they need. Should be on a plate on them or it may be in the literature that came with them.

  3. Let me get this straight. The adaptor says 100v-200v 1.2A

    and it says 8-10v and 4-5A.

    The number of watts you use depends on what the unit is doing. In some modes it will use more power than others.

    100v-200v 1.2A -- I think this is actually 100v-240v 1.2A, as 200v is not useful, whereas 240 means it will work in Europe. Most appliance are specd for 100-240v for that reason.

    The 1.2a number will vary depending on the voltage you plug it into. At 240v it will use a lot less current than at 100v. So I would use 100*1.2 or 120 watts as the MAX input power.

    DC power out, I'd take 9v and 5 amps, or 45 watts as the power output. But even 10v and 5a is only 50w

    I have no idea why they are so different, it could be a game of specmanship. or a very inefficient supply.

    Bottom line, use 120 watts.

Question Stats

Latest activity: earlier.
This question has 3 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.