Question:

Do I need a resistor if the Source voltage and diode forward voltage are the same?

by  |  earlier

0 LIKES UnLike

The led consumes 20 mA. And then, what would happen if I connect more in parallel?

 Tags:

   Report

3 ANSWERS


  1. An LED, or light emitting diode, operates like any other diode when biased in the forward conducting mode. The current versus voltage curve rises steeply from the “knee” of the forward bias plot. This means you either have to supply power to the LED from a constant current source at the rated forward operating current of the LED (20 mA in your example) or limit the current from a constant voltage source by using a resistor in series with the LED. The latter approach is most common for LEDs used as indicator lamps, while sophisticated switch-mode constant current supplies are commonly used for illumination applications. There are integrated circuits available especially designed for LEDs used as high-power luminaries.

    Multiple LEDs are wired in series so each led carries the same current. Because the voltage versus current relationship is temperature sensitive, and the exact nature of the curve varies from one part to another, it is not recommended that LEDs ever be wired in parallel. The result of parallel-wired LEDs will be uneven light output at best. At worst, some LEDs will not illuminate at all, while others will be draw more than their rated current.


  2. If the source voltage equals the forward voltage of the diode then you don't need a series resistor. Provided that the voltage source has sufficient capacity to supply enough current, you could add as many LED's in parallel as you need.

    With each LED drawing 20mA that means 50 LED's per amp of capacity.

  3. You have a sticky situation there.

    If the voltage is EXACTLY what you need, then you have no need for a current limiting resistor. But...

    If your voltage source is not-so-well regulated, you will have problems. Here is the deal:

    Take a garden-variety red LED. It is supposed to have a 2V drop, and will usually pass about 20mA at 2V. But if the source voltage went up only 0.3V more, the current would shoot up to more than three times, and blow the LED right up.

    As an engineer, I would have to suggest that you use an incandescent bulb in that application. Or, if you have to use a LED, you could rig up an elaborate scheme (or a small DC-DC converter chip) to boost the voltage higher, so you could use a fixed resistor as a current limiter in series with the LED.

    You did not say anything about the stiffness of your source. If your voltage source is rather "spongy", you will get some natural current regulation due to the ESR of the supply. But if it is a rigid voltage, expect to have problems powering your LED.

    Unless all the diodes are from the same manufacturing batch, NEVER CONNECT LEDs IN PARALLEL.  Nothing good will ever come from it.  Even a small variation in manufacture of any two LEDS would cause one to take nearly all the current and the other almost none.

Question Stats

Latest activity: earlier.
This question has 3 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.