TropTrea
New member
An LED is a diode and a diode has not a linear relationship between voltage and current as the resistance is large in the beginning and at a given voltage (FV) it will be very low.
As an example: Cree's XM-L LED.
At a voltage of 2.8 V, the current is 400 mA. At 3 V, the current is 1100 mA and at 3.2 V, the current is 2100 mA. At a difference of 0.4 V, the current increases by almost 1700 mA or more than 5 times.
As it is amperage that burns down our chips, it's much wiser (and safer) to use a source that provides a constant current and varying voltages after the load.
I'm not saying it is not possible with a constant voltage source, but it requires a lot more knowledge and fine-tuning especially if you are close to the chips FV.
Sincerely Lasse
Yes for our application with the high output diodes this is especialy true. It is even more impoetant since we are working in a low voltage range. However there are some comercial applications where the Voltage regulation is a much cheaper route and even considering the higher potential of burning out LED's. In these applications Zener Diodes are usualy used to regulate the voltage well below the threshold of the LED's.
As you mentioned earlier just expanding on the XM-L LED
Voltage Current Wattage
2.8 400 1.12 Watts
3.0 1.1 Amp 3.3 Watts
3.2 2.1 Amps 6.7 Watts
We can roughly project that then that
3.4 Volts 4.1 Amps 14 Watts
3.6 Volts 8.1 Amps 29 Watts
The safe max for this LED on long term use is only 5 Watts. So with a small change of .2 Volts you are more than doubling the Wattage beyound the safe range.
Now As said there are applications where this is not as muchh of an issue an example is in some signage use. Here you may have 30 LED's wired in series with a pulsing power supply that is generating a 33% power cycle with 85 volts. This is putting 2.83 Volts across each LED only 1/3 of the time. This allows cooling to occur 2/3 of the time.
The biggest destroyer of LED's is actualy heat. The higher the Power the LED is consiuming the more heat that is generated and heat slowly builds up. If you look at effeciency rating on LED's they at not more effecient at higher currents which shows that the ratio of heat to light increases as the current goes up.
If someone wanted to do an experiment on LED's at higher current and there life spans it could easly be done. Theywould find that that a some majic point usualy the lfe spans would start decreasing drasticly like to less than 50% by a change in voltage of only a few hundreds of a volt. But unless you wanted to simply blow up a hundred LED's it is best to just the manfacturers word and observe their limits.