oreo57
Well-known member
Ok I got it. Thanks for the time and the break down. One more final ? I won't need resistors because the driver will deliver the current exactly?
Yes in the examples the "driver" IS the resistor.. because you are using a constant voltage source it needs to be only 1 value.
When you use constant current drivers it frees you from having the exact voltage to run the LED's... The driver controls the voltage internally to keep amps constant
Amps = Volts/Resistance..
http://www.instructables.com/id/Choosing-The-Resistor-To-Use-With-LEDs/?ALLSTEPS
http://www.digikey.com/Web Export/Supplier Content/Allegro_620/PDF/Allegro_an295031.pdf?redirected=1LEDs are current-driven devices that require current limiting
when driven from a voltage source. In most applications,
it is desirable to drive LEDs with a constant-current source.
The current source is used to regulate the current through
the LED regardless of power supply (voltage) variations or
changes in forward voltage drops, VF , between LEDs
http://en.wikipedia.org/wiki/Current_source
http://en.wikipedia.org/wiki/LED_circuit
What many people miss
The LED used will have a voltage drop, specified at the intended operating current. Ohm's law and Kirchhoff's circuit laws are used to calculate the resistor that is used to attain the correct current. The resistor value is computed by subtracting the LED voltage drop from the supply voltage, and then dividing by the desired LED operating current. If the supply voltage is equal to the LED's voltage drop, no resistor is needed.
So practically speaking you NEED to know the Voltage drop AT the power (mA) you want to run the LED's at.. in order to correctly determine your power supply voltage need...
Last edited: