GordonThree wrote:
3s 3p led array
Vf(typ) 3.3vdc per chip, x 3 in series = 9.9 volts. Let's round to 10
Ohms law P=V x I ... So 10=10 x I, so I = 1amp
So let's say 13v supply voltage means a resistor needs to drop 3 volts.
Ohms law V=I x R ... So 3=1 x R, thus R=3.3 ohms (nearest easy value)
Ohms law P=I²R ... So P=1 x 3.3, thus P=3.3 watts
I'm a bit rusty... What did I miss?
Of course you can under drive the module to reduce loss but also output.
Technically, your math is right but your ASSUMPTIONS ARE WRONG bringing the entire equation into error.
Your assuming 3.3V per diode, that is not correct.
White LEDs can be had with voltages ranges from 3.1 up to 4.1V depending on the different materials used to make the diode junctions.
The 10W COB that the OP has, is most likely similar to the ones I have which are specd at 11.1V of the series string, that puts the diode junction at 3.7V.
Now, since I have messed with 10W, 20W, 30W and 50W versions I can't say for sure at this time the exact max current for each, I would have to dig around to find the manufacturers spec sheet and my notes.
I can tell you this, the 10W versions ARE 11.1V for full brightness and current draw. The 20W, 30W, 50W require 34V as a min voltage which to operate from 12V would need a booster power supply.
I did retrofit a couple of my TTs outdoor porch fixtures with the spare 10W versions since I didn't need a boost regulator.
Unfortunately, it IS a very dreary and wet evening tonight and the rain is forecasted through Sat so I am not about to bother getting a ladder out to pop the lens off to verify the resistor I used..
My point to you is a couple of watts heat wasted by a resistor is not in any way shape or form "substantial" compared to the wattage consumed by the COB.
Statistically speaking it can be said that the difference IS NOT STATISTICALLY SIGNIFICANT, since the increase in loss is very small compared to the overall power used by the entire device.
People tend to forget that a switching power supply regulator does indeed WASTE energy in the form of heat. It just does it a a SLIGHTLY LOWER AMOUNT.
Typical switching supplies often have efficiency levels of 80%-85%, some may reach 90%-92% but not very many do. That means a switching regulator will waste 15%-20% of the power that goes into it as heat..
If the power wasted by the resistor was EQUAL OR MORE than the COB then I would consider that to be STATISTICALLY SIGNIFICANT.
And YES, I ERROR on the safer side by UNDER DRIVING THE LED. There is a point where the current increases rapidly and yet the light gain is extremely small, at that point pushing it to max yields very little light gain and severely reduces the operating life..
That is intentional, and vastly increases the life of the LED (or any device for that matter).
This also allows for the fudge factor when things don’t go as planned.