Give away medical masks when you place an order. learn more

Optimizing LED Performance

Over the last several years, improved efficacy of modern LEDs has allowed engineers to use fewer chips in their lighting fixtures to achieve the same light output as older designs. The key benefits are reduced cost and simpler design. The enhanced robustness of today’s chips has allowed engineers to take the approach a stage further by increasing the forward current to further boost luminosity of individual LEDs and cut their number even further in a given design (see the TechZone article “Maximizing LED Luminosity to Drive Down System Cost”).

However, although LEDs shine brighter at higher drive currents, the increased luminosity comes at the cost of reduced efficacy (the chips exhibit highest efficacy at relatively low currents, see the TechZone article “Identifying the Causes of LED Efficiency Droop”). That increases power consumption and negates some of the gains of reduced components cost over the lifetime of the lighting fixture. The increased power consumption also causes the chips to run hotter, shortening their lifespan.

This article explains the trade-off between cost, design simplicity, power consumption, and longevity for a given LED application running at high-drive current.  

Finding the limit

In the early days of LEDs, when the fledgling technology delivered less light and was much more fragile than today, engineers were advised to limit the drive current to LEDs in order to maximize efficacy and prolong the life of the devices.

However, two decades of development (Figure 1) has seen LED efficacy soar and tolerance to the high-junction temperatures that result from increased drive current significantly increase. Many engineers question why low-drive currents are still necessary when such currents limit a chip’s output to only a fraction of its capability.

Figure 1: Today’s high-power LEDs allow fewer chips to be used for a given output of a lighting fixture. (Courtesy of OSRAM)

In addition, a growing body of evidence from industry-standard tests such as LM-79, LM-80, and TM-21 shows that modern LEDs can operate for tens of thousands of hours at temperatures that would have quickly killed the previous generation’s devices.

Manufacturers have encouraged the use of higher-drive currents by offering chips that can be pushed much harder than early 350 mA devices. Cree’s XLamp MK-R, for example, is a 120 lm/W, 1000 lm (at 1.4 A/6 V) LED that can be driven at up to 2.5 A. Similarly, Philips LumiledsLUXEON M is a 134 lm/W, 1000 lm (at 2.8 A/2.8 V) chip that can be driven up to 4.8 A.

However, as robust as modern LEDs are, there is a limit in how far this strategy can be taken. While increasing the drive current improves luminosity, it also exponentially increases power consumption and pushes up temperature. In a lighting application powered from the mains supply, a degree of increased power consumption might be acceptable when traded off against the other benefits of reducing the LED count, but the engineer still needs to make an informed decision about the fixture’s operating point – particularly as low-power consumption is touted as a key advantage of LEDs compared with other light sources.

Also, with modern thermal-management techniques and products, while much higher junction temperatures now can be tolerated in solid-state lighting, the use of the technology in confined spaces, for example, puts an upper safety limit on how hot the fixture can get

Calculating power consumption

How does a design engineer know the optimum operating point for the LEDs in his or her application? Much depends on that application of course, but in truth there is no definitive answer. The key is to determine the critical design parameters. For example, end-product purchase cost might dictate fewer LEDs at the expense of greater energy consumption over the life of the product. Or restricting chromaticity drift (which increases with temperature – see the TechZone article “Thermal Effects on White LED Chromaticity”) might be important in a high-quality light fixture, demanding more LEDs each driven at a lower current to keep the temperature down.

However, a good starting point is to check how forward current influences power consumption and efficacy.  

Forward current versus efficacy is rarely detailed in an LED datasheet. Tabulated efficacy data is also often difficult to track down. However, calculating this key metric is relatively straightforward using forward current (IF) vs. forward voltage (VF) and forward current vs. relative luminosity data. Another useful figure is typical lumen output at a given IF. This data is typically available from the manufacturers’ datasheets.

LED luminosity (ΦV), power consumption, and efficacy (η) can then be calculated versus IF. As previously noted, peak efficacy for LEDs occurs at a relatively low IF and tails off steadily as the current approaches the maximum rating[1].

Figures 2 and 3 show the forward current (IF) vs. forward voltage (VF) and forward current vs. relative luminosity for an OSRAM OSLON SSL 150 white LED, a popular device for lighting applications. According to the manufacturer’s datasheet, this LED produces 147 lm at an efficacy of 142 lm/W when operating at an IF of 350 mA and VF of 2.95 V. However, the LED is capable of being driven at up to 1 A, so let’s consider two design options for an 1100 lm lighting fixture. (For ease of calculation it has been assumed that the LED driver is 100 percent efficient.)

Figure 2: Forward voltage against forward current for an OSLON SSL 150 white LED.

Figure 3: Forward current against relative luminosity for OSLON SSL 150 white LED.

In the first hypothetical example, the designer has chosen to drive the LEDs at 350 mA (hence––as shown in Figure 2––at a forward voltage of 2.95 V). The manufacturer’s datasheet reveals that each LED produces 147 lm under these operating conditions. The power consumption is 1.03 W so the efficacy is 142 lm/W. To obtain the required output from the fixture the designer will need 1100 lm/147 lm = 8 LEDs. The total power consumption of the fixture will therefore be 8.24 W.

A fellow designer has decided to take an alternative route and opts to push up the drive current so she can reduce the number of LEDs in the fixture, simplifying the design and cutting the bill of materials. This time the designer has chosen a drive current of 800 mA (well within the capability of the LED). Figure 2 shows that the forward voltage will creep up to 3.18 V. Figure 3 shows that at this drive current the LED’s relative luminosity has climbed to 1.9, so each device now produces 1.9 x 147 lm, or around 279 lm. However, the power consumption of each device has now increased to 2.54 W and the efficacy has dropped to 110 lm/W. To achieve the specified output of 1100 lm, this alternative design of lighting fixture will need four LEDs so the designer has achieved some of the design objectives. However, total power consumption has now climbed to 10.2 W (an increase of more than 20 percent on the original design).

Beware of the downsides

Increasing the drive current to boost LED luminosity is a good way to decrease the number of LEDs in a lighting design with a specified output. Fewer LEDs reduces design complexity and simplifies the product; but a designer should be aware of the downsides. First, LEDs are less efficient at higher drive currents, increasing power consumption for fixtures (of a given output) with fewer devices. Second, higher-drive currents increase LED junction temperatures, shortening device life. Third, both higher-drive currents and higher temperatures have been shown to cause chromaticity drift away from the intended color point for a design. Such drift is hard to control, and can cause identical designs to cast different light, which could lead to consumer disappointment.  

For more information about the parts discussed in this article, use the links provided to access product pages on the Hotenda website.


  1. "Optimal operating point of an LED," Donald Schelle, Analog Applications Journal, Texas Instruments, Q1 2015.