Forum for Illumination Research, Engineering, and Science (FIRES)
Category: Measurements and Calculations

An Intuitive Metric for Lumen Maintenance

Editorial Disclaimer
The views expressed in articles published on FIRES do not necessarily reflect those of IES or represent endorsement by the IES.

By Eric Bretschneider

For better or for worse, the lighting industry commonly associates the lifetime of LEDs and LED-based lighting products with L70 – the amount of time for the lumen maintenance of an LED-based device to reach 70% of its initial value. Admittedly, the failure of other components, particularly those that provide power to LEDs, are more likely to determine the overall lifetime of an LED-based component or luminaire. However, only lumen maintenance is considered here.

That being said, I firmly believe that simple, intuitive metrics tend to get used more often simply because they are in fact intuitive. Unfortunately, all we have for TM-21 and TM-28 is L70. While L70 is intuitive and easy to understand, the details of how it is reported according to the standards means that we may often be stuck with comparing products with the exact same L70.

Both TM-21 and TM-28 model luminous flux behavior using:

{\phi}= \mathit{Be}^{-\alpha t},       Eq (1)
Φ= luminous flux
B = initial constant
α = decay rate constant [hr-1]
t = time [hr]

from Eq (1), L70 can be calculated:

L_{70 }= \tfrac{ln(^{B/_{0.7}})}{\alpha}       Eq (2)
Because of uncertainties in the parameters, the maximum extrapolation limit for L70 is 6 times the total test duration of the dataset used to calculate the parameters in Eq (1). Given that most datasets range from 6,000 to 10,000 hours, this in turn limits reported L70 values to the maximum extrapolation limit, or 36,000 to 60,000 hours.

LED-based products may be evaluated by comparing L70 values, but when products have the same reported L70 (i.e., L70 > 36,000 hours) the only thing left is looking at α values.

Logically, a smaller value of α represents a product with a slower lumen depreciation rate, but how exactly do we compare different values of α? As a first step, we can calculate lumen maintenance for different time intervals, create data tables, and then compare the entries. That makes sense, but it seems like a lot of work, and it definitely doesn’t sound like it will allow a simple, intuitive comparison. Perhaps a bit of mathematical analysis can help us figure out a better approach.

Instead of calculating a table, let’s plot the data and see if it gives us any hints. To make things easy, we will assume B = 1.0. This isn’t a terrible assumption since in most cases 0.98 ≤ B ≤ 1.02. Let’s take a look at α = 1.189 x 10-5. To most, I would expect this value to have no meaning, but we’ll come back to that later.

Plotting the data gives us the figure shown below.

An Intuitive Metric for Lumen Maintenance

On cursory inspection, this looks an awful lot like a straight line. In fact, a least-squares fit of the data would give a line with a correlation coefficient of r2 = 0.9999 and would match the data points to within 0.02%. That’s what I call a good fit!

For those who are mathematically inclined, this shouldn’t be a surprise. The Taylor series expansion for an exponential function is given as:

e^{-\alpha t}\approx 1 - \alpha t + (\alpha t)^{2}/2! - (\alpha t)^{3}/3! +...
When the value of α << 1, αn ≈ 0 for n ≥ 2. At first glance, all we need to do is keep extending the line to L70 or our extrapolation limit, whichever comes first. Trying this, it looks like we reach L70 at about 26,250 hours (see next figure). At least we don’t have to worry about the extrapolation limit.

An Intuitive Metric for Lumen Maintenance

At this point, you may be wondering why the working groups for TM-21 and TM-28 didn’t just use a linear equation. Extending our data table gives the answer. In the plot below, the filled diamonds represent the initial data we used to calculate the straight line, the open diamonds represent that data all the way out to L70. Doing this we find out that the “true” value of L70 for this example is 30,000 hours. Wasn’t that completely obvious from the fact that α = 1.189 x 10-5?

An Intuitive Metric for Lumen Maintenance

So, while a linear fit of the initial data might seem a reasonable approach, it represents an error in L70 of almost 12.5%. Close, but I’m not sure people would want to “discount” the performance of their LED products by that much.

Looking at all the data points out to L70, it still looks like a great approximation of a straight line. In fact, you would have a correlation coefficient of r2 = 0.9979 and match the lumen maintenance to within 0.8%. As the value of α decreases, L70 increases and becomes a better approximation of a straight line. If only we could sort out how to convert α into the slope for a straight line.

This is actually pretty easy if you’re not afraid of a little math. We know that at t = 0 hours lumen maintenance = 1.0. We also know by definition at L70 lumen maintenance = 0.70. From TM-21 and TM-28 we also know how to calculate L70 from Eq (2) above. That means we should have everything we need to calculate the average slope. Let’s call this new metric δ: the average decay rate. We can calculate δ using:

\delta = \frac{0.3}{L_{70}}= \frac{0.3\alpha }{-ln(0.7)}= 841.1\alpha ,
α = decay rate constant from TM-21 or TM-28 [hr-1]
δ = average decay rate [%/1,000 hrs, %/kh]

It turns out that all we have to do is multiply the decay rate constant (α) by a single number to convert it to an average decay rate. Best of all, the new number is simple and easy to understand. With units of %/kh, estimating the change in luminous flux over, say, 10,000 hours or 20,000 hours now becomes a math problem that most of us can do in our heads.

Keep in mind that the difference between this average linear model from t = 0 hours to L70 is less than 1% all the way out to L70 or the extrapolation limit, whichever comes first. I would argue that this is remarkably accurate for such a simple model and that it should be more than adequate for quick comparisons.

In practice, the initial constant (B) does have an effect, but for most cases, 0.98 < B < 1.02, which means in most cases it would change the value of the average decay rate by about 5% or less. That amounts to a worst-case error in estimating lumen maintenance of about 1% over a time span of 20,000 hours. I wouldn’t expect this to be a significant issue for quick comparisons. If product performance is that close, then you may just want to flip a coin.

When using the average decay rate, it is still important to follow the rules for extrapolation limits. The intent of this paper is to give the industry an option for a simple, intuitive metric that will allow rapid comparisons of different products.

Would you rather compare two products with α1 = 5.236 x 10-6 and α2 = 7.120 x 10-6, (LM-80 duration for both = 8,000 hours), both of which have L70 > 48,000 hours, or two products with δ1 = 0.44%/kh and δ2 = 0.60%/kh? In 10,000 hours the difference in lumen maintenance between the products will be about 1.6%, and in 20,000 hours the difference will be about 3.2%.

Going back to our initial example, α = 1.189 x 10-5, we find that δ = 1.189 10-5 x 841.1 = 1.0%/kh. In hindsight, it might now be much more obvious how and why I selected this for an example.

Notify of
1 Comment
Inline Feedbacks
View all comments
W. Wade Johnson
2 years ago

With most customers expecting 10 years of carefree service and many industrial applications operating 24/7, do you see the margin for error significant at 80,000 hours?

Would love your thoughts, please comment.x
Skip to content