I am performing eye safety calculations on a device and I have hit a roadblock. To perform the calculations I have been following the IEC 62471 "Photobiological safety of lamps and lamp systems" Standard.
I am using three IR LED arrays. One has 34 IR LEDs and the other two have 3 IR LEDs. The arrays utilizes the SFH 4243-Z IR LED from OSRAM.
OSRAM Provides an eye safety document in order to calculate if your set up is potentially hazardous to the eye however, their examples when calculating the burn hazard has me confused.
I will post an example they provide that is the most similar to my device. However I will ask my question first because the snippet may be confusing at first.
EXAMPLE SUMMARY:
- They have 20 X IR LEDS , which operate for > 1000 seconds
- Each LED has a Radiant intensity of 160 mW/sr
- In total the array produces a radiant intensity of 3.2 W/sr
- When they perform the retinal burn hazard calculation they only use 160 mW/sr and not 3.2 W/sr. (I have highlighted this in red in the snippet below)
- Because they only use 160mW/sr, the value of L_IR is calculated to 58.7 mW/mm²/sr, which falls within the exposure limit of 545.5 mW/mm²/sr.
- Therefore, this would pass and be NO RISK
- However, if this value is multiplied by 20 for the entire LED array (3.2 W/sr) you get 1174 mW/mm²/sr which would FAIL the limit
- This calculation is used in this way for all of their examples.
Therefore, my question is: Why is the radiant intensity of only one LED used for the burn hazard calculation and not the entire LED Array? Is there a scientific reason for this?
Here is the example:
Thank you
(For additional information Equation 10 has been appended below)
For a description of my IR LED arrays, in relation to the eye, here is a simplified picture I have drawn for my setup. Each LED has a typical radiant intensity of 11 mW/sr