Per the datasheet
The TCRT5000 and TCRT5000L are reflective sensors which include an
infrared emitter and phototransistor in a leaded package which blocks
visible light.
The datasheet says that the typical forward voltage of the emitter is 1.25V. With a 5V supply the current through the LED is therefore...
$$I_{D} = \frac{5V - 1.25V}{R_8 + R_1} $$
Given that R8 can go between 0 ohms and 10K ohms, the LED current can go from 369uA to 25mA.
I have read elsewhere that reducing R1+R8 can increase the IR light
intensity from the bulb which can increase the system's sensitivity as
that would mean more reflected light falling on the phototransister,
and that that would mean more current from the phototransistor which
would mean R2 would need be reduced too.
That's correct. The current output from the transistor is proportional to the amount of light falling on the detector. The amount of light falling on the detector is proportional to the amount of light emitted from the LED and the reflectivity coefficient of the target material that the light is being reflected from, and inversely proportional to the square of the distance to the target.
But I don't know the theoretical basis for all this (the math etc) so
I don't really get it. What is the job of R2? Why R2 is 10K? How is R1
and R2 and R8 calculated? Someone kindly explain.
The phototransistor allows current to pass from its collector to emitter in proportion to the light falling on the detector. If the current passes through R2 then the voltage across R2 is...
$$V_{R2} = R_2 * I_c$$
So the output voltage is...
$$V_{out} = 5V - V_{R2} = 5V - R_2 * I_c$$
The purpose of R2 is to convert the current output of the transistor into a voltage. For an output feeding a low frequency digital input pin on some other device, 10K is a fairly standard choice for a pullup resistor.
Firstly, According to the datasheet, the forward current value is 60 mA. Why would the designers restrict the current to just 25 mA?
Using a part right at its maximum rating is not likely to result in good long-term reliability.
Why not use a smaller resistance than 150 Ω and closer to the limiting
62.5 Ω so there is more brightness and more sensitivity in the system?
Its possible the sensor was sensitive enough at 25mA, in that case, it may not have been necessary to burn the extra power.
Second question datasheet says maximum Ic can be 100 mA, which would
mean Vr2 = (10000*100)/1000 = 1000 V. What does that means? Thats a
scary voltage.
You won't get 1000V or 100mA with this setup. The transistor can't create extra voltage or current. It just allows current to pass through it.
The maximum voltage you can get across R2 is the supply voltage minus the saturation voltage of the transistor (about 0.4V).
Therefore, the maximum current you can get through R2 is (5V - Vce_sat)/R2, which is about 460uA in this case.
The only way to get close to 100mA would be to lower R2 to like 46 ohms. But even then you probably couldn't get that much since it would require a lot of light.
Fourthly, wouldn't it be better to set R1 and R8 values such that it
allows for maximum possible Luminescence at the IR lamp, and then
controlling the signal quality from setting a preset at R8? I noticed
in this system that human pulse was only visible at approximately
R1+R8 = 300 Ω, if R1+R8 was say 70 Ω, would I have been able to see
the pulse again by setting the R2 at a lower value through a 10 kΩ
preset?
If you did an experiment that showed R1+R8 needed to be in the range of a few hundred ohms to get at reading, then its possible a 10K pot wasn't optimal. On the other hand there could be other experimental conditions (different sensor distances, skin color, placement on body) where it worked better with R1+R8 at several kilo-ohms. It's hard to assess that without more data. Use whatever works best under the conditions you plan to use the device.
Fifth question, I understand now from your answer what is the role of
R2, but suppose it is replaced by a preset and the preset is
accidently set to zero, would that in any manner be harmful to the
phototransistor?
The original circuit was designed with R2=10K. That means that the original designers intended for the detector to be fully on with only 460uA flowing in the transistor.
I can't say for sure, but its likely that under normal conditions transistor isn't going to conduct more than a few 10s of mA even with a lot of light hitting the detector. With a 5V supply that's likely to result in a few tenths of a watt of heat at most, which probably won't burn out the transistor.