I'm a newbie trying to measure μV-range voltages using cheap op-amps, with some software calibrations or error corrections, for education and practice.
At first, I looked at some op-amp datasheets and looked for parameters with their unit defined as "mV" or "V", because I thought it should be indicated as volts, but didn't find anything which I could relate except Vio. I think that aforementioned parameter which I call "sensitivity" should be Vio
(input offset voltage), because it's the minimum differential voltage that can make the output non-zero, and hence, amplify it.
According to this document from TI it is defined as:
The input offset voltage is defined as the voltage that must be applied between the two input terminals of the op amp to obtain zero volts at the output.
So, is this unknown parameter, the Vio?
If the answer is yes, the LM358 datasheet mentions a Vio of 7 mV
maximum; this means that, for example, a differential voltage of 1 V and 1.007 V should be amplified. But the question is, does the output change when we change the 1.007000 V to 1.007001 V? Also, in some op-amps, the offset voltage is "nullable". Does this affect the sensitivity?
If the answer is no, then which parameter indicates the minimum voltage? So we know if this op-amp is good to use in the μV or even nV range?
Thank you for your time and knowledge.