Based on this post, let me assume that I have a input source of 94 dB SPL / 1 Pa that a microphone is picking sounds from.
The microphone has a sensitivity of -46dBV/Pa , this gives 0.005012 V RMS/ Pa.
Let us assume the ADC has unity preamp gain and no additional gain. Let us also assume an ideal case where there is no degradation in signal due to noise before ADC.
Now, I'm guessing the signal measured at ADC would be the same as input, which is 0.005012 Vrms.
20×log 0.005012/005012 = 0 dB
so the dB SPL will be (-46) + 0 = -46 + 94 = 48 dB SPL.
Why is the output not 94 dB SPL which is what I thought the output should be because we are inputting a 1 Pa sound?