0

I am using a current output DAC connected to a voltage reference and a transimpedance amplifier.

I know the noise characteristics of the voltage reference and the amplifier. The DAC simply lists "13 nV/rtHz output voltage noise density" in its datasheet.

How do I combine these three values to estimate the output voltage noise density of the entire system? I know the DAC must convert the voltage noise of the reference into current noise which is then gained up by the transimpedance amplifier, but how to actually do this calculation is a mystery to me.

The DAC is an LTC2757. I want to use it in the typical applications manner shown here:

enter image description here

justinis
  • 23
  • 4
  • 1
    Include a link to the datasheet of the DAC. How that 13nV/rtHz is measured should be described, it must be across some impedance that loads the DAC as the DAC has a current output. – Bimpelrekkie Jun 17 '19 at 14:19
  • And draw a circuit. – Andy aka Jun 17 '19 at 14:29
  • and show all stray noise in the environment by some measurement and path of your signal in your "system". FYI https://electronics.stackexchange.com/questions/32257/noise-and-what-does-v-%E2%88%9Ahz-actually-mean – Tony Stewart EE75 Jun 17 '19 at 17:29
  • I added datasheet and circuit. I don't understand the last request. – justinis Jun 18 '19 at 16:46

0 Answers0