I don't have a circuit I'm working on, this is more of a theoretical question - I am trying to remedy a flaw in my understanding.
Imagine I want to build a high input impedance amplifier to work in the low mV range, with a few nV/√Hz noise. I want to amplify a 1-100KHz differential signal. Initially, I would start with a good quality instrumentation amplifier (e.g. AD8421) and just put capacitors in series with both inputs.
But that has a problem. There is no DC path to ground on the input, so it's probably going to slowly drift away and rail the output. So I need to add a resistor to ground on each input. See the first circuit in the diagram below. That resistor will set the input impedance of my amplifier, which I want to be about 100MΩ. But if I calculate the Johnson noise I expect from two 100MΩ resistors I get\$\sqrt{2} \times \sqrt{4k_BTR}\$ ≈ 1.7 μV/√Hz
So I came to the conclusion that I could have low noise or high impedance, but not both. I then found a commercial input preamplifier which is specified at 3.6 nV/√Hz input noise and 100MΩ input impedance. I had a look inside, and it seems that they use the circuit on the right.
simulate this circuit – Schematic created using CircuitLab
The two FETs at the right hand side are a matched pair (datasheet from google), and form the first stage of the amplifer. I didn't reverse engineer any more of the circuit, but I can if necessary.
So my question is: What is wrong with my understanding? Why does the second circuit not have about 1-2μV/√Hz white noise from the resistors?