I have designed a simple TX/RX system in GRC with a single USRP N210 and CBX-40.
I am transmitting sinusoidal signals at various frequencies within the 2-6GHz range and detecting the amplitude and phase of the received signals at each frequency when connecting a coaxial wire from the Tx port to the RX port of the USRP.
I am doing the FFT at reception and "looking" at the correct sample of my IF (samp_rate/bin_size*N). I therefore detect the amplitude correctly.
However, I observe that the phase is only detected correctly (hence, it has always the same value because the cable connecting both ports has a fixed length), for certain frequencies. In my case, for 2GHz, 2.1GHz, 2.2GHz etc. (every 100MHz).
I would like to have a finer phase resolution and I don't know what of the HW components (could be the PLL) or SW settings could solve my issue. Could it be related to the "tuning policies" of the USRP?