I am trying to work out what happens when you downconvert an RF signal, such as what happens in an SDR device when tuning to a given frequency. For example if a device using a Zero-IF is tuned to 399MHz, then whatever signal you see at 400MHz will appear at 1MHz where it is then digitised.
Now imagine at 400MHz you see a signal consisting of just a carrier, which switches on and off very rapidly. On for one cycle, off for one cycle, on for another cycle, off again. If you assign a binary 1 to the 'on' cycle and a binary 0 to the 'off' cycle, I believe this would allow you to transmit 400,000,000 bits per second.
Now what happens if this signal is downconverted to 1MHz ready for the SDR to digitise? If the 1MHz carrier switches on and off at a rate of one cycle at a time, there will only be 1,000,000 transitions, although the original signal had 400,000,000 transitions in the same time period.
So what happens in this case? Does the 1MHz carrier cycle on and off at the original 400MHz frequency? Does that allow you to transmit your original 400,000,000 bits per second on a 1MHz carrier frequency? Or are the extra cycles lost somehow? What would the resulting signal at 1MHz look like?