I'm a layman in this field. I have a question about RF transmission.
How does the data transferred by RF waves (for example in mobiles, wifi) reach the users accurately in spite of losses during transmission?
I'm a layman in this field. I have a question about RF transmission.
How does the data transferred by RF waves (for example in mobiles, wifi) reach the users accurately in spite of losses during transmission?
A receiver only needs to receive more energy than the background noise around it to be confident about whether a 1 or 0 was transmitted.
A typical signal threshold for mobiles is around -100 dBm, about 0.1 pW, 10-13 watts, whereas wifi/mobile transmitters are pumping out a good fraction of one watt.
That -100 dBm compares with the fundamental physical limit of -114 dBm in a 1 MHz bandwidth for the noise level of a perfect front end amplifier, a factor of 14 dB = 10-^1.4 = 25 times. That factor allows for a lot of inefficiency in the receiver of not being ideal, and losses from antenna to receiver.
The ratio of around a watt at the transmitter to pW at the receiver is the 'path loss' that all radio communication systems have to design around. If your transmitter is only mW, then the path has to be shorter. A TV transmitter pushing out 10s of kW has a much greater range.
Of course, there's more 'noise' in the environment than just the fundamental physical limit of resistor noise. There are transmitters operating on different frequencies. These are handled at the receiver by first of all making the antenna somewhat tuned, which rejects far off signals. Closer signals are rejected by RF filters before conversion in the receiver. Even closer signals are handled by channel filters after conversion in the receiver. Signals at the same frequency, are handled by correlation and error correction techniques in the receiver.
This hierarchy of noise rejection is ensured by the designers of the radio systems, and I emphasise Systems. When new radio systems are designed, the allocation of certain bands to certain geographical areas is based on limiting the interference of one system to adjacent systems, to that which can be handled by existing receivers.
The basic approach is to ensure that the transmitter puts out so much power that a sensitive receiver can pick it up. We call this getting a good signal. Mobile systems constantly adjust signal levels to give good reception without wasting power.
After that, all sorts of tricks can be played to reduce or correct errors. With digital mobiles, these tricks get incredibly complicated. The signal pattern is diced and sliced with sufficient duplication built in that if a data block is lost, all the others will duplicate enough that it can be reconstructed. Signals reflecting off obstacles are characterised and "deconvolved" to remove the echoes. Timing delays due to the finite speed of light are measured and constantly synchronised as the mobile moves around. The list goes on and on.