3

I'm a layman in this field. I have a question about RF transmission.

How does the data transferred by RF waves (for example in mobiles, wifi) reach the users accurately in spite of losses during transmission?

JRE
  • 67,678
  • 8
  • 104
  • 179
CHEMOJEE
  • 33
  • 4
  • Special modulation techniques that can distinguish signal form noise and rejects signals from other transmitters even if transmitted at the same frequency. – Marko Buršič Feb 27 '21 at 08:27
  • 7
    Why do you think it is efficient or what is your definition of efficiency? In most cases almost all the energy is lost as the receivers, if there are any, are so small in comparison with the spherical surface at that distance from the transmitter. – Transistor Feb 27 '21 at 08:33
  • @Transistor My definition of efficiency(for now) concerns over lesser loss in information transmitted. – CHEMOJEE Feb 27 '21 at 09:12
  • @MarkoBuršič This answers me. Could you provide some examples? – CHEMOJEE Feb 27 '21 at 09:17
  • 1
    The basic transmission of a radio wave is lossless. However, an antenna will transmit in a lot of directions simultaneously and therefore the power received thins out with distance. – Andy aka Feb 27 '21 at 09:24
  • 2
    [RF energy harvesting](https://electronics.stackexchange.com/questions/175121/rf-energy-harvesting/175123#175123) might give you an insight. – Andy aka Feb 27 '21 at 09:25
  • Shannon's theorem is probably most relevant here: so long as the signal is above the noise floor, it can be resolved into symbols. – pjc50 Feb 27 '21 at 14:41
  • really interesting to know. – CHEMOJEE Feb 27 '21 at 17:55

2 Answers2

10

A receiver only needs to receive more energy than the background noise around it to be confident about whether a 1 or 0 was transmitted.

A typical signal threshold for mobiles is around -100 dBm, about 0.1 pW, 10-13 watts, whereas wifi/mobile transmitters are pumping out a good fraction of one watt.

That -100 dBm compares with the fundamental physical limit of -114 dBm in a 1 MHz bandwidth for the noise level of a perfect front end amplifier, a factor of 14 dB = 10-^1.4 = 25 times. That factor allows for a lot of inefficiency in the receiver of not being ideal, and losses from antenna to receiver.

The ratio of around a watt at the transmitter to pW at the receiver is the 'path loss' that all radio communication systems have to design around. If your transmitter is only mW, then the path has to be shorter. A TV transmitter pushing out 10s of kW has a much greater range.

Of course, there's more 'noise' in the environment than just the fundamental physical limit of resistor noise. There are transmitters operating on different frequencies. These are handled at the receiver by first of all making the antenna somewhat tuned, which rejects far off signals. Closer signals are rejected by RF filters before conversion in the receiver. Even closer signals are handled by channel filters after conversion in the receiver. Signals at the same frequency, are handled by correlation and error correction techniques in the receiver.

This hierarchy of noise rejection is ensured by the designers of the radio systems, and I emphasise Systems. When new radio systems are designed, the allocation of certain bands to certain geographical areas is based on limiting the interference of one system to adjacent systems, to that which can be handled by existing receivers.

Neil_UK
  • 158,152
  • 3
  • 173
  • 387
  • Technically, there are ways to make it work even if the signal is lower energy than background noise; this is used in communication with deep space probes, for instance. I don't fully understand it myself, but they do it, and I doubt we could communicate with New Horizons without it. – Hearth Feb 27 '21 at 16:33
  • @Hearth: The trick is called "processing gain." GPS uses the same thing to allow really crappy little antennas to pick up a really low power signal. In the case of GPS, you transmit a bit stream at a high rate (1MHz for the civilian signal.) On that bit stream, you modulate the actual data at a much lower rate. GPS uses 50 bits per second. You then demodulate the fast stream (which is full of errors) then demodulate the low speed stream (which has nearly no errors.) [Wikipedia article,](https://en.wikipedia.org/wiki/GPS_signals#Demodulation_and_decoding) – JRE Feb 27 '21 at 16:43
  • @Hearth Read my assertion more carefully, 'more energy than the background noise `around it`'. The meaning of `it` has to be read in the context of the communication system, it could mean time, or frequency, or sequency, or combinations of those. For CDMA, we are filtering in sequency, or 'liftering' as some authors like to call it. – Neil_UK Feb 27 '21 at 16:44
1

The basic approach is to ensure that the transmitter puts out so much power that a sensitive receiver can pick it up. We call this getting a good signal. Mobile systems constantly adjust signal levels to give good reception without wasting power.

After that, all sorts of tricks can be played to reduce or correct errors. With digital mobiles, these tricks get incredibly complicated. The signal pattern is diced and sliced with sufficient duplication built in that if a data block is lost, all the others will duplicate enough that it can be reconstructed. Signals reflecting off obstacles are characterised and "deconvolved" to remove the echoes. Timing delays due to the finite speed of light are measured and constantly synchronised as the mobile moves around. The list goes on and on.

Guy Inchbald
  • 1,435
  • 5
  • 10