11

Someone gave me this analogy - turning on your TV or radio does not cost the broadcaster more money because the radio waves would otherwise be dissipated by the air or other objects. Is this correct? I find it difficult to believe - surely some energy must be drawn in by the receiver circuit of the radio and the fact that the radio is on or off does not change the properties of said objects.

(OK, it may only be milliwatts drawn by said devices, but I'm still curious as to how much it would make a difference.)

Thomas O
  • 31,546
  • 57
  • 182
  • 320
  • 3
    I think milliwatts would be a very optimistic amount. – Nick T Mar 01 '11 at 21:31
  • 1
    Think pico and nano watts. – Kortuk Mar 02 '11 at 11:31
  • 1
    It's not "dissipated in the air". We specifically choose radio frequencies because the air is transparent for them. http://en.wikipedia.org/wiki/Absorption_%28electromagnetic_radiation%29 http://en.wikipedia.org/wiki/File:Atmospheric_electromagnetic_opacity.svg Related: http://electronics.stackexchange.com/questions/4664/stealing-energy-from-radio-towers-or-power-lines – endolith Mar 02 '11 at 16:13

7 Answers7

20
  • Turning on your TV or radio does not cost the broadcaster more money, because broadcast transmissions would otherwise be dissipated by some other object.
  • some energy is drawn in by the receiver circuit of the radio.

Both statements are true. Do you think one contradicts the other? I think of it as analogous to the big decorative water sprinkler at the local park. (I agree that "dissipated in the air" seems unlikely).

  • Letting your dog drink water from the sprinkler does not cost the park more money, because the same amount of water comes out of the sprinkler, whether some of the droplets land on the dog's tongue or whether they all falls on the ground.
  • Some water is drawn in by the dog.

Often several dogs catch water drops on their tongues at the same time at the same fountain, and still the vast majority of the water is "wasted" landing on the ground. Likewise you can have thousands of people tuning in to the same TV station, and still the vast majority of the photons pouring out of the transmission antennas never hits a receiver antenna, but instead is "wasted" hitting trees or mountains or escaping to outer space. There is no way to tell from looking at the park's water meter whether dozens of dogs drink water from this fountain, or no dogs at all -- the same water comes out the sprinkler either way. There is no way to tell from looking at the broadcaster's electic meter whether thousands of people are tuned in, or no one is tuned in -- the same electromagnetic power comes out of the transmission tower either way.

This is very different from the way energy flows "through the air" in a air-core concentric-coil transformer, or "through the air" in an air dielectric capacitor, or the way mains powered devices "draw in" only the amount of current and power they need.

  • Do radio receivers use any power from a transmitter?

A few crystal radios have no batteries or mains connection -- all the power they have comes from the radio transmitter, and the radio uses power from the transmitter to drive the earphone.

You could argue that most radios extract only the signal from the station; all the power from the antenna ends up warming the BE junction of the first transistor in the pre-amplifier, and 100% of the power "used" by the radio in later stages and to drive the speakers comes from batteries or mains power or a clockwork spring.

How much would make a difference? Well, if we packed enough radios and their antennas all around the broadcast antenna, eventually we would form a Faraday cage -- those radios would absorb all the broadcast energy, and other radios outside a Faraday cage cannot hear any transmissions from inside it.

There are a few things this analogy does not capture perfectly. Although it is tempting to think of the antenna as a "bucket", since the bigger it is the more photons it catches, a tuned antenna can catch far more energy than one might expect from its size and the local energy density -- 'Energy-sucking' Radio Antennas. If a puppy is catching the droplets on his tongue, and then a German Shepherd steps over him and catches the droplets first, then nothing reaches the puppy -- unless the puppy moves a bit to the side to get out of the shadow of the big dog. Likewise, if you put one radio antenna close to and immediately "behind" another radio antenna (as seen from the transmission tower), the upstream radio one will receive perfectly, as if the downstream radio is not even there, and the downstream radio will hear nothing -- until the downstream radio moves a bit to the side to get out of the shadow of the big dog. However, if you move the downstream radio antenna further away from (and yet still behind) the upstream antenna, it will also start to hear the station -- the power from the station "curves around" the upstream radio.

A typical FM broadcast tower puts out 100 kW ERP (+80 dBm) and 300 m high. FM radio receivers are expected to work down to a signal strength of 0.5mV/m. A typical radio reciever has a sensitivity of -90 dBm with an antenna roughly 1 m long.

davidcary
  • 17,426
  • 11
  • 66
  • 115
  • Another analogy, is that the receiver will cast a "shadow". – markrages Mar 02 '11 at 02:00
  • forgive me if I am wrong on this, but -90 dBm correlates to 1pW. Just adding this for those that do not do dBm conversions well. – Kortuk Mar 02 '11 at 11:33
  • on the note of the wall of receivers, in this case I think the antenna would start having near fields effects and reflected power effects, but that is above the scope of the OP, just want to note it. – Kortuk Mar 02 '11 at 11:38
  • @markrages: I don't think that's a very good analogy, since radio waves will diffract around any human-sized objects. – endolith Mar 02 '11 at 16:19
18

Does the Sun create more energy when the cat lies on the windowsill?

markrages
  • 19,905
  • 7
  • 59
  • 96
  • 10
    Only if the total power output of the Sun is coupled to the Enlightened Cat – Toby Jaffey Mar 01 '11 at 22:00
  • 1
    @JobyTaffey, well written. @Thomas, there is actually an important point in what Joby wrote, more than an excellent joke, in that if your receiver causes a change in output impedance for the transmitter you can change how much power the tower has to radiate. In reality any device at any range does give a feedback(a perfect antenna system radiates as much power as it receives) and this could couple back to the original tower causing mismatch issues. The theory can get very complicated, but related to building a metal wall nearby that uses power from 1/4 of the field, this would change things. – Kortuk Mar 02 '11 at 11:37
3

That is correct; your statement about dissipation is about right, it just keeps radiating out into space or the distance. Think about light: everyone can look at a light source whether it can hit the walls or not, it doesn't draw any more power.

Some energy is received by the receiver, but this doesn't affect the transmitter.

Brian Carlton
  • 13,252
  • 5
  • 43
  • 64
3

You can only electrically couple to the evanescent standing waves in the near field (less than a wavelength) where the E/H impedance varies until finally settling to the free space limit of 376.73 Ω. The near field terms decay rapidly in proportion to 1/r^2 and 1/r^3. In contrast, the far field is no longer electrically coupled to the transmitter and decays much more slowly in proportion to 1/r. The energy simply radiates away, whether or not it gets picked up by any receiving antennas, or walls, or whatever. In empty space it would propagate at the speed of light according to the pattern of the antenna design, such as an omni-directional torus for a dipole antenna.

Eryk Sun
  • 849
  • 7
  • 11
0

Yes i would say they get dissipated some way or other. Any energy dissipated in the devices would be probably in the range of picowatts.

Erik
  • 47
  • 1
0

I am neither physicist nor electrical engineer. I did not know the answer to your question, so read Wikipedia for some background. Caveat Emptor.

By definition:

Radio waves are a type of electromagnetic radiation with wavelengths in the electromagnetic spectrum longer than infrared light.

From Maxwell's Laws: (they seem important, I've heard of them)

Ampère's law with Maxwell's correction states that magnetic fields can be generated in two ways: by electrical current (this was the original "Ampère's law") and by changing electric fields (this was "Maxwell's correction").

Maxwell's correction to Ampère's law is particularly important: It means that a changing magnetic field creates an electric field, and a changing electric field creates a magnetic field.[1][2]

Therefore, these equations allow self-sustaining "electromagnetic waves" to travel through empty space

By the Particle Model of Electromagnetic Radiation

EM waves are emitted and absorbed as discrete packets of energy, or quanta, called photons. Because photons are emitted and absorbed by charged particles, they act as transporters of energy

The amount of energy received is defined by the Link Budget. But, it is not coupled to the transmitter.

Could enough crystal radio sets mute the BBC World Service?

Toby Jaffey
  • 28,796
  • 19
  • 96
  • 150
-2

I don't think radio receivers effect the transmitter of the radio waves at all. A million radio receivers within the broadcast range of the transmitter will not effect the transmitter anymore than 1 radio within the range. Radios are looking for the changes imposed on the radio waves by the content; i.e., radio waves are carriers at a specific frequency (and radios tuned to that frequency are only looking for that specific frequency); the "content" imposed on the carrier by the transmitter modifies the wave profile. Radios play the differences between the carrier wave frequency and modifications imposed by the content. JMHO.