0

What I'm thinking of is a 2 way communication (voice communication) between two systems which are in motion. (like 100 km/h difference) Will the data transmission and reception be disturbed, by how much?

Is this present in modern transreceivers?

I'm not fixed on a particular frequency. In that case, what difference will it make for a 400 MHz vs a 2.4 GHz carrier on data throughput?

Background: I read something about Doppler shift. How much noise will be induced due to this effect?

I found 2 questions similar in the same forum (not same questions but talking about similar communication stuff). I'll add the links once I get my hands on my PC.

Dave Tweed
  • 168,369
  • 17
  • 228
  • 393
seetharaman
  • 476
  • 1
  • 6
  • 17

5 Answers5

6

I don't know why everyone else is focusing on the carrier frequency. The data throughput is a question about the modulating frequency (i.e., baud rate).

The Doppler effect applies to modulation, as well. If the transmitter is moving toward the receiver, the baud rate is increased by a factor of 1.000000092, and if it is moving away, the factor is 0.999999908 — the baud rate is reduced.

There is also relativistic time dilation, which is based on what fraction of the speed of light the relative velocity is. If it takes Δt seconds to transmit some number of symbols, they will arrive at the receiver in Δt' seconds:

$$\Delta t' = \frac{\Delta t}{\sqrt{1 - \frac{v^2}{c^2}}}$$

So yes, the data throughput from the receiver's point of view is reduced by a very tiny fraction at the speed you're talking about.

For a speed of 27.778 m/s, that fraction is about 4.28669×10-15, giving a ratio of 0.9999999999999957.

Dave Tweed
  • 168,369
  • 17
  • 228
  • 393
  • I don't think I ever said that the discussion in my answer applied to the carrier and not the modulation...I'll make it more clear. – The Photon Jan 17 '16 at 21:15
2

I read something about Doppler shift... Is this present in modern transreceivers?

The doppler effect doesn't depend what kind of transmitter and receiver is used. It is caused by fundamental physics, because electromagnetic signals propagate at a finite velocity (approximately c). So if you are travelling away from the source, after one wavecrest reaches you, it will take slightly longer for the next one to reach you than the time between the crests when they left the source.

Will the data transmission and reception be disturbed, by how much? ...

The doppler effect changes the received frequency according to

$$f_r = (1+\frac{\Delta{}v}{c})f_s$$

where \$f_r\$ is the received frequency, \$f_s\$ is the source frequency, and \$\Delta{}v\$ is the velocity of the receiver relative to the source. As Dave Tweed points out in his answer, this applies to both the carrier frequency and the modulation frequency.

So, in your case, 100 km/hr is about 27.8 m/s, or 0.000000092 c. The data will arrive at the receiver at about 0.99999991 times the rate it left the source. It's unlikely that any practical communications system would be affected by this difference.

How much noi[s]e will be induced due to this effect?

The doppler effect does not induce noise. It just changes the frequency of the signal arriving at the receiver.

The Photon
  • 126,425
  • 3
  • 159
  • 304
  • I agree with all the notes about doppler shift, but the frequency shift at the receiver is *a form of noise* (albeit tiny in this case) in the sense the inbound signal is being very slightly modulated due to relative motion of the link pair. – Peter Smith Jan 17 '16 at 17:23
  • 3
    @PeterSmith, if the tx and rx are have constant (or even controlled) relative velocity, I would not call it noise because it is not a random effect. If \$\Delta{}v\$ is random for some reason, then I suppose you could call it noise. – The Photon Jan 17 '16 at 17:24
  • 1
    One notion that it's not noise is the fact that you can predict and correct for it. Maybe it's better to call it an offset. – gbarry Jan 17 '16 at 18:07
1

Doppler shift at those speeds amounts to few KHz. A narrow band transceiver at 433MHz might suffer reduced sensitivity with such an offset. A 802.15.4 at 2.4GHz (Zigbee and similar) transceiver with its 5MHz channel, will not even notice it. Throughput will be affected only if the link has an adaptive scheme of some sort.

Lior Bilia
  • 7,282
  • 1
  • 20
  • 30
  • Could you add some reference for further reading. Thanks. – seetharaman Jan 17 '16 at 15:52
  • @seetharaman, links to further reading aren't the focus of this site. If you want to know more, just ask another question. – The Photon Jan 17 '16 at 16:34
  • @ThePhoton what I meant was some description that provides the mathematical explanation for the given answer. How would that be off the scope? – seetharaman Jan 17 '16 at 16:57
  • You should just ask Lior to give some more math, or edit your question to clarify that you want a mathematical answer. – The Photon Jan 17 '16 at 16:58
1

Yes, Doppler will, on many systems, influence throughput.

First of all, though, as mentioned by the other answers, the induced symbol rate change will, on any system that I know, not be measurable. Point is that receivers always have to do some kind of timing recovery, if they try to achieve high throughput for a significant amount of time. That timing recovery will need to be there because no two oscillators are physically the same, so the transmitter's symbol clock will never be exactly the symbol clock a receiver would have generated by itself. As oscillator errors typically are in the parts per million down to parts per billion, if an extremely good oscillator was used, your doppler shift induced rate change will just be compensated by that. So yes, there's a rate change, but no, that won't be spottable at these speeds.

Then, of course, a transceiver is typically a complex system. Often, if not usually, if bidirectional, there's a method to ask for data that hasn't reached its target to be re-sent. If Doppler shifts your TX signal out of the "sweet spot" your receiver is currently tuned to, there's a good chance that will reduce SNR, and hence increase Symbol Error Rate, and hence increase packet error rate, and if forward error correction can't deal with that, throughput will be reduced by the amount of packets that have to be re-sent.

Many systems, amongst these WiFi, LTE etc. are rate-adaptive, in that they can reduce the modulation order, making the transmission more robust against error, and/or increase coding redundance, making errors easier to detect and correct. Both measures might kick in when Doppler stresses whatever control loop structure is used to track the signal, actively reducing the rate available. Keep in mind, however, that these systems typically optimize the rate:

Though Doppler doesn't introduce "new" noise, it can decrease SNR by reducing the signal power actually available for demodulation, or increasing the average error. For example, a frequency offset in a quadrature demodulation receiver will lead to a ongoing rotation of the constellation diagram. Hence, you will get something that looks like it was noise all the time -- your average received constellation point isn't on the spot it "should be" any more, but is most of the time off by a certain amount of degrees. That means that with less noise, you can make a symbol be misinterpreted as a neighboring symbol.

Reduced SNR will directly reduce the channel capacity:

Shannon's channel capacity is

$C=b*log_2(1+SNR)$

with b being the bandwidth, and S/N being the ratio of signal to noise power, all in bit/s.

No matter how good your modulation, how efficient your forward error correction is, you will never be able to transmit more bits per second over a channel than given by above formula.

Hence, the central trick one has to pull is making sure you can get as much of S out of your signal, and reducing N as far as possible. As mentioned above, a frequency shift looks like increased N, but if you have a control loop that corrects that fast enough, you can get low N, nevertheless.

Marcus Müller
  • 88,280
  • 5
  • 131
  • 237
0

Doppler of this small amount won't affect data/voice integrity but moving the relative postions of the two communicating systems can play havoc at some point or other due to signal cancellation (multipath fading). It will be short-lived but could become a nuisance. But then again, two static systems can be subject to data integrity problems due to other things moving and, these other things could cause an almost permanent loss of signal due to fading problems. So it's swings and roundabouts really.

I'm not fixed on a particular frequency. In that case, what difference will it make for a 400 MHz vs a 2.4 GHz carrier on data throughput?

Regards the transmit/receive frequency, lower frequencies require a larger antenna (to suit the wavelength of the carrier) and because the antenna must be larger, they will inevitably receive more power from the transmitter. An antenna may look like a simple straight piece of wire but it has an effective area called an aperture so, lower frequency = longer wavelength = bigger antenna = more square metres = more power received.

The extra loss in the link for operating at 2.45 GHz compared to 400 MHz is 20log\$_{10}\$(2.45/0.4) = 15.7 dB so it's quite a chunk of power loss going up to 2.45 GHz from 400 MHz. Of course, this assumes that the transmit power for both frequencies is the same.

Some related Q&A: -

Long range (~15 km) low baud-rate wireless communication in a mountain environment (no LOS)

How to know (or estimate) the range of a transceiver?

Andy aka
  • 434,556
  • 28
  • 351
  • 777
  • Regarding the frequency of transmission, if the lower freq is used, it will increase antenna size, but the data distortion will be relatively less(compared to higher freq). Is that right? – seetharaman Jan 18 '16 at 00:39
  • The lower frequency HAS to use a bigger antenna THEREFORE it retrieves more power at a given distance from the transmit antenna. More power = lower susceptibility to interference = more robust. – Andy aka Jan 18 '16 at 08:42