1

Sorry if this was asked before. I was unable to find such a question elsewhere in SE.

Why does space-wave communication (ie/ satellites) utilize different ground-station (GS1) broadcast and satellite (S)(re)broadcast frequencies? If "interference" is the answer, from where is it originating? Would it be GS1/receiving GS interference (sans S)? This would seem to imply that the broadcast signals are not ~straight-line. It would also seem to imply that the receiving GS would suffer the same problem from other (unmatched?) GS1s.

I was thinking that any slight ionospheric reflection (if any at these higher frequencies) of the GS1 signal received at GS, would be interfered with the S (re)broadcast.

Now, if the frequencies really must be different, then why is it usually written that f(GS1)>F(S). If rain attenuation is the answer, then how would this not affect (a) the GS1 signal, and (b) transmissions using higher frequencies (maybe none lower are available), then both GS1 and S signals will suffer.

Thanks!

Voltage Spike
  • 75,799
  • 36
  • 80
  • 208
Declan
  • 11
  • 2

1 Answers1

2

The main reason the uplink and downlink signals are separated is because you don't want to mix high power and low power signals in the same band. The LNA on the input can be saturated by out of band signals that are too close to be filtered out. On the ground, the uplink will be extremely powerful while the downlink will be extremely weak, so to have any hope of receiving the downlink you have to really separate the frequencies so you can filter out the uplink effectively. Now, specifically which frequencies are used is a convention, but I imagine that the uplink frequencies probably tend to be higher because ground stations generally have a lot more power available to run the transmitter than the satellite and so can work more effectively with higher frequencies that have more path loss.

alex.forencich
  • 40,694
  • 1
  • 68
  • 109
  • Thanks, Alex. The power use issue makes senes.. Regarding "The LNA on the input can be saturated by out of band signals that are too close to be filtered out" ... does this imply that the uplink signal is beamed very tightly...at all! I mean, if it's being sent via line-of-site propagation, then for a ground station to unwittingly pick up that signal, it must have either been diffracted significantly or was sent out over 2\pi in the first place... If you follow what I mean. – Declan Mar 30 '15 at 11:22
  • 1
    It's even more of a problem on the satellite itself, because of the volume and weight constraints. If the satellite is broadcasting on its own input frequency, it won't be able to "hear" the signal from the ground at all. – Dave Tweed Mar 30 '15 at 12:40
  • The issue is that the difference in signal power is HUGE. The TX might be something like 50 dBm (100W) while the RX might be something like -100 dBm. 150 dB is a crazy difference in power level. Antennas are directive, but they are not 150 dB directive. Maybe 50 dB for a really big one. If your RX LNA is saturated by a 0 dBm signal, there is no way you are going to be able to pull a -100 dBm signal out of that. It's really a restriction preventing other terrestrial sources (not just the satellite uplinks) from interfering with the downlink signal. – alex.forencich Mar 30 '15 at 17:53
  • Also satellites do not always use very directive antennas. I think high bandwidth communication satellites and satellites that do not want to be listened to (e.g. spy satellites) are the only ones that bother with high gain downlink antennas, either pointable or multi-beam. Note that in many cases the downlink antenna is already pretty high gain, but the satellite is so far from the earth that the ENTIRE PLANET is only a few degress across. Also note that you need a large antenna to get a high antenna gain, and satellites are space and weight constrained. – alex.forencich Mar 30 '15 at 18:02