0

I am studying 6G communication. There are a lot of publication about it, frequency, latency, BW and other KPIs.[1],[2]

Now one of the most popular research topics is an integration non-terrestrial network in 6G: satellite, HAP, LAP, … If we integrate a satellite in the terrestrial network, we will have a propagation delay for the satellite much longer than for terrestrial network due to altitude.

From a realistic viewpoint and width, carrier frequency, path attenuation, polarization mismatch or large delay and Doppler shift/rate, we can conclude that 6G system with satellite will not reach extremely high data rates.

How can I compute how much data rate we lose if we integrate a non-terrestrial network?

winny
  • 13,064
  • 6
  • 46
  • 63
Aid22
  • 3
  • 2

1 Answers1

1

Data rates are always limited by Shannon-Hartley Law due to BW, SNR and bit compression rate and modulation effectiveness with Fading loss.

Latency does not impact data rates in large messages. It will be tolerated for large packets of data and may compete with Gbps Wifi in LANs as the packet error feedback protocol does not need to ACK every packet and will NAK with the last valid packet # received if there is an error.

This will only affect the ping times which are very small packets.

BER will then be measured on large data packets and thruput as usual will depend on traffic and be some ratio of the burst data rate with jumbo packets.

Thus the protocol must adapt to latency to optimize thruput with burst rates dependent on initialization and data rate throttling methods due to disturbances. When fading losses disturb dynamic error rates , some memory of historical patterns may require user preferences for moving or static transceiver and contention to connection protocol with anti-jamming protocols.

Tony Stewart EE75
  • 1
  • 3
  • 54
  • 182