5

This topic has got me so much confused.

In terms of Wi-Fi, when we say the 802.11b has 22 MHz of Channel Bandwidth, what does it mean? When we refer to a particular Channel No. say Channel 1, it has its center frequency of 2412 MHz, but it spans from 2401 MHz (-11 MHz) to 2423 MHz (+ 11 MHz) to form a 22 MHz channel. Why is that? Why can't the channel be just 2412 MHz signal?

How does the data utilize the 22 MHz width?

Also how does increasing the Bandwidth help in increasing the throughput (not data rate)? No highway examples please, because those are layman and incorrect analogies.

Thanks

MarkU
  • 14,413
  • 1
  • 34
  • 53
eecs
  • 165
  • 12
  • 1
    Because a pure 2412 MHz sine wave carries no information. And you'll find that any signal which is not a pure sine wave has nonzero bandwidth. – user253751 Apr 12 '16 at 22:54
  • How does the data/information get spread over a 22 MHz width then? – eecs Apr 12 '16 at 22:55
  • Not a snark answer! Wikipedia knows all. https://en.wikipedia.org/wiki/Bandwidth_%28signal_processing%29 –  Apr 12 '16 at 23:15
  • It's not intuitive to understand just by knowing B.W. is difference between highest and lowest frequency. My question is what are the factors which decide how wide the channel should be? And how does the information/data gets spread over it? Channel is a set of frequencies. But when we always point out to the center frequency. – eecs Apr 12 '16 at 23:20
  • My earlier question may have sounded like the duplicate. Modified it to make it more specific and which also goes with the explanation. – eecs Apr 12 '16 at 23:44
  • "Why can't the channel be just 2412 MHz signal?" is a duplicate of the first part of that other question. I'm not sure that "How does the data utilize the 22 MHz width?" is a question that makes sense in its current wording. That leaves "how does increasing the Bandwidth help in increasing the throughput?" – user253751 Apr 12 '16 at 23:54
  • 1
    Don't think of it as "how does increasing the bandwidth increase the data rate?"; think of it as "how does increasing the data rate increase the bandwidth?" – user253751 Apr 13 '16 at 02:10
  • Not data rate but throughput. From what I understand, throughput answers the question, "how much?" whereas data rate answers the question, "how fast?". I have read people making claims, higher the bandwidth higher the throughput, that I am not able to understand. – eecs Apr 13 '16 at 06:38

1 Answers1

5

The original 802.11 standard used an 11 chip Barker code modulation. For each data symbol, an 11 chip sequence is transmitted. The symbol rate is 1 Ms/sec, so the chip rate is 11 Million chips per second. This yields the resulting spectrum (i.e. null to null) is +/- 11 MHz around the center frequency. The exact shape of the spectrum is determined by the filtering of the modulated chips.

IEEE 802.11b introduced a new modulation format called Complementary Code Keying, but it employs the same 11 Million chips per second as the original 802.11. So the output spectrum has the same nulls at +/- 11 MHz relative to the center carrier.

See IEEE 802.11-2012 standard or 802.11 Handbook: A Designer's Companion by Al Petrick and Bob O'Hara.

Chris Hansen
  • 536
  • 3
  • 7
  • @Chris Hansen Does QPSK modulation actually transmits 16 bits per 8 bit CCK code word? I am referring to following source https://books.google.hr/books?id=nSKNDAAAQBAJ&pg=PA422&lpg=PA422&dq=engineering+desk+reference+cck&source=bl&ots=Bxrr4P4_P4&sig=SCfcKL_eUyJwtpW0yUhuDna2OzQ&hl=hr&sa=X&ved=2ahUKEwit9pS2m8XcAhWnh6YKHc9hBZcQ6AEwAXoECAEQAQ#v=onepage&q=engineering%20desk%20reference%20cck&f=false , pp. 422 and table at bottom left. I am trying to understand how CCK works and this got me little confused. – Quirik Jul 29 '18 at 21:11