36

Bluetooth, WiFi, Zigbee, Remote Controls, Alarms, Cordless Phones etc..

Why all of these protocols, devices, etc. use 2.4 GHz band instead of 3.14 GHz. What is so special about it?

cbr
  • 463
  • 4
  • 4
  • 19
    We can also add **Microwave Ovens** to the list of 2.4 GHz devices. – Nick Alexeev Aug 19 '14 at 03:31
  • 8
    Because it's difficult to get a stable oscillation at *exactly* π GHz? – user Aug 19 '14 at 20:31
  • 2
    @MichaelKjörling For π GHz use a circulator! ;) – Phil Aug 20 '14 at 17:04
  • 2
    @NickAlexeev - Interestingly, [microwave ovens are apparently one of the original reasons why 2.45GHz was made an ISM band.](http://www.itu.int/dms_pub/itu-s/oth/02/01/S020100002B4813PDFE.pdf) (pg 466) – Compro01 Aug 21 '14 at 03:42

5 Answers5

44

2.4 GHz is one of the industrial, scientific and medical (ISM) radio bands. ISM bands are unlicensed, which makes it easier to certify the equipment with FCC (or its counterparts in other countries).

However, what special about 2.4 GHz? There is about a dozen ISM bands. Some at higher frequency, others have lower frequency. Not all ISM bands are international. But 2.4 GHz is an international band.


update:

Microwave ovens also operate at 2.4 GHz, which is not a coincidence.
Short version in Q&A format:

Q: Why does so much wireless communication operate at 2.4 GHz band?
A: Because it's an ISM band, and it's unlicensed, and it's international.

Q: Why is 2.4 GHz an unlicensed band?
A: FCC has originally set aside this band for microwave heaters (cookers, ovens). As a result, from the beginning, this band is polluted by the microwave ovens.

Q: Why 2.4 GHz for microwave ovens? Microwave ovens can work on pretty much any frequency between 1 and 20 GHz. There's nothing special (like resonance), when it comes to absorption of microwaves by water at 2.4 GHz (see also here).

A: The frequency choice was based on a combination of empirical measurements of heat penetration for various foodstuffs, design considerations for the size of the magnetron, and frequency considerations for any resulting harmonic frequencies.

[These considerations were proposed by Raytheon and GE to FCC in 1946, when the decision about 2.4 GHz was made.]

The long versions can be found here. [This link goes to Indiegogo, because this bit of historical research was crowd-funded.]
Also, this FCC document (54MB) from 1947 can be of interest. Thanks, @Compro01 for finding this reference.

Nick Alexeev
  • 37,739
  • 17
  • 97
  • 230
29

The "special" thing about 2.4GHz is that when spectrum was allocated for various needs in the 60's and 70's, no one wanted it, because it was thought that atmospheric water absorption made it useless.

Lior Bilia
  • 7,282
  • 1
  • 20
  • 30
  • 5
    When we were experimenting with 802.11-based wireless links in the 1990s (back then, only two companies provided the equipment - BreezeCom and Western Radio), we used directional antenna to shoot the signal 3 to 5 miles. Trees gave us a lot of problems in the summer because of the water in the leaves. One type of tree was particularly problematic, but I don't remember which it was. It effectively created a "line of sight" propagation. – jww Aug 19 '14 at 08:13
  • 4
    Ding ding ding ding ding ding! Nearly (9/10) the correct answer. Atmospheric water absorption DOES make it, not "useless," but a lot less useful than a lot of other microwave bands *for long-distance work*. But as "EE developer" said above, that actually is an advantage for what are supposed to be "local area networks". Note that the crazily long-distance WiFi contacts that are made each year around DefCon time are done in the Nevada desert, where there's very little water in the air (and, of course, big high-gain dish antennas). – Jamie Hanrahan Aug 19 '14 at 21:03
  • Have a look at [water absorption spectrum](http://www.csudh.edu/oliver/che230/textbook/watabsb.gif) it starts to climb around 1 GHz and keeps climbing and climbing through 2.4 GHz, 5.7 GHz and onwards. – Nick Alexeev Aug 19 '14 at 23:07
  • 1
    Incidentally this (water absorption) is exactly why microwave ovens operate at this frequency - the power couples well into the water in the food. And while this makes it not-so-great for long range transmission, this is an advantage for WiFi / Bluetooth etc operation, not a hindrance. – Floris Aug 20 '14 at 04:32
  • 2
    @Floris Just to clarify, there's nothing out of the ordinary (like resonance) happening with water at 2.4 GHz. Microwave ovens can work on pretty much any frequency between 1 and 20 GHz. Apparently, we don't know the specific reasons why 2.4 GHz was chosen. May be, the other bands were already occupied. (There are even people who are [raising funds for historical research](https://www.indiegogo.com/projects/why-2-4ghz-chasing-wireless-history) on the subject of 2.4 GHz.) – Nick Alexeev Aug 20 '14 at 09:16
  • @NickAlexeev that project was funded 2 years ago; if you go to the updates tab you can see what he found. Short version was that 2.4ghz was baked into GE's microwave oven design and close enough to what Raytheon originally wanted to work for them too. – Dan Is Fiddling By Firelight Aug 20 '14 at 13:02
14

It is 'special' since it does not go very far.

Strangely, this turns out to be an important advantage as many devices and people can use the same band in near by area without interference.

Tele density is the term used in phone industry as how many cordless phone per square mile. Early generations (25 years ago) coreless phone use few MHz and tens of MHz and go too far. Modern (now year 2014) cordless phone use GHz (some are not 2.4GHz) for short range and high tele density.

There is social vs technical story and dimension behind this. My first job, 30 years ago, was first generation coreless phone using 1 MHz and 50MHz, work range a few miles, excellent for farmer and country size home.

Cell phone was just coming out at price 5% of a house, too costly for use in that social context so coreless fits social demand.

As more people use them, big interference, phone at times has 10 blinking LED showing searching for un-used channel as they are getting too crowded. Then, move to higher frequency, 900MHz and the likes.

Then come Spread Spectrum. It was time at IEEE tech conference that Spread Spectrum session were off limit to civilian. That changed. SS tech moved into consumer items, WLAN, GPS, Coreless phone, 3G Cell phone, Remote Control model, Bluetooth.

Next higher move to 2.4G did the trick of balancing social need, short range (BT is a few meters, WLAN tens of meters), spread spectrum, anti-interference, auto channel search (old RC modeler fly color flag on antenna to tell other to stay off their channel).

As other responder pointed out, cost did play a part. My first 2.4GHz WLAN is 4 by 10 inch PC plug in card, at 2000 US dollar. Now, we have finger nail USB plug at cost order of magnitude lower.

2.4GHz was 'special' since to does not go very far.

Also, SS and social demand at that times shaped the present situation as described by original poster, that many devices use 2.4GHz

EEd
  • 989
  • 4
  • 12
  • 1
    yet, strangely enough, I recall having a 2.4Ghz and 5Ghz cordless phone and the former had a much better range. – Michael Aug 19 '14 at 16:12
  • Path loss is proportional to the square of the frequency of the radio signal. Higher frequency is much higher path loss. 2.4GHz was chosen, among other reasons, is that it has much greater loss than first generation cordless phone that use tens of MHz. http://en.wikipedia.org/wiki/Free-space_path_loss – EEd Aug 19 '14 at 16:20
  • http://en.wikipedia.org/wiki/Cordless_phone for how, when cordless phone move upward in frequency and change to digital system, to reduce working distance to needed and no more. Digital prevent other to listen in and use your phone line for long distance calls, which was very expensive at times. – EEd Aug 19 '14 at 16:31
  • There's such a thing as a modern cordless phone? – Matti Virkkunen Aug 19 '14 at 23:45
  • In context of freq and short range, Modern is present unit at GHz vs. first generation unit, about 20 to 25 years ago, at 1 MHz and 50MHz. Some at 900MHz at mid point until now. – EEd Aug 20 '14 at 03:48
  • Edited. Technical vs Social context – EEd Aug 21 '14 at 09:37
5

Some of the reason is cost (both financial and power budget with distance), some is because what frequencies are reserved for other types of devices/communications and the interference caused by such deviations from those frequencies.

When a frequency is chosen for widespread use, it is cheaper to use off-the-shelf parts in your design rather than having to start from scratch to use a particular frequency. You can buy a ready-made transceiver that millions of devices use for a lower cost per unit than using a custom made transceiver.

http://en.wikipedia.org/wiki/ISM_band

and

http://en.wikipedia.org/wiki/Electromagnetic_interference_at_2.4_GHz

have some info regarding the frequency designations.

3

As others have said, it's an ISM band, and all of the other listed reasons are totally valid, but I think another part of of the reason it's more popular than other ISM bands is that it is available in almost all countries whereas some ISM bands are only ISM in certain regions, and it is also fairly wide compared to other ISM bands. As you go up in frequency the ISM bands get wider it seems.

In fact, 5GHz WiFi is getting more common all the time as 2.4 gets more crowded. The 5.8 band has 150MHz whereas 2.4 only has 100MHz. 5GHz can't go through walls quite as well but they say it can go through smaller holes like under doors.

Volker Siegel
  • 1,072
  • 1
  • 12
  • 25
EternityForest
  • 691
  • 4
  • 6