18

Infrared has a frequency of 300 GHz – 430 THz, while WiFi has a frequency of 2.4 GHz or 5 GHz.

As the frequency of infrared is greater than the frequency of WiFi, the transfer rate (bitrate) of IR should be greater than WiFi.

In reality, IR transmission is in Kbps, while WiFi for WLAN is about 100 Mbps.

We know that a greater frequency wave can carry more bit data, but it is difficult to handle interference such as thick wall obstacles, therefore lower coverage range, and vice versa for lower frequency.

That statement only applies between mobile cellular band LTE (2.3 GHz) and WiFi (2.4 GHz) where WiFi bitrate is higher than LTE, but not between IR and WiFi.

Edit: I said Kbps after I read random article about IR Remote.

  • 9
    Fibre optic can exceed 100Mbps easily. I think by a factor of 100x :) – anrieff Apr 01 '23 at 13:46
  • 21
    I'd say your premise is faulty. You can get IR transmitters with giga or terabit per second transmission rates, the modern internet is based around them. Perhaps you meant to ask why the specific product you've picked is so slow? – user1850479 Apr 01 '23 at 13:47
  • 5
    It's no theoretical limitation - it's just how the protocols are designed. In particular, infrared communications *was* very low-cost whereas wi-fi requires quite sophisticated chips – user253751 Apr 01 '23 at 13:58
  • I said Kbps after I read random article about [IR Remote](https://www.rfwireless-world.com/Terminology/IR-TV-Remote.html#:~:text=The%20typical%20bit%20rate%20is,using%20external%20mounted%20ceramic%20resonator.) – Muhammad Ikhwan Perwira Apr 01 '23 at 14:04
  • 11
    To be clear you asking why your TV remote sends information at a low rate? How fast were you expecting it to send button clicks? – user1850479 Apr 01 '23 at 14:35
  • 6
    When you say "IR", could you state what you mean _specifically_? I.e., if you mean a TV remote, then **edit your question** to say so. – TimWescott Apr 01 '23 at 15:51
  • The IR might be high frequency. Then, the electronics in the remote control actually modulate this light to about 36 to 38 Khz. The presence (or absence) of this 36 KHz modulated IR light then signifies a "1" or a "0". Many bits are used to send a single button-press ... a preamble, followed by the code. Holding the button usually just repeats this over and over ... – Steve Apr 02 '23 at 01:54
  • 2
    The important thing is the receiver in the TV or whatever needs to be able to distinguish between the remote sending a code, and all the other dazzling IR light it may also see. The receiver in the TV and the transmitter in the remote must also agree on what means what. And the code needs to be somewhat error-free, so trying to turn up the volume on the big part of a scene doesn't turn the TV off! The receiver should be able to see as wide of a field-of-view as possible, and the remote should be a nice, "bright" wide-angle LED too. – Steve Apr 02 '23 at 01:57
  • 4
    @anrieff, make it at least 250x to 1000x, as 25 Gb/s and 1000 Gb/s links exist and aren't exactly rare. Though AFAIK 100G usually uses four channels, so perhaps that's cheating. :) – ilkkachu Apr 02 '23 at 15:00
  • 1
    sigh, meant to write 25 and 100 Gb/s... Yes, there are faster ones yet, but 100 Gb/s is rather common already. – ilkkachu Apr 03 '23 at 06:51
  • just like why [UWB](https://en.wikipedia.org/wiki/Ultra-wideband) in modern phones has such slow throughput despite using a very huge bandwidth. Because they're designed for different purposes – phuclv Apr 04 '23 at 06:37

6 Answers6

55

The premise is false. Infrared communication can be much higher bandwidth than wifi; just look at modern fiber-optic communications.

It looks like you're really asking why wifi is faster than the protocol used by infrared remote controls, though. The reason for that is quite simple: the remote control doesn't need to transmit quickly. It only needs to send a tiny amount of data intermittently. Designing a high-bandwidth communications protocol simply to let your TV know that you hit the power button is completely unnecessary. Remote controls are also designed to last a long time on small batteries, so simpler protocols are better--complicated modulation schemes require more electronics and more power consumption. And on top of that, the system used by remote controls was designed in the 1970s! Not exactly what you'd call modern communications systems.

Hearth
  • 27,177
  • 3
  • 51
  • 115
  • 12
    For that matter, some _other_ protocol operating at 2.4GHz could have a higher bit rate than wifi, especially if you ignore regulatory concerns. It's just a matter of power and bandwidth (and humorless government agencies and lawyers, if you ignore regulatory concerns). – TimWescott Apr 01 '23 at 15:52
  • 8
    I would also add that, since the system is unidirectional, a slower transmission rate helps keeping error rate down. You don't have the luxury to have the TV set ask the remote for a retransmission if it detects a bad packet. And in such an application even a single error is extremely annoying for the average user. – LorenzoDonati4Ukraine-OnStrike Apr 02 '23 at 11:47
  • I think that instead of "protocol" you wanted to say "modulation". – pabouk - Ukraine stay strong Apr 02 '23 at 14:23
  • @pabouk-Ukrainestaystrong I used the word protocol a handful of times, which one do you mean? – Hearth Apr 02 '23 at 15:22
  • I don't see any *bit* word in your answer. – Muhammad Ikhwan Perwira Apr 02 '23 at 17:33
  • @MuhammadIkhwanPerwira What do you mean? – Hearth Apr 02 '23 at 18:56
  • Weren't the remote control systems of the 1970s designed to run on ultrasonic? That would be a lot lower bandwidth than IR. – Mark Ransom Apr 02 '23 at 19:06
  • @MarkRansom There were ultrasonic ones first, yes. From what I found with some cursory research, the late 70s is also when IR ones started to be developed. I don't know if any were commercially available until the early 80s, but the technology was developed in the 70s. – Hearth Apr 02 '23 at 21:58
  • 1
    My point was that ultrasonic came first and had a very limited bitrate. The protocol for IR was probably copied verbatim, although I don't know that for a fact. – Mark Ransom Apr 02 '23 at 22:27
  • I was referring to the overall logic of your answer, not specific instances of the word. Modulation directly determines the possible bitrates of communication. The protocol uses the modulation with the available bitrate. Thus, a protocol is already limited by the modulation. You can see https://en.wikipedia.org/wiki/Modulation and https://en.wikipedia.org/wiki/Communication_protocol – pabouk - Ukraine stay strong Apr 03 '23 at 07:27
  • 3
    @pabouk-Ukrainestaystrong Yes, but the choice of modulation is a part of the protocol. – Hearth Apr 03 '23 at 13:11
15

Even free-space IR connections can be quicker than you think - or at least they were 20 years ago.

Consider some of the IrDA standards for line-of-sight IR. They can get into the hundreds of Mbit or even Gbit range.

When IrDA ports were first common, they had rates comparable to the high end of RS232, which was good for the time (late 90s). By the time I last made good use of IrDA, overlapping with early bluetooth, IR connections were often quicker in prcatice - but you needed a continuous line of sight between devices, for a beam you couldn't see, which was inconvenient for large file transfers (I used to sync MP3s onto a PDA over IR from my desktop).

Chris H
  • 2,331
  • 11
  • 18
  • I remember using my Nokia 6230 to forward files from my friends' Sony Ericssons that use IrDA to other phones that use bluetooth and vice versa. It wasn't common for phones to have both IrDA and BT at that time because BT was still new – phuclv Apr 04 '23 at 06:32
  • @phuclv I never had a phone with both, but I did use a PDA with both. At the time I had nothing else with BT but my laptop had IrDA – Chris H Apr 04 '23 at 08:01
8

You are comparing apples to oranges, when considering the bitrate of the communications channel you not only need to consider the frequency of the medium that's traveling in but also the bandwidth (the amount of spectrum), the bitrate and the modulation scheme.

So the properly make a comparison between Wi-Fi and infrared you'd also need to consider the fiber optic line or laser system.

Voltage Spike
  • 75,799
  • 36
  • 80
  • 208
3

The bitrate of an IR optically based communication system functionally similar to a WiFi connection is limited by multipath interference - the light propagating directly to the receiver takes less time than light scattering off a wall or ceiling. If the bit rate is on the same order of time as the difference in time of propagation there is interference and this limits the bit rate.

D Duck
  • 2,041
  • 1
  • 8
  • 18
1

WiFi uses extensive negotiation and handshaking processes to maximize channel efficiency. This makes it possible to achieve very high data rates, but at the expense of increasing the time required to establish a connection.

Further, WiFi expects that while a channel is being used for transmission, the signal received from the transmitter on that channel will be stronger than any signals from any other sources that might simultaneously be received on that channel.

Infrared remotes need to be able to send data almost instantly, even when incandescent within the sensor's field of view are brighter than the remote control.

Typical remote control designs prioritize cost, robustness, and "instant" operation, over communications bandwidth. RF remote controls for things like garage door openers often have designs that are much closer to those of infrared remote controllers than to those of WiFi systems, and have communications rates which are likewise closer to those of infrared remotes than to WiFi. On the flip side, infrared communications can be very fast when using bidirectional communications links that allow receivers to continuously recalibrate themselves to recognize more complex forms of modulation.

supercat
  • 45,939
  • 2
  • 84
  • 143
-1

Because WiFi can go through walls.

Yes, WiFi does get blocked, or significantly degraded, by some types of walls and other objects. But infrared can be blocked by almost any solid object except clear glass or clear plastic. The usefulness of WiFi isn't just the raw speed, it is that it can be easily used in multiple rooms and without a direct (or near direct, such as a mirror) line of transmission. As others have pointed out, where infrared can be used at high speed and long distances is through optical fiber. But if you could connect some "thing" between your computer and router, you could use optical fiber or perhaps copper wires (like ethernet...) and wouldn't need WiFi.

As a result, infrared through the air for consumer use is generally limited to remote controls and other low-bandwidth devices, with WiFi or copper wires or fiber used for higher speed where needed.

  • 6
    This is more an answer to "why don't we use free-space IR communications instead of wifi" than "why does wifi have a higher bitrate than IR". It's *true*--and "we don't use free-space IR for high-speed communications because it requires line-of-sight" is definitely a good answer to this question--but makes a bit of a leap of logic that may be hard to follow. You may want to clarify. – Hearth Apr 02 '23 at 16:08