10

Because Bluetooth earbuds struggle in my home, due to its layout and electronic noise, I hoped to switch to WiFi technology, which has a good signal throughout the house. Either to connect earbuds and a mobile device directly, perhaps with some app, or some kind of WLAN enabled connectivity.

Unfortunately, doing some searching, I have found no such earbuds so far, only WiFi soundbars and speaker sets, like those from Sonos.

Why is that? Is WiFi not technically feasible for earbuds, or is the dominance of Bluetooth impeding other hardware from challenging it?

ocrdu
  • 8,705
  • 21
  • 30
  • 42
thegreatwhatsit
  • 219
  • 1
  • 6
  • 3
    WiFi stack is much more complex than BT. – Eugene Sh. Nov 23 '22 at 20:18
  • Which then probably increases power draw? Well, there was bound to be a reason. Too bad it's usability. ;) – thegreatwhatsit Nov 23 '22 at 20:27
  • 15
    WiFi is a power hog, your earbuds would need to plug into a power brick. – Neil_UK Nov 23 '22 at 20:39
  • 3
    @Neil_UK not sure I'd agree at that level of generality. I'm not convinced a BT system uses less Joule per bit for the same distance if there's no constraints on latency and the data volume is asymptotically infinite. But yeah. Sleeping with a BT device is extremely efficient, the per-packet overhead is lower, and there's no need to wait for a full 20/40/80 MHz channel to be free before usage – for our use case here, Wifi's the hog :) – Marcus Müller Nov 24 '22 at 09:05
  • 1
    BTW I've found big bluetooth headphones to work better in my house than little in-ear ones (both cheap things) and far better than a little receiver/adaptor to use any wired headphones over BT. The best I get with big headphones is about 5m through walls and floor (maybe 8m if I face the right way) while with the receiver I'm lucky to get 3m line-of-sight and it cuts out if I stand in the way. – Chris H Nov 24 '22 at 11:28
  • 1
    Vague thoughts about a test by streaming audio over WiFi to a Raspberry Pi with wired headphones, powered off a battery pack – Chris H Nov 24 '22 at 11:29
  • 2
    @user253751 which doesn't imply much, because their physical and medium access layers are fundamentally different. It's not even true they get the same interference: BT is a relatively narrowband FHSS system, whereas (modern) wifi is wideband OFDM, so the interferers are not manifesting in similar ways. – Marcus Müller Nov 24 '22 at 13:06
  • 1
    @user253751 They use the same band, yes. But they do not get the same interference and do not use frequencies the same way. For example, part of the power in BT is in rapid frequency hopping. Your statement is way too broad. – Mast Nov 24 '22 at 13:40
  • 1
    traditional wifi also connects on the same band as bluetooth (2.4ghz) so if the bluetooth signal is weak, the wifi signal may not be stronger (and 5ghz is shorter in range afaik) – JoSSte Nov 25 '22 at 15:51

4 Answers4

35

Because wifi is the wrong technology for audio transport.

The highly nondeterministic, large-burst approach to data networking introduces a need for a large buffer. This inherently means high latency if you optimize for throughput - something that you very much avoid in an audio application.

This explains why it exists for speaker bars: if you only do playback applications, either your delay doesn't matter at all (pure audio playback) or can be compensated by delaying the video playback relative to the audio playback. Doesn't work for phone calls, or games, when you insert that delay between the video- and audio-rendering devices.

Then, Wifi is optimized for a different kind of coverage, as you noticed - so it's less optimized for things that are very desirable in earbuds: battery life in a short-range, small-packet scenario. The increased throughput that wifi has when it has much data in bursts to transport comes at a high power budget price if your data is actually small packets in small, regular intervals.

Furthermore, there might be a commercial aspect: Bluetooth transmitters also come in different classes, which promise different transmit powers and hence reaches. But having a high-reach bluetooth device with a battery that lasts shorter might simply not be commercially attractive, so these modes are not what you might find in every earbud.

Also, you might be overestimating how well Wifi actually works in your area of coverage: I don't think you count data packets that didn't arrive on the first time or within a very limited time window as failed – that's not how Wifi works, your station just asks for the same packet to be resent if it was broken. That doesn't work well for audio streaming – can't play something if you missed your reception window for the next piece of audio buffer; you get dropouts. And: Wifi can work reliably over larger distances by using better error-correcting codes, but these are often very long – meaning that you need to receive much data before you can correct errors (but then, it works pretty well). That introduces additional latency.

Finally, don't forget that the antenna sizes in earbuds is very limited. Don't want to guarantee it, but if you used the same antenna, right next to a large fleshy-watery absorber (human head), you might simply get wifi coverage that would underwhelm you, as well.

Marcus Müller
  • 88,280
  • 5
  • 131
  • 237
  • 5
    I'm just recollecting all the work that went into the Seiko Message Watch (when I was working with their team.) The receiver soaked up most of the battery life (which was almost 2 years on a button battery if you can believe it -- they worked very hard on this.) I'm guessing that Bluetooth has a market-driven need for lower power in some cases while Wifi does NOT have such a market-driven pressure. So there may not have been similar engineering investments in lower power receiver ICs. That's what popped to mind when reading your writing. – jonk Nov 23 '22 at 20:35
  • 2
    that is very true as well, though I'd presume that the billionfold smart phone proliferation has at the top end of things moved to things being *very* power efficient. For example, when you look at an intel wifi chipset, that thing can receive on a different channel while it transmits. Crosstalk and consequently nonlinear hilarity in the receiver ensues. Now, luckily, the chip knows its transmit signal and can cancel that (there's quite a bit of math involved), but that also inherently means it can deal with the own receiver nonlinearities better – and the most linear are the least efficient. – Marcus Müller Nov 23 '22 at 20:40
  • 11
    I'm sorry to say this, but your otherwise nice answer is contaminated with some… bovine faeces. Latencies "far above 40 ms" are nowhere near normal for any wifi network (even 802.11a/b would do better). I'm writing this on an ordinary laptop connected to an ordinary WiFi network with LAN round-trip latencies around 2-3 ms. And due to some recent global events, there's a counterexample to your "voice won't work over WiFi" in the form of probably like a billion people with first-hand experience of frequent live voice and video conferencing, often involving WiFi. – TooTea Nov 23 '22 at 20:40
  • 4
    @TooTea rest assured I'm sitting on PC-style hardware connected to my home network as well. And my ping latencies are also much lower than 20ms on average. But let another station actually use my channel, and the variance of my RTT increases – especially if I'm the far station! This should get a lot worse the more regularly I use the channel, as other stations then can't, and at some point I will need to yield the shared medium quite a bit. – Marcus Müller Nov 23 '22 at 20:42
  • Yes, I suppose wifi would require a destinct audio standard which would allow errors to maintain low latency, instead of correcting it. Which then would mean the quality of the connection would probably not be much better than Bluetooth. Thank you. Excellent explanation. – thegreatwhatsit Nov 23 '22 at 20:43
  • 3
    @TooTea by the way, I marked your comment as helpful – it contains good and correct perspective, and I think I might actually be exaggerating the problem in the answer, whereas the contention issue in Wifi is a bit realer than we admit here. It's also worth noting that BT doesn't solve the non-cooperative devices case, at all, other than by hopping around aggressively in hopes of having a low probability of collision. (but also, the wider the reach in a non-cooperative scenario, the higher the contention, and thus the number of collisions) – Marcus Müller Nov 23 '22 at 20:49
  • 1
    Re: contention, I have recently seen someone on Reddits r/homelab state that they were seeing over a hundred different SSIDs in their apartment. By contrast, Bluetooth has, in comparison, extremely short range, where device density in a given area is similar - so there's far less contention. Most BT devices would probably not be able to communicate through a single wall. – jaskij Nov 24 '22 at 08:55
  • 2
    @TooTea the thing with video calls over WiFi is that both audio and video are transmitted together - so whatever delay you get, it is common to both signals, and thus unnoticeable. With BT (or the hypothetical WiFi audio OP is asking about), you add an additional delay *just to the audio* - which is a completely different thing. The threshold where a common delay starts being a bother in a call is well above 100ms. As a data point, it's not uncommon to see 50ms or even 100ms of input lag in TVs - which is unnoticeable because audio and video are kept in sync. – jaskij Nov 24 '22 at 09:01
  • 1
    @jaskij Don't forget that wifi and bluetooth use the same 2.4GHz band. A wifi packet will completely swamp a bluetooth packet if they happen to come at the same time at the same frequency. The fact that bluetooth is short range means nothing if there's lots of 2.4GHz wifi traffic: bluetooth's low power doesn't help if it's being swamped by lots of huge high-power wifi packets – canton7 Nov 24 '22 at 10:17
  • 8
    Bluetooth 5 has a *best-case* latency of 20ms I believe, and bluetooth in general has higher latencies than wifi. Everything comes down to power consumption: low-power operation is a huge consideration for bluetooth, and it drives a lot of the design including deliberate compromises which increase latency; but power consumption isn't really a consideration for wifi at all. The headphones wouldn't last any time at all if they were wifi – canton7 Nov 24 '22 at 10:30
  • 3
    @canton7 I think that 20ms comes from the audio codecs used; the medium access and RF PHY latencies are more in the order of 200 µs, IIRC. (Codecs would still apply to IEEE802.11-transported data) (I'm not 100% sure about µ vs m, but 200 ms would sound very excessive) – Marcus Müller Nov 24 '22 at 10:36
  • 1
    @canton7 same band, but do they use the same channels within that band? – jaskij Nov 24 '22 at 10:46
  • 1
    @jaskij there's no common "channels" within that band that apply to different technologies. Wifi uses channels from the full ISM/unlicensed band, and Bluetooth hops across the entire band. So, yes, they do collide. – Marcus Müller Nov 24 '22 at 10:47
  • 1
    @jaskij They're each doing their own thing within the 2.4GHz band -- Bluetooth divides it into 40 channels which it hops between, and wifi divides it into 13-ish channels which it does its own thing with. If you're watching bluetooth transmissions on a spectrum analyzer you'll see wifi packets crash in and completely clobber them occasionally – canton7 Nov 24 '22 at 11:09
  • 1
    There is a also a significant user interface issue. Bluetooth has pairing. How would you configure wifi on your tiny little earbud? There is no easy way to put in the SSID and password. Perhaps this could be overcome, but I don't see an obvious way. – user57037 Nov 25 '22 at 18:47
  • 1
    @mkeith surely wps (supposedly mandatory on all devices) would do the trick? – somebody Nov 25 '22 at 23:54
  • 2
    So WiFi works A-OK for audio streaming -- and indeed people use WiFI with all kinds of devices for audio streaming regularly. So I think itis absurd to say "WiFi is the wrong technology for audio". However the question is about earbuds specifically, and that raises questions about size and power requirements, plus mechanisms for pairing/addressing and control, etc. There are now ultra-low power nearly microscopic WiFi chips available, since 2020, so if the addressing/pairing issue could be worked out then WiFi earbuds are entirely viable and would work just fine. – Greg A. Woods Nov 25 '22 at 23:59
  • 1
    @somebody I have never used wps in my life. I didn't even know it existed. Had to use a search engine. Many times when I connect to wifi it would not be practical to use wps (hotels and cafes, for example). I guess if the audio source had its own AP, you could activate wps on the audio source. I can see how it could fit the bill. – user57037 Nov 26 '22 at 00:02
  • 1
    @mkeith i'm not convinced using wifi earbuds in hotels and cafes is particularly desirable... sounds like anyone would be able to play music on your earbuds - unless you create yet another protocol to pair your device with the earbuds over wifi – somebody Nov 26 '22 at 00:06
  • 1
    @somebody yes, I agree. Remember, I am arguing that there are is a significant UI issue in using earbuds with wifi. So yeah, I agree. It may note be practical or desireable. UNLESS the audio source is the access point and supports wps. Then it might not be too bad. Initiate wps, pair and off you go. Assuming the AP maintains the same SSID, you wouldn't need to pair again later, even. – user57037 Nov 26 '22 at 00:20
14

Although the answer from Marcus Müller is good, he leaves out an important factor: standardization. This is not the same thing as market share. Wi-fi is a general-purpose protocol designed to carry Internet data. Bluetooth is a function-oriented protocol designed to connect devices that do specific things. Bluetooth has a particular protocol designed to transmit audio. This provides a predefined way for the devices to agree on how to do all kinds of things: audio bitrate, format, channels for stereo and microphone, DRM, and user controls (pause, fast forward, etc).

Wi-fi doesn't have any of this. All of it would have to be implemented in software by the component developer, driving up the cost and increasing the chance of bugs. And since it would be implemented in software, this would seriously increase power consumption on both ends of the connection.

This is where the standardization comes in. Any Bluetooth peripheral can connect to any Bluetooth-enabled computer or phone and they will just work without any particular software. To use Wi-Fi the manufacturer would have to develop driver-level software for half a dozen different operating system platforms (Windows, Mac, Android, IOS, Playstation, Xbox, Nintendo, Linux...), convince the platform's developer to allow it (probably impossible on the more tightly controlled platforms), convince media companies that it meets their DRM standards, and convince the user to install the software. Lots of trouble.

A Wi-Fi enabled soundbar will usually come with its own built-in server that a quickly and easily developed app can connect to, or will function as a client to something like Spotify or Alexa; in essence they just have a built-in app that is a normal client for those services. I am not aware of any soundbars that use Wi-Fi outside of this approach - they usually use a traditional audio connection like SPDIF, HDMI, or even Bluetooth for their general-purpose audio input.

But I think your premise that Wi-Fi just works better in your environment is probably also flawed. You might just have unrealistic expectations for your Bluetooth connection. Bluetooth is a "personal area network" whereas Wi-Fi is designed for longer range and higher power. The concept of Bluetooth is that your earbuds can communicate with your phone in your pocket or your computer on your desk - it's not designed to work from the other side of the house.

As for latency, as others have said, Wi-Fi's latency performance is probably better than Bluetooth's most of the time. But one of the things the Bluetooth protocol does is allow the devices to negotiate the latency, so the player software can delay the video to match. On my earbuds, there is a noticeable synchronization difference between software that knows how to do this correction (Youtube) and software that doesn't (Twitch).

So, to sum up, Wi-Fi just isn't designed for this, and Bluetooth is. Doing it with Wi-Fi would be reinventing the wheel, and it probably wouldn't get any rounder.

fluffysheap
  • 251
  • 1
  • 4
  • 5
    this is an excellent point, but soundbars, TVs, DVRs and media center software für PCs have long standardized on DLNA, so there *is* a standard. You *don't* need specific software! The point still stands, because the BT audio standardization was actually much more successful, but that might be mutually depending on the commercial success of BT hands-free headsets and speakers as well as smart phones. – Marcus Müller Nov 24 '22 at 13:29
  • 2
    For example, if you have modern smart TV and a media center (like a PS4) in the same network, you can control one with the other and play media from one on the other – Marcus Müller Nov 24 '22 at 14:01
  • 3
    "This is where the standardization comes in. Any Bluetooth peripheral can connect to any Bluetooth-enabled computer or phone". Eh, no. I remember a meeting from GENIVI (automobile electronics, now called Connected Vehicle Systems Alliance). Compared to WiFi, Bluetooth is a proper mess. You complain about "drivers for half a dozen OS'es". We knew back in 2010 that we had to deal with hundreds of device profiles. Bluetooth Low Energy apparently is a whole lot better, at the expense of 0% compatibility with old BT. (Devices that do both basically need two BT stacks) – MSalters Nov 24 '22 at 15:11
  • "Wi-fi doesn't have any of this. All of it would have to be implemented in software ... And since it would be implemented in software, this would seriously increase power consumption on both ends of the connection." - This is not true. The Bluetooth stack is also implemented in software. Afaik there is no *hardware* (VHDL or similar) implementation of a Bluetooth stack. – brhans Dec 19 '22 at 14:54
3

One consideration is that use of Wifi pretty much mandates the use of Internet Protocol.

There's nothing about Wifi itself actually mandates that, but most of the devices that you might want to connect to will mandate that. In Android, for example, the only way a normal app to access the Wifi is through the networking API. You don't get access to layer 2. This means that any device that wants to connect to an Android mobile using Wifi must implement Internet Protocol and at least UDP on top of that.

And that extra overhead can equal a bulkier micro with a higher power consumption, which could also lead to a bulkier battery.

Rodney
  • 332
  • 1
  • 6
  • 1
    There's a nice point made here: BT is designed with one master connected to multiple devices that do not have anything to do with each other; wifi's architecture is optimized for networks with packetized data. That has consequences throughout their whole stacks! – Marcus Müller Nov 26 '22 at 10:39
-10

Reason is that it would cook your brain alive. No way that it is physically possible to use WiFi waves like that, they are way too strong. Think about how they are supposed to pass through concrete walls and all. Using WiFi waves to operate earbuds is like trying use a wrecking ball to make a nail hole for hanging a picture on the wall in the living room.

Chad Branzdon
  • 1
  • 1
  • 1
  • 10
  • 6
    Cellphones have wifi and people hold them next to their heads all the time. Why doesn't the wifi in the cell phone cook their brains alive? – user57037 Nov 25 '22 at 18:56
  • @mkeith Cellphone makes an open circuit which is like touching a single terminal of a car batterey. Or actually more like touching a live AC wire while standing isolated from the ground, so you get a little but not lethal shock because of your non-zero capacitance (cellphones aren't entirely harmless). Putting earbuds places your brain between the terminals, which is like your wet left hand touching the cathode and wet right hand touching the anode, where as little as 100mA passing through your meat pump terminates its operation and shuts your living function off, forever. – Chad Branzdon Nov 25 '22 at 20:43
  • 7
    Earbuds are not conductive. They are not "terminals." So I don't see how they could cause 100 mA to pass through your heart. I do not see how they resemble putting your wet left hand on a cathode and wet right hand on an anode. I think you are spouting nonsense. – user57037 Nov 25 '22 at 21:13
  • 3
    "Cellphone makes an open circuit which is like touching a single terminal of a car batterey....Putting earbuds places your brain between the terminals" Not quite. All these circumstances are radio transceivers. In transmit mode they all put out radio waves, some of which get absorbed by the body. If the energy absorbed gets too high, then the body starts 'cooking' microwave-oven-style. The output level of wi-fi or Bluetooth is not enough to cook the body. Stand in front of a microwave uplink tower for rural TV, then cooking happens. – Triplefault Nov 25 '22 at 21:25
  • @mkeith Earbuds are putting radiation flux through the brain, not the heart. The 100mA current flowing through the heart was meant to be just an analogy to a strong EM radiation flux flowing through the brain, which is what you apparently missed. So you might want to read more carefully next time before you start dismissive namecalling accusations of "spouting nonsense". – Chad Branzdon Nov 25 '22 at 22:58
  • @Triplefault I might not be agreeing with all of what you said, but I wish the other user was as civil as you are in your response. – Chad Branzdon Nov 25 '22 at 23:00
  • 3
    But why would the wifi earbuds beam energy directly to each other through your brain in the first place? Presumably they would just be transmitting only to the wifi router or access point or whatever. So it would be like holding one cellphone to your right ear and another one to your left ear at the same time. Are you of the opinion that using cell phones like that would cook your brain alive? – user57037 Nov 25 '22 at 23:21
  • 2
    mkeith is correct, though there was a bit too much edge in one comment. The Wi-fi and Bluetooth transmitters don't complete a circuit. They merely emit radio waves, which hopefully will get picked up by a receiver somewhere. Energy gets transmitted whether there is a receiver to receive or not. The transmitting energy is what may damage the body if powerful enough. https://www.fda.gov/radiation-emitting-products/cell-phones/do-cell-phones-pose-health-hazard#:~:text=Cell%20phones%20emit%20low%20levels%20of%20non%2Dionizing%20radiation%20when,increases%20cancer%20risk%20in%20humans. – Triplefault Nov 26 '22 at 16:32
  • This answer would be improved with a credible citation. Would it be possible to add a link to some reputable experimental evidence? – SusanW Nov 27 '22 at 14:52