I'm someone who actually likes reading standards for fun (sometimes there are interesting things in them), and have read many of them as a result[1], and even implemented parts of some; but I still find Bluetooth, along with WiFi and GSM/LTE very intimidating.
Is there something about wireless in particular that lends itself to this proliferation of complexity? I've read 802.3 (Ethernet) and it was nowhere near as dense and intimidating.
[1] MP3, JPEG, JPEG2000, JBIG2, H.261 to 264, MPEG 1 and 2, USB, SATA, IEEE1284, RS-232, a bunch of RFCs, and too many others to list, those are just the ones I remember reading recently...
Somehow you'd think it would also benefit participants if it was also reliable and fulfilled end user's requirements. It is not theoretically impossible to have wireless headphones with low enough latency for gaming.
It's true that only paying companies (associate membership or higher) can participate in working groups and contribute to the specifications, but once a specifications has been adopted it becomes public.
You should try USB 3.2, which is 1500 or 3000 pages, The original Intel version of USB 1.0 was a nice comprehensible 150 pages.
And that is excluding the Type C, Type A and all other connector Spec.
And there is the newest WiFI 802.11ax or WiFi 6, Even earlier this year as I skim through it, the draft still has 2000+ comments unresolved. ( One reason I am looking forward to 802.11be to fix this mess instead )
Sometimes I do miss the day of parallel port, everything was so simple. That is may be me getting old.
Wireless has a lot of stuff going on.
Layer 1 off wireless is split in many different levels.
This, combined with the fact that interoperability is paramount, but there are only a few providers make for complicated standards. The interop between a few providers problems means the standards have to bow to actual usage a lot more.
This applies a lot more to 3gpp (cellular) stuff than WiFi though. Especially because 3gpp is commercial only, is a lot more 'trust the network', and cares about billing. Moreover, the interop is meant to give a pretty seamless roaming ability, whereas e.g. 802.3 is only about working on a single subnet.
Moreover with WiFi, which partially uses time-division (like ethernet) you cannot have two devices sending at the same time. However, unlike old-style shared medium Ethernet, you can have two devices outside of range of each-other, but both reaching your WiFi router. This means the transmissions scramble each other at the router, but the devices can't detect this.
Hence, you cannot do CSMA/CD (collision detection) but need CSMA/CA (collision avoidance). This means you get ACK messages at layer 1 of your protocol.
In general, it might be the Multiple Access model that makes wireless hard.
You stole my comment! No but in all seriousness, this is a long-standing issue. The A) requirement of making BT-enabled devices backwards compatible, B) the monstrous stack of protocols involved in like every use-case, & C) an assumption that Spread-Spectrum Frequency Hopping @ the physical layer is a substitute for encryption, all combine to make a BT kind of especially terrifying.
Take for example the Nike Fit Band, an iconic bluetooth peripheral that's been on the market for some time:
what’s is your point about it being a dollar per chip? that it’s implemented by super cheap labor or something?
silicon is free. you are paying for the r&d. that’s why volume matters. at the volume of bluetooth chips, they are almost indeed free. most of that dollar is distribution costs.
my point is, the cost of the chip isn’t a signal of anything except the ubiquity of it.
True. The likes of broadcom are hugely sophisticated and really really understand the spec.
But _none_ of your consumer level gadgets are made by broadcom.
Broadcom will sell millions of these cheap as chips chips to second tier players.
Who may sell on to third tier players.
Who buy or pull a bluetooth stack and sort of vaguely maybe know what they doing, tie it all together and get the BT compliance stamp (which sort of means it doesn't shit all over the RF spectrum and vaguely works)...
...and then the sell it to you.
ie. The chip manufacturer hopefully has clue.... then it gallops rapidly downhill from there.
All Bluetooth chips come with SDKs and depending on your solution another API in the OS of your board. At that point clients don't need to know much about bt and from my experience don't care beyond "how do I make these bytes appear on the other side".
BT is massive but that's mostly BT classic which is slowly going away as more and more chips are BLE only which is saner. That's why we're adding isochronous channels to keep it interesting.
Do you have any idea how bloody insanely complex bluetooth is?
Just the the core stuff,
https://www.bluetooth.com/specifications/bluetooth-core-spec...
without the various protocol specs... https://www.bluetooth.com/specifications/protocol-specificat... or the GATT or ...
The bloody core spec is 3000 PAGES of standardese.
All in stuff that costs less than a dollar per chip.