Categories
Uncategorized

AirPods Pro 2 — How Apple DESTROYS Bluetooth

$550 and the AirPods Max can’t even stream lossless audio. Never mind the $250 AirPods Pro. Why? Because they’re Bluetooth and — wait for it — Bluetooth is lossy. Well, lossy like trying to fit a river through a hose is lossy or… watching the cam version of No Way Home IMAX is lossy. You can have the best source in the multiverse but Bluetooth has such limited bandwidth, is such a narrow pipe, that you just can’t fit real, true lossless audio through it. Including Apple Music Lossless. Or hitting that subscribe button so we can build the best community in tech together… lossless.

So, what do you do? I mean, besides hitting the button and cursing in the comments, obviously. Well, if you’re Apple… If you’re Apple, maybe you kill Bluetooth. MDK it just to watch it die. And then you replace it — or maybe superset it — with a silicon and signal solution that’s just way, way better.

That’s what Front Page Tech’s John Prosser said Apple was planning on doing — updating the AirPods, or AirPlay, or both, to support lossless audio. AirPlay being Apple’s wireless media streaming protocol. It’s what the iPhone uses to send video to the Apple TV or audio to the HomePod, for example.

I’ll get to how in a minute, but the why is super important. See, according to Apple lore, the original version of AirPlay was a weekend project, a hack, to get something like the old AirPort Express AirTunes protocol up and running for iOS.

It was really, really cool and compelling, and worked well enough at the time for Apple to ship it. But it was also extremely limited and inherited a ton of technical debt. Like a lot of Apple’s audio stack at the time. Something they found out the hard way when they wanted to ship that OG HomePod with features like… multi-room audio. So, AirPlay 2. Refactored to fit the needs of a way more modern media ecosystem. Ultra-low power Bluetooth to broadcast availability and negotiate connections, and ultra-high capacity point-to-point Wi-Fi to handle the actual media streaming.

Which was really critical for a product like the OG HomePod, whose marketing pitch was based entirely on it sounding way, way better than Bluetooth speakers. And part of me still wonders if the reason Apple didn’t include Bluetooth on the HomePod was so that it could never, not ever sound only as good as a Bluetooth speaker, not just in rooms, but reviews, in Pepsi challenges all over YouTube, just blowing up that whole entire marketing pitch.

And yeah, I know, Apple spent millions on AirPlay 2 and Russia just took a pencil to space. By which I mean why not just include an actual hard line-in for actual high quality audio.

But all-caps love it, or just hate it’s breathing guts, Apple is all-in on the future being wireless. And for AirPods, which Apple was developing at the same time as HomePod, AirPlay just wasn’t an option. Because, unlike the Apple TV, the HomePod, AirPods weren’t going to be plugged into a wall 24/7. They weren’t going to be plugged into anything, well, except for our ears and tiny, tiny batteries.

And while Wi-Fi can be more efficient than Bluetooth for specific use cases, basically race-to-sleep, or transiting very fast bursts of data and then shutting the radio off to save power, streaming media is pretty much the exact opposite of that use case. And thanks to the way each protocol has been used, and how the chips and radios have evolved to support those uses, AirPods were limited to Bluetooth, it’s very, very low bandwidth, and all the lossy, lossy compression that came with it.

Which brings us to Gary Geaves, Apple's VP of Acoustics, who’s legit awesome, and what he just told What Hi-Fi:

“We have to concentrate very hard on squeezing the most that we can out of the Bluetooth technology, and there’s a number of tricks we can play to maximise or get around some of the limits of Bluetooth. But it’s fair to say that we would like more bandwidth and… I’ll stop right there. We would like more bandwidth”, he smiles.

That first part involved spending some of Apple’s billions on one of the largest, state-of-the-art-of-audio labs in the world, just down from Apple Park in Cupertino. I got a tour back when the HomePod was released, and from the negative decibel… a quiet place… to rooms that could be molded and remodeled to match the acoustic profiles of a multitude of test environments… There’s just been a serious escalation in Apple’s physical and computational audio teams over the last few years. As all their recent headphone, speaker, and mic systems can attest.

But also in silicon. The original AirPods use W1, Apple’s first wireless chipset. Basically tiny little computers in each pod, they’re what made the Bluetooth connections and synchronization so quick, easy, solid, reliable, low latency, and consistent. I mean, Bluetooth is still Bluetooth, so if you put enough of your ugly bag of mostly water body between, say, your iPhone and your AirPods, more than they can compensate for, you can still disrupt them. But, for anyone who previous lived their lives on the island of traditional Bluetooth, W1’s implementation was indistinguishable from radio magic.

For efficiency and specificity reasons, Apple eventually split W1 into W2 and W3, adding Wi-Fi handling, and integrating them into the Apple Watch system-in-package, or SiP. And, H1, Apple’s first Headphones chip, for the second-generation AirPods and, eventually, AirPods Pro and AirPods Max. H1 stuck to Bluetooth but became basically it’s own SiP, using up to 10 audio cores to support computation-heavy features like active noise cancellation, transparency mode, conversation boost, Dolby Atmos, and spatial audio with dynamic head tracking. But, not lossless audio, never lossless audio, because stuck to Bluetooth. And like Gary said… Apple would like more bandwidth.

And, just by way of showing how concerned Apple is for efficiency in the AirPods, the original HomePod used an A8 system-on-a-chip, same as the iPhone 6, and the HomePod mini uses an S5 system-in-package, same as the Apple Watch Series 5. If not recycling silicon but spending actual real new money and resources on something custom for headphones, it’s because everyone from the execs on down consider it essential.

Which is why I wonder if Apple really could just flip a bit and turn on Wi-Fi-dependent-AirPlay on for existing AirPods, not just because of the power draw — the potential hit on battery life — but the system architecture itself. I mean, I’d all caps love it, best surprise upgrade ever, but I’ll only ever expect it when I see it. And having not seen in in the year-and-however-long its been now since AirPods Pro and Max and Apple Music Lossless… those expectations are only dwindling.

Same with Apple going all-in on aptX, which is Qualcomm’s high-fidelity Bluetooth audio compression codec. Specifically, the recently announced aptX: Lossless. Which, yes, all the nerd dreams and drool. All of it. But even though Apple and Qualcomm are playing nice with 5G modems these days, aptX Lossless is really aptX CD quality. 16-bit.44.1kHz. Which is phenomenal for Bluetooth. Legit game-changing for Bluetooth. And a huge improvement for AirPods. But still Bluetooth. And Apple may want their dependency on Qualcomm to go only so far… if not in the other direction entirely.

Also, just licensing a codec doesn’t really seem to fit Apple’s style, let alone swagger. Not when their key product differentiator is better experience through tighter integration of hardware, software, and most recently, services.

Which is why it’s possible Apple will finally, the Rock-style FINALLY move to just replace Bluetooth altogether with something much higher performance but also.. even more efficient. Maybe it’s marketed as AirPlay 3, maybe not. Maybe it leans on a next-generation H2 system-in-package for AirPods Pro 2, and eventually AirPods Max 2, if there are an AirPods Max 2. Maybe not.

But supply chain exfiltrator extraordinaire, Kuo Ming-Chi, just released a report saying that AirPods Pro 2 would not only have a new design and speakers in the charging case to improve the Find My experience, but they’d also support Apple’s own lossless audio codec, or ALAC.

Now, I’ll drop a link to more on Apple’s upcoming AirPods in the description right below the like button, but here’s a question that pretty much immediately pops to mind:

Which version of ALAC, since Apple’s codec supports 16 to 32-bits and up to 384kHz? Unclear, but I’ll also assume the least for now so I can be surprised rather than disappointed later. Even if these are the freaks and geeks who just added 10-bit ProRes HQ to the iPhone Pro…

But if Apple really wants to blow minds and ears, they’re going to really want that more bandwidth than Bluetooth by itself allows, and that leaves some type of point-to-point Wi-Fi, or maybe some fusion of Bluetooth, Wi-Fi, even ultra-wide-band like the U1 chip. None of which are ideal on their own, but maybe could be in very clever combination. I mean, it’s really just an implementation detail at this point, but one I’m super beyond curious about.

Now, In Nilay Patel’s worst nightmare hellscape of a world — and mine too, honestly — this… as of right now almost entirely fanfic new protocol… would be all the next generation of AirPods and headphone silicon support. Like AirPlay 2 on the HomePods. But my own personal hope and dream would be for Apple not so much to rip and replace Bluetooth, but to superset it. Not just so that Nilay’s head is saved from actually exploding, but so, if you have an older device or a non-Apple device that can’t support Lossless, you can elegantly fall over into good old fashioned Bluetooth lossy. Maybe even aptX. I mean, make the kids happy — hell, make the Qualcomm patent licensing treasury, and those increasingly beyond acid salty regulators happy! Just oh, so very, very happy. But… but… if you have the latest and the greatest, you get the absolute best lossless AirPods experience on the planet. If your human ears can even distinguish the difference… beyond audiophile LARPing, that is.