Categories
Uncategorized

Apple Glasses — Accessory or Platform?

Over the last couple of posts, I’ve talked about why Apple would make Glasses, basically a face-mounted computational display — and what could actually get people like you and me to buy a pari. Hit subscribe so you don’t miss the rest of this series.

One thing I haven’t talked about yet, though, is the developer story… apps.

When the original iPhone launched, it came with only the apps Apple had given it. Including the last-minute addition of YouTube.

Henri Lamiraux’s software team had just finished a forced march, a marathon of sprints, to get version 1 ready to ship, only to have to turn around and start another for version 2, this time to get the frameworks ready for the SDK event and the App Store ready to launch.

Umpteen millions of apps and billions of dollars later, it literally made apps as mainstream, as popular, as music and TV.

So, it should have come as no surprise that when the original Apple Watch launched, it had not only the apps Apple had given it, but kinda… sorta… an App Store as well.

Back then, on that first version, the App Store lived on the iPhone and so did the app logic. Only the interface and interaction layers were… projected… onto the Watch. All based on Apple’s pre-existing Extensibility frameworks.

Now, thanks to their own forced marches, Apple got app logic onto the Watch in version 2 and brought the store on-device in version 5.

Sure, it’s fair to say that, while there are some legit terrific Apple Watch apps, wrist-based apps in general just haven’t proven as popular or as critical to the Watch’s success as they were the phone’s. Subject for another video. Let me know in the comments if you want to see it.

But, that same process, the process of projectables, could prove even more critical and more valuable for Apple’s next small thing.

I’m Rene Ritchie and this… is the potential for Apple Glasses apps. Please don’t call them glapps.

There’s this common misconception that iOS doesn’t multitask well. It does. Go back to the very first demo when Steve Jobs plays music, surfs the web, checks email, takes a call, and the music comes right back just to see how well it multitasks. Because of course it does, it’s built on the same UNIX underpinnings as macOS.

The difference is that Apple has always tightly controlled when processes can and can’t multitask on iOS. And they did it because they were paranoid about battery life.

Only Apple’s own apps had any kind of background access at first. Then only a tiny handful of specific functions, like streaming music and turn-by-turn navigation. Then things like background refresh. And now, while it’s still by no means a wide open, wild-wild-west of multitasking madness, pretty much every kind of app can persist in the background… at least for a while. RIP bloated, cross-platform social and gaming apps.

iOS always had the potential to handle headless apps as well. That’s where interfaces and feature sets, could pop up as remote views in the system and inside other apps. Because, again, same Unix underpinning as macOS.

The difference here is that Apple didn’t even start porting that functionality over until later. Because they were even more paranoid about security.

Then came iOS 8 and Extensibility, which I’ve often called one of the most important developments in iOS since… well, iOS.

With it, apps maintained their secure sandboxes but direct user action, things like tapping a button, could poke a small hole in them, just big enough to bring up a Share sheet, or widget, or photo filter, or custom keyboard, or document provider, or… you get the idea.

When that happened, iOS would pull the interface and data from the container app, and then present it to you so you could quickly, easily send a photo to Twitter, swipe down a calculator from anywhere, or get your bitmoji on.

All while never letting it mingle with the host app, so no one could try and trick you into giving them your personal, private information and content.

Now, this wasn’t some test bed for Apple Watch apps. It wasn’t part of some master plan. But, when the Watch team was trying to figure out how they could get third-party apps to work on a device as constrained as the original Watch, and realized they’d have to decouple app logic from app interface to do it, wow was Extensibility handy.

Same thing in principle for AirPlay, even CarPlay.

Just like websites were broken down into web services that could exist on any site, binary app blogs were broken down into specific functions that could be called from anywhere.

Google, which values cloud the way Apple values local, has been doing something similar with app fragments for years as well. Because a good idea is a good idea.

At least in theory. If you have to wait for an old school watch app to spool up or app fragment to download, you know never turned out nearly as well in practice.

Except, it kind did. Just not for whole apps or big functions. It’s turned out really great for a bunch of other stuff we use every day, including and especially rich notifications and, on the Apple Watch, rich complications.

Part of it has to do with transmission. Locally, Bluetooth has gotten better but we can still block it with our own, ugly, mostly bags of water body parts. From the cloud, even modern Wi-Fi and cellular protocols can and do fail us.

Part of it has to do with compute power. Even the latest Apple Watch can’t run apps, especially heavy apps, as well as an iPhone, iPad, or Mac.

Part of it has to do with interface. The Apple Watch screen is so much smaller it makes touch that much harder, and the Digital Crown is basically a scroll wheel minus the mouse.

Apple Glasses will likely be just as susceptible to wireless interference, have far less compute power, and even less interactive potential and precision than an Apple Watch.

Now, that’s not a show-stopper. Far from it. Just take a look at AirPods.

AirPods are just as dependent on a wireless connection, even AirPods Pro and their fancy 10-core audio engine have far less compute power, and when it comes to interface, there’s precisely zero screen and only a capacitive squeeze or several to work with.

Yet they can relay calls, turn-by-turn directions, workout instructions, even incoming messages.

And somewhere in the middle there, somewhere in the land of rich notifications and complications, is where I think Glasses apps will lie. At least at first.

Actually, if you go back to the original Apple Watch, glances would probably be an even better conceptual fit for glasses than complications.

Developers would get the glasses as an extension to the existing APNs Apple Push Notification Service for quote unquote free, where it just works but with some additional effort and customization, it will just work even better.

And also for glances, so, for example, a guided workout app could pull data from the Apple Watch and show it in your peripheral vision so you can see your starts without having to pull up a sleeve, pull down a glove, or take your hands off your ski poles or kettlebells.

Same thing with walking directions, AppleTag detection, and a host of other features that, again, may not be as functional as the watch, much less the phone, but far, far more available. And far less obtrusive and, frankly, rude. Like, you’ll be able to check an event reminder or to do without even having to lift or turn your wrist or reach for your phone.

The reverse will also be true, though: Developers will be able to pull from the Apple Glasses sensors, particularly the LiDAR sensors, to make things like indoor mapping, item location, guided workouts, and other things, maybe even some level of location and spatial-based gaming possible, without you having to hold your iPhone or iPad up for extended periods of time.

And, yeah, there’ll have to be things like Driving Mode which cuts out any and all distractions when the Glasses detect a certain velocity in general and the shape of a steering wheel in front of you in specific. There’s already a crude version of that on the iPhone. But, believe me, no one wants an LL Cool J is going live Insta alert while we’re careening down a highway in a blizzard. Including LL.

Now, maybe one day Apple’s silicon team will have a G4 or G5 system in package powerful enough for fully on-board, always-connected apps like something straight out of an Iron Man movie, with a voice-driven Siri interface that’s finally as functional as never mind Jarvis or Friday, but Google or Alexa.

I have a whole rant brewing on that, so if you want to see that video sooner rather than later, hit me up in the comments.

So, here’s how I think it’ll play out at first — Apple will announce the glasses with support for pulling notifications and glances, pushing sensor data like LiDAR, and fine grained visual verbosity controls to provide information but prevent distraction. Maybe with a cool demo from Twitter, Peleton, Pokemon Go, whatever’s trendy and looks compelling.

But, it will be more of a display than a computer when it comes to third-party apps, at least at first. And then, like the iPhone and Apple Watch, the Glasses will be iterated on and improved over time. Until, yeah, I start to dream of EDITH again.