Categories
Uncategorized

iPhone 12: Apple’s Secret Weapon

Over the last week or so, a diagram has been making its way across the pillars of the social cross — Twitter, Insta, blogs, YouTube. It’s purportedly from that internal December iOS 14 build, the one that everyone and their Animal Crossing pals now seemingly have access to. And it shows the new iPad Pro LiDAR Scanner — that’s LIght Detection and Ranging — but in fully Digivolved iPhone 12 Pro form.

That’s lead to a ton of people telling all of us, over and over again, just exactly what it is.

But… I’m way more interested in WHY it is.

See, Apple doesn’t talk about future products. You know it. I know it. Every financial analyst who’s ever squandered a question on Apple’s quarterly earnings calls… impotently trying to get Tim Cook to do just that certainly knows it.

But there are a couple of future products that Tim Cook and Apple HAVE been willing to talk about publicly… and kind of frequently.

That’s automation and augmented reality.

Now, coverage often conflates automation with automobiles, but Tim Cook has flat out said it goes much further than that. And I can get into just how far exactly in a future video. Let me know in the comments if you want to see that.

They’re intertwined in that both require technology that can pull in data about the real world. In other words, scan and ingest the environment around them — and us — understand what it is, meaning, tell a person from a tree from a car from a rock from rain, and then help us in ways that appropriate to that understanding, like by giving us better directions or warning us of dangers — bus, bus, BUS! — or just making us smile as adorable detective Deadpool pikachu runs behind the chair in the coffee shop.

And, I think the reason Tim and Apple… Tim Apple… are willing to talk about these future products is that… they’re not really products. They’re core technologies.

Think of it this way: Most current Apple products have screens. Apple Watch, iPhone, iPad, Mac. Pro Display XDR… is a screen!

The screen isn’t a product. It’s a technology. A component. And a ubiquitous one at that.

I think, in the near future, automation will just be an extension of machine learning and augmented reality, just another end point, just another quote-unquote screen, for the visual display of data.

Same way things like AirPods and the HomePod are already bridging into near- and far-field audio AR as part of the bigger drive towards ambient computing.

There’ll eventually be a range of verbosity vs. visualization based on what we’re doing at the time and the capabilities of the device. From jogging to driving to sitting on the sofa, and from speakers to watches to cars. Again, topic for another video.

To get to really good AR, we’re going to need really good components along the chain. Starting with scanners.

Apple’s been working on computational cameras for almost a decade. Most of us started really noticing with the iPhone 7 Plus and Portrait Mode, where it could understand depth enough to generate virtual bokeh.

Next was the iPhone X and the TrueDepth system. Not just for Portrait Mode or Portrait Lighting, but because it started boiling us in the AR water of Animoji and, later Memoji.

Sure, you could look at all that like a lame way to turn your face into a poop emoji… once and then never look at it again. But people aren’t always comfortable with new technologies, especially when they’re sensory things like AR, so getting us to play with changing the world we see through real-time depth effect, or even ourselves by throwing on a silly virtual mask, it helps ease us into everything coming next.

And now it’s all just surfaced in better Snap and Insta filters.

Now, on the iPhone 11, we have things like semantic rendering which can not only distinguish faces in scenes, but parts of faces. And the LiDAR camera on the back of the new iPad Pro, which scans objects in front of us, like a lower resolution but longer range version of what the True Depth camera does TO us.

Add that to ARKit, Apple’s frameworks for augmented reality, which have gone from horizontal to vertical to irregular surfaces, single person to multi-person experiences, object placement to object occlusion, and the list goes on.

The LiDAR Scanner in particular makes all of that not only faster — near instant — but better, including creating accurate measurements, topological 3D meshes, better object occlusion, and better real-world physics simulations.

The only problem… and it’s a big honking problem… is that there’s almost no reason for a regular person to use any of it yet. There are few compelling AR experiences still and no killer LiDAR apps.

That’s why I said in my iPad Pro 2020 review that most of its appeal would be for developers working on just exactly those kinds of LiDAR and AR apps. And, in fact, it was super smart for Apple to seed this iPad Pro to those developers now, well before the iPhone 12.

See, on a relatively niche device, no one is going to knock Apple too hard for not having relatively bleeding edge features fully fleshed out yet. At iPhone scale, they’ll be knocked, hard. Like influencer posting super inappropriate meme hard.

So, half the battle is getting devs with the new iPad Pro to ramp up and push out hella cool LiDAR apps.

The other half is Apple, if not leading the way — because with the iPad Pro launching a couple weeks ago, means the way is already under way — but taking the lead this fall and showing what an AR camera system is really capable of.

Maps is an obvious example. Google’s already shown off AR in their maps. For people like me, who are EXTREMELY directionally challenged, being able to see exactly where to go, or exactly what something is, can be a game changer.

The new Find My network and AirTags are another. My AirPod is in the room. Great, I know that already, genius. Thanks. Where in the damn room? I can’t hear it… oh, look, AR is showing its outline right there, under the cat pillow on the couch… damn cat pillow.

For us nerds and creators — same thing, really — the ability to scan random objects into 3D, USDZ AR files, like Alex Lindsay and Brianna Wu have been talking about for years now, just opens up so many more possibilities to so many more people.

For everyone, having high quality AR capabilities built into the rear camera, not just the front, where you can change the environment like the Clips app, but on Hulk serum, and add objects and characters, and basically paint into real world space, could, sure, be a one-time thing like Memoji, or a phenomena when apps like Insta and TikTok pick it up and run with it.

But, even that…

Even that….

Just like the LiDAR camera on the iPad Pro seems like a developer tool for the LiDAR camera on the on iPhone 12, the LiDAR camera on the iPhone 12 may end up seeming like a developer tool for what comes next.

Because having to hold an iPad or iPhone up to experience this wondrous new world of AR isn’t cool. Not having to hold up anything and experiencing it all — now that’s cool.