Next up, Brianna Wu, Christina Warren, and Simon De Rochefort were kind enough to have me join the on Rocket, one of my absolute favorite tech shows:
We've got another special guest this week: Rene Ritchie, who has just left iMore after a long tenure to fly solo with his own YouTube channel! But first, another installment of the WeWork saga, the Stalk Market, and then finally Resident Evil 3.
There will obviously be privacy concerns, but Apple has built up a solid reputation here and I'm hoping we can trust them to make sure data is collected with full disclosure and used ethically and appropriately.
That understood, this is fan-freaking-tastic. Kudos to both companies for making it happen.
Yesterday I talked about an iPhone 12 diagram supposedly ripped from a leaked internal iOS 14 build. Well, there were also a couple other diagrams leaked as well, one of which shows a new iPhone Home screen. Or, at least that’s how many people ran with it. What it really showed was a Rorschach test for whatever it is you secretly wish the iPhone Home screen, or Springboard, would look like.
We’ll only know for sure when Apple shows off iOS 14, presumably in June.
But, what it does give me, right now, today, is this excuse to tell you what I want to see:
Constrained customization. That’s it.
But hear me out.
I want a grid that I can fill up any way I like, and configure however the mood strikes me.
First, I want a search box, a whole horizontal row. Like Spotlight is now, but always there if I want it to always be there.
Yeah, just like Android has had since Cupcake or beignets, or whatever. Good is good.
If my primary means of finding secondary or tertiary apps, buried somewhere on some Home screen in the stack, is to swipe down Spotlight and start typing, then just let me save myself that swipe. Just let me tap and start typing.
Also, in addition to letting me type out search queries, let me type out commands. Actions.
Basically, anything I can say to Siri to do I should be able to type to Spotlight. webOS, even BlackBerry 10, used to have this feature. It was great. It can be great again.
And I know, swiping down for spotlight isn’t hard. Talking to Siri isn’t hard. But not everyone experiences the world in the same way and the option for Spotlight and Siri object permanence and the ability to just type when it’s socially awkward to talk, or if you can’t talk, isn’t just a nice to have.
Second, complications, like on the Apple Watch. One by one, two by one, two by two. Four by one. Hey, maybe even four by two. You can call them widgets if you like but, to me, Apple Watch complications have always been more glanceable, more information-filled, and just flat-out more useful than any widget system. At least for me.
And I know… I know…. for years we’ve been told that people don’t use widgets. That they look nice but almost no one ever changes the defaults or interacts with them in any meaningful way. For non-nerds, they’re decorative. Ornamental.
That’s why the Mac Dashboard went away and modern Android phones are far cleaner than back in the glory days of HTC.
And maybe iPhone owners only want them because we don’t have them.
I’m willing to bet the company that’s been iterating on the Watch system for years could figure out Home screen complications. And Lock screen complications too.
Third, a row or more of Siri suggested apps. Siri don’t always talk good, but man alive are the suggested apps so almost always on point.
Thanks to a variety of signals from past behavior to current time and location, I can’t tell you how often I swipe down to Spotlight an app and it’s just already there, right there, staring back at me, good to go.
And I know… again, I know… it’s a slippery slope. When apps are static, always exactly where you expect them to be because they’re always exactly in the same place, you can build up muscle memory. You don’t have to scan for them. Don’t even have to think about them. There’s almost no cognitive load. You just tap, sometimes before you even realize it.
But having the same app in the same physical space is also limiting, because physical space is limited, and no two apps can occupy it at the same time.
Siri suggested apps add the element of time, effectively multiplying the space out towards infinity.
They could even work like Watch faces in a way, where you could set up a few different options based on time and location. Ones that are biased towards being at home or traveling, going to the Gym or when a game is on. Whatever.
Fourth would be those static apps, and for every reason I just said. There still needs to be a place for our most frequently used apps to be just exactly where we expect them to be, so we can hit them whenever we need them without a first thought, never mind a second.
The dock is already super static but it’s also super set in its ways. If you want to go crazy and have two rows, though, why should the dock stop you? It’s not the boss of you. Have two rows. Or a little arrow that pops it double, like Siri Suggested apps already have. Or, and I’m just spitballing here, an expander that opens it up to a full on app drawer that contains everything you’ve ever downloaded from the store, sorted how you like, in grid or in list view like the Watch does and some other rumors have already suggested for iOS 14.
That’s what I want. And if you just spent this entire video thinking, wait, all Rene’s really just done is describe the minus one Home page already does, you know, the Today View or widget screen or whatever you want to call it, if you think all I’ve just done is describe that with a few tweaks…
Well, then, you’re exactly right.
See, I think Apple’s been not-so-secretly experimenting on us for years with the minus one Home screen. Trying out all the less conventional, more interesting ideas there, safely adjacent to the good old — really old — main Home screen grid.
They’ve just never had the nerve to let us make it the main Home screen. To take it from minus to one. To flip a switch, demote the grid, and promote this admitted more complex but also more flexible system to prime time.
To do pretty much what the Apple Watch did from the very beginning and has gotten much better at in recent versions of watchOS — make the launcher not just a static launcher but a flexible, context sensitive informer as well.
I mean, I get it. Back when Steve Jobs introduced the original iPhone, with a Home screen that looks so very much like it still does today, it was never meant to be a destination. No one was meant to just sit on the Home screen and… abide.
It was meant to be a jump point. The place from which you could launch into any of a handful of apps at first, hundreds of thousands now.
Just like the Home button was meant to be an escape hatch. Something that jumped you back to a known state any time you finished what you were doing or simply got lost and wanted to go back.
Now, the Home button is fading and a better, more flexible, and yes, more complex gesture navigation system has taken its place.
So, maybe it’s time for the old Springboard to fade as well, and let the better, more flexible, and yes, more complex minus one Home screen take its place. If not as the default yet, than as an option for those who are ready.
Ready to stop chasing after our apps and to have those apps, their superficial information and modularized functionality, all come to us.
Over the last week or so, a diagram has been making its way across the pillars of the social cross — Twitter, Insta, blogs, YouTube. It’s purportedly from that internal December iOS 14 build, the one that everyone and their Animal Crossing pals now seemingly have access to. And it shows the new iPad Pro LiDAR Scanner — that’s LIght Detection and Ranging — but in fully Digivolved iPhone 12 Pro form.
That’s lead to a ton of people telling all of us, over and over again, just exactly what it is.
But… I’m way more interested in WHY it is.
See, Apple doesn’t talk about future products. You know it. I know it. Every financial analyst who’s ever squandered a question on Apple’s quarterly earnings calls… impotently trying to get Tim Cook to do just that certainly knows it.
But there are a couple of future products that Tim Cook and Apple HAVE been willing to talk about publicly… and kind of frequently.
That’s automation and augmented reality.
Now, coverage often conflates automation with automobiles, but Tim Cook has flat out said it goes much further than that. And I can get into just how far exactly in a future video. Let me know in the comments if you want to see that.
They’re intertwined in that both require technology that can pull in data about the real world. In other words, scan and ingest the environment around them — and us — understand what it is, meaning, tell a person from a tree from a car from a rock from rain, and then help us in ways that appropriate to that understanding, like by giving us better directions or warning us of dangers — bus, bus, BUS! — or just making us smile as adorable detective Deadpool pikachu runs behind the chair in the coffee shop.
And, I think the reason Tim and Apple… Tim Apple… are willing to talk about these future products is that… they’re not really products. They’re core technologies.
Think of it this way: Most current Apple products have screens. Apple Watch, iPhone, iPad, Mac. Pro Display XDR… is a screen!
The screen isn’t a product. It’s a technology. A component. And a ubiquitous one at that.
I think, in the near future, automation will just be an extension of machine learning and augmented reality, just another end point, just another quote-unquote screen, for the visual display of data.
Same way things like AirPods and the HomePod are already bridging into near- and far-field audio AR as part of the bigger drive towards ambient computing.
There’ll eventually be a range of verbosity vs. visualization based on what we’re doing at the time and the capabilities of the device. From jogging to driving to sitting on the sofa, and from speakers to watches to cars. Again, topic for another video.
To get to really good AR, we’re going to need really good components along the chain. Starting with scanners.
Apple’s been working on computational cameras for almost a decade. Most of us started really noticing with the iPhone 7 Plus and Portrait Mode, where it could understand depth enough to generate virtual bokeh.
Next was the iPhone X and the TrueDepth system. Not just for Portrait Mode or Portrait Lighting, but because it started boiling us in the AR water of Animoji and, later Memoji.
Sure, you could look at all that like a lame way to turn your face into a poop emoji… once and then never look at it again. But people aren’t always comfortable with new technologies, especially when they’re sensory things like AR, so getting us to play with changing the world we see through real-time depth effect, or even ourselves by throwing on a silly virtual mask, it helps ease us into everything coming next.
And now it’s all just surfaced in better Snap and Insta filters.
Now, on the iPhone 11, we have things like semantic rendering which can not only distinguish faces in scenes, but parts of faces. And the LiDAR camera on the back of the new iPad Pro, which scans objects in front of us, like a lower resolution but longer range version of what the True Depth camera does TO us.
Add that to ARKit, Apple’s frameworks for augmented reality, which have gone from horizontal to vertical to irregular surfaces, single person to multi-person experiences, object placement to object occlusion, and the list goes on.
The LiDAR Scanner in particular makes all of that not only faster — near instant — but better, including creating accurate measurements, topological 3D meshes, better object occlusion, and better real-world physics simulations.
The only problem… and it’s a big honking problem… is that there’s almost no reason for a regular person to use any of it yet. There are few compelling AR experiences still and no killer LiDAR apps.
That’s why I said in my iPad Pro 2020 review that most of its appeal would be for developers working on just exactly those kinds of LiDAR and AR apps. And, in fact, it was super smart for Apple to seed this iPad Pro to those developers now, well before the iPhone 12.
See, on a relatively niche device, no one is going to knock Apple too hard for not having relatively bleeding edge features fully fleshed out yet. At iPhone scale, they’ll be knocked, hard. Like influencer posting super inappropriate meme hard.
So, half the battle is getting devs with the new iPad Pro to ramp up and push out hella cool LiDAR apps.
The other half is Apple, if not leading the way — because with the iPad Pro launching a couple weeks ago, means the way is already under way — but taking the lead this fall and showing what an AR camera system is really capable of.
Maps is an obvious example. Google’s already shown off AR in their maps. For people like me, who are EXTREMELY directionally challenged, being able to see exactly where to go, or exactly what something is, can be a game changer.
The new Find My network and AirTags are another. My AirPod is in the room. Great, I know that already, genius. Thanks. Where in the damn room? I can’t hear it… oh, look, AR is showing its outline right there, under the cat pillow on the couch… damn cat pillow.
For us nerds and creators — same thing, really — the ability to scan random objects into 3D, USDZ AR files, like Alex Lindsay and Brianna Wu have been talking about for years now, just opens up so many more possibilities to so many more people.
For everyone, having high quality AR capabilities built into the rear camera, not just the front, where you can change the environment like the Clips app, but on Hulk serum, and add objects and characters, and basically paint into real world space, could, sure, be a one-time thing like Memoji, or a phenomena when apps like Insta and TikTok pick it up and run with it.
But, even that…
Just like the LiDAR camera on the iPad Pro seems like a developer tool for the LiDAR camera on the on iPhone 12, the LiDAR camera on the iPhone 12 may end up seeming like a developer tool for what comes next.
Because having to hold an iPad or iPhone up to experience this wondrous new world of AR isn’t cool. Not having to hold up anything and experiencing it all — now that’s cool.
March 31 on the Dark Sky Blog, Adam Grossman dropped these bombshell announcements:
Dark Sky has joined Apple.
There will be no changes to Dark Sky for iOS at this time.
The Android and Wear OS Apps will no longer be available for download
Same for the actual weather parts of the website.
The API will no longer take new signups but will continue functioning through the end of 2021.
Now, Dark Sky was really only available in the U.S. and U.K., so this doesn’t affect most of the world. Though, people in the U.S. and U.K. who preferred it are obviously super affected by this.
The API part sucks for apps like CARROT and WeatherLine, at least in those countries, but it sounds like they’ll be ok.
Congrats to the Dark Sky meatbags on the acquisition. To those concerned about what this means for me, I have lots of other amazing weather providers and 1.5 years to make the transition! I’ll be around until the sun finally sets on humanity.
A lot of people are upset about losing the Android app, which I totally get.
Dieter Bohn, in his Processor newsletter, said it felt petty, both fairly because that’s how Android users feel, and wildly unfairly because Apple doesn’t run a charity.
But… it’s actually not odd to me that Apple is killing the Android and Wear OS apps. It’s odd to me they’re keeping the iOS app.
The Dark Sky team is going to be just loaded up at Apple, similar to how the Workflows team was before Siri Shortcuts was announced. And, they kept the Workflows app going in maintenance mode, but they didn’t have an Android version to maintain.
Beddit, which hasn’t seen an Apple rebirth yet, dropped Android support a while back.
Beats, which became Apple Music, still has both iOS and Android versions.
My guess with Beats is that staying cross-platform gives them a better chance at throwing big numbers up against Spotify, especially for family subscriptions for mixed-device households. In other words, it’s a competitive space where Apple has to go to customers, not the other way around. Same reason TV+ was announced for Samsung before it even launched.
Dropping the Android app makes me think Apple isn’t planning a subscription weather services. At least not yet.
And if the idea of a subscription weather service sounds funny, remember many weather apps are subscription based because they have to pay for the weather data.
Apple could more easily swallow the costs for an iOS-only service, or, rather, subsidize them with hardware profits like it does iMessage and FaceTime.
But why would Apple want a hyper-regional, hyper-localized weather app?
My best guess is to make it less hyper-regionalized and add the hyper-localized features as data layers to iPhones, Apple Watches, and whatever comes next.
Apple has a mixed track record with international, I’ll give you that. TV+ and Music launchd in well over 100 countries each. Apple Cash and Apple Card are still U.S. only and Apple News, barely more than that. Which is super frustrating to customers who feel like they pay more for the products but get less in terms of features. Again, if you want to see a video on that, let me know in the comments.
But, the ability to have precise precipitation forecasts, down to the minute, down to the meter, could be beyond compelling.
Neil Cybart on Above Avalon said much the same. That building it into Apple Watch today, and into a future range of wearables, would be a benefit — Apple would have the best solution for hyperlocal weather on the market.
And, I’ll add, in an incredibly convenient and context-appropriate way.
Apple also likes to control as much of its own core technologies as possible. They used to use Yahoo! For weather. Currently, The Weather Company.
As best as I understand it, and please correct me in the comments if I’m wrong, the data mostly comes from government organizations, but it’s the packaging and feature sets that get built on top of it that are key.
IBM bought the Weather Company and Weather Underground, and could go a long way towards locking up these kinds of services.
If Apple really can build out Dark Sky, big if, this could go a long way towards both protecting and improving a key data type for Apple’s customers.
According to Motherboard, Facebook approached the NSO Group to try and buy a tool that would let them better spy on a specific subset of their users. Because of course they did.
This all from court documents revealed as part of Facebook’s ongoing lawsuit against NSO for helping governments hack WhatsApp, which Facebook owns.
Facebook, as everyone knows, is the behemoth social network that’s been accused of a series of reprehensible privacy and decency violations against its users.
The NSO Group is the hugely controversial supplier of spyware like Pegasus and hacking tools to nation-states, among others.
The subset of users Facebook reportedly wanted to target was us, iPhone users.
Now, Facebook apparently didn’t actually want Pegasus proper, which is used to remotely infect and spy on phones. They wanted Pegasus technology to make their own, existing Facebook spyware better.
See, in a previous scandal — it’s legit hard to keep up — Facebook was accused of pushing a VPN app named Onavo Protect on their users without properly disclosing that Facebook owned it and was using it to harvest our behavioral data to monetize, and to target competition.
You know that old cliche that big internet companies don’t sell our data because it’s too valuable? Yeah, Cambridge Analytica proved they do indeed sell our data because metastasizing is even more valuable.
Now, Facebook was eventually forced to pull the VPN from the App Store and Google Play Store, and sunset it.
But, like any vampire, it’s blood sucking continued. Techcrunch reported that Facebook had rebranded Onavo Protect as Research, used its massive advertising engine to target desirable demographics, including teen agers, offered them $20 for participating, and then abused Apple’s enterprise distribution system to infect their iPhones with it.
And yeah, that’s how valuable our data is to them. Block them from offering free services to get it and they’ll become desperate enough to pay for it. Topic for a future video.
But, according to Motherboard’s report, even after all that, Facebook still couldn’t harvest as much data from iPhone users as they could from Android users, and so wanted the Pegasus parts to make sure we were all being equally violated.
Now, Facebook says NSO is misrepresenting all this to try and distract from the lawsuit. But, Facebook doesn’t refute the claim in any other way. No denial, no context, no explanation.
And here’s a little secret decoder ring for corporate PR: If a company is in the right, they’ll usually hold their ground and say so in simple, plain language. If they’re on shaky ground, they’ll try to angle and change the point of discussion. If they done wrong, they’ll flail about in a desperate attempt to distract from the topic.
NSO Group just said they only sell to government agencies and law enforcement, so they talk-to-the-handed Facebook and wouldn’t comment further.
In terms of reactions so far, there haven’t been many. At least not that I can find. It’s up on TechMeme but, Apple site’s aside, almost none of the major tech sites have covered it, and I haven’t seen it talked about in my Twitter feed either.
The big exception is John Gruber from Daring Fireball, who repeated what he’s said before:
Facebook is a criminal enterprise.
Ryan Mac from Buzzfeed tweeted a reminder to whatever the NSO says with 10 lbs of salt, but that it was an incredible accusation in a formal court filing.
A New York Times reported tweeted that it was very disturbing if true, but that tweet seems to have been deleted.
So, it’s possible that reporters just don’t think there’s any there here, that commenters see it as two controversial companies just going at it, or that people are just busy with other stuff during all this shelter-in-place.
It’s the latter possibility that concerns me, though.
We’ve seen with Zoom, which has become massively popular thanks to social distancing, that people have been willing to put longer range concerns about privacy and security on hold for the immediate convenience of social connection.
Zoom has a history of, at best, playing fast and loose with security and privacy, and at worst having a reckless, borderline malicious disregard for it. Everything from secretly installing servers on the Mac to secretly channeling data to Facebook to secretly routing data through China, to failing to provide basic safeguards for users.
It’s a chilling mess that I would never use or recommend using during normal times, but that the people I care about and want to stay connected with are using right now.
Facebook is the same. I deleted almost all my data and stopped logging into the blue app following Cambridge Analytica, but now I’m worried about family and friends for whom Facebook is effectively the Internet. And, of course, I never stopped using Instagram, because Insta.
Walt Mossberg, who pretty much invented personal technology journalism, expressed just exactly that a couple weeks ago.
The deal with the devil that I’ve made for myself is that I won’t actively use Zoom or Facebook but if the people I care about reach out to connect through them, I’ll respond.
But I'm terrified that if I sacrifice privacy and security in the name of convenience and connection, I'll end up with none of it.