The folks who run /r/iPhone on Reddit were kind enough to ask me on for an AMA. We did it on Discord and it worked incredibly well. Thanks to everyone in the community for their time and for all the questions!
Over the last couple of posts, I’ve talked about why Apple would make Glasses, basically a face-mounted computational display — and what could actually get people like you and me to buy a pari. Hit subscribe so you don’t miss the rest of this series.
One thing I haven’t talked about yet, though, is the developer story… apps.
When the original iPhone launched, it came with only the apps Apple had given it. Including the last-minute addition of YouTube.
Henri Lamiraux’s software team had just finished a forced march, a marathon of sprints, to get version 1 ready to ship, only to have to turn around and start another for version 2, this time to get the frameworks ready for the SDK event and the App Store ready to launch.
Umpteen millions of apps and billions of dollars later, it literally made apps as mainstream, as popular, as music and TV.
So, it should have come as no surprise that when the original Apple Watch launched, it had not only the apps Apple had given it, but kinda… sorta… an App Store as well.
Back then, on that first version, the App Store lived on the iPhone and so did the app logic. Only the interface and interaction layers were… projected… onto the Watch. All based on Apple’s pre-existing Extensibility frameworks.
Now, thanks to their own forced marches, Apple got app logic onto the Watch in version 2 and brought the store on-device in version 5.
Sure, it’s fair to say that, while there are some legit terrific Apple Watch apps, wrist-based apps in general just haven’t proven as popular or as critical to the Watch’s success as they were the phone’s. Subject for another video. Let me know in the comments if you want to see it.
But, that same process, the process of projectables, could prove even more critical and more valuable for Apple’s next small thing.
I’m Rene Ritchie and this… is the potential for Apple Glasses apps. Please don’t call them glapps.
There’s this common misconception that iOS doesn’t multitask well. It does. Go back to the very first demo when Steve Jobs plays music, surfs the web, checks email, takes a call, and the music comes right back just to see how well it multitasks. Because of course it does, it’s built on the same UNIX underpinnings as macOS.
The difference is that Apple has always tightly controlled when processes can and can’t multitask on iOS. And they did it because they were paranoid about battery life.
Only Apple’s own apps had any kind of background access at first. Then only a tiny handful of specific functions, like streaming music and turn-by-turn navigation. Then things like background refresh. And now, while it’s still by no means a wide open, wild-wild-west of multitasking madness, pretty much every kind of app can persist in the background… at least for a while. RIP bloated, cross-platform social and gaming apps.
iOS always had the potential to handle headless apps as well. That’s where interfaces and feature sets, could pop up as remote views in the system and inside other apps. Because, again, same Unix underpinning as macOS.
The difference here is that Apple didn’t even start porting that functionality over until later. Because they were even more paranoid about security.
Then came iOS 8 and Extensibility, which I’ve often called one of the most important developments in iOS since… well, iOS.
With it, apps maintained their secure sandboxes but direct user action, things like tapping a button, could poke a small hole in them, just big enough to bring up a Share sheet, or widget, or photo filter, or custom keyboard, or document provider, or… you get the idea.
When that happened, iOS would pull the interface and data from the container app, and then present it to you so you could quickly, easily send a photo to Twitter, swipe down a calculator from anywhere, or get your bitmoji on.
All while never letting it mingle with the host app, so no one could try and trick you into giving them your personal, private information and content.
Now, this wasn’t some test bed for Apple Watch apps. It wasn’t part of some master plan. But, when the Watch team was trying to figure out how they could get third-party apps to work on a device as constrained as the original Watch, and realized they’d have to decouple app logic from app interface to do it, wow was Extensibility handy.
Same thing in principle for AirPlay, even CarPlay.
Just like websites were broken down into web services that could exist on any site, binary app blogs were broken down into specific functions that could be called from anywhere.
Google, which values cloud the way Apple values local, has been doing something similar with app fragments for years as well. Because a good idea is a good idea.
At least in theory. If you have to wait for an old school watch app to spool up or app fragment to download, you know never turned out nearly as well in practice.
Except, it kind did. Just not for whole apps or big functions. It’s turned out really great for a bunch of other stuff we use every day, including and especially rich notifications and, on the Apple Watch, rich complications.
Part of it has to do with transmission. Locally, Bluetooth has gotten better but we can still block it with our own, ugly, mostly bags of water body parts. From the cloud, even modern Wi-Fi and cellular protocols can and do fail us.
Part of it has to do with compute power. Even the latest Apple Watch can’t run apps, especially heavy apps, as well as an iPhone, iPad, or Mac.
Part of it has to do with interface. The Apple Watch screen is so much smaller it makes touch that much harder, and the Digital Crown is basically a scroll wheel minus the mouse.
Apple Glasses will likely be just as susceptible to wireless interference, have far less compute power, and even less interactive potential and precision than an Apple Watch.
Now, that’s not a show-stopper. Far from it. Just take a look at AirPods.
AirPods are just as dependent on a wireless connection, even AirPods Pro and their fancy 10-core audio engine have far less compute power, and when it comes to interface, there’s precisely zero screen and only a capacitive squeeze or several to work with.
Yet they can relay calls, turn-by-turn directions, workout instructions, even incoming messages.
And somewhere in the middle there, somewhere in the land of rich notifications and complications, is where I think Glasses apps will lie. At least at first.
Actually, if you go back to the original Apple Watch, glances would probably be an even better conceptual fit for glasses than complications.
Developers would get the glasses as an extension to the existing APNs Apple Push Notification Service for quote unquote free, where it just works but with some additional effort and customization, it will just work even better.
And also for glances, so, for example, a guided workout app could pull data from the Apple Watch and show it in your peripheral vision so you can see your starts without having to pull up a sleeve, pull down a glove, or take your hands off your ski poles or kettlebells.
Same thing with walking directions, AppleTag detection, and a host of other features that, again, may not be as functional as the watch, much less the phone, but far, far more available. And far less obtrusive and, frankly, rude. Like, you’ll be able to check an event reminder or to do without even having to lift or turn your wrist or reach for your phone.
The reverse will also be true, though: Developers will be able to pull from the Apple Glasses sensors, particularly the LiDAR sensors, to make things like indoor mapping, item location, guided workouts, and other things, maybe even some level of location and spatial-based gaming possible, without you having to hold your iPhone or iPad up for extended periods of time.
And, yeah, there’ll have to be things like Driving Mode which cuts out any and all distractions when the Glasses detect a certain velocity in general and the shape of a steering wheel in front of you in specific. There’s already a crude version of that on the iPhone. But, believe me, no one wants an LL Cool J is going live Insta alert while we’re careening down a highway in a blizzard. Including LL.
Now, maybe one day Apple’s silicon team will have a G4 or G5 system in package powerful enough for fully on-board, always-connected apps like something straight out of an Iron Man movie, with a voice-driven Siri interface that’s finally as functional as never mind Jarvis or Friday, but Google or Alexa.
I have a whole rant brewing on that, so if you want to see that video sooner rather than later, hit me up in the comments.
So, here’s how I think it’ll play out at first — Apple will announce the glasses with support for pulling notifications and glances, pushing sensor data like LiDAR, and fine grained visual verbosity controls to provide information but prevent distraction. Maybe with a cool demo from Twitter, Peleton, Pokemon Go, whatever’s trendy and looks compelling.
But, it will be more of a display than a computer when it comes to third-party apps, at least at first. And then, like the iPhone and Apple Watch, the Glasses will be iterated on and improved over time. Until, yeah, I start to dream of EDITH again.
After a few years and a steady stream of reports, noted supply-chain exfiltrator extraordinaire, Kuo Ming-Chi, updated his, saying he now expected Apple Glasses to launch sometime in 2022.
At that point, the always ascending Jon Prosser of Front Page Tech retorted, saying he’d heard an announcement was planned for as early as this fall’s iPhone event, but may be postponed to next year. Also, that they’d be called Apple Glass and cost $500. Then he went on a plethora of live-streams and podcasts, from the GadgetCast to VypDrive to the CultCast, and added details like they’d have no built-in speakers, but would rely on AirPods, and that there may even be a limited edition version in the shape of Steve Jobs’ famously rounded frames. Think Harry Potter, kids.
Which, of course, set an internet already primed by gold-watch, XDR stand, and Mac Pro outrage once again on fire.
Bloomberg’s biz-prophet, Mark Gurman, who’s also been reporting on the Glasses for years, called the Steve Jobs frames in specific, and the rest of the stories in general, fiction.
Prosser replied that he respected Gurman. Gurman that there were both mixed reality headsets and glasses in the pipeline, which might be confusing. Prosser that he wasn’t confused. And Benjamin Mayo of 9to5Mac that hahahaha. Sorry, strike that, ahahaha.
Gurman had previously reported that the glasses were coming in 2020, but now that the announcement had been delayed to 2021 and the release until 2022 or 2023. With the headset being the thing coming earlier, but maybe also not until 2022.
Noted code extractor, Steve Troughton-Smith, said he’d laugh at these rumors floating around if not for people taking them seriously.
Troughton-Smith based that on all the glasses software that had accidentally been included in previous versions of iOS, and suggested that a bunch of noise was being deliberately generated to drown out any real signals.
Others suggested alternate sources, different projects, different prototypes, or just bored work-from-homers, and I paraphrase, trolling for the lulz.
Max Weinbach of XDA Developers, who’s contributed some of his own reporting on live streams and the EverythingApplePro channel, seemed to be on the same page as Prosser.
And Prosser, for his part, basically said wait, watch, and see what happens.
Now, I personally try to never fall victim to any of the classic blunders. The most famous, of course, is never bet against a dread pirate when death is on the line. But, only slightly less well known, is never get involved in a rumor war on the internet.
But none of these people are your dad, so no hashtag Jern or hashtag Gurm in the comments, people.
Personally, as you know, the when of the release date and the what of the hardware aren’t even the parts that fascinate me most.
Though the rumors and the reporters being so conflicted right now is, in and of itself, fascinating in terms of seeing how the small, loud, but important social part of the market responds to the idea of the product itself.
Because, while I covered why Apple I think Apple is making glasses last week — make sure you hit subscribe so you don’t miss any of these videos — the bigger question I have right now isn’t why Apple would make them but… why would any of us buy them?
That wasn’t a question with the iPhone. People need phones. Feature phones were already a mature market and smartphones were primed to move from early adopters to mainstream acceptance. They just needed a push, which turned out to be just exactly the iPhone.
It was a big question for the Apple Watch, though. Feature watches, from classic switch to deeply digital, were beyond a mature market but for a variety of reasons, they weren’t at all ubiquitous. Smart watches were also still firmly in the early adopter phase, primed to move not at all.
Would people be willing to even give the Apple Watch a chance on their wrist, either the relatively few who still wore traditional watches for actual time keeping or as jewelry, or the much bigger group that no longer or simply never wore watches at all?
Of course, the ground is now littered with the reputation-corpses of the many pundits who said nope, big old nope, even and especially years after the reverse proved obvious.
Because, it turned out, if you created or nurtured a product with a compelling enough feature set, enough people who switch from traditional watches or go back or start wearing them for the first time. And now, Apple Watch owns that small market and is almost single-productly driving its expansion.
In the case of the Apple Watch, the most compelling features of the set have been health, fitness, and medical/emergency alerts, with a good peppering of notifications and communications thrown in there to round things out.
It works with your phone, does a few general things the phone does, not as in-depth but more conveniently, but it also does a few very specific things that the phone can’t do at all.
But, glasses feel like an even bigger ask. Watches go on your wrist. Glasses go on your face.
Now, unlike watches, a lot of people, myself included, simply have to wear glasses or contacts for vision correction. We’re used to them. We wear them all day, every day. And switching from dumb to smart glasses won’t be a big ask. Basically, Apple will have us at One More Thing…
But, also unlike watches, I don’t think there’s a large group, at least not in established markets, that have stopped or simply never worn glasses at all, ever.
I think most have worn them opportunistically, for safety or fashion reasons. Sun glasses are the biggest example.
Because safety and fashion are also absolutely compelling enough reasons to put on glasses. I don’t want anything flying into my eyes or hurting my vision, or I just want to look all Top Gun Terminator badass. They’re just worn intermittently. Not all the time. So, rabid fanbase and technophiles aside, will Apple need a ton of stuff after the one more thing to win this group over?
Whatever Activity and Workouts and heart rate monitoring and fall detection and international emergency calling and messages are… but for glasses?
Now, I fully realize the dangers of trying to predict this stuff. I barely anticipated half the features that now make the Apple Watch all but indispensable to me. Yeah, when asked by Ford, I just said I wanted more convenient horses.
And that’s where I am with Apple Glasses as well. Currently, I just want more convenient horses.
Like I said in the last video: a Head’s Up Display, a HUD like Tony Stark has in the Marvel movies.
A way to get everything from notifications to information, from messages to walking directions, never mind without reaching for my phone but without even having to lift my wrist.
But, I also want a much more granular, nuanced, less obtrusive system for it, because as we’ve all learned in the age of always-on, notifications can so easily turn into annoyance and information into distraction. Always on the wrist is both better and worse for that. Always on the face… I can’t wait and I couldn’t be more terrified.
We’re flattening compute down from the desktop to the laptop to the palm top to the wrist top to finally the face top. Shrinking the distance between our devices and ourselves. Bringing our external cybernetics, because that’s really what all this is — everything that amplifies our senses and memories and communications — as close as they can possible come to being internal.
But I digress. See, I don’t think Apple will have or even need any of those really compelling things at launch.
And they won’t need mainstream adoption at launch either.
They didn’t have it with the iPhone. It took years to build out the feature set and grow the market there. Same with Apple Watch.
But they had cool stuff that the rabid fanbase and technophiles glommed to, things like pinch to zoom and rubber banding, activities and sketches, that got a lot of attention and made for great, show-offy demos for every early adopter’s more decidedly mainstream family, friends, and colleagues.
And that, combined with new and actually compelling features over the years, be it the App Store or Swim Proofing, iMessage or Always On, increased appeal and removed roadblocks, and step by step, year by year, the products grew and grew.
So, my guess is the Apple Glasses will be the same. They’ll start just as simply, just as crudely even as the iPhone and Apple Watch did. They’ll have a few features that are cool enough to capture the attention of the always hangry media and the wallets of those who just always have to have the next and newest thing.
They’ll provide timely data, like indoor and outdoor walking directions. Critical notifications like who’s messaging or calling you. Things that make it so you don’t even have to look down at your phone or watch, or put down whatever you’re holding or doing to look at any other device.
They’ll be even less capable but they’ll be even more available, and that’ll be good enough to get things started.
Then we’ll get rapid iterations on speed and power, then a redesign, then a lower priced model, and then before you know it I’ll be making a video about how Apple Glasses need to be set iPhone, even Apple Watch free.
And that’s before we even get to the real killer feature — being able to sit on calls or really anything boring at all, and secretly be watching Nebula the whole time.
Beyond the rumors, Apple will eventually announced Glasses. They’ll have a few features that excite us nerds but still be very much a version one device. The potential will be there, though. They’ll get a bunch of coverage, including a ton of articles and videos about how very doomed they are and doomed Apple is for making them, but they’ll be iterated on for the next few years, they’ll grow over the next few years, and within half a decade we’ll have what everyone who blew the initial analysis calls the version Apple should have shipped to begin with. The iPhone 4, iPad Air, or Apple Watch 4 version.
I’ve seen a lot of discussion lately about the iPhone — or just phones in general — being “good enough”. That most of them already do most of the things most of the people need most of the time, and Apple, Samsung, Google, and everyone else should just chill out and stop updating them every year. iPhone 12? Who needs it. It’s exhausting. Enough already.
But… is it? I mean really?
Now, hear me out… Maybe the iPhone, even the hotly rumored upcoming iPhone 12, maybe all current and expected phones this year, are just scratching the surface of their potential. Of what they can be and what they should be doing for us.
Read why in my latest column for OneZero on Medium:
Spent the better part of the last week working on this. It's my first non-Apple/tech video, and it's just gone live on Nebula Originals.
It's for the Working Titles series and it's all about one of my favorite shows: Buffy the Vampire Slayer.
I go into everything from the contrast of classic and modern horror elements, to the subversion of the genre and its many stereotypes, to its legacy as part of popular culture.
You can watch it here:
I'm also doing an AMA on the /r/watchnebula subreddit to go along with it:
So ask me anything, and let me know what you think!
Apple has been pushing out updates to their education offerings all year, including things that are valuable in general but also really help out people using their products in education. That includes recently updated products like the $899 (with education discount) MacBook Air, and recently added features like trackpad support in iPadOS.
For education in specific, though, Apple is currently beta-testing Schoolwork 2.0, which has an all-new design and, frankly, makes a lot of other, more utilitarian software, look like something out of Windows 95.
Workflows have been streamlined, interfaces have been flattened, and views have been created that almost let teachers see the "health" of their classroom from overall to each specific student.
Layouts are also consistent with existing apps and tools, everything from similarity to the Files app for document handling to the compose sheet for handout distribution. That engineered familiarity lets teachers who already know how to use iPadOS get ramped up and ready much faster than they would if they had to learn entirely new conventions.
If a teacher is using Apple systems like Messages and FaceTime, those are also built right in now so they can instantly follow up directly with students, via text or call.
For students, shared iPads can now let them just grab-and-go with temporary sessions. In cases where no data needs to be preserved, no login needs to be made. They can literally pick it up, use it, tap out, and everything is reset, ready for the next student.
For the Mac, Apple is bringing over assessment mode. So, for cases when individual workflows and learning paces aren't as important as mass teaching and standardized lessons, Mac apps can now lock the environment and feature set down so students have no choice but to stay on the educational rails.
Through all of it, even while ramping up features and improving apps like the Pages, Keynote, and Numbers, Apple is also keeping the spotlight right on ClassKit apps, letting those do all the educational heavy lifting, with what looks like an unprecedented level of integration.
If it seems prescient given the current shelter-in-place guidelines and school closures, I think that's just emblematic of good educational approaches being good educational approaches regardless of whether or not the location or modality changes.
In other words, if you build for humans rather the checklists, humans will get value whether they're in a building or at home.
Because of current conditions, though, Apple also has rapidly expanded all of their educational resources to better help teachers, students, and parents.
That includes a new Learn from Home portal:
But also more on-demand courses for teachers and parents-become-teachers:
A learning center for materials and ideas to help better engage students:
And a variety of curated App Store collections for home and remote teaching and learning:
Even a super-simple starting point with 30 creative activities for kids:
Of course, the biggest challenges for Apple in education remains inertia, which is simply schools and boards that have gone Microsoft or Google and don't even think about alternatives, and the more IT-centric management technologies that likely drew those schools and boards to begin with.
Apple is very much still carefully picking and choosing their points of engagement and areas in which they believe they can provide highly differentiated value.
I'm super happy, though, that a company as big and as successful in the consumer space as Apple is choosing to spend significant resources on things like education, accessibility, and health.
It's one of the things I value most about the company.
In 2014, Tim Cook introduced the big and bigger iPhone 6 and iPhone 6 Plus, and followed them up with one more thing… the Apple Watch.
In 2020, could Tim Cook announce a smaller and bigger iPhone 12 and iPhone 12 Pro Max, and follow them up with one more thing… the Apple Glasses?
Hold that thought. I’ll come back to it in a scalding minute.
For years, the worst of the worst analysts have been saying Apple desperately needs a next big thing, and I’ve been eye rolling them just adamatium hard. Because there isn’t anything as big as the iPhone yet and there won’t be for a good long while.
But… what there is is a series of next small things that, when taken in aggregate, will provide for exceptional value. Literally one more thing. But several of them in succession.
Apple Watch. AirPods. Services. And, if the rumors are true, at some point in the next year or two, Apple Glasses.
Now, there’s a bunch of information already available about Apple Glasses, because Mark Gurman’s been reporting on rOS for a while, the StarBoard frameworks were accidentally included in an iOS release last year, and Jon Prosser just added that he heard they’ll be branded Glass and have a starting price of $499.
But, I’m not here to re-write anyone’s blog post or re-shoot anyone’s video. No. Not today, Satan. You can go read or watch that stuff straight from the original sources.
Because you know, like Drax, I’m never as interested in The Who or the what.
I want to know the why…
Apple Glass Architecture
I think a lot of people assume the first iteration of the Apple Glasses will be like the original Apple Watch.
Another wearable shuttlecraft utterly dependent on Starship iPhone for anything and everything major.
Maybe even more dependent. Somewhere between the Apple Watch and, say, AirPods.
And that certainly makes the kind of sense that does. Given there’d be no room on glasses for an iPhone-style A-series system-on-a-chip, or SoC, and all the memory and storage and battery volume that goes with it.
There probably won’t even be room for an Apple Watch-style S-series system-in-package, which basically wraps an entire computer in an itty-bitty-box.
But maybe something between that and the H-series system-in-package in the current-gen AirPods, which is smaller but also far more specific.
An Apple G1, so to speak.
Now, Apple had over a decade to study personal computers, including their own Apple II, before launching the iMac. A decade of smartphones and tablet PCs before launching the iPhone and iPad.
But barely half that with smart watches before launching the Apple Watch.
And it showed. Because the Watch wasn’t anywhere nearly as focused in purpose or messaging at that launch, and, in Apple ended up experimenting, learning, and rapidly improving it over the next few years. In public. Which was just interesting as hell to watch but also… unprecedented.
Glasses are way more like Watches than phones or tablets. There was Google Glass, but those were literally an experiment. More recently, Snap-ticles, Amazon Echo Frames, North’s Focals, and a couple others, which all look more like normal glasses and less like something the Borg would wear out on the town.
But none of them do much yet, not even as much as the Apple Watch did at launch.
Apple Glass Capabilities
Despite its short comings, the Apple Watch did have a few things going for it. Just like the iPhone wasn’t really a smart phone but the first full-on pocket computer, the Apple Watch wasn’t really a smart watch but the first full-on wrist computer.
It could do a much smaller, much briefer, but still critically important subset of what the iPhone could do — just like the iPhone could do a much smaller, much briefer, but still critically important subset of what the Mac could so — but by virtue of it being always on your body, the Apple Watch could also do a few compelling things all its own.
Then, heart rate monitoring for health and fitness, now for medical as well, along with fall detection and international emergency calling.
Also and just as importantly, convenience. Just like you could that smaller subset of important things on your iPhone without having to go back to your Mac, you can do that even smaller subset of important things on your Apple Watch without even having to reach for your iPhone.
Now, what about glasses?
Working backwards, convenience is easy. With glasses, you could do a much smaller subset of things but without even having to lift your wrist and look at your Watch.
Meaning, zero-effort head’s up display, HUD, though not anything like what Tony Stark has in the Iron Man armor, at last not yet.
As to sensors, I don’t know if heart rate would be possible or even needed, but you’d have LiDAR, which could quickly provide a rough scan of 3D objects that surround you.
It’s what the 2020 iPads Pro shipped with and what’s rumored to be coming on the 2020 iPhones Pro as well. Likely to help get the system adopted, vetted, and ramped up in time for Apple Glasses.
Because holding a giant tablet up, even a phone up, to see augmented reality layers on top of the real world isn’t cool. Having it painted up in front of your eyes, now that’s cool.
It also sounds like the Apple Glasses will be just like the Apple Watch in another way — maybe glorious, maybe terrible: They won’t have an RGB camera.
See, on the iPads Pro and the upcoming iPhones Pro, like the TrueDepth camera on the front, the back reality cameras will have not just LiDAR but actual, you know, conventional cameras.
Those are the ones that let you take photos and videos, sure, but also let you do things like scan documents and QR codes, identify Lego models, read signs and landmarks, and do basically any level of sophisticated real-word ingestion and computer vision, like at all.
It’s glorious in that, not having a conventional camera is critical for privacy. One of the biggest push-backs against Google Glass was people didn’t like the idea that they could be secretly recorded in a changing room, bathroom, hell, dining room. And a blinking record light again just makes you look like creepy cousin Borg.
It’s terrible in that, without a conventional camera, a lot of the convenience of a face mounted computer goes away.
No walking into Starbucks, glancing at a QR code embedded in the logo, and having the daily deals pop up for you. No looking at a sign while you’re traveling and having the translation just write right over it.
Maybe measurements will work based of LiDAR, maybe maps will work based off GPS from your phone or watch, maybe Find My will still light up for devices with U1 chips, maybe we’ll even be able to have AR characters, or AR versions of new Apple Products, dance with our nicely occluded friends.
But a lot of the sophisticated stuff that relies on depth and image data just won’t have that second set of data to rely on, to the point of maybe not having them at all, and still having to go back to our iPhone or iPad for it.
Is that tradeoff, privacy instead of functionality, a benefit or a drawback for you, or do you think Apple will find a way for us to have both? Let me know in the comments.
Apple Glass Experience
For me, all of that is going to come down to the experience the Apple Glasses provide when you put them on. Limited in the first iteration, for sure, but also whatever path is glimpse-able ahead.
With the Apple Watch, when they were still just a rumor, I said my killer feature would have to be convenience. Those few, brief, critical things they could do more conveniently than a phone. But, over the years, it’s grown to be more. The things they do no phone can. The health and fitness. The way they save lives.
I feel very much the same way about Apple Glasses. I want that head’s up display, that HUD in my life, I want it to show me important notifications more easily than even the Apple Watch can. I want it to give me walking directions in a way beyond what the Apple Watch’s Taptic Engine can.
But I also need those other features. Not taps or heartbeats or sketches, the stuff that came and went from the Apple Watch quickly, though I’m sure we’ll get some version of those too, but the features no iPhone or Apple Watch can do. The life changing ones.
And that’s key not just for us but for Apple as well. Because, for Glasses to be truly successful, for them to be part of that series of next small things, that, when taken in aggregate, will provide for exceptional value, Apple will need you to buy them in addition to, not instead of.
Just like you buy the Apple Watch in addition to the iPhone, and the AirPods in addition to both.
Apple Glasses have to literally be more than one more thing, they have to be the next in a succession of several few things.
When that happens, I fully expect we’ll see exactly what we saw with the iPhone 6 and Apple Watch. Just with iPhone 12… or iPhone 13… or iPhone 14… and Apple Glasses.
From Apple and Google:
One of the most effective techniques that public health officials have used during outbreaks is called contact tracing. Through this approach, public health officials contact, test, treat and advise people who may have been exposed to an affected person. One new element of contact tracing is Exposure Notifications: using privacy-preserving digital technology to tell someone they may have been exposed to the virus. Exposure Notification has the specific goal of rapid notification, which is especially important to slowing the spread of the disease with a virus that can be spread asymptomatically.
To help, Apple and Google cooperated to build Exposure Notifications technology that will enable apps created by public health agencies to work more accurately, reliably and effectively across both Android phones and iPhones. Over the last several weeks, our two companies have worked together, reaching out to public health officials scientists, privacy groups and government leaders all over the world to get their input and guidance.
Starting today, our Exposure Notifications technology is available to public health agencies on both iOS and Android. What we’ve built is not an app — rather public health agencies will incorporate the API into their own apps that people install. Our technology is designed to make these apps work better. Each user gets to decide whether or not to opt-in to Exposure Notifications; the system does not collect or use location from the device; and if a person is diagnosed with COVID-19, it is up to them whether or not to report that in the public health app. User adoption is key to success and we believe that these strong privacy protections are also the best way to encourage use of these apps.
Today, this technology is in the hands of public health agencies across the world who will take the lead and we will continue to support their efforts.
Just when I thought I was out… You pulled me back in!
Yeah, I did pretty much every single MacBook Pro comparison I could think of. Hit subscribe to see them all. But I didn’t do 16-inch vs. 13-inch, because I figured anyone who wanted the 16-inch would know it.
But… that didn’t stop all you all from asking. And asking. And asking.
So, this… is the MacBook Pro 13-inch vs. 16-inch — FIGHT!
While the 13-inch MacBook Pro just got updated… the design part of it simply never got the memo. It’s still got pretty much the same look and build that it debuted with back in October of ought 16.
Same boxy, unibody aluminum chassis. Same silver and space gray options. Same four USB-C / Thunderbolt 3 ports on the higher-end model, and same 13.3-inch display.
The 16-inch MacBook Pro… well, late last year is when it went to 16-inches.
Not much changed, granted. It’s also got the same boxy, unibody aluminum chassis, silver and space gray options, and four USB-C / Thunderbolt 3 ports as it debuted with back in 2016, but the display has been stretched out from 15 to 16 inches to not just reduce the bezels but increase the useable work space.
And, in an age where competing laptops are going full edge-edge, it’s currently the only MacBook Apple has that’s coming close to keeping up.
The difference between the 13.3-inch and 16-inch displays doesn’t just come down to size alone. Not quite.
Sure one is 2560 by 1600 and the other is 3072 by 1920, but they’re both LCD, both 500 nits of brightness, both wide P3 color gamut, and both have True Tone ambient color temperature matching.
The subtle difference is that the 16-inch can change refresh rates between 60hz and 48hz, so if you produce video like I do for this channel, in glorious 24 frames per second — say it with me — the way nature and Hollywood intended, you can set the 16-inch display to properly show it to you that way while you’re editing.
Which is, yeah, nice to have more than anything else, but it’s really super nice to have.
But, the 13-inch is a tiny bit thinner, almost 3 inches skinnier, over an inch less deep, and over a pound lighter.
So, if size matters to you, get the 16-inch.
If portability is more important, get the 13-inch.
Audio / Video
Let’s just get this part out of the way: Both the 13-inch and the 16-inch MacBooks Pro have the same terribad 720p webcam that’s ok in full light but bad and worse in backlight and low light, and pretty much everyone wants Apple to apply their iOS prowess to fixing it up and soon.
The speakers on the 13-inch aren’t bad, though. In fact, they’re pretty good. High dynamic range, wide stereo, spatial audio, even Dolby Atmos. They make it sound like whatever you’re listening to is right in front of you.
The speakers on the 16-inch, though, are next level. Like Apple shoved a HomePod under the keyboard next level. And they make it sound like whatever you’re listening to is all around you.
Same with the mics. The 13-inch has a 3 mic array that’s ok for video calls in a pinch. The 16-inch has a 3-mic array that’s tuned for high signal-to-noise, what Apple calls “studio quality” and what I’d call equivalent to a mid-range dedicated USB mic. Which is ok for even podcasting or voice over in a pinch.
And yes, both still have 3.5mm headphone jacks.
So, if you want the best sound possible, you want the 16-inch.
The new high-end 13-inch MacBook Pro — and I’m really only going to focus on the new high-end model for this — starts off with a 2.0Ghz quad-core Intel 10th gen Core i5 but you can spec it out up to a 2.3Ghz quad core i7.
The 16-inch starts off with a 2.6Ghz hex-core Intel 9th gen Core i7 and can go all the way up to a 2.4ghz octo-core i9. Yeah.
Moreover, the high-end 13-inch only comes with Intel Irish Plus graphics where the 16-inch has Intel HHD Graphics 630 but also comes with a discreet AMD Radeon Pro 5300M and can be upgraded to a Radeon 5500M.
Both have the same T2 ARM-based co-processor for Touch ID, voice-activated Siri, camera and mic security, real-time encryption, accelerators, controllers, and all the other custom silicon advantages.
Neither have WiFi 6, because Apple, while jumping right into it on iOS, seems to be staying the hell away from it on the Mac.
And both promise the same 10hrs of battery life for light workloads, which means roughly the same for heavy workloads as well.
But even though the 16-inch still has previous generation CPUs, it’s got much more powerful GPU options.
So, if performance matters more to you than portability, the 16-inch MacBook Pro is going to give you more.
The latest 13-inch MacBook Pro update means the high-end model now starts at 16GB of RAM and can finally — is finally fair here? — go up to 32GB of RAM.
Likewise, the storage now starts at 512GB but can also finally — also fair here? — go up to 4TB of SSD.
The 16-inch, though… well, that also starts at 16GB of RAM and 512GB of SSD but… but… they can go all the way to 64GB of RAM and 8TB of SSD. 8TB.
So, if you want to run a ton of virtual machines, simulators, or super pro apps, or you want to keep a ton of pro projects and content on your internal drive, you need the 16-inch MacBook Pro.
One of the most important updates to the 13-inch MacBook Pro is the new Magic Keyboard. The same one the 16-inch MacBook Pro got late last year.
That means they both have the new scissor switches, which maintain some of the stability people liked about the butterfly switches, but restore more travel and, so far, reliability, which is what people hated about the butterflies.
I just call them the best of both keyboard worlds.
They also both have distinct escape keys now, Touch Bars, combined Touch ID power buttons, and inverted T arrow keys.
So, while this is a draw, it’s an excellent, excellent draw.
The new high end 13-inch MacBook Pro starts at $1,799 U.S. That’s for the 4-port, 10th gen model, which can go all the way up to $3,599 U.S. with all the bells and whistles.
The 16-inch MacBook Pro starts at $2,399 U.S. But, because graphics, RAM, and SSD can go much higher, you can bell and whistle it all the way up to $6,099 U.S.
Now, some people have complained that the price difference between the 13-inch and the 16-inch isn’t really that different… But… ok.
There is absolutely an overlap, especially if you tilt your head and squint to look just at that overlap.
But you’re not really paying per inch here. For some pros, they have a Mac Pro or iMac Pro at home and really just want something as light and capable as possible to travel with. For other pros, they work almost exclusively while mobile so they want the biggest screens and higher performance possible.
And, yeah, those pros typically charge hardware purchases to clients and pay them off with big jobs, so things like portability vs. performance are far more important to them price.
If price is important to you, though, the 13-inch starts lower. Much lower at the lower end.
So, if you need to do pro-level work on a Mac while mobile and you don’t want to carry a pound or an inch more than you have to, everything else be damned, than you want to carry the 13-inch MacBook Pro.
But, if you need to do most of your pro-level work on a Mac while mobile, and you need the absolute best Mac possible regardless of size or weight, or you just want the better audio and graphics, then you want the 16-inch MacBook Pro.
Apple has announced a new set of Apple Watch Pride bands — Sport Loop and Nike Sport Loop this year — and matching Pride Watch Faces for 2020:
This year’s artfully unique Pride Edition Sport Band is joined for the first time by the new Apple Watch Nike Pride Edition Sport Band. Both are available today from apple.com, the Apple Store app, and Apple stores, and pair beautifully with new matching Pride Watch faces that are coming soon as a part of watchOS 6.
The Pride Sports Loop is next-level manufacturing-at-scale:
The Pride Edition Sport Band is assembled by hand from individual strips of colored fluoroelastomer. The strips are then machine die-cut and compression molded together. This process results in subtle variations in the wavy rainbow design, making each band artfully unique. Durable and strong, yet surprisingly soft, the band drapes elegantly across your wrist and feels comfortable next to your skin. An innovative pin-and-tuck closure ensures a clean fit.
Apple Store: https://www.apple.com/shop/watch/bands