Categories
Uncategorized

ARKit Mug Demo

Experimental ARKit version of the brand new RR mug, in case you want to try it out around your favorite drinking spots. Looks terrific. Doesn’t hold beverage anywhere nearly as well as the real thing.

You can try it out here:

https://standard.tv/collections/rene-ritchie/products/rene-ritchie-logo-mug

Categories
Uncategorized

Apple Watch 6 — Mental Health Features?

Some of the rumors swirling around the next version of watchOS this June or the next Apple Watch this fall — or both — concern mental health.

Since its inception, the Apple Watch has been getting better and better at helping with physical health. From preventative fitness like activity rings and workouts, to warning systems like low, high, and irregular heartbeats and ECG, to emergency features like international SOS and fall detection.

There have been crossover features that offer some benefit to both mind and body, like the breathing app, and some behavioral assistance — the nature of the Apple Watch makes it not just impractical but impossible to get lost in social networks or games for minutes or hours on end, for example.

But this is the first time we’re hearing about features specifically for mental health, and, as usual, I’m really curious how that may play out.

So, I asked one of the smartest people I know, who also happens to be a licensed psychotherapist, Georgia Dow, to help me figure it out.

You can find Georgia's work at:

https://www.anxiety-videos.com

https://www.westmounttherapy.com

Categories
Uncategorized

Apple Returns Price of RAM Upgrades on Entry-Level 13″ MacBook Pro to ‘Normal’

MacBook Pro RAM upgrade pricing is now exactly the same as MacBook Air RAM upgrade pricing. So, my very strong guess is that it was somehow “discounted” at launch and has now returned to “normal”.

Nothing to do with component cost/availability, or anything else.

(Apple should honor previous orders at the previous pricing.)

https://www.macrumors.com/2020/05/30/13-inch-macbook-pro-ram-upgrade-doubled/

Categories
Uncategorized

New Mug. Free month of Nebula.

My brand new store has just gone live and there's a brand new mug to go with it.

Yeah, a mug. Because coffee. Also because I really wanted it to be something practical, something useful, something that could bring comfort right now. Even if you choose to fill it with tea or hot chocolate instead. There’s no wrong beverage here.

To make it even better, anyone who gets the mug also gets a month of Nebula, absolutely free. (I you already subscribe to Nebula, you’ll get the free month added to your existing subscription.)

https://standard.tv/reneritchie

Categories
Uncategorized

iPhone 12 Camera: Mega Anamorphic Zoom

Rumors are making the rounds that the iPhone 12 or iPhone 13 — one of the upcoming iPhones — will finally increase the megapixel count on the camera system to a number greater than 12. With 64 being the flavor of the week.

I’m doing a whole series on the future of the iPhone. So, hit subscribe to make sure you don’t miss any of it.

Now, megapixels are all about quantity. They don’t always help, and can even hurt, image quality. So what is Apple even thinking?

Well…

Megapixels

Camera sensors are made up of pixels, just like screens are. But, instead of displaying images to you, these are all about capturing them for you.

A megapixel is just a million pixels. In other words, when you multiply the horizontal number pixels by the vertical number of pixels, the result is higher than a million.

For 12 megapixels, the current iPhone camera spec, it’s greater than 12 million.4032 x 3024 = 12,192,768 to be specific.

Now, more megapixels aren’t necessarily better. Because you also have to factor in the size of the sensor.

See, the more you chop up a sensor, the greater the number of pixels — and megapixels — you get, but the smaller each of those pixels becomes. And the smaller the pixel, the worse it performs when it comes to capturing light. Which mean, the worse photos it takes.

In other words, they’re super high resolution, they’re just super low quality at the same time. You get more but the more sucks.

What you really want to do to improve image quality is increase the sensor size, which increases the pixel size. Not pixel count. Pixel size. That way, each pixel can take in more light. That’s the kind of more that sucks less.

Within reason. End of the day, you want the best balance possible given the size constraints of the device, and the distortions and aberrations you can get if you push anything too far in any one direction.

And, for the last few years, Apple figured the best balance was holding the count at 12 megapixels while slowly improving just about everything else.

Pixel Binning

Now, some other phone makers have been really aggressive about increasing megapixel counts instead, especially in markets where they feel people care more about having the highest possible number on a spec sheet.

Recently, Samsung even put a 108 megapixel sensor in the Galaxy S20 Ultra. To keep those pixels from being too tiny, Samsung is using a fairly massive sensor. But, they’re also using pixel binning.

That’s a process where a group of pixels are averaged together to try and get a result that’s better than the sum of the individual parts. In this specific case, binning 9 pixels down to 1, or 108 megapixels back down to 12.

Same with things like quad Bayar filters, which means the pixels are color filters — red, green, blue, green — in 2 by 2 grids.

You can shoot the full 108 megapixels if you really want to, and in extremely bright, outdoor, well lit situations, you’ll get usable results, but for indoors or lower light, that binning and that filtering will kick in, hard.

Image stacking

What Apple — and for that matter, Google — have been doing instead is something called image stacking.

Instead of shooting a ton of pixels and binning them down, the iPhone shoots a ton of exposures in rapid succession and then the image signal processor, the ISP on the A-series chipset, churns through them, pixel by pixel, and picks the best parts of each and every exposure, for each and every pixel.

It started with simple high dynamic range — HDR. Take a photo, take an over exposure, take an under exposure, combine them together and you get a photo with more detail in both the highlights and shadows.

But now, thanks to machine learning, the ISP can detect textures like clouds and ropes, even faces and parts of faces, and then process everything slightly differently to render out the best possible results for each and every element in the scene.

The advantage to pixel binning is speed. It’s doing less work on more data. The ISP is fast, but it’s still doing way more work but on less data.

It’s not hard to imagine that if Apple did both, bin down 4 to 1, from 64 to 16, or something like that, and then apply Smart HDR staking, semantic rendering, tonal mapping, and all the rest of the machine learning, they could potentially get even better results faster.

Zoom

The second part of the iPhone camera rumors involves zoom, which is something I’ve been complaining about for a while.

Huawei and Samsung have added periscopes to their camera systems to greatly increase optical zoom. Basically, because phones just aren’t think enough for traditional zoom lenses, inside of having them stick half an inch out of the phone, they lay them on their sides, trading depth for width. Then they use mirrors to redirect the light. Like a periscope.

Google, by contrast, took their HDR+ image stacking technology and souped it up to make better digital zoom. Pretty much using the minute movements of the phone while taking a photo to capture extra data, and then computing the hell out of it to make the zoom look way sharper, and the colors way more accurate, then anything the sensor could capture by itself.

Apple could do something similar and adapt Smart HDR into Smart Zoom for the wide angle camera. Then, they could use a better telephoto, not even a periscope because that’s probably overkill when it comes to return on optics with current technology, and add Smart Zoom on top of it. That could work out to 3x optical, much better 10x digital on the iPhone 12. Maybe even 5x optical, even good 20x or more digital on the iPhone 13.

That way you could capture your kids playing sports, your pets in the park, all the sights when you’re traveling — you know, when we can do all that stuff again. One day.

And like I’ve said numerous times before, it’s an area where Apple is currently behind, and not just other camera phones, but cameras in general.

Anamorphic

The last of the most recent iPhone camera rumors involves a completely new kind of camera. Not a 2 megapixel macro, sorry Marques!

No, an anamorphic lens.

Now, there are historical reasons for why anamorphic lenses exit but it’s safe to say they remain in use today primarily for the way they look. They’re not round and so the bokeh they capture — the blurry lights in the background — aren’t round either.

And the lens flares you can get… well, if you’ve ever seen JJ Abrams’ Star Trek movie, you know all about the lens flares you can get with anamorphic.

So, if anamorphic lenses in the modern age are less about practicality and more about artistry, why would Apple of all companies include one?

Maybe, maybe I say ever so trepidatiously, so could an anamorphic camera. Think about it, if Apple of all companies is planning or even considering an anamorphic lens, it would be precisely for the artistry.

I mean, just close your eyes, can’t you just imagine the keynote demo?

Maybe it could be done computationally as well, and on the main wide angle camera, but either way, it’d be something incredible fun to see.

Categories
Uncategorized

iPad mini 6 — Will Apple Make it Pro?

The original iPad mini was never something Steve Jobs wanted. When tiny Android tablets started coming out, Steve snarked that you’d have to file down your fingers just to use it.

I’ve talked about similar stories before. Hit subscribe so you don’t miss any of them.

Then, Eddy Cue, senior Vice President of internet services, read an article about all the reasons a smaller iPad made the same kind of sense, he picked up one of the Android half-tabs, came to the same opinion, and convinced Steve to do it.

That, by the way, is one of the biggest differences between Steve Jobs and almost everyone who tries to be the next Steve Jobs — he hired smart people and then, more times than none, he even listened to them.

That’s how Apple got the iPhone and not the Fire Phone.

But, this isn’t about Steve Jobs.

This… is about the iPad mini and where it goes next.

https://www.imore.com/ipad-mini-6-or-ipad-pro-mini-future-apples-tiny-tablet

Categories
Uncategorized

/r/iPhone AMA

The folks who run /r/iPhone on Reddit were kind enough to ask me on for an AMA. We did it on Discord and it worked incredibly well. Thanks to everyone in the community for their time and for all the questions!

https://iphonediscord.info/ama/reneritchie/

Categories
Uncategorized

Apple Glasses — Accessory or Platform?

Over the last couple of posts, I’ve talked about why Apple would make Glasses, basically a face-mounted computational display — and what could actually get people like you and me to buy a pari. Hit subscribe so you don’t miss the rest of this series.

One thing I haven’t talked about yet, though, is the developer story… apps.

When the original iPhone launched, it came with only the apps Apple had given it. Including the last-minute addition of YouTube.

Henri Lamiraux’s software team had just finished a forced march, a marathon of sprints, to get version 1 ready to ship, only to have to turn around and start another for version 2, this time to get the frameworks ready for the SDK event and the App Store ready to launch.

Umpteen millions of apps and billions of dollars later, it literally made apps as mainstream, as popular, as music and TV.

So, it should have come as no surprise that when the original Apple Watch launched, it had not only the apps Apple had given it, but kinda… sorta… an App Store as well.

Back then, on that first version, the App Store lived on the iPhone and so did the app logic. Only the interface and interaction layers were… projected… onto the Watch. All based on Apple’s pre-existing Extensibility frameworks.

Now, thanks to their own forced marches, Apple got app logic onto the Watch in version 2 and brought the store on-device in version 5.

Sure, it’s fair to say that, while there are some legit terrific Apple Watch apps, wrist-based apps in general just haven’t proven as popular or as critical to the Watch’s success as they were the phone’s. Subject for another video. Let me know in the comments if you want to see it.

But, that same process, the process of projectables, could prove even more critical and more valuable for Apple’s next small thing.

I’m Rene Ritchie and this… is the potential for Apple Glasses apps. Please don’t call them glapps.

There’s this common misconception that iOS doesn’t multitask well. It does. Go back to the very first demo when Steve Jobs plays music, surfs the web, checks email, takes a call, and the music comes right back just to see how well it multitasks. Because of course it does, it’s built on the same UNIX underpinnings as macOS.

The difference is that Apple has always tightly controlled when processes can and can’t multitask on iOS. And they did it because they were paranoid about battery life.

Only Apple’s own apps had any kind of background access at first. Then only a tiny handful of specific functions, like streaming music and turn-by-turn navigation. Then things like background refresh. And now, while it’s still by no means a wide open, wild-wild-west of multitasking madness, pretty much every kind of app can persist in the background… at least for a while. RIP bloated, cross-platform social and gaming apps.

iOS always had the potential to handle headless apps as well. That’s where interfaces and feature sets, could pop up as remote views in the system and inside other apps. Because, again, same Unix underpinning as macOS.

The difference here is that Apple didn’t even start porting that functionality over until later. Because they were even more paranoid about security.

Then came iOS 8 and Extensibility, which I’ve often called one of the most important developments in iOS since… well, iOS.

With it, apps maintained their secure sandboxes but direct user action, things like tapping a button, could poke a small hole in them, just big enough to bring up a Share sheet, or widget, or photo filter, or custom keyboard, or document provider, or… you get the idea.

When that happened, iOS would pull the interface and data from the container app, and then present it to you so you could quickly, easily send a photo to Twitter, swipe down a calculator from anywhere, or get your bitmoji on.

All while never letting it mingle with the host app, so no one could try and trick you into giving them your personal, private information and content.

Now, this wasn’t some test bed for Apple Watch apps. It wasn’t part of some master plan. But, when the Watch team was trying to figure out how they could get third-party apps to work on a device as constrained as the original Watch, and realized they’d have to decouple app logic from app interface to do it, wow was Extensibility handy.

Same thing in principle for AirPlay, even CarPlay.

Just like websites were broken down into web services that could exist on any site, binary app blogs were broken down into specific functions that could be called from anywhere.

Google, which values cloud the way Apple values local, has been doing something similar with app fragments for years as well. Because a good idea is a good idea.

At least in theory. If you have to wait for an old school watch app to spool up or app fragment to download, you know never turned out nearly as well in practice.

Except, it kind did. Just not for whole apps or big functions. It’s turned out really great for a bunch of other stuff we use every day, including and especially rich notifications and, on the Apple Watch, rich complications.

Part of it has to do with transmission. Locally, Bluetooth has gotten better but we can still block it with our own, ugly, mostly bags of water body parts. From the cloud, even modern Wi-Fi and cellular protocols can and do fail us.

Part of it has to do with compute power. Even the latest Apple Watch can’t run apps, especially heavy apps, as well as an iPhone, iPad, or Mac.

Part of it has to do with interface. The Apple Watch screen is so much smaller it makes touch that much harder, and the Digital Crown is basically a scroll wheel minus the mouse.

Apple Glasses will likely be just as susceptible to wireless interference, have far less compute power, and even less interactive potential and precision than an Apple Watch.

Now, that’s not a show-stopper. Far from it. Just take a look at AirPods.

AirPods are just as dependent on a wireless connection, even AirPods Pro and their fancy 10-core audio engine have far less compute power, and when it comes to interface, there’s precisely zero screen and only a capacitive squeeze or several to work with.

Yet they can relay calls, turn-by-turn directions, workout instructions, even incoming messages.

And somewhere in the middle there, somewhere in the land of rich notifications and complications, is where I think Glasses apps will lie. At least at first.

Actually, if you go back to the original Apple Watch, glances would probably be an even better conceptual fit for glasses than complications.

Developers would get the glasses as an extension to the existing APNs Apple Push Notification Service for quote unquote free, where it just works but with some additional effort and customization, it will just work even better.

And also for glances, so, for example, a guided workout app could pull data from the Apple Watch and show it in your peripheral vision so you can see your starts without having to pull up a sleeve, pull down a glove, or take your hands off your ski poles or kettlebells.

Same thing with walking directions, AppleTag detection, and a host of other features that, again, may not be as functional as the watch, much less the phone, but far, far more available. And far less obtrusive and, frankly, rude. Like, you’ll be able to check an event reminder or to do without even having to lift or turn your wrist or reach for your phone.

The reverse will also be true, though: Developers will be able to pull from the Apple Glasses sensors, particularly the LiDAR sensors, to make things like indoor mapping, item location, guided workouts, and other things, maybe even some level of location and spatial-based gaming possible, without you having to hold your iPhone or iPad up for extended periods of time.

And, yeah, there’ll have to be things like Driving Mode which cuts out any and all distractions when the Glasses detect a certain velocity in general and the shape of a steering wheel in front of you in specific. There’s already a crude version of that on the iPhone. But, believe me, no one wants an LL Cool J is going live Insta alert while we’re careening down a highway in a blizzard. Including LL.

Now, maybe one day Apple’s silicon team will have a G4 or G5 system in package powerful enough for fully on-board, always-connected apps like something straight out of an Iron Man movie, with a voice-driven Siri interface that’s finally as functional as never mind Jarvis or Friday, but Google or Alexa.

I have a whole rant brewing on that, so if you want to see that video sooner rather than later, hit me up in the comments.

So, here’s how I think it’ll play out at first — Apple will announce the glasses with support for pulling notifications and glances, pushing sensor data like LiDAR, and fine grained visual verbosity controls to provide information but prevent distraction. Maybe with a cool demo from Twitter, Peleton, Pokemon Go, whatever’s trendy and looks compelling.

But, it will be more of a display than a computer when it comes to third-party apps, at least at first. And then, like the iPhone and Apple Watch, the Glasses will be iterated on and improved over time. Until, yeah, I start to dream of EDITH again.

Categories
Uncategorized

Apple Glasses — Why Would Anyone Buy Them?

After a few years and a steady stream of reports, noted supply-chain exfiltrator extraordinaire, Kuo Ming-Chi, updated his, saying he now expected Apple Glasses to launch sometime in 2022.

At that point, the always ascending Jon Prosser of Front Page Tech retorted, saying he’d heard an announcement was planned for as early as this fall’s iPhone event, but may be postponed to next year. Also, that they’d be called Apple Glass and cost $500. Then he went on a plethora of live-streams and podcasts, from the GadgetCast to VypDrive to the CultCast, and added details like they’d have no built-in speakers, but would rely on AirPods, and that there may even be a limited edition version in the shape of Steve Jobs’ famously rounded frames. Think Harry Potter, kids.

Which, of course, set an internet already primed by gold-watch, XDR stand, and Mac Pro outrage once again on fire.

Bloomberg’s biz-prophet, Mark Gurman, who’s also been reporting on the Glasses for years, called the Steve Jobs frames in specific, and the rest of the stories in general, fiction.

Prosser replied that he respected Gurman. Gurman that there were both mixed reality headsets and glasses in the pipeline, which might be confusing. Prosser that he wasn’t confused. And Benjamin Mayo of 9to5Mac that hahahaha. Sorry, strike that, ahahaha.

Gurman had previously reported that the glasses were coming in 2020, but now that the announcement had been delayed to 2021 and the release until 2022 or 2023. With the headset being the thing coming earlier, but maybe also not until 2022.

Noted code extractor, Steve Troughton-Smith, said he’d laugh at these rumors floating around if not for people taking them seriously.

Troughton-Smith based that on all the glasses software that had accidentally been included in previous versions of iOS, and suggested that a bunch of noise was being deliberately generated to drown out any real signals.

Others suggested alternate sources, different projects, different prototypes, or just bored work-from-homers, and I paraphrase, trolling for the lulz.

Max Weinbach of XDA Developers, who’s contributed some of his own reporting on live streams and the EverythingApplePro channel, seemed to be on the same page as Prosser.

And Prosser, for his part, basically said wait, watch, and see what happens.

Now, I personally try to never fall victim to any of the classic blunders. The most famous, of course, is never bet against a dread pirate when death is on the line. But, only slightly less well known, is never get involved in a rumor war on the internet.

But none of these people are your dad, so no hashtag Jern or hashtag Gurm in the comments, people.

Personally, as you know, the when of the release date and the what of the hardware aren’t even the parts that fascinate me most.

Though the rumors and the reporters being so conflicted right now is, in and of itself, fascinating in terms of seeing how the small, loud, but important social part of the market responds to the idea of the product itself.

Because, while I covered why Apple I think Apple is making glasses last week — make sure you hit subscribe so you don’t miss any of these videos — the bigger question I have right now isn’t why Apple would make them but… why would any of us buy them?

That wasn’t a question with the iPhone. People need phones. Feature phones were already a mature market and smartphones were primed to move from early adopters to mainstream acceptance. They just needed a push, which turned out to be just exactly the iPhone.

It was a big question for the Apple Watch, though. Feature watches, from classic switch to deeply digital, were beyond a mature market but for a variety of reasons, they weren’t at all ubiquitous. Smart watches were also still firmly in the early adopter phase, primed to move not at all.

Would people be willing to even give the Apple Watch a chance on their wrist, either the relatively few who still wore traditional watches for actual time keeping or as jewelry, or the much bigger group that no longer or simply never wore watches at all?

Of course, the ground is now littered with the reputation-corpses of the many pundits who said nope, big old nope, even and especially years after the reverse proved obvious.

Because, it turned out, if you created or nurtured a product with a compelling enough feature set, enough people who switch from traditional watches or go back or start wearing them for the first time. And now, Apple Watch owns that small market and is almost single-productly driving its expansion.

In the case of the Apple Watch, the most compelling features of the set have been health, fitness, and medical/emergency alerts, with a good peppering of notifications and communications thrown in there to round things out.

It works with your phone, does a few general things the phone does, not as in-depth but more conveniently, but it also does a few very specific things that the phone can’t do at all.

But, glasses feel like an even bigger ask. Watches go on your wrist. Glasses go on your face.

Now, unlike watches, a lot of people, myself included, simply have to wear glasses or contacts for vision correction. We’re used to them. We wear them all day, every day. And switching from dumb to smart glasses won’t be a big ask. Basically, Apple will have us at One More Thing…

But, also unlike watches, I don’t think there’s a large group, at least not in established markets, that have stopped or simply never worn glasses at all, ever.

I think most have worn them opportunistically, for safety or fashion reasons. Sun glasses are the biggest example.

Because safety and fashion are also absolutely compelling enough reasons to put on glasses. I don’t want anything flying into my eyes or hurting my vision, or I just want to look all Top Gun Terminator badass. They’re just worn intermittently. Not all the time. So, rabid fanbase and technophiles aside, will Apple need a ton of stuff after the one more thing to win this group over?

Whatever Activity and Workouts and heart rate monitoring and fall detection and international emergency calling and messages are… but for glasses?

Now, I fully realize the dangers of trying to predict this stuff. I barely anticipated half the features that now make the Apple Watch all but indispensable to me. Yeah, when asked by Ford, I just said I wanted more convenient horses.

And that’s where I am with Apple Glasses as well. Currently, I just want more convenient horses.

Like I said in the last video: a Head’s Up Display, a HUD like Tony Stark has in the Marvel movies.

A way to get everything from notifications to information, from messages to walking directions, never mind without reaching for my phone but without even having to lift my wrist.

But, I also want a much more granular, nuanced, less obtrusive system for it, because as we’ve all learned in the age of always-on, notifications can so easily turn into annoyance and information into distraction. Always on the wrist is both better and worse for that. Always on the face… I can’t wait and I couldn’t be more terrified.

We’re flattening compute down from the desktop to the laptop to the palm top to the wrist top to finally the face top. Shrinking the distance between our devices and ourselves. Bringing our external cybernetics, because that’s really what all this is — everything that amplifies our senses and memories and communications — as close as they can possible come to being internal.

But I digress. See, I don’t think Apple will have or even need any of those really compelling things at launch.

And they won’t need mainstream adoption at launch either.

They didn’t have it with the iPhone. It took years to build out the feature set and grow the market there. Same with Apple Watch.

But they had cool stuff that the rabid fanbase and technophiles glommed to, things like pinch to zoom and rubber banding, activities and sketches, that got a lot of attention and made for great, show-offy demos for every early adopter’s more decidedly mainstream family, friends, and colleagues.

And that, combined with new and actually compelling features over the years, be it the App Store or Swim Proofing, iMessage or Always On, increased appeal and removed roadblocks, and step by step, year by year, the products grew and grew.

So, my guess is the Apple Glasses will be the same. They’ll start just as simply, just as crudely even as the iPhone and Apple Watch did. They’ll have a few features that are cool enough to capture the attention of the always hangry media and the wallets of those who just always have to have the next and newest thing.

They’ll provide timely data, like indoor and outdoor walking directions. Critical notifications like who’s messaging or calling you. Things that make it so you don’t even have to look down at your phone or watch, or put down whatever you’re holding or doing to look at any other device.

They’ll be even less capable but they’ll be even more available, and that’ll be good enough to get things started.

Then we’ll get rapid iterations on speed and power, then a redesign, then a lower priced model, and then before you know it I’ll be making a video about how Apple Glasses need to be set iPhone, even Apple Watch free.

And that’s before we even get to the real killer feature — being able to sit on calls or really anything boring at all, and secretly be watching Nebula the whole time.

Beyond the rumors, Apple will eventually announced Glasses. They’ll have a few features that excite us nerds but still be very much a version one device. The potential will be there, though. They’ll get a bunch of coverage, including a ton of articles and videos about how very doomed they are and doomed Apple is for making them, but they’ll be iterated on for the next few years, they’ll grow over the next few years, and within half a decade we’ll have what everyone who blew the initial analysis calls the version Apple should have shipped to begin with. The iPhone 4, iPad Air, or Apple Watch 4 version.

Categories
Uncategorized

The iPhone 12 Won’t Be Good Enough — Here’s Why!

I’ve seen a lot of discussion lately about the iPhone — or just phones in general — being “good enough”. That most of them already do most of the things most of the people need most of the time, and Apple, Samsung, Google, and everyone else should just chill out and stop updating them every year. iPhone 12? Who needs it. It’s exhausting. Enough already.

But… is it? I mean really?

Now, hear me out… Maybe the iPhone, even the hotly rumored upcoming iPhone 12, maybe all current and expected phones this year, are just scratching the surface of their potential. Of what they can be and what they should be doing for us.

Read why in my latest column for OneZero on Medium:

https://onezero.medium.com/how-apple-can-take-the-iphone-to-the-next-level-5becb48494b