The notch. The horns. The tab. The cutout. Whatever you want to call it. When Apple announced the iPhone X redesign in September of ought 17, when they killed the Home button, Touch ID and the giant bezels that held and balanced it, when they took the screen corner-to-curved-corner, it was the TrueDepth array — the RGB and infrared cameras, the dot projector, the flood illuminator, the proximity and ambient light sensors, the speaker and microphone that ruined it, that kept in from being truly full. That kept a splinter in our eyes.
Wow, that got dramatic.
But the simple fact is, it’s been three generations and almost as many years, and not only hasn’t the notch gone away, it hasn’t gotten any narrower or shallower. Not one bit. And some people, well, they still hate its breathing guts.
But, could the iPhone 12, the iPhone 13, any future iPhone really, do anything at all to lessen that pain?
When Apple wanted to take the modern iPhone design full screen, it meant they had to delete the bezels, the Home button, and Touch ID along with them.
Well, not really. Apple could have moved Touch ID to the back of the phone, where some Android phones have had it for years. The Apple logo being a cool-as-hell place to put it.
That, or the power button. Without a Home button, you’d theoretically have to press the power button to wake the phone anyway, so why not just use it to unlock at the same time?
But Apple had a few other things going on. First, they were getting into AR — augmented reality — and that meant getting depth-capable cameras on the front as well as on the back.
Second, Apple likes to flex. Hard. And by that I mean they like to implement things other companies just don’t have the processing power, integration, budges, patents, or engineering to match. At least not for a couple or few years.
That why iOS 7 had so many damn layers of Gaussian blur effects built into it. Literally no other company had the silicon to copy it.
Same with Face ID. Other companies had done face scanning before, but it was comically, tragically bad. They’d take a photo and compare it to another photo which meant, say it with me, it could be fooled by a photo.
Because Apple had a depth camera, they could scan not for the pixels of your face but for the geometry that made them. The actual structure of your head in the triangle between your eyes and nose, which is way more secure. In all cases but an evil twin, even more secure than Touch ID.
But, yeah, the nose part is why it doesn’t work with a normal mask on. It can’t get enough facial geometry points to reliably authenticate you, and if you remove those points, it’s just not secure enough to use as a means of authentication. But that’s a topic for another video.
So, anyway, Apple went with the TrueDepth array and Face ID. And, yeah, almost three years later and only Google’s been able to duplicate it, and it took even them until last year to do it.
Wicked hard flex.
Once all the True Depth components were locked and loaded into the front of the phone, Apple had to figure out how to house them.
They could have given up on a corner-to-corner display, of course. Delete the chin and Touch ID, but keep the forehead.
I mean, a bunch of Android phones had gone real thin up top but couldn’t or wouldn’t fold the display back like Apple was doing at the bottom, so had to keep a little or a lot of a chin. This would have just been the reverse of that.
Google, famously — infamously maybe? — would go on to make the mother of all bathtub notches on the Pixel 3, even before they had a real depth camera and needed one, likely just to dogfood it for other vendors, only to retreat, retreat, retreat back to a forehead when they actually added a real depth camera with the Pixel 4.
But Apple wanted corner-to-corner, and to be able to shove the status bar all up into those corners, to make it look even more expansive.
So, Apple spent another ton of engineering resources carving first the OLED on the iPhone X, then LCD on the iPhone XR around the TrueDepth array with super-ellipses, sub-pixel masking, the whole over-the-to-bit.
In other words, Apple figured if they were going to have a notch — or horns as Craig Federighi referred to them back then — they’d lean all the way into them.
Even use them in the icons the way they used to use the Home screen.
But, when it came time to bring Face ID to the iPad Pro, Apple didn’t go with the notch. They deleted only about half the bezel and kept a partial forehead to house the TrueDepth array. Along with a partial chin and… I dunno… partial jowls? — along the sides to keep everything balanced.
Now, Apple treats the iPhone as portrait mode device, and with a proper, definite top and bottom. The iPad, by contrast, they treat as a device worth rotating up, down, all the way around. So, having a notch would look even more visually disturbing on an iPad than it does on an iPhone.
But, it does show that Apple could have, and still could, flatten out the notch… if they wanted to.
When it came time for Samsung to go full screen, they did something different. They’d tried doing bad facial recognition and good iris scanning but they couldn’t or wouldn’t integrate the two into something more useful. So, they abandoned it for an under-display fingerprint reader of middling quality, at least at first, but that meant they didn’t have anything like a TrueDepth array to work around.
So, Samsung could literally cut the screen just around the single or dual RGB cameras on the front. Like hole punches. And they could get better and better at doing it until the holes were barely any bigger than the cameras.
That let the display go not just corner-to-corner but truly edge-to-edge. So, as much as the notch gives some extra pixels in the corners, the hole punches give even more along the edge and in between the cameras.
They still get in the way of doing anything truly expansive or clean, and they still stare at you, unblinking, like lidless eyes writhed in OLED… just… less.
Apple could go with something similar but, unlike Samsung, they’d have to deal with more than just one or two RGB cameras. They’d have to have cutouts for the dot projector, flood illuminator, and infrared camera as well. Assuming the audio and sensors could be pushed up and maybe even on top, that’s still 4 cutouts instead of 1 or 2.
And… while it would be better in keeping up with the spider-eye pattern on the new Mac Pro, objectively, it just doesn’t seem any better looking than a notch.
To avoid a notch, a hole bunch, or anything that would tarnish the perfection of a true, full screen, some other companies have used mechanical choochers what pop the front-facing cameras up and down or spin them around.
The pros are obvious: Because the camera doesn’t have to be in the screen, you get that true, full screen. Also, if you use the spin-around version rather than the up and down version, you get the benefit of the rear-facing camera system which is typically much, much better than any front-facing one.
The cons are also obvious: Not matter how fast a mechanical choocher chooches, it’s never going to be as fast as a camera that requires zero choochage. Is too a word. Also, mechanical parts are potential points of failure, not just in terms of breaking down but for dust and liquid ingress.
That might be why OnePlus, which had a peek-a-boo camera on the 7 Pro, ditched it for a hole punch on the 8 Pro.
It’s also probably why it’s not even rumored as being on Apple’s radar. Apple, the company that deleted the headphone jack in 2016, replaced the mechanical Home button with a Taptic one the very same year only to delete it entirely the very next year, and are even rumored to be deleting the Lightning port, likely to replace it with a smart connector, in other year or so.
I did a video on that already, hit subscribe so you don’t miss any of this series.
Anyway, if Apple is simplifying the mechanics, they probably won’t be adding any mechanisms like choochers, at least not any time soon.
On the experimental if not yet mainstream side, we also have companies like Oppo and Xiaomi prototyping in-display cameras. In other words, cameras that don’t have the screen cut away around them — at all — but rather have the screen paved right over them.
When you’re not using the camera, the screen just looks like a big, beautiful, full-on screen. When you do use them, though, the screen becomes transparent and you can shoot right through them.
I think it’s fair to say this stuff is all in the prototype stage still, the concept stage even, and comes with enough limitations and constraints that it’s easy to understand why it hasn’t even made its way into Samsung phones yet. And Samsung is willing to try almost anything even remotely commercially viable at scale.
There have been rumors of Apple testing under-display versions of Touch ID.
Both optical and ultra-sonic versions of those have been shipping for a few years now. Most major Android vendors have shipped them, including Samsung.
The early ones were… rough… but they’ve been getting better each generation.
Would Apple double back, ditch Face ID, go full screen, and put Touch ID under it?
That would certainly solve the mask problem but bring back the glove problem. So, personally, I hope Apple would go for both.
And having them both under-display, at least eventually, sounds like the most Apple of any of these solutions. At least eventually.
Before Apple goes fully notchless, there are some rumors saying they’ll go notch-less.
Moving some of the components like the speaker system up so that other components like the infrared system can be condensed closer together.
Still a notch but not as much of a notch, if the rumors prove true.