Categories
Uncategorized

iPhone 14 Leaks — Pure 8K Camera FIRE

iPhone 14. Specifically, the camera. Maybe a new 48 megapixel camera. But megapixels don’t matter, right? Pixel size over pixel count, quality over quantity. But… what if Apple can do both?

I’m Rene Ritchie. YouTube says 70% of you watching still aren’t subscribed, so hit that button and bell, be part of this awesome community, and let’s do this!

According to a just-released report from randomly accurate Haitong International Securities analyst Jeff Pu, Apple is finally — finally — getting ready to ditch their traditional 12 megapixel sensor for quad-major upgrade to 48 megapixels. Which backs up a previous report from more reliable supply chain exfilterator and TF International Securities analyst Kuo Ming-Chi.

Now, there are three really big reasons why, which I’ll get to in a minute, but it follows Samsung and other Android phone makers switching to monstrous megapixel counts a few years ago already, and even equally hesitant Google going all-in on 48 megapixel + sensors with the Pixel 6 series earlier this year.

See, with traditional cameras, you have huge, honking lenses that pull in a ton of light. But those don’t fit on phones that fit in pockets. Regardless, Samsung, Huawei, and others have been trying to live that best big glass life, strapping Loquitous of Borg-like sensors onto the backs of their phones, but they’re still nowhere nearly big enough, and their algorithms — the computational photography they use to try and bridge the gap — is just still not good enough. And so very try hard at the same time.

Google, until now, had been sticking with multi-generational hand-me-down sensors backed up first by massive server-side processing and, more recently, just ludicrous localized machine learning. HDR+. Super Res Zoom. And now, they can literally Magic Erase anything from our memories this side of Tiger King Season 2.

Apple never offered Samsung’s big glass or Google’s big algo, but a really good balance of the two. 12 megapixel sensors with bigger pixel sizes, wider apertures, longer focal lengths, and increasingly useful modes, from Smart HDR to deep fusion to Night Mode to Macro to Dolby Vision. Basically, big compute driven in real-time by the best pound-for-pound silicon on the planet.

And, for a long time, 12 megapixels was a real sweet spot for that big compute. Enough data for photos and 4K video but still light enough to be super easy, barely an inconvenience to round trip through the image signal process, neural engines, and GPUs for a ton of stacking, bracketing, analyzing, and rendering. Because that’s the bottleneck. Getting the data through the pipeline in a timely enough fashion. Especially for things like 4K30 Dolby Vision in ProRes.

So, why go 48 megapixels now?

Well, first, 8K. 12 megapixels is enough for 4K video but nowhere nearly enough for the four times bigger 8K video. Four because it doubles both vertical and horizontal resolution. So, going to 48 megapixels on the iPhone 14 means being able to capture 8K video on the iPhone 14, which is what a bunch of bigger glass phones have already been able to do for a while now.

And even though Apple hasn’t really embraced 8K yet on any of their current products, those new ProRes engines on the M1 Pro and M1 Max can tear through it, and presumably the ones on the A15 Bionic already as well. Never mind what may be a major escalation with the anticipated 3 nanometer A16 that’s coming next. There also be an Apple TV that supports 8K as well at some point, and depending on how Apple plays it, it could be super useful for capturing VR content for the dual 4K displays rumored to be on board the upcoming mixed reality headset.

8K sensors also typically make for much better 4K video, thanks to down-sampling. I’m shooting on 5.9K downsampled to 4K right now, for example.

Which brings us to the next reason, pixel-binning. What that means is, taking groups of pixels, like 4, 6, 8, etc. and treating them as a single pixel. In this case, taking groups of 4 pixels and treating them as one, so 48 megapixels become 12 megapixels again. It’s pretty much the same idea as a Retina display, but the other way around.

Now, in some cases, that can be worse, like if you just cut up the same size sensor into even smaller pixels and call it a day. That way you get more smaller pixels, that capture even less light, less data, because you lose more to the increased number of borders in-between.

But, if instead, you take those smaller pixels and make each one ever so slightly bigger again, when you bin the back together, you end up capturing more light, more data than before.

Which is what Kuo’s report says Apple is going to do. With a 4x binned pixel size of 2.5 microns, which would be WAY bigger than the current iPhone 13’s already increased pixel size of 1.9 microns.

So, why bother to pixel bin and not just keep the sensor at 12 megapixel with 2.5 micron pixels? Well, 8K, but also for photos, the advantage to pixel binning is that you also don’t have to do it. You can downsample 8K to 4K, but you can also just keep 8K. Same way you can pixel bin 48 megapixels to 12 megapixels, but you can also just keep the 48 megapixels. And for computational photography, you can stack and bracket both. Another weapon in the war between detail and noise.

Also, it can help in the one giant hole left in Apple’s overall camera system — zoom. The iPhone 13 Pro can do macro, super close is now covered. And it has a 3x punch in, thanks to the new effective 77mm telephoto camera. But 3x is still just 3x and even using Smart HDR to improve digital zoom the way Google uses HDR+-style techniques for Super Res Zoom can only do so much.

But like with Google’s new Pixel 6 Pro, with 48 megapixels plus, you can do more. You can give the people more. Or, more accurately, you can give the zoom more. Maybe.

Samsung, Huawei, and others use periscope cameras for real, high-level optical zoom. Periscope because the lens is too long to be mounted normally so it’s folded sideways across the phone and they use mirrors to point it out the back. And that hasn’t been rumored for Apple until at least the iPhone 14s or iPhone 15 or whatever they call the 2023 model.

But for a real escalation in zoom, Apple would probably have to go to at least 48 megapixels on the telephoto as well as the wide angle. 12 megapixel is still more than fine for the ultra-wide angle. But you really want that tele for 4x or 5x. And current reports suggest Apple is only focusing on 48 megapixels for the main, wide-angle camera this time around.

Now, Apple has always treated the separate cameras as more of an implementation detail. Fusing them together and using them according to need and conditions. So, if you hit 3x on the iPhone 12 Pro, you may get the 77 megapixel f/2.8 in bright light, but in low-light you’ll get the much faster 23 mm f/1.5 wide angle instead. It cares way less about which camera it uses and way more about delivering the best photo it can. So maybe Apple will whip up some wild 3x telephoto, 4-5x wide-angle camera fusion with this type of megapixel enhanced system as well. That would dramatically improve the iPhone camera system… again… and also be in keeping with Apple’s traditional balance between optical and computational.