Categories
Uncategorized

iPhone 12 and the Potential Return of Touch ID

Back in 2017, Apple deleted Touch ID and replaced it with Face ID. They didn’t keep both. They didn’t add Face ID to Touch ID. They went to the brave new world of biometrics and burned the old one down behind them.

But, the ashes had barely even settled before we started getting rumors of Touch ID’s return. It’s resurrection.

Not just in the new iPhone SE, which some people loved precisely because it had Touch ID, but in the flagship iPhones as well.

But, why would Apple kill it just to bring it back to life?

Touch ID

There were fingerprint scanners before Touch ID, of course, but none of them were as elegant, reliable, or secure as what Apple shipped with the iPhone 5s back in 2013.

That’s how Apple typically rolls. They wait. They watch. They identify problems. They prototype solutions. And, if and when they’re happy, they ship.

In Touch ID’s case, Apple went with a capacitive scanner instead of optical, so it was harder to fool with an image or prothetic. Apple converted the fingerprint data to math, transited it through a locked-down hardware channel, stored it in a secure element inside their custom 64-bit chipset, never let it get off-device, and only ever released a yes or no token through the system so individual apps couldn’t ignore the results.

This when, even years later, other vendors were caught leaving full fingerprint photos in world-readable directories on their phones and other fun stuff.

But, Touch ID wasn’t perfect. Far from it.

It didn’t work with gloves on or when your finger was excessively dry or wet, and if you cut or burned your finger, you’d basically have to re-register with a different finger.

And you had to physically touch the sensor for it to scan your fingerprint.

Also, everyone started doing good fingerprint scanners eventually, so where’s the flex in that?

Face ID

There were also face scanners before Face ID, of course, of course, but none of them were as elegant, reliable, or secure as what Apple shipped with the iPhone X back in 2017 either.

Apple didn’t scan the image of your face, they scanned the geometry, so the system couldn’t be fooled by images or even masks, converted it to math, and then used an adversarial neural network to not only compare readings and models, but evolve them over time and challenge them to reduce the chance of future spoofing.

This when other vendors were using scanners that could be fooled by photos and videos or wouldn’t work in low light, much less the dark.

But, surprise surprise, Face ID isn’t perfect either. Not at all.

It doesn’t work with face masks on or if the sun is blinding the camera, or you’re wearing sunglasses that block infra-red light.

And, you have to stick your face in front of the camera for it to scan you.

Also, other companies have now started shipping similar facial geometry sensors as well, so even that flex just isn’t quite so flexy any more.

In-Display

One of the reasons Apple switched from Touch ID to Face ID was because they wanted to get rid of the Home button that housed Touch ID and take the iPhone display corner-to-corner.

They could have left the Touch ID sensor there and made a notch for the bottom to match the one on top. Yeah, let that sink in for a hot second.

They could have moved Touch ID to the power button or to the back of the phone where the Apple logo is.

But, Apple didn’t like any of those options. So we got Face ID instead.

Other vendors, though, also wanted to go edge-to-edge but didn’t have Face ID as an option, so they started experimenting with something else — in-display finger print readers.

Now, sure, many of them already had their version of fingerprint scanners on the back or in the button, so they could have just left them there and gone about their full-screening. But Android phone makers want to flex too.

So, we first got optical in-display fingerprint scanners. Which had all the downsides of traditional optical scanners, including the need to light up the fingerprint to take a picture, and because they’re only taking a picture, they’re easier to fool with pictures and prosthetics. Though they try to use hardware and software countermeasures to mitigate against it.

Next, we got ultrasonic in-display fingerprint scanners. Those use sound waves to build a 3D model instead of a 2D image, making them sightly harder to spoof.

All of the early implementations were kind of terrible though. Slow. unreliable. Error prone.

But, over the last couple of years, they’ve gotten better and better.

Maybe, finally, good enough for Apple to copy?

iPhone 12

Back in January of this year, supply chain exfiltrator extraordinaire, Kuo Ming-Chi claimed that Apple is readying a full screen entry-level iPhone for early 2021 that would have Touch ID not in the display but in the power button. Presumably because a now-traditional capacitive fingerprint scanner in the button would be cheaper than either an in-display scanner or Face ID.

I’ll do a whole video on that phone, a full screen LCD iPhone that would sit somewhere between the classic design of the iPhone SE at $399 and the all-OLED iPhone 12 lineup starting at $649. So, make sure you hit subscribe if you haven’t already.

But this isn’t about that. No, because way back in August of 2019, Kuo also said that Apple was working on an ultra-sonic in-display fingerprint scanner for the iPhone 13 in 2021.

Kuo figured that thickness, sensing area, power efficiency, and lamination would all be solved enough by then for Apple to implement their own in-display patents.

Touch ID wouldn’t be seeking revenge or anything, and killing Face ID the way Face ID killed it, but rather it would be peace in our time — both Touch ID and Face ID living together in perfect harmony.

Mark Gurman and Debby Wu of Bloomberg followed up on that in September of 2019, saying Apple could ship the in-screen Touch ID system as soon as the iPhone 12 in 2020, but that it might slip to the iPhone 13 refresh in 2021 instead.

Also, that it would work on a large portion of the display, which is handy when you don’t have a precise, tactile, physical target like a home button to feel and hit, and that it would work in tandem with Face ID. Which could mean either or, for added convenience, or both, for added security.

Prior to Kuo and Bloomberg, way back in March 2019, Barclays analyst Blayne Curtis also reported that Apple would add ultra-sonic, in-display Touch ID to the 2020 iPhone.

But, given everything that’s gone on with closures in Chinese manufacturing centers and U.S. design and engineering centers this year, though, it seems less likely we’ll get Touch ID back this year and more likely we’ll get it pushed back to next.

Bummer, I know, because it doesn’t look like the need for masks and gloves is going anywhere any time soon.

Categories
Uncategorized

iPad Air 4 Rumor Analysis

Roll-of-the-dice accurate Mac Otakara dropped some rumor bombs this week about a new 4th generation MacBook Air. When you combine them with previous rumors from supply chain exfiltrator extraordinaire, Kuo Ming-Chi, and the sometimes lucid LOvetodream Twitter account over the last couple of months, it provides a compelling vision for what might that next gen iPad Air might be. That, or a tantalizing fantasy.

Read the rest in my weekly column at iMore:

https://www.imore.com/ipad-air-4-making-pro-features-mainstream

Categories
Uncategorized

iPhone 12 and Deleting the ‘Notch’

The notch. The horns. The tab. The cutout. Whatever you want to call it. When Apple announced the iPhone X redesign in September of ought 17, when they killed the Home button, Touch ID and the giant bezels that held and balanced it, when they took the screen corner-to-curved-corner, it was the TrueDepth array — the RGB and infrared cameras, the dot projector, the flood illuminator, the proximity and ambient light sensors, the speaker and microphone that ruined it, that kept in from being truly full. That kept a splinter in our eyes.

Wow, that got dramatic.

But the simple fact is, it’s been three generations and almost as many years, and not only hasn’t the notch gone away, it hasn’t gotten any narrower or shallower. Not one bit. And some people, well, they still hate its breathing guts.

But, could the iPhone 12, the iPhone 13, any future iPhone really, do anything at all to lessen that pain?

The Notch

When Apple wanted to take the modern iPhone design full screen, it meant they had to delete the bezels, the Home button, and Touch ID along with them.

Well, not really. Apple could have moved Touch ID to the back of the phone, where some Android phones have had it for years. The Apple logo being a cool-as-hell place to put it.

That, or the power button. Without a Home button, you’d theoretically have to press the power button to wake the phone anyway, so why not just use it to unlock at the same time?

But Apple had a few other things going on. First, they were getting into AR — augmented reality — and that meant getting depth-capable cameras on the front as well as on the back.

Second, Apple likes to flex. Hard. And by that I mean they like to implement things other companies just don’t have the processing power, integration, budges, patents, or engineering to match. At least not for a couple or few years.

That why iOS 7 had so many damn layers of Gaussian blur effects built into it. Literally no other company had the silicon to copy it.

Same with Face ID. Other companies had done face scanning before, but it was comically, tragically bad. They’d take a photo and compare it to another photo which meant, say it with me, it could be fooled by a photo.

Because Apple had a depth camera, they could scan not for the pixels of your face but for the geometry that made them. The actual structure of your head in the triangle between your eyes and nose, which is way more secure. In all cases but an evil twin, even more secure than Touch ID.

But, yeah, the nose part is why it doesn’t work with a normal mask on. It can’t get enough facial geometry points to reliably authenticate you, and if you remove those points, it’s just not secure enough to use as a means of authentication. But that’s a topic for another video.

So, anyway, Apple went with the TrueDepth array and Face ID. And, yeah, almost three years later and only Google’s been able to duplicate it, and it took even them until last year to do it.

Wicked hard flex.

The forehead

Once all the True Depth components were locked and loaded into the front of the phone, Apple had to figure out how to house them.

They could have given up on a corner-to-corner display, of course. Delete the chin and Touch ID, but keep the forehead.

I mean, a bunch of Android phones had gone real thin up top but couldn’t or wouldn’t fold the display back like Apple was doing at the bottom, so had to keep a little or a lot of a chin. This would have just been the reverse of that.

Google, famously — infamously maybe? — would go on to make the mother of all bathtub notches on the Pixel 3, even before they had a real depth camera and needed one, likely just to dogfood it for other vendors, only to retreat, retreat, retreat back to a forehead when they actually added a real depth camera with the Pixel 4.

But Apple wanted corner-to-corner, and to be able to shove the status bar all up into those corners, to make it look even more expansive.

So, Apple spent another ton of engineering resources carving first the OLED on the iPhone X, then LCD on the iPhone XR around the TrueDepth array with super-ellipses, sub-pixel masking, the whole over-the-to-bit.

In other words, Apple figured if they were going to have a notch — or horns as Craig Federighi referred to them back then — they’d lean all the way into them.

Even use them in the icons the way they used to use the Home screen.

But, when it came time to bring Face ID to the iPad Pro, Apple didn’t go with the notch. They deleted only about half the bezel and kept a partial forehead to house the TrueDepth array. Along with a partial chin and… I dunno… partial jowls? — along the sides to keep everything balanced.

Now, Apple treats the iPhone as portrait mode device, and with a proper, definite top and bottom. The iPad, by contrast, they treat as a device worth rotating up, down, all the way around. So, having a notch would look even more visually disturbing on an iPad than it does on an iPhone.

But, it does show that Apple could have, and still could, flatten out the notch… if they wanted to.

Hole punches

When it came time for Samsung to go full screen, they did something different. They’d tried doing bad facial recognition and good iris scanning but they couldn’t or wouldn’t integrate the two into something more useful. So, they abandoned it for an under-display fingerprint reader of middling quality, at least at first, but that meant they didn’t have anything like a TrueDepth array to work around.

So, Samsung could literally cut the screen just around the single or dual RGB cameras on the front. Like hole punches. And they could get better and better at doing it until the holes were barely any bigger than the cameras.

That let the display go not just corner-to-corner but truly edge-to-edge. So, as much as the notch gives some extra pixels in the corners, the hole punches give even more along the edge and in between the cameras.

They still get in the way of doing anything truly expansive or clean, and they still stare at you, unblinking, like lidless eyes writhed in OLED… just… less.

Apple could go with something similar but, unlike Samsung, they’d have to deal with more than just one or two RGB cameras. They’d have to have cutouts for the dot projector, flood illuminator, and infrared camera as well. Assuming the audio and sensors could be pushed up and maybe even on top, that’s still 4 cutouts instead of 1 or 2.

And… while it would be better in keeping up with the spider-eye pattern on the new Mac Pro, objectively, it just doesn’t seem any better looking than a notch.

Mechanical choochers

To avoid a notch, a hole bunch, or anything that would tarnish the perfection of a true, full screen, some other companies have used mechanical choochers what pop the front-facing cameras up and down or spin them around.

The pros are obvious: Because the camera doesn’t have to be in the screen, you get that true, full screen. Also, if you use the spin-around version rather than the up and down version, you get the benefit of the rear-facing camera system which is typically much, much better than any front-facing one.

The cons are also obvious: Not matter how fast a mechanical choocher chooches, it’s never going to be as fast as a camera that requires zero choochage. Is too a word. Also, mechanical parts are potential points of failure, not just in terms of breaking down but for dust and liquid ingress.

That might be why OnePlus, which had a peek-a-boo camera on the 7 Pro, ditched it for a hole punch on the 8 Pro.

It’s also probably why it’s not even rumored as being on Apple’s radar. Apple, the company that deleted the headphone jack in 2016, replaced the mechanical Home button with a Taptic one the very same year only to delete it entirely the very next year, and are even rumored to be deleting the Lightning port, likely to replace it with a smart connector, in other year or so.

I did a video on that already, hit subscribe so you don’t miss any of this series.

Anyway, if Apple is simplifying the mechanics, they probably won’t be adding any mechanisms like choochers, at least not any time soon.

In-display

On the experimental if not yet mainstream side, we also have companies like Oppo and Xiaomi prototyping in-display cameras. In other words, cameras that don’t have the screen cut away around them — at all — but rather have the screen paved right over them.

When you’re not using the camera, the screen just looks like a big, beautiful, full-on screen. When you do use them, though, the screen becomes transparent and you can shoot right through them.

I think it’s fair to say this stuff is all in the prototype stage still, the concept stage even, and comes with enough limitations and constraints that it’s easy to understand why it hasn’t even made its way into Samsung phones yet. And Samsung is willing to try almost anything even remotely commercially viable at scale.

There have been rumors of Apple testing under-display versions of Touch ID.

Both optical and ultra-sonic versions of those have been shipping for a few years now. Most major Android vendors have shipped them, including Samsung.

The early ones were… rough… but they’ve been getting better each generation.

Would Apple double back, ditch Face ID, go full screen, and put Touch ID under it?

That would certainly solve the mask problem but bring back the glove problem. So, personally, I hope Apple would go for both.

And having them both under-display, at least eventually, sounds like the most Apple of any of these solutions. At least eventually.

Notch-less

Before Apple goes fully notchless, there are some rumors saying they’ll go notch-less.

Moving some of the components like the speaker system up so that other components like the infrared system can be condensed closer together.

Still a notch but not as much of a notch, if the rumors prove true.

Categories
Uncategorized

How To Protect Your iPhone Privacy and Security

Out of the box, your iPhone tries to strike a balance between security and safety, privacy and convenience.

If you have location on, especially for social apps and photos, they can be used to find you and others in case of emergency. But they can also be used to place you and others in situations of tyranny.

Videos can help document and prosecute abusers but can also be used to identify and target people standing up to abuse.

Online backup can save your data even if your iPhone is taken on damaged, but can also expose your data to extralegal seizure from servers.

Notifications and biometrics can help you get things done faster but can also let others get to your things faster.

If you’re at home, you probably want to to keep your iPhone set to maximum convenience. Also, if you’re a bad actor of any kind, leave it like that as well. Please and thank you.

But, if you’re putting yourself in harms way, if you’re on the front lines as press or health care or to protect the rights, civil liberties, and simple dignity of your family, community, and country, then you should absolutely lock your iPhone down. Hard.

In situations where you fear for your security, the best thing to do is get a second, burner phone and lock that down. An older iPhone or iPhone SE works great. If you have the time or money.

Either way, here’s how to protect your data. In the extreme.

Passcode

Go into Settings > Touch ID or Face ID and Passcode, and make sure you have a 6-digit numeric at the very least or, better yet, alphanumeric passcode. 4-digit passcodes are far too easy for a bad actor to break into. I’ll put a link in the description for how to choose a good one.

Also, turn on Erase Data after 10 failed attempts. It sounds scary but it’s extremely difficult to do by accident, even if you have kids, but even more difficult for anyone else to get around.

Lock screen

While you’re in there, turn off Allow Access When Locked for everything. Widgets, Notifications, Control Center, Message Replies, everything.

That way people can’t see who’s messaging you or what apps you’re using, or do things like put your iPhone into AirPlane mode so you can’t trace it later.

Then jump into Settings > Siri and turn off Allow Siri When Locked.

That way they can’t use voice to access your data or use your apps while you’re iPhone is still locked.

Basically, no passcode, no joy.

You’ll still be able to take photos and videos and answer calls quickly and easily, but everything else will require your unlocking your iPhone first.

Stop Send as SMS

Go to Settings > Messages and turn off Send as SMS.

iMessage is end-to-end encrypted. Text messages are not. So, you want to prevent a private iMessage from unintentionally going out over an insecure text message channel where your carrier or anyone who exploits them can read it. By error, by accident, for any reason.

If you need to communicate with people who don’t have iPhones or iMessage, download, install, and get everyone on Signal. Not Facebook Messenger or WhatsApp. Signal.

It’s a hassle, I know, but it’s secure.

Break iCloud Backup

Go to Settings > Apple ID > iCloud, and turn off messages, photos, and any other backups you want to prevent.

For most people, most of the time, losing messages from loved ones or photos of family is a bigger risk and worse outcome than someone hacking them or having them subject to extralegal seizure. That’s why iCloud backup is fail safe, not fail secure.

But, if you’re in a situation where you absolutely can’t risk anyone else getting your data from a server — which is just a fancy word for someone else’s computer — shut all the online off and backup locally and with encryption via iTunes or the Finder app. I’ll leave a link for how to do that in the description as well.

Lock Down Local

If at any point you think you’re in imminent danger, squeeze it with your hand over the power and volume buttons. That puts it into SOS mode.

Technically, it puts your iPhone into an unauthorized state. Into PreBoard, where you have to enter your passcode or password to decrypt the device and regain access to Springboard, which is your apps and the data contained within them.

Also, if for any reason you’ve left biometrics on, it prevents that from working until you’ve authenticated as well.

If you keep squeezing, it’ll also sound an audio alarm and start calling emergency services for you.

If that’s helpful to you under whatever circumstances you find yourself in the moment, keep squeezing. Otherwise, let go and your iPhone is as locked as it can get.

Kill Significant Locations

Go into Settings > Privacy > Location and turn off everything you can. Especially Camera if you don’t want your photos GeoTagged — in other words, stamped with the coordinates of where they were taken.

Again, it can be useful for finding people but ca also be used to place people.

While you’re there, go to Location Services > System Services > Significant Locations, and shut that down.

When your iPhone tells you to leave now for home or work or wherever to make it in time, it’s because it’s learned where and when you typically over time. It’s end-to-end encrypted and Apple can’t read it, but if you hate even the idea of having that data stored on your phone, turn it off.

Slow Your Snaps

Go to Settings > Camera and turn off Live Photos if you don’t want your photos also capturing a few seconds of video and audio and as well.

You might want that, but if you don’t, turn it off.

Also, on the iPhone 11, Capture Outside Frame can reveal what’s just outside your photo as well. Again, you might want that, but if you don’t, end it.

Think twice

Again, if you’re at home. If you’re using your iPhone to keep in touch with those you love, to organize, to share information, to post to Twitter, and Facebook, and TikTok so that people can see and hear what’s happening and not have to take any pundit or politician or giant news corps word for it, then you probably want to balance your security and convenience, your privacy and safety.

Otherwise, lock things down as needed, when needed.

Categories
Uncategorized

ARKit Mug Demo

Experimental ARKit version of the brand new RR mug, in case you want to try it out around your favorite drinking spots. Looks terrific. Doesn’t hold beverage anywhere nearly as well as the real thing.

You can try it out here:

https://standard.tv/collections/rene-ritchie/products/rene-ritchie-logo-mug

Categories
Uncategorized

Apple Watch 6 — Mental Health Features?

Some of the rumors swirling around the next version of watchOS this June or the next Apple Watch this fall — or both — concern mental health.

Since its inception, the Apple Watch has been getting better and better at helping with physical health. From preventative fitness like activity rings and workouts, to warning systems like low, high, and irregular heartbeats and ECG, to emergency features like international SOS and fall detection.

There have been crossover features that offer some benefit to both mind and body, like the breathing app, and some behavioral assistance — the nature of the Apple Watch makes it not just impractical but impossible to get lost in social networks or games for minutes or hours on end, for example.

But this is the first time we’re hearing about features specifically for mental health, and, as usual, I’m really curious how that may play out.

So, I asked one of the smartest people I know, who also happens to be a licensed psychotherapist, Georgia Dow, to help me figure it out.

You can find Georgia's work at:

https://www.anxiety-videos.com

https://www.westmounttherapy.com

Categories
Uncategorized

Apple Returns Price of RAM Upgrades on Entry-Level 13″ MacBook Pro to ‘Normal’

MacBook Pro RAM upgrade pricing is now exactly the same as MacBook Air RAM upgrade pricing. So, my very strong guess is that it was somehow “discounted” at launch and has now returned to “normal”.

Nothing to do with component cost/availability, or anything else.

(Apple should honor previous orders at the previous pricing.)

https://www.macrumors.com/2020/05/30/13-inch-macbook-pro-ram-upgrade-doubled/

Categories
Uncategorized

New Mug. Free month of Nebula.

My brand new store has just gone live and there's a brand new mug to go with it.

Yeah, a mug. Because coffee. Also because I really wanted it to be something practical, something useful, something that could bring comfort right now. Even if you choose to fill it with tea or hot chocolate instead. There’s no wrong beverage here.

To make it even better, anyone who gets the mug also gets a month of Nebula, absolutely free. (I you already subscribe to Nebula, you’ll get the free month added to your existing subscription.)

https://standard.tv/reneritchie

Categories
Uncategorized

iPhone 12 Camera: Mega Anamorphic Zoom

Rumors are making the rounds that the iPhone 12 or iPhone 13 — one of the upcoming iPhones — will finally increase the megapixel count on the camera system to a number greater than 12. With 64 being the flavor of the week.

I’m doing a whole series on the future of the iPhone. So, hit subscribe to make sure you don’t miss any of it.

Now, megapixels are all about quantity. They don’t always help, and can even hurt, image quality. So what is Apple even thinking?

Well…

Megapixels

Camera sensors are made up of pixels, just like screens are. But, instead of displaying images to you, these are all about capturing them for you.

A megapixel is just a million pixels. In other words, when you multiply the horizontal number pixels by the vertical number of pixels, the result is higher than a million.

For 12 megapixels, the current iPhone camera spec, it’s greater than 12 million.4032 x 3024 = 12,192,768 to be specific.

Now, more megapixels aren’t necessarily better. Because you also have to factor in the size of the sensor.

See, the more you chop up a sensor, the greater the number of pixels — and megapixels — you get, but the smaller each of those pixels becomes. And the smaller the pixel, the worse it performs when it comes to capturing light. Which mean, the worse photos it takes.

In other words, they’re super high resolution, they’re just super low quality at the same time. You get more but the more sucks.

What you really want to do to improve image quality is increase the sensor size, which increases the pixel size. Not pixel count. Pixel size. That way, each pixel can take in more light. That’s the kind of more that sucks less.

Within reason. End of the day, you want the best balance possible given the size constraints of the device, and the distortions and aberrations you can get if you push anything too far in any one direction.

And, for the last few years, Apple figured the best balance was holding the count at 12 megapixels while slowly improving just about everything else.

Pixel Binning

Now, some other phone makers have been really aggressive about increasing megapixel counts instead, especially in markets where they feel people care more about having the highest possible number on a spec sheet.

Recently, Samsung even put a 108 megapixel sensor in the Galaxy S20 Ultra. To keep those pixels from being too tiny, Samsung is using a fairly massive sensor. But, they’re also using pixel binning.

That’s a process where a group of pixels are averaged together to try and get a result that’s better than the sum of the individual parts. In this specific case, binning 9 pixels down to 1, or 108 megapixels back down to 12.

Same with things like quad Bayar filters, which means the pixels are color filters — red, green, blue, green — in 2 by 2 grids.

You can shoot the full 108 megapixels if you really want to, and in extremely bright, outdoor, well lit situations, you’ll get usable results, but for indoors or lower light, that binning and that filtering will kick in, hard.

Image stacking

What Apple — and for that matter, Google — have been doing instead is something called image stacking.

Instead of shooting a ton of pixels and binning them down, the iPhone shoots a ton of exposures in rapid succession and then the image signal processor, the ISP on the A-series chipset, churns through them, pixel by pixel, and picks the best parts of each and every exposure, for each and every pixel.

It started with simple high dynamic range — HDR. Take a photo, take an over exposure, take an under exposure, combine them together and you get a photo with more detail in both the highlights and shadows.

But now, thanks to machine learning, the ISP can detect textures like clouds and ropes, even faces and parts of faces, and then process everything slightly differently to render out the best possible results for each and every element in the scene.

The advantage to pixel binning is speed. It’s doing less work on more data. The ISP is fast, but it’s still doing way more work but on less data.

It’s not hard to imagine that if Apple did both, bin down 4 to 1, from 64 to 16, or something like that, and then apply Smart HDR staking, semantic rendering, tonal mapping, and all the rest of the machine learning, they could potentially get even better results faster.

Zoom

The second part of the iPhone camera rumors involves zoom, which is something I’ve been complaining about for a while.

Huawei and Samsung have added periscopes to their camera systems to greatly increase optical zoom. Basically, because phones just aren’t think enough for traditional zoom lenses, inside of having them stick half an inch out of the phone, they lay them on their sides, trading depth for width. Then they use mirrors to redirect the light. Like a periscope.

Google, by contrast, took their HDR+ image stacking technology and souped it up to make better digital zoom. Pretty much using the minute movements of the phone while taking a photo to capture extra data, and then computing the hell out of it to make the zoom look way sharper, and the colors way more accurate, then anything the sensor could capture by itself.

Apple could do something similar and adapt Smart HDR into Smart Zoom for the wide angle camera. Then, they could use a better telephoto, not even a periscope because that’s probably overkill when it comes to return on optics with current technology, and add Smart Zoom on top of it. That could work out to 3x optical, much better 10x digital on the iPhone 12. Maybe even 5x optical, even good 20x or more digital on the iPhone 13.

That way you could capture your kids playing sports, your pets in the park, all the sights when you’re traveling — you know, when we can do all that stuff again. One day.

And like I’ve said numerous times before, it’s an area where Apple is currently behind, and not just other camera phones, but cameras in general.

Anamorphic

The last of the most recent iPhone camera rumors involves a completely new kind of camera. Not a 2 megapixel macro, sorry Marques!

No, an anamorphic lens.

Now, there are historical reasons for why anamorphic lenses exit but it’s safe to say they remain in use today primarily for the way they look. They’re not round and so the bokeh they capture — the blurry lights in the background — aren’t round either.

And the lens flares you can get… well, if you’ve ever seen JJ Abrams’ Star Trek movie, you know all about the lens flares you can get with anamorphic.

So, if anamorphic lenses in the modern age are less about practicality and more about artistry, why would Apple of all companies include one?

Maybe, maybe I say ever so trepidatiously, so could an anamorphic camera. Think about it, if Apple of all companies is planning or even considering an anamorphic lens, it would be precisely for the artistry.

I mean, just close your eyes, can’t you just imagine the keynote demo?

Maybe it could be done computationally as well, and on the main wide angle camera, but either way, it’d be something incredible fun to see.

Categories
Uncategorized

iPad mini 6 — Will Apple Make it Pro?

The original iPad mini was never something Steve Jobs wanted. When tiny Android tablets started coming out, Steve snarked that you’d have to file down your fingers just to use it.

I’ve talked about similar stories before. Hit subscribe so you don’t miss any of them.

Then, Eddy Cue, senior Vice President of internet services, read an article about all the reasons a smaller iPad made the same kind of sense, he picked up one of the Android half-tabs, came to the same opinion, and convinced Steve to do it.

That, by the way, is one of the biggest differences between Steve Jobs and almost everyone who tries to be the next Steve Jobs — he hired smart people and then, more times than none, he even listened to them.

That’s how Apple got the iPhone and not the Fire Phone.

But, this isn’t about Steve Jobs.

This… is about the iPad mini and where it goes next.

https://www.imore.com/ipad-mini-6-or-ipad-pro-mini-future-apples-tiny-tablet