Categories
Uncategorized

I Want the HomePod Max Back!

Just over a year ago Apple introduced the HomePod mini. A cute, now colorful, smaller than Shoryuken-sized ball of Siri-served music, info, and HomeKit fun. But at the same time they MDK’d the OG HomePod Biggie. Killed it just to watch it die. And, swear to Jobs… I want it back.

I’m Rene Ritchie. YouTube says 70% of you watching still aren’t subscribed, so hit that button and bell so we can make this the absolute best community in tech, together!

Look, I want the original HomePod back. I’m not going to go so far as to say deleting it was a mistake, but I’m also not not going to go so far as to say deleting it was a mistake. I mean, have you seen how hard they are to get on the aftermarket these days? There were problems with it, for sure, both in terms of capabilities and market fit, and I’ll get to all of that in a minute, but there’s also still nothing that can take its place, that can fill that HomePod biggie sized gap. Just rumors, which I’ll also get to in a minute. Well, second minute, because time space. Whatever. Context time.

TL;DR: Apple wanted to take another swing at the home speaker market. iPod Hi-Fi never worked out, but the problem space still existed — Apple made all these personal music products but most people still didn’t have a decent sound system in every room, much less one that was easy to deploy, simple to use, and sounded fucking fantastic. So, they turned to computational audio. It would strip stereo music streams down to their consistent parts, Rick Beato every single note out of them, and then recombine them into a 3-dimensional spacial audio sound stage, constantly adjusted and tuned by bouncing sound waves off of walls and large objects.

Basically, this little bigger-on-the-inside box you could drop anywhere in a room and it would sound great — totally fill that room and blast that base — no matter where it was, and no matter where you were, in the room. Living room, office, bedroom, cabin, kitchen, whatever, wherever. And because it had no display, they figured Siri would be the best way to control it. I mean, I got to talk to Phil Schiller about it at launch, and you could just feel how much he deeply, truly loved it.

And, in some ways, it worked out terrifically well. Siri was still all shades of inconsistent, but the setup was stone stump simple, and the sound was unbelievable, punching way, way far above its weight.

But it also landed with a huge, gravity well of a problem: It had taken Apple like 5 years to develop it, and in that time Amazon had crapped out a ton of super cheap, super tin-sounding Alexa home assistants in a Pringle can, totally commoditizing the market. So, HomePod ended up looking not so much like a comparatively cheap premium speaker with voice control, and more like an incredibly expensive Home assistant that just happened to play great sounding music. Just total market inversion. And unlike almost all of Apple’s other I-can’t-believe-people-pay-this-much-of-a-premium-for-it-products, the general market just didn’t see any value beyond the cost. Especially since Apple chose to alienate a whole entire chunk of the market by not including line-in or Bluetooth.

And because biz pubs that have been covering Apple for decades, but still don’t at all understand Apple, and have built up these kinda toxic love hate relationships, saw an opportunity, the knives came out.

Yes, Apple absolutely brought a premium voice-controlled speaker to a commodity home assistant fight, but even Apple Watch was being covered as a failure and Amazon Echo as the next big thing, when Watch was killing it by the actual numbers. And sometimes, narratives matter.

Ok, I suck at TL;DR, but the bottom line is, Apple shifted to the HomePod mini and shafted the HomePod Biggie. And the mini is great for what it does, but leaves a huge gaping gap for what it doesn’t do — fill the room and blast that base. Even a stereo pair of minis do not a biggie make.

And, turns out, bitterest of ironies, the vast majority of people — the 90% of the world not on tech YouTube or tech Twitter — barely even use the smart part of all the speakers, including Alexa, even Google. They get them, particularly over the holidays, plug them in, play around a bit, and then promptly settle on just exactly what we all secretly in our hearts know they settle on: Basic clock radio stuff. Set a timer, play that tune. Things the HomePod, even the OG, did… does just fine.

Which is why I want it back. Kinda. What I really want is it better. Same computational audio, speaker, and mic capabilities, but with Bluetooth and Line In so that it’s something the mainstream can still drop in, but power users can also add in. I mean, if AirPods Max can handle Bluetooth better than just about anything else, zero excuses for HomePod Max. Because, you just know that’s what the new version has to be called by now, right? That, and a better price. I hate when YouTubers and bloggers say something is too expensive, because they’ve almost never actually studied the cost structure of bringing that thing to market. But it’s absolutely fair to say you don’t see the value in it or you wouldn’t pay that much for it, or like me and my dream Lambo, it’s just out of your price range. Some products just can’t be made affordably for the size of their mainstream market, and that’s fine. But I think, given the time that’s passed and the growth the market has seen, Apple could absolutely fine tune a HomePod Max maybe not for the mainstream, but for the premium segment. All recent audio product pricing to the contrary, sure… It just has to hit that sweeter than sweet spot where the value is seen as greater than the price. You know, the one Apple’s usually so damn good at hitting so damn always.

Now, there are rumors of a HomePod 2.0 and have been for a while. But there are also rumors of HomePod variants. Not to the Alligator Loki extreme, of course, alas, but of one that includes an iPad-style display and Center Stage FaceTime capabilities… a HomePad that’s better suited to things like following cooking videos in the kitchen. Or, whatever it is you choose to do in the privacy of your own bedroom. I’m not the Judge Dredd of you! I’ve always just wanted a HomePod that was set up to dock with an iPad, so they’d work together hands-free, but you could still peel the iPad off to go hands-on. Like the Magic Keyboard, but Smart Connecting to a Speaker instead. But I realize that may just be way too toaster fridge for Apple, even in the kitchen. Or bedroom. Stop it!

Also, mother of all finally’s, an Apple TV + HomePod hybrid. A TheaterPod that would provide Dolby Atmos and Spatial Audio for every living room, bedroom, or media room. Like the Apple VR headset, but far field instead of nearfield, for the whole family and not just ourselves. I’d love to see that as well, because I’m currently using two HomePod Biggies in connected mode as my home theater. I replaced a Sonos Home Theater system with them and I’ve never been happier. But I could be, with a proper home theater from Apple, especially — mega huge bonus points — if we can toss some HomePods mini or 2.0 around the room as well, and they just U1-chip spatial position themselves to provide truly immersive, not just room-filling, but room flooding home theater experiences.

Categories
Uncategorized

Apple’s 2022 Mac Roadmap — Revealed!

New M2 MacBook Air. M2 MacBook Pro. Dual M1 Max iMac Pro. M1 Max Mac mini Pro. Quad M1 Max Mac Pro. And maybe, just maybe, a new Xeon AMD Mac Pro. Wait what?!

I’m Rene Ritchie. YouTube says 70% of you watching still aren’t subscribed, so hit that button and bell so we can make this the absolute best community in tech!

I’m going to get to the upcoming ultra low power Macs — the next generation in a minute, but just a reminder: Nothing is official or CONFIRMED until Apple announces it on stage or drops a press release on the net. They’re a logical company that sticks to pattern right up until they don’t, so a lot of this really makes the kind of sense that does, but if 2020 and 2020 Jr have taught us anything, it’s that things can and will change, especially schedules… even in 2020… 2.

First up, a new MacBook Air with a sleeker, slimmer redesign, off-white bezels, and all the 24-inch iMac inspired taste the rainbow of skittle pop flavors. And I’m not even going to lie, I’m so beyond hype about this. Mini-LED display, because larger sized OLED panels are still problematic in ways Apple finds even more irksome than blooming or jittering, and fanless M2 silicon inside. Which should mean slightly better performance power cores, way better efficiency cores, way, way better graphics cores, maybe some hot media engine action, but… probably not much longer battery life. I’m guessing Apple will spend that bigger efficiency budget on a smaller casing again.

But, that’s what the new 2-port MacBook Pro will be for. Almost as portable as the MacBook Air, not quite as powerful as the real-deal MacBooks Pro, it may not get the colors, or at least the same colors, stick to a black bezel, go full-on mini-LED and M2, but in an enclosure that’s slightly roomier, with batteries that are slightly bigger, and with an active cooling system that’ll let it run slightly hotter, for slightly longer. Maybe with even better battery life, unless Apple shrink-wraps a new casing around it as well. Either way, think of it less as a MacBook Pro Air and more of a MacBook Air Pro.

As to whether or not Apple will put an M2 in the current M1 Mac mini, it’s legit hard to say. The entry-level mini has already suffered through a decade of neglect, including a couple of years from 2018 to 2020 where Apple blipped it completely out of existence. Gave it’s room to a pricier, space grayer model and everything. I’m a firm believer that Apple should update everything, every year. I know some people hate that. They want Apple to only ever make something when they’re ready to buy and then not make anything ever again until they personally are ready to buy again, but that’s so bleak. Apple should have piping fresh Macs ready every year, because any given year, someone is going to buy, and everyone deserves the most recent, up-to-date, longest lasting device they can get any time they want to buy.

Ditto the 24-inch M1 iMac. I can’t see Apple doing a big update to the entry-level all-in one again so soon, not since it only just went all-new. But I’d love it if Apple refreshed it with M2. Just, yeah, M2 all the things. Same way they used to spec update Intel… whenever Intel could actually ship a chip… Do that with the Macs. Only trouble spot I see is M2 availability. iPhone will always get priority on any node, new or old. But then supply will go to the new machines, like the MacBook Air and the MagSafe iPad Pro. Only then, only if there’s enough yield, and it fits within the bill of goods, will Apple start looking at refreshes. And normally, I’d push hard for it. But, third year in a row, I figure nothing will be normal about 2022 either.

Now, before we get to Xeon, there are some ultra high performance Apple Silicon Macs to talk about.

The 27-inch iMac Pro leads that particular pack. And yeah, the latest rumors have moved away from 32-inch and settled on 27-inch, with matching stand-alone Apple Cinema displays to complement them. I just did a whole entire video on those, link in the description right below the like button. Thinner body, thinner bezels, maybe just silver and space gray but maybe more colors as well, just not those classic Apple logo colors like the 24-inch. Mini-LED display with ProMotion up to 120Hz adaptive refresh and, fingers crossed, a nano-texture option. I mean, I’m going to go glorious gloss, but I feel for those who can’t. MagSafe and more ports, as all pro machines should. And then M1 Pro or M1 Max inside, just like the recently unleashed new MacBooks Pro. Yes, MacBooks, it a West Wing thing, relax about it. And since Apple silicon runs so much cooler than intel, maybe, just maybe, a dual M1 Max option to really kick the iMac up a notch.

And there’s rumored to be a new Mac mini Pro to go right along with it. Or instead of it. Not an all-in-one, but a all-in-none. None in one? You know what I mean. Smaller enclosure, maybe with a polycarbonate top, maybe with some iMac-matching colors splashed in. MagSafe and even more ports. And M1 Pro and M1 Max inside. Maybe a dual M1 Max option as well, depending on how much smaller Apple makes the enclosure, and how much the resulting thermals can really handle. Then just attach it to one of the hot new 24-inch, 27-inch, or 32-inch new displays, or an old broken one, and forget letting it rip. Let it roar. Because I sincerely think this’ll be the Mac mini every pro has really been waiting for.

Will there be M2 Max MacBook Pros in 2022? Never say never, but I’d bet against it at this point. Apple only just pushed out the M1 Pro and M1 Max, almost a full year after M1, so I wouldn’t expect M2 Pro and M2 Max until early 2023… or maybe never. And by that I mean, Apple could go straight to M3 Pro and M3 Max instead. Because Apple doesn’t always make extended or pro versions of every silicon generation. There was an A10X, an A12X and Z, and an A14X in the form of M1, but there was no A11X or A13X, so it’s not hard to imagine Apple might apply that same strategy here. Especially since it’ll let them push out even more impressive updates, thanks to A16 and M3 likely being on TSMCs next-generation 4 or 3 nanometer nodes. Those… aren’t actual sizes. They’re just marketing terms. But they’re smaller, which will mean better efficiency and performance, and a bunch more transistors for Apple’s silicon team to play around with.

Then, the Mac Pro. First, a smaller version with M1 Max, Dual M1 Max, and… hold onto your butts, Quad M1 Max options. That’s the kind of space and thermal overhead Apple will save from ripping out those blast furnaces and associated hellicarrier fans with Intel and AMD logos on them. Jury’s out on how modular it’ll remain with Apple Silicon and Unified Memory. Apple’s done off-package memory for iOS devices before, so I’m ruling nothing out, but it might also take another generation or two before Apple fully matures their next-generation expansion pack system. So, while even the idea of 4x M1 Max pretty much lights my brainstem on fire, for those who really still need a more traditional system…

Yeah, Intel Xeon and AMD RDNA Mac Pro. And I’ve been hearing whispers of this — one final big metal upgrade — since 2020, when Apple was pushing out all of their other last calls on the Intel line up. But, as time went on, Intel delays mounted, and shipping chips had little to nothing to offer for Apple’s set enclosure sizes, I kinda figured the odds we’d ever see it were getting less and less likely. Firefly reboot less likely. But now it sounds like Intel might just have it’s Xeons back in order, AMD is still pushing extra try hard, and Apple could spin out one last traditional Mac Pro to tide over power users and studios that still depend on traditional x86 software, especially niche and in-house software that needs even more time to make the Apple Silicon Switch. Basically, Apple’s going to have the motorcycles. The cars. Even the SUVs and trucks. All converted over, all by Tim Cook’s 2-year deadline. But they’re still going to keep one Intel bulldozer out back, at least for another year or two, until Apple Silicon scales all the way to… forget trucks and dozers. To Optimus Prime.

Categories
Uncategorized

The Untold History of iPhone

January 9, 2007. San Francisco California. Mere minutes before he put sneaker to stage for the biggest keynote in history, Steve Jobs, Apple’s co-founder, called his team together and told them to remember this moment. This one, last moment… before the iPhone. Because everything was about to change.

Except… it almost didn’t. The original iPhone almost never happened. Like at all. Because Steve Jobs never intended for Apple to make the iPhone. He was just having dinner one day, with his wife, her friend, and her friend’s husband, who just happened to work at Microsoft, Apple’s biggest rival. And dude was going on and on about how Bill Gates and company were about to revolutionize the whole entire world with a breakthrough new product — the tablet PC, digital pen and paper. And Steve went from eye-rolling to full-on kill mode… so fast.

See, there are few motivators in this world better than competition… but spite is one of them. And there was no way in hell Steve Jobs was going to let Microsoft — janky, tasteless, Microsoft — redefine the future of ultra-personal computing. Not on his watch.

So, Apple began work on what would eventually become… the iPad. Yes, the iPad. But stick with me here! Because, Shamalan-style plot twist… at the same exact time, the smartphone revolution was beginning. Handspring and Palm. BlackBerry. Nokia. It was still early days and they were all equal parts tantalizing and terrible. And to Steve Jobs and his team, they all kinda sucked in terms of user experience. But here’s the thing — It was also clearly an existential threat, like meteors-to-dinosaurs or, you know, Facebook-to-civilization.

Not to a threat the Mac, not for a long time. But an extinction level event for Apple’s other big business — the iPod.

A thousand songs in your pocket was cool, and Apple was flat out dominating the MP3 market. But a thousand songs on your phone… that was going to be way, way cooler, and basically game over for the iPod. And Steve Jobs… well, Steve was never one to mistake the company’s products for its business. He’d seen what fear and protectionism was doing to Microsoft. And if anyone was going to kill the iPod, it was damn well going to be Apple.

So, they ran a proof-of-concept with Motorola called the ROCKR. It was a way to test not just iTunes on a phone, but to learn more about making phones. And at the same time, they pivoted the tablet project — that Microsoft-spiting tablet project — into a full-on Apple phone.

It was codenamed Purple Experience Project, PEP, or more commonly, just Purple. And the dream was for it to be based on this multitouch technology Apple had acquired from FingerWorks for the tablet project. But that was being prototyped as a massive, table-sized rig, nothing even close to what could fit in a pocket — to what Steve could actually sell.

So, he decided to split the team up. Tony Fadell, who’d been working on the iPod, would lead Purple 1, or P1, an iPod-based phone with a clickwheel interface. I kid you not! Something Apple could bring far more rapidly to market and buy time for Purple 2, or P2, led by Scott Forstall who’d been working on OS X, and would follow up with the multitouch Mac phone.

Secrecy for Purple was dialed up to 11. To 111. Steve insisted on disclosing each and every new addition to the team personally, something made hugely complicated by his growing health issues. So, managers had to get creative. Running to secured rooms to look at whiteboards, back to rooms filled with undisclosed new recruits, basically the worst MrBeast challenge ever, just doing their best to describe what needed to be implemented and how.

But then something super interesting began to happen: As much as there was an external rivalry between Apple and Microsoft, and, unbeknownst to them yet, Palm and BlackBerry. There was a growing internal rivalry between P1 and P2.

Fadell’s team really wanted to finish and ship fast, but Forstall’s team began beating them to key milestones like SMS, the texting system used on all mobile phones. The nail in P1’s coffin though, was the multitouch interface that P2 was developing, especially human interface designer Bas Ording’s inertial scrolling and rubber banding. Lists would slow down the way a spinner would in real life, and wouldn’t just stop cold or feel stuck, but bounce back at the ends. A type of digital faux physics that didn’t just make P2 feel better and more instinctive to use, but way more fun to use. When Steve saw that, when he realized that, it was second purple to the right, and straight on till keynote.

It was this… perfect combination of everything Apple had been learning about miniaturization from the iPod, the media content and syncing they’d built up over the last few years for iTunes, the foundations of NExT and OS X they’d spent so much time transitioning to, the growth of mobile carriers and especially mobile data, and AT&T, then Cingular’s, rabid desperation to catch up to Verizon… Verizon who’d been offered the Apple phone before, and just hard passed. The hardest of passes.

But that let Apple keep P2 secret even from the carriers, even from Cingular, and, more critically, free from their meddling and molestation, something that had frustrated and compromised the attempts of literally every other phone maker.

There were real problems though. The keyboard, for one. Palm, Nokia, and Blackberry had been using full-on hardware keyboards, which were terrifically tactile but to Apple, maddeningly immutable. A software keyboard that could change not just over time but from app to app and experience to experience was going to be critical to Steve’s marketing, but it was just a complete failure at the time.

So, Forstall stopped everything, called the team together, and had everyone go off and brainstorm a way to solve for the keyboard. Ken Kocienda won that derby a few weeks later with an implementation that would change the tap target size of the individual characters depending on what you were typing — larger for more likely combinations, way smaller for less likely. Also, auto-correct, which would try to fix anything that didn’t come out right… even if it often made things… just more wrong. Ducking hilarious. But it took so long to nail the keyboard there was no time left to work on advanced text functions like… oh… copy and paste.

Another tentpole was going to be the first real web browser on mobile. Don Melton had led the Alexander Project, the one that’d forked KDE’s KHMTL rendering engine and Konquerer browser from Linux into WebKit and Safari for OS X. It had the advantage of a codebase that was just ridiculously small. So small… it could fit onto a phone. Richard Williamson was in charge of bringing MobileSafari to P2, but this time it didn’t go badly… it went way to well. So well, Apple never even bothered to make a WAP browser. One of those stripped down, barely functional little wireless access protocol — aka baby web browsers — every other phone was using at the time. But, too late, Apple discovered they needed WAP to support MMS. The multimedia version of SMS. You know, the protocol that let you display things like pictures in text message threads. So… they’d have to ship without it.

There were bigger issues ahead of them as well, like whether to build the interface with AppKit, the traditional frameworks on the Mac, on WebKit, and go all-in on Web technologies, or to create something new. They’d eventually settle on something new. UIKit. But there wasn’t any time to even think about an SDK, or software development kit, at that point anyway. Henry Lamireux’s entire frameworks team was already running a marathon of sprints. Forstall was even hacking together basic controls for views and tables. But, they had planned to ship some of the built-in apps using web technologies, specifically the kind they’d debuted as Dashboard widgets on the Mac. Apps like Weather. But the performance was just terrible, so they had to switch them over to native code. And that pretty much settled that. At least at Apple. The dissenters would later take their hardware keyboards and web-based interfaces to Palm for webOS and the Pre. But that’s a whole separate secret origin spin-off!

Steve would sometimes change his mind, like deciding he hated the split screen email view Nitin Ganatra’s App team had been working on, so that got yoted. But he was also brokering deals with Google to add even more apps. Their then-CEO, Eric Schmidt was on Apple’s board and… kinda… sorta.. forgot to ever tell Steve they were working on Android at the same time. But that story of intrigue and double betrayal is for another video. Let me know if you want to see it! Either way, Apple would have access to Google’s services, like Maps and soon, YouTube as well. But Apple would have to design and build their own apps and interfaces for those services from scratch, which added to the already staggering load.

It wasn’t too bad when Forstall was around. He had this… singular talent, this ability to just know which… 3 out of a hundred of any particular design or implementation Steve would like enough to make a final call on. But whenever Forstall was doing something else and not around to effectively pre cognate for a designer or engineer, the iteration cycles on which specific shade of blue or what exact texture could be near-endless. Because Steve didn’t want to iPhone to feel antiseptic. He believed photorealism not only made tables and lists less… naked and more visually interesting, but more relatable and distinguishable to actual people. The human part of human interface.

Industrial Design had to face some grim realties as well. Jony Ive’s platonic ideal aluminum slab was being cut off at the knees by a plastic kilt just so Bob Mansfield’s hardware team could ensure there’d be enough radio-frequency transparency for the GSM voice and 2G EDGE data signals, and 802.11b.g Wi-Fi, and Bluetooth 2.0 EDR, to pass through.

At least there was no Verizon to worry about yet, so no need to support their very different CDMA and EVDO networks. Not until the iPhone 4. But there also wasn’t even 3G UTMS/HSPA data, AT&T’s faster, better, longer, stronger network. But then, AT&T’s 2G network wasn’t in any shape to support what the iPhone would become, much less their still maturing 3G network. Something that would just be the subject of misery until long after the iPhone went on Verizon as well. Also no GPS, just Wi-Fi router mapping and cell tower triangulation. No removable, replaceable battery. No SD card support. Apple not only wanted to keep things simple, they needed every square millimeter of space to pack everything inside. And from the Mac on, Steve famously, infamously, veered away from computer kits and towards computing appliances — elegantly sealed boxes.

The industrial design team also had this longstanding dream to kill the headphone jack… kill it just to watch it die. Luckily for the vast majority of us, that was just nowhere nearly ready to become reality yet, especially not with the Bluetooth Apple had to work with at the time. So, they begrudgingly drilled into the bead-blasted aluminum shell so they could fit the super skinny hipster plug on Apple’s headphones, and then dropped the mother of all dongles for anything with even a little bit more chonk.

The main port was Apple’s 30-pin dock connector. That would tie P2 into the whole entire iPod ecosystem, giving it a huge advantage over every other phone on the market where it mattered most to Apple — mainstream customers. They did add a ringer switch, though. I mean, sure, the industrial design team hated buttons almost as much as headphone jacks, but no one, absolutely no one, wanted it to be their phone that interrupted Steve during a meeting. I kid. Kinda.

The basic principle was something like, if 80% of people won’t use it 80% of the time, it shouldn’t be built-in. But that’s also exactly why Apple added a physical Home button — so anyone not as technologically savvy, anyone who felt lost or confused at any time, could push it at any time, like an escape hatch, and return to the comfortably known state of the Home Screen and it’s springboard of app icons. That was going to be key to making P2 feel accessible to the far, far larger market.

Apple also had sensors that had to be perfectly calibrated to really sell the experience. Not just the multitouch, which was actually a 3-dimensional radiating field that some other companies would later dabble with for air gestures, but that Apple settled on, early on, as a way to model far more precise finger detection — and rejection. But also an accelerometer that had to automatically rotate the screen to match device orientation, and a proximity sensor that had to automatically turn off the display and multitouch when you held it close to your face.

That display, by the way, was 3.5-inches… the biggest display that could be fit into a phone at the time. Even if by today’s standards it’s… tiny… toy sized… it could ride a Pro Max like a tauntaun.

And it was all powered by an ARM-based 1176JZ(F)-S processor, PowerVR MBX Lite 3D graphics chip, and 128MB of RAM, manufactured by Samsung. Literally repurposed from a set-top box. Something that switched cable channels for a TV. But Apple was just beginning to look ahead, to assemble their Silicon Avengers. Tim Millet was on board, but Johny Srouji was still on his way. Because they knew even then, that it would become one of their biggest differentiators — from GPU accelerated, rock solid 60 frames-per-second animations to instant touch response for direct manipulation, and so much more to come.

Very, very late in the process, Steve managed to scratch up the then plastic screen of his prototype with his keys, and so he demanded something harder, and Apple’s hardware team and Corning had to race to get chemically hardened, ion-exchange glass ramped up and ready for production lines in mere weeks.

And, of course, because P2 was based on OS X, Apple’s native apps would have killer multitasking capabilities, and Steve really wanted to show those off. Start a song, make a call, switch to Mail, look up something from Safari, end a call, and have the music just fade back in. Something that, second only to pinch to zoom, would drop every jaw in the room.

But, for the demo, things were still McGyver’ed together with so many paperclips and so much chewing gum, that it would only work if Steve stuck to that exacting sequence of tasks. To that golden path. Any deviation — or any bad luck — and it would all come crashing down. Literally.

It was also going to priced at $499 for the 4GB and $599 for the 8GB model on-contract. Those prices weren't unheard of at the time — early Motorola RAZR flip phones were incredibly expensive back in the day as well — but it meant P2 would be limited to early adopters and those able and willing to pay a premium. A far cry from the mainstream audience iPod had reached.

Then there was the small, tiny even issue of Cisco owning the trademark for the name Steve Jobs and Apple had settled on — iPhone. Something that might have stalled or even altered the plans of any company not run by Steve Jobs, who decided he was just going to roll with it, hope for a Nat 20 and not a fumble, to force a deal later, and not get mired in litigation.

Because January 9, 2007 was set. Moscone West would be jam packet. The Macworld keynote had to be… beyond epic. It had to be iconic. And, of course, it was.

It shocked the world. Including and especially Apple’s new competitors, who Steve Jobs had opened up on with the keynote equivalent of a flame thrower. A mass driver. Saying their hardware keyboards were everything wrong, their resistive touch screens sucked, they styluses were just yuck, and their browsers could only show the baby web.

They were caught completely off-guard and flat footed, but they did manage to fire back.

Ed Coligan, the CEO of Palm said they’d “learned and struggled for a few years here figuring out how to make a decent phone. PC guys are not going to just figure this out. They're not going to just walk in.”

Mike Lazaridis, Co-CEO of BlackBerry, who literally believed the iPhone demo just had to have been faked, and that keyboards and proxy browsers were still key to the future of phones, said “Apple's design-centric approach [will] ultimately limit its appeal by sacrificing needed enterprise functionality. I think over-focus on one blinds you to the value of the other. [...] Apple's approach produced devices that inevitably sacrificed advanced features for aesthetics.”

Steve Ballmer, CEO of Microsoft said “You can get a Motorola Q for $99. [...] [Apple] will have the most expensive phone, by far, in the marketplace. There's no chance that the iPhone is going to get any significant market share. No chance.”

But Apple wasn’t looking at the puck where it was, or where it had just come from. They were Gretzky skating to where it was going to be. A world where dial-up-style mobile data was about to go broadband. Where enterprise and early adopters were going to get swarmed by mainstream consumers. Where bringing up a phone from a pager or PDA, or trying to make pocket versions of operating systems or browsers wasn’t going to be enough. They were going to have to be brought down from those PCs. All they needed was to… alter the bargain with AT&T to fix the up-front pricing, and it was light speed ahead.

Only Google, who had Android in Eric Schmidt’s back pocket, and Eric on Apple’s board, was able to see the same future, and pivot that Nokia communicator / BlackBerry style Sooner prototype into a more iPhone-competitive slider. And yes, that was the exact moment, when Google stopped worrying about Microsoft, Windows Mobile, and Pocket IE owning mobile, and turned their sights on Apple, that we entered the third age of Middle Earth. Or of connected computing. Both maybe.

The original iPhone would go on to sell over 6 million units in its first year on four carriers and in four countries. Now it sells hundreds of millions on almost every carrier in almost every country on earth.

Categories
Uncategorized

New Apple Silicon… Displays?! — Massive Leak Breakdown!

New Apple Cinema Display

A second-generation 32-inch Pro Display XDR. With Apple Silicon? But also, maybe the return of the Cinema Display? 24-inches like the M1 iMac, and 27-inches like the highly anticipated M1 Pro iMac.

I’m Rene Ritchie. YouTube says 70% of you watching still aren’t subscribed, so hit that button and bell, be part of this awesome community, and let’s do this!

There’s a new report out from DylanDKT, who’s rated 78% accurate by Apple Track, saying LG, yes Lucky Goldstar Life is Good LG, is working on three new, so far unbranded displays… And one, maybe all three of them, could be waiting on that Apple logo.

I love, all caps LOVE the idea of Apple getting back into the consumer display space. Since Tony Stark Snapping the old Thunderbolt Display a few years ago, our only real options have been crap commercial panels, LG Ultrafine, or Apple’s literal 6K for 6K Pro Display. Which were both problematic for reasons I’ll get into in a minute. But, do I believe this particular rumor?

Well, in addition to the Ultra kinda-fines, LG has been making panels for Apple displays for a long, long time. Including the LED and Thunderbolt displays that they released alongside the original IPS iMacs. Also, when Apple unceremoniously exited the consumer display business a few years ago, it was those LG Ultrafine 4K and 5K that they collaborated on to fill the giant sucking gaping gaps they left behind.

But if these are meant to be actual Apple branded displays the rumors are talking about, LG wouldn’t be working on anything other than the mini-LED for the panels, which they’d then hand off to Apple’s display team for literally everything else, as well as Industrial Design, Hardware Engineering, and Platform technologies, to turn them into actual Apple products. Apple specs for fabrication, they don’t white label. So whatever enclosures these panels are currently in would just be for testing and validation. If these are meant to be actual Apple branded displays.

Which could make the kind of sense that does, since Apple has often used LG panels in their displays. Including the LED and Thunderbolt displays that released alongside the original IPS iMacs.

Now, the first and most likely candidate for that Apple brand is a new 32-inch Pro Display XDR. Basically a second generation version of what Apple introduced back at WWDC 2019 alongside the re-cheese-grator’ed modular Mac Pro. Pro Display was good for its time. I mean, it was $5K — or $6K with the stand — which blew a lot of non-pro brains and made it the butt of a lot of jokes. But it was also a workable replacement for traditional, much more expensive reference monitors, which could easily cost you $42K. Even studios couldn’t afford to push those across the color management pipeline, but Pro Display XDR, that was way, way easier to scale.

But the OG Pro Display XDR was a also little… early. It wasn’t mini-LED… though it tried real hard. With nowhere nearly as many local dimming zones as we have on more modern displays like the iPad Pro and MacBook Pro. So it’s way more prone to blooming, vignetting, and off-angle issues. It had a lot of technological workarounds, a lot of cooling, and even though it’s cheap compared to traditional reference monitors, it’s still hella expensive for anyone not on a color-accurate production pipeline… which is the vast majority of people. Especially as Apple’s only current display.

So an improved version of that, an even better reference monitor, keeping the nano-texture option, adding up to 120Hz ProMotion adaptive refresh, even at the same price would be great. At a reduced price… fucking amazing.

As to having custom Apple silicon inside. Well, the iMac’s have had a custom Apple timing controller in them since they went 5K. That’s how they went 5K, by stitching and syncing two streams together within any tearing down the middle. That’s also why we lost Target Display Mode. Now I know every nerd worth their Twitter snark is probably hoping there’s be like an external GPU in the display that’d take some of the burden off the Mac you’re plugging it into. But my guess is it might be even a little more sophisticated than that, given all the display engines Apple’s been building into A-series and M-series for the last few years. Basically something that lets us get even better performance from the mini-LED than the atoms themselves would otherwise allow. Including an actually good I/O controller for an actually reliable hub...?

The other two displays, the 24-inch and 27-inch displays, could mark Apple’s long-rumored return to the consumer display space. Which would be… terrific. LG makes great panels but crappy enclosures, sometimes detrimentally so like when they forgot to properly tin-foil up the first run of Ultrafines to prevent Wi-Fi interference. Yeah. Quad-major oof. Also, I’m still a firm believer in the horn affect, which the opposite of the halo effect. The display is the most visible part of any machine, why give away that branding and that entry point to any other company?

If they get Apple logos, they could be the next generation of Apple Cinema Display — Not LED or Thunderbolt, but mini-LED and maybe marketed as Liquid Retina XDR, just like Apple’s other mini-LED displays?

Just big, beautiful, ultra-crisp, glossy displays, the kind that make Dave2D just ultra-happy?

Would the 24-inch version come in colors, and with an off-white bezel, to match the 24-inch iMac? Would the 27-inch version only be silver and space gray to match the iMac Pro? These are the questions that keep me up at night. I mean, not really. But kinda. It’d be ideal.

The 24-inch could just be an HDR display. Just HDR. Ha. How far we’ve come! The 27-inch could have some limited reference monitor capabilities as well, given the M1 Pro and M1 Max MacBook Pros have exactly those limited capabilities already.

Also, nano-texture option and 120Hz ProMotion, because that’s just what Pro versions get these days, right?

It’s unclear if they’d have dedicated Apple display silicon in them, like the rumored 32-inch, but I’d love to see an image signal processor for a 1080p webcam as well. Obviously, the original Pro Display is far too serious for such integrated video shenanigans, or it’s just meant for studios like ILM and Pixar where cameras are frowned open. But consumer displays? Gimme. Especially if they’re Apple branded.

Although, that could be true whether they have Apple logos on them or… they just end up being the next generation of Ultrafines. Because finding a quality display, something truly on par with Apple displays, has been really really hard for the last few years. Like, our eyes are bleeding from all the poor quality pixels we’ve been forced to stare at hard.

Categories
Uncategorized

iPhone 14 Leaks — Pure 8K Camera FIRE

iPhone 14. Specifically, the camera. Maybe a new 48 megapixel camera. But megapixels don’t matter, right? Pixel size over pixel count, quality over quantity. But… what if Apple can do both?

I’m Rene Ritchie. YouTube says 70% of you watching still aren’t subscribed, so hit that button and bell, be part of this awesome community, and let’s do this!

According to a just-released report from randomly accurate Haitong International Securities analyst Jeff Pu, Apple is finally — finally — getting ready to ditch their traditional 12 megapixel sensor for quad-major upgrade to 48 megapixels. Which backs up a previous report from more reliable supply chain exfilterator and TF International Securities analyst Kuo Ming-Chi.

Now, there are three really big reasons why, which I’ll get to in a minute, but it follows Samsung and other Android phone makers switching to monstrous megapixel counts a few years ago already, and even equally hesitant Google going all-in on 48 megapixel + sensors with the Pixel 6 series earlier this year.

See, with traditional cameras, you have huge, honking lenses that pull in a ton of light. But those don’t fit on phones that fit in pockets. Regardless, Samsung, Huawei, and others have been trying to live that best big glass life, strapping Loquitous of Borg-like sensors onto the backs of their phones, but they’re still nowhere nearly big enough, and their algorithms — the computational photography they use to try and bridge the gap — is just still not good enough. And so very try hard at the same time.

Google, until now, had been sticking with multi-generational hand-me-down sensors backed up first by massive server-side processing and, more recently, just ludicrous localized machine learning. HDR+. Super Res Zoom. And now, they can literally Magic Erase anything from our memories this side of Tiger King Season 2.

Apple never offered Samsung’s big glass or Google’s big algo, but a really good balance of the two. 12 megapixel sensors with bigger pixel sizes, wider apertures, longer focal lengths, and increasingly useful modes, from Smart HDR to deep fusion to Night Mode to Macro to Dolby Vision. Basically, big compute driven in real-time by the best pound-for-pound silicon on the planet.

And, for a long time, 12 megapixels was a real sweet spot for that big compute. Enough data for photos and 4K video but still light enough to be super easy, barely an inconvenience to round trip through the image signal process, neural engines, and GPUs for a ton of stacking, bracketing, analyzing, and rendering. Because that’s the bottleneck. Getting the data through the pipeline in a timely enough fashion. Especially for things like 4K30 Dolby Vision in ProRes.

So, why go 48 megapixels now?

Well, first, 8K. 12 megapixels is enough for 4K video but nowhere nearly enough for the four times bigger 8K video. Four because it doubles both vertical and horizontal resolution. So, going to 48 megapixels on the iPhone 14 means being able to capture 8K video on the iPhone 14, which is what a bunch of bigger glass phones have already been able to do for a while now.

And even though Apple hasn’t really embraced 8K yet on any of their current products, those new ProRes engines on the M1 Pro and M1 Max can tear through it, and presumably the ones on the A15 Bionic already as well. Never mind what may be a major escalation with the anticipated 3 nanometer A16 that’s coming next. There also be an Apple TV that supports 8K as well at some point, and depending on how Apple plays it, it could be super useful for capturing VR content for the dual 4K displays rumored to be on board the upcoming mixed reality headset.

8K sensors also typically make for much better 4K video, thanks to down-sampling. I’m shooting on 5.9K downsampled to 4K right now, for example.

Which brings us to the next reason, pixel-binning. What that means is, taking groups of pixels, like 4, 6, 8, etc. and treating them as a single pixel. In this case, taking groups of 4 pixels and treating them as one, so 48 megapixels become 12 megapixels again. It’s pretty much the same idea as a Retina display, but the other way around.

Now, in some cases, that can be worse, like if you just cut up the same size sensor into even smaller pixels and call it a day. That way you get more smaller pixels, that capture even less light, less data, because you lose more to the increased number of borders in-between.

But, if instead, you take those smaller pixels and make each one ever so slightly bigger again, when you bin the back together, you end up capturing more light, more data than before.

Which is what Kuo’s report says Apple is going to do. With a 4x binned pixel size of 2.5 microns, which would be WAY bigger than the current iPhone 13’s already increased pixel size of 1.9 microns.

So, why bother to pixel bin and not just keep the sensor at 12 megapixel with 2.5 micron pixels? Well, 8K, but also for photos, the advantage to pixel binning is that you also don’t have to do it. You can downsample 8K to 4K, but you can also just keep 8K. Same way you can pixel bin 48 megapixels to 12 megapixels, but you can also just keep the 48 megapixels. And for computational photography, you can stack and bracket both. Another weapon in the war between detail and noise.

Also, it can help in the one giant hole left in Apple’s overall camera system — zoom. The iPhone 13 Pro can do macro, super close is now covered. And it has a 3x punch in, thanks to the new effective 77mm telephoto camera. But 3x is still just 3x and even using Smart HDR to improve digital zoom the way Google uses HDR+-style techniques for Super Res Zoom can only do so much.

But like with Google’s new Pixel 6 Pro, with 48 megapixels plus, you can do more. You can give the people more. Or, more accurately, you can give the zoom more. Maybe.

Samsung, Huawei, and others use periscope cameras for real, high-level optical zoom. Periscope because the lens is too long to be mounted normally so it’s folded sideways across the phone and they use mirrors to point it out the back. And that hasn’t been rumored for Apple until at least the iPhone 14s or iPhone 15 or whatever they call the 2023 model.

But for a real escalation in zoom, Apple would probably have to go to at least 48 megapixels on the telephoto as well as the wide angle. 12 megapixel is still more than fine for the ultra-wide angle. But you really want that tele for 4x or 5x. And current reports suggest Apple is only focusing on 48 megapixels for the main, wide-angle camera this time around.

Now, Apple has always treated the separate cameras as more of an implementation detail. Fusing them together and using them according to need and conditions. So, if you hit 3x on the iPhone 12 Pro, you may get the 77 megapixel f/2.8 in bright light, but in low-light you’ll get the much faster 23 mm f/1.5 wide angle instead. It cares way less about which camera it uses and way more about delivering the best photo it can. So maybe Apple will whip up some wild 3x telephoto, 4-5x wide-angle camera fusion with this type of megapixel enhanced system as well. That would dramatically improve the iPhone camera system… again… and also be in keeping with Apple’s traditional balance between optical and computational.

Categories
Uncategorized

Apple Myths Debunked — And Confirmed!

They’re listening to you over your mic. They’re slowing down your phones. Force quitting apps saves battery. Charging destroys battery. Steve Jobs would never… Antenna-gate was overblown. Apple hates Nvidia. Myths, misconceptions, misinformation about Apple… it spreads faster than an Android Virus… Kidding. Or am I? Because there’s no such thing as malware on Mac… or is there? Never mind reading it in black and white, we see it on Facebook. People doing their own research. Retweeting at the speed of social. But what’s even real… and what’s just really, really fake?

I’m Rene Ritchie, I’ve been interrogating Apple for almost a decade and a half, and I’m going to debunk some of the juiciest iPhone and Mac myths on the planet… and maybe just confirm some of the biggest ones as well!

MYTH: Social Media Apps are Listening In on Us

Social Media apps like Instagram are just so damn good — as-in-creepy — at showing us ads for things we talk or text about some people think they simply have to be listening in, spying on us through our iPhone mics. And no… but pretty much because they just don’t have to.

It’s been investigated a bunch of times and no one’s found any evidence to support it. Which is actually pretty compelling because Facebook’s been caught doing other creepy stuff like spying on the apps we use to figure out which to buy or copy. And recording us and uploading those recordings for analysis an exploitation on their servers would be such a big battery and data hit it’d be pretty easy to catch as well.

But they already track us like… the Military in a MrBeast video, through their apps and across the web, they know who we’re connecting with through their social graphs, and they even get our IRL transactions through data brokers. So, they basically have full-on behavioral profiles on all of us, which… forget predictive, border on precognitive at times. Also, we only really remember the super creepy ads anyway, and filter out the dozens of dumb ones we see all day, every day. So that just enhances fuels all the stories.

But it’s also exactly why Apple’s been building features like App Tracking Transparency and Private Relay into their products.

MYTH: Apple is sabotaging older iPhone

Apple is secretly slowing down our older iPhones to annoy us into upgrading to new ones. That’s something that bubbles up especially when new versions of iOS and iPhones hit all the world, all at once, and everyone’s Spotlight search and photo indexes are rebuilding and updated versions of Apps and libraries are downloading for a good day or two.

Also, when Apple decides not to push some of the most intensive new features to some of the oldest phones, because they think it’ll legit thrash performance, they get accused of withholding to force upgrades, so there’s some damned if you do, damned if you don’t at work there as well.

But, because Apple makes iPhones that can last 5 or more years, and provides iOS updates for up to 5 years, and as batteries and processors age out, and software and apps get heavier, bits rot, junk builds up in the Other truck, people complain and conspiracies spread. Even if fresh batteries and clean installs can take a lot of that load off. And, newer phones with faster processors and bigger batteries run everything smoother for longer by comparison. Because obviously.

But this really came to a head a few years ago when Apple was accused of secretly throttling performance on older phones. See, the year before, Apple found out their increasingly high performance chipsets were causing brown outs on phones with worn out batteries. Basically, if the processor spiked, and the battery couldn’t meet demand, the phone would shut down to prevent damage, and you had to plug in to reboot it, which could be super inconvenient or even dangerous. So, Apple fixed the reboot hassle, but also throttled those processors on worn out batteries to prevent spikes, which would prevent the brown outs. But they didn’t explain it well, or that a battery replacement could fix it, and they were hella aggressive about the throttling, and nobody even remembered it happened until a year later when Geekbench and Reddit exploded it all over the net.

Apple apologized, provided super-cheap battery swaps, added a switch for people who preferred the potential of a brown out to the constant throttling, and built better battery management software to greatly reduce the potential for it occurring on newer phones.

MYTH: Force Quitting Apps Saves Your Battery.

Some people insist on force quitting every app, all the time, thinking it’ll save battery life or increase performance. Others insist you should never, not ever force quit any app. The truth is, most of the time, force quitting apps actually wastes battery life. iOS doesn’t work like a traditional operating system, not like Windows, Android, even the Mac. It has no concept of swapping or paging memory out to SSD. So, apps stay in memory, in RAM, in a suspended state until you go back to them, and they wake up, or iOS needs that memory for something else, and so jettisons the oldest apps on the stack to make room for the new ones.

The exact same lack of swap is also why having 16 GB on an iPad Pro doesn’t mean you can have bigger apps, it just means you can have more big apps in memory for longer. But that’s another video.

Now, waking an app takes far less power than launching an app, so if you make it a habit of force quitting all your apps, just killing them to watch them die, you’re forcing iOS to relaunch them all from scratch each and every time, which takes more power.

The only exception is if an app goes rogue, doesn’t shut down properly, and leaves processes just churning away. Usually a super thirsty social network app like Facebook or cross-compiled game like Pokemon Go. Then you’ll feel your iPhone get warm and literally see your battery drain. In that case, just force quit away. Check battery usage in Settings if you’re not sure which app to yote, or if you’re desperate and don’t have time to narrow it down, go ahead and force quit everything so you can call that Lyft or whatever it is you need to do to be safe and sound, and then narrow it down the next time. Just don’t do it unless it’s an emergency, and never don’t ever make a habit of it.

MYTH: Charging destroys your battery

Batteries are fuel and using them depletes them. But some people believe if you don’t micromanage your charging, plug in long before it gets to zero and plug out way before it gets to 100, you’ll deplete them way way faster. And here’s the thing, I’ve talked to the actual battery engineers and testers at the actual companies that make this stuff, and they’ve taken all of that into account already.

Basically, there are really only three things that damage lithium-ion batteries: Exposing them to heat, keeping them in a high charge state for any length of time, and storing them at a low charge state for an extended length of time.

iOS will gate against excessive heat by cutting the brightness, throwing a temperature warning, and eventually shutting down, but point is — don’t leave your iPhone out in the sun on a hot day, like on your dashboard or pool side table, and don’t leave it on heater in winter, like in car dock mounted on a hot air vent. Cold it can recover from when it warms back up, but heat causes permanent damage.

For high and low charge states, Apple manages that by kinda lying about the percentages. 0% isn’t really 0%. Your iPhone will shut down a while before it actually discharges completely. If you do plan on storing your iPhone for a long period of time, charge it halfway before turning it off, but otherwise you’re fine. For high charge state, if you leave the setting on, iOS will idle it at 80% and only go to 100% based on machine learning algorithms that figure out when you’ll actually need maximum charge, like right before you typically leave for work in the morning.

Now, you still might be able to reduce gross charging cycles by micromanaging when you plug in and out, but you’re exchanging mental health for charging health, and losing time and convenience for what might amount to a few percentage points or a not very expensive battery swap a few years down the line. Especially when you consider impact like from drops or ambient heat can erase any and all effort you put into micromanaging at any time anyway.

MYTH: Antenna Gate was Overblown

Back in the Jurassic period, when BlackBerry and Palm still roamed the earth, Apple had a problem with the iPhone 4 — You could kill the cellular connection by putting your finger on the gap between the antenna bands on the bottom. In typical counter-conspiracy fashion, some people then and now think this was exaggerated, even all made up, that there was in fact no unintentional network pause play button.

Partially, because it just seems so ridiculous now, and because de-tuning and attenuation kept being conflated back then. See, you could attenuate pretty much any phone by wrapping your ugly meat paw of mostly water around it and disrupting enough of the antenna to interfere with the signal. But, the iPhone 4… that you could also de-tune by placing your capacitive flesh finger right on that naked stainless steel bridge spot.

Cases prevented both, but not everyone used a case and if you happened to be in an area with poor cellular service or signal strength, which wasn’t exactly uncommon back then, the drop could be enough to kill your reception.

But yeah, anyone could objectively see the de-tuning by going into Field Test Mode by typing 3001#12345# into the dialer and then just measuring the drop.

Apple ended up giving away free bumpers, which insulated the antennas from our death touches, and engineered full fixes for de-tuning with the Verizon iPhone 4 and, half a year later, all versions of the iPhone 4s.

Since then, antenna systems have gotten way smarter and more advanced, and RF transparency, way more sophisticated — insert your mmWave jokes here — so neither is really a problem any more anyway. But it sure as hell was back then.

MYTH: Apple Overcharges

The Apple Tax is a passive aggressive — or just aggressive aggressive — way of saying people want Apple products but wish they were less expensive. See, other companies typically offer a range of price points, from budget to premium. They have options with worse displays. Slower SSDs. Lower performance chipsets. Less durable materials and build quality. Sometimes supported by ads or other business goals. And Apple doesn’t do that. They’re not Toyota where you can get the Lexus version if you want. They’re BMW or Porsche, where that’s the only version. Even if every once in a while they throw out a Boxter.

And because enough of Apple’s customers are super satisfied with those premium products, Apple hasn’t been forced to sell at near zero margin or produce budget versions, the way many other companies have. And when Apple dabbled with it, like the iPhone 5c, or the crappy HD in iMacs, it’s been largely rejected by the market in favor of older premium products, like the previous few generations of iPhone, or with updated internals, like the iPhone SE or baseline iPad or MacBook Air.

Now, Apple’s margins actually stayed pretty much the same for many, many years after Steve Jobs. Hardware margins even went down as design started spending more on chamfered edges and 3D Touch layers, engineering wanted the latest technologies first, like 7 and 5 nanometer processes and endlessly more custom silicon, and marketing felt it critical to invest in more expensive parts like OLED and 5G. Which one of the main reasons Apple is now so all-in on services, because those margins are way, way, way higher — I mean, not makeup or fashion higher — but they do lift everything else up.

If you want Apple stuff but just can’t or won’t pay Apple prices, it’s still going to make you hella mad, and there’s no getting around that. But, if you go beyond simple up-front cost, and factor in how long Apple hardware typically lasts, how often you get software updates, all the free software, and the typically high resale values, the total cost of ownership on Apple products can end offering way more value than way cheaper alternatives. And there’s no getting around that either.

MYTH: Apple Hates Nvidia

Apple used to use Nvidia graphics cards. And then they didn’t. Much to the consternation of Cuda-stans everywhere. And yes, there was an incident, a defect Apple felt Nvidia failed to take responsibility for, and that caused some Tay Tay level Bad Blood.

But, ultimately, these are two hugely successful companies with two very different agendas. Nvidia wants to reduce PCs to commodity front ends for their massive graphics cards, so it doesn’t matter who you buy your box from or how you build your rig, it just has to have Nvidia inside and you’re golden. And Apple wanted to reduce GPUs to commodity cores for their Metal framework, so it didn’t matter which cards they sourced, they’d all just work with macOS. But where AMD was willing to let Apple do what they wanted, including bypassing everything and going straight to the lowercase metal if and when they wanted to, Nvidia was absolutely not.

In hindsight, now that Apple’s switched to M1 silicon, it’s easy to see why they wanted what they wanted. Apps that used Metal did in fact just work on M1, even the Intel versions, even under x86 emulation. And now Apple can focus on making better GPU and more specific accelerators, and profit. But it’s hard to fault Nvidia for being so… Nvidia about it either, even though going on a year later you still can’t so much as sell an organ for a 3090.

Because, when you have two companies that hugely successful, powerful, and, yeah, bull-headed, neither is going to flinch until the market hurts or motivates one or both of them enough to force it.

MYTH: There’s No Malware on the Mac

The idea that Macs don’t get viruses dates back to the days of Windows 98 and XP, where Microsoft had almost complete dominance of the personal computer market… but hardly a lick of security built into their operating system. And since malware is an economic crime, where bad actors want to hit the biggest target at the least expense, they cared almost nothing about the Mac, which not only had better-for-the-time security thanks to its BSD UNIX foundations, but also just nowhere nearly an install base worth bothering with.

Now, Windows security has steadily improved over the last decade, and thanks to the iPhone and iPad, Apple’s profile has risen considerably. Also, the types, goals, availability, and distribution methods of malware have evolved. Especially the growth of the internet, which has made adware and especially phishing a problem for everyone, because it really doesn’t care what OS you have. It just wants your accounts.

And you can see the effect of all of this just by looking at how much effort Apple’s been putting into hardening the Mac over the last few years. Everything from XProtect, which is their constantly updated Malware scanner, to Gatekeeper, which can prevent non-App Store or signed apps from installing, to sandboxes, secure boot, and read-only system volumes to mitigate against malware-level tampering.

But it’s never been not truer, so Apple has to keep tightening up those blast doors and we have to make sure we don’t open them back up every time a fake free app or dodgy porn link slides into our DMs.

Steve Jobs would never

Ritchie’s law — pretty much any time someone says “Steve Jobs would never do that…”, you can find several examples of Steve Jobs doing pretty much exactly that. The only major exceptions that come to mind is letting the Butterfly keyboard stay on the market as long as it did, given how fast he killed the G4 Cube and reverted the buttonless iPod shuffle and wide-body iPod nano. It’s just hard to see him not Mobile Me flame throwering that thing within a year… 18 months max.

But Steve was Steve and Tim is Tim and that was then and this is now, so either way it’s still one of the laziest logical fallacies to drop on any particular argument.

For actual insight into how Steve Jobs actually ran Apple, check out Ken Kocienda’s Creative Selection: Inside Apple's Design Process During the Golden Age of Steve Jobs… Ken helped build the Safari Web Browser, the iPhone keyboard, and Apple Watch Faces, and he details how Steve set up small teams, assigned direct responsibility, and insisted on intense focus to produce… amazing results.

Categories
Uncategorized

M1 Pro vs M1 Max — The Ugly Truth

Ok, real talk. I haven’t seen this many comments, tweets, chats, DMs, just questions and confusion in general about anything since 8GB vs. 16GB on last year’s M1 Macs:

Should you get the M1 Pro or M1 Max MacBook Pro

And yeah, because despite all the reviews, all the recommendations, all the benchmark LARP, it still seems just super hard, really an inconvenience to get a simple, straight, easy to understand answer. Well, I’m Rene Ritchie and I’m going to give it to you. Right now.

I think the reason 8GB vs. 16GB was so confusing last time was because M1 was only in ultra-low power Macs and people who would normally buy higher end Macs were desperate for M1, didn’t want to wait, and were trying to figure out if they could fit into the more widely available 8GB models.

And I think the reason for M1 Pro vs. M1 Max is so confusing now is broadly the same. Those higher-end chips are only in MacBook Pros right now and people who would normally by iMacs or even Mac Pros are desperate for Apple Silicon, don’t want to wait, and are trying to figure out if they can fit into a laptop for now.

And because, in a way, it’s simpler than the old Intel i5 vs i7 vs i9 with Iris integrated or AMD discrete graphics, and the reputation M1 has built for punching just way, way above its weight, and, honestly, 2021 being as awful in its own way as 2021, especially in terms of how much everything costs, I think people really do want to figure out if they need to go Max or can save some money and with the Pro.

And it doesn’t help that every second tweet is from someone saying how awesome or terrible the battery life is, because they don’t realize surfing the web in safari hits so different than rendering video in Premiere Pro. Or if there’s a memory leak, or if you’re stuck on Intel emulation in Rosetta and the app never bothered to update for Metal, or any of a dozen other things or context.

So for people who really just want to know which MacBook Pro they can use as a bridge machine for the eventual iMac or Mac Pro, or pros looking to switch to the MacBook Pro for the first time and wondering how much they really need to spend, well, here’s the honest answer.

If battery life is the absolute most important thing to you, if you need to be able to use your MacBook Pro away from main power, a lot, the added size and weight isn’t an issue for you, because you just need your laptop to last as long as it can possibly last, get the 16-inch M1 Pro.

It’ll save you $200 to $400 bucks off the cost of an M1 Max, or let you put that money towards more RAM or a SSD instead.

The 16-inch has a bigger battery, so you have more potential power to begin with, and a bigger thermal envelop, so it doesn’t get as hot and there’s even less change the fans will have to power up. The M1 Pro also has fewer transistors, in other words, fewer graphics cores, fewer media engines, and potentially less RAM to light up, which means it uses less power as well. It does have a bigger display to drive, but all in all, the combination of bigger battery, bigger thermal envelope, and less silicon, makes it the battery life champ.

Of course, if you do light up all the transistors in M1, like rendering video with a ton of effects applies, with screen brightness all the way up, on a shoddy Wi-Fi connection, on a hot Arizona day, you can still kill even the 16-inch battery pretty quick. That’s why I keep saying potential. Battery is fuel and a bigger tank is just a bigger tank. The more and faster you go, the more and faster it’ll go.

Now, if performance is the absolute most important thing to you, if time is literally money, or far more precious to you than money, and the faster your can get through your work, or the more work you can can get done any given day, is everything, than get the 16-inch M1 Pro Max.

The CPUs are mostly all the same. Mostly. And I’ll get to the exceptions in a binned-down minute. So, if you’re doing music with a ton of plugins, for example, or anything CPU bound, it won’t really make a difference. But the M1 Max has up to double the GPU cores, which means it’ll just tear through any heavy graphics work, from 3D modeling to textures, transformations, effects, and more, and double the Media Engines, so H.264, HEVC, and ProRes will render almost literally twice as fast. And while they’re rendering, because they’re on the Media Engines and not CPU or GPU bound, you can still use that CPU and GPU for other work. If you have time. Because so damn fast.

Because the 16-inch has a bigger thermal envelop, those extra GPU cores have more room to breathe before they start saturating the cooling system, and the 16-inch even has a high power mode so you can unleash the fans and the cores to get the most performance possible, at least within the limits of physics. I’m not going to say it’s like hitting nitro on that fuel tank, all the speed but all the drain. Because I honestly don’t know enough about racing to land that particular analogy. But I’m not not going to not not say it either.

So, what about the 14-inch models, if you really need to save some extra cash or you just need something smaller and lighter to carry around, or work on planes, train, busses, and in coffee shops, hotels, and venues?

Well, they have physically smaller batteries and smaller thermal envelopes, so even though they have smaller displays to drive, they still offer less potential power. There is a binned down 8-CPU and 14-GPU version of the M1 Pro for the 14-inch, which means even less silicon that can be lit up, so even less drain even if or when fully lit. The also a low power mode in macOS Monterey that reduces overall draw, so you can get even more time on the battery you have, if you’re not doing anything too demanding or intense. That’d be the best bet if you want or need the 14-inch but battery life is still critical to you.

And if you want or need the 14-inch, don’t care about battery life, but do want as much performance as you can get, the M1 Max is absolutely, positively, still a monster even in the smaller thermal envelope. You can’t go as long between charges, of course, but you can still get a ton done.

Just remember, Macs are built like… aluminum bricks. They typically last 5 or more years. So don’t just think about what present you needs today. Consider what future you, or whomever you give, hand-down, or sell off to will need tomorrow. You can get batteries replaced if and when they age out. You can hang external storage off a USB4 port. You can’t add or takeout RAM or swap an M1 Pro for an M1 Max chipset, or vice versa. If you have limited cash on hand, of course, only ever get what you can afford. But if you’re at all flexible, don’t just consider up-front cost, think carefully about long term value.

Like on Twitter or in comments, you may see people warn you away from the 14-inch M1 Max. And I for one got the 16-inch Max for all the reasons I just went over. But you know who got the 14-inch M1 Max? Apple’s VP of Silicon, the guy who helped create it, and Apple’s VP of Mac Product Marketing, the guy who helped spec it all out, because they want the power and the portability, and everyone knows their own workloads the best, so seriously, ask all the questions you want, but make sure you do you and get what you need.

Categories
Uncategorized

Apple 2022 Mega Preview — iPhone 14, Apple VR, M2 MacBook Air, More!

iPhone 14. MagSafe iPad Pro. Mac mini Pro. iMac Pro. Mac Pro. New AirPods Pro. Just… so much Pro! But also, new MacBook Air. iPhone SE 3. Apple Watch SE 2. Apple Watch… Extreme. And maybe, just maybe, Apple VR.

I’m Rene Ritchie and this… and if you thought 2021 was Apple-packed… just wait’ll we get a load of 2022!

Ok, so… Apple mostly holds events in Spring. Mostly. Last year April, year before canceled, year before that, March… for Oprah. But Spring 2022 could indeed have more than a few highly-anticipated new and updated devices on deck.

Last year it was a purple iPhone 12, so maybe a purple iPhone 13. That or the long-rumored orange… I wouldn’t say no. Also, with the original iPhone SE debuting in the spring of 2016 and iPhone SE 2 in the spring of 2020, Apple could be speeding up that schedule and going with iPhone SE 3 as soon as spring of 2022. But with nowhere nearly as big of an update. Just swapping out the Intel LTE modem for a Qualcomm 5G modem, and the A13 chipset for an A15. But, yeah, no idea how they’re going to manage battery life with all that new iPhone 13 silicon in all that same old iPhone 8 shell…

Too soon for an AirTags update, but maybe, finally AirPods Pro: The Next Generation. They launched in 2019, about 8 months after AirPods 2. But now we’re on on AirPods 3… so, carry the 8… divide by zero, and it sure feels like time. Same great active noise cancelation, new smaller stems, more like Beats Studio Pro. But with H1, Apple’s original headphones chip, or are we finally going to see H2?

We got the A12Z iPad Pro in the spring of 2020 and the M1 in the spring of 2021, which is faster than the 18 month cycle they were on before. So, if M2 is good to go, we could see a new iPad Pro for third spring in a row. And that would include better single core performance, up to two extra graphics cores, ProRes Media Engines, same great mini-LED on the front but now with MagSafe on back. Maybe MacBook plug style, maybe iPhone puck style, but either way, with full-on structural integrity so I don’t have to watch any more unboxers or tear-downers bend any more iPads not ever again. And I’ll have my full preview for the 2022 iPad Pro up soon, so hit that subscribe button and bell like they stole your Ronin suit. Or whatever.

If Apple wants to roll out all the M2, all at once, like they did with M1 at the end of 2020, we could also see the redesigned MacBook Air… or just MacBook… Depending on how Apple wants to brand it. I vote the latter. The original Air and original 12-inch MacBook both debuted in the first half of their respective years. But Apple isn’t nostalgic. They’re operational. They’ll announce what they want to announce as soon as it’s ready to ship. And that means yeah, M2, but also mini-LED, MagSafe, and an all-new, all-white bezeled design. Maybe alongside an M2 two-port MacBook Pro as well. Just… update all the ultra-lower power things.

Last year we also got the new 24-inch M1 iMac, but what many of us are still waiting for is the newer-than-new 27- to 32-inch iMac please brand it Pro. Mini-LED display, nano-texture option, more ports, and M1 Pro, M1 Max, and maybe even a dual M1 Max chipset option. Because what every pro needs is a little workstation built right into in their display. Speaking of which, there are still rumors of a new, way less than $5 thousand dollar, stand included, Apple display on the agenda, but nothing to indicate it’s any time soon. But please, I need it.

Fast-forward to summer, and Apple’s annual world wide developers conference, WWDC should hit us first week of June. Biggest question is whether or not it’ll be the first in-person event since the fall of 2019, you know, just before the world started ending. But as with 2020 and 2020 jr., we’ll have to wait and see if we actually get a for-real 2022 yet, or if it’s just 2020: Let There Be Carnage. The second biggest question is whether Apple will announce or even just tease the next Mac Pro. That’s what they did in 2013 for the trashcan and 2019 for the revenge of the cheesegrater. Same modular design, but instead of Intel Xeon and AMD Navi blast furnaces in the belly, dual or quad M1 Max instead, for even better performance… and no water cooling needed. That, and the Mac mini Pro with the new design and M1 Pro and M1 Max inside would make every developer in the house — or at home — supremely happy. Along with iOS and iPadOS 16, tvOS 16, watchOS 9, and macOS… a little help Craig? Um… No

Now, some WWDCs, like last year’s, there’s no new hardware. Others, like 2017, there’s, well… all the new hardware. Most years, it’s mixed. And there have been rumors of a more powerful, more gaming focused Apple TV, a hybrid Apple TV HomePod for home theater, and a hybrid HomePod and iPad for kitchens and bedrooms. The Apple TV would have an A14 or better and an Apple-designed gamepad. The HomePod Theater would have room-filling — and shaking — Atmos and spatial audio. The Home… Pad would have a display and Center Stage for full-on family FaceTime fun. But if and where any of that is on the actual roadmap for next year or the year after, we’ll have to wait and see.

Assuming Apple and the world stays on track, new iPhones in September should be fully back to being just the safest bet in tech. And that means iPhone 14. Which is expected to be another redesign, this time even more iPhone 4 than iPhone 5, deleting the iPhone mini, but adding a new iPhone Biggie… like a Max but non-Pro. So, goodbye slightly less expensive small phone, hello much less expensive big phone. And with hole punches instead of notches so you can still tell the Pros apart. But tiny process node for the A16 chipset — presumably TSMC’s brand new 3 nanometers, just in time for Kang and the Quantum Realm. Also, better cameras, of course. Of course. Of course. And maybe a port. Maybe no port. Just hopefully something, anything, that can pull ProRes way, way faster than still 2012-era lightning. Because… damn.

There’ll be an Apple Watch Series 8, but also an Apple Watch SE 2. Six to five and pick’em which one ends up having that flat-edged design that was all over the digital papers last year. And, maybe, a ruggedized Apple Watch Extreme for those who need even more watch for even more on-the-job or cross-country action.

October used to be iPad and Mac month, but for the last couple of years we got iPads in September. Air and Gen 8 in 2020, then mini and Gen 9 in 2021, so this year maybe Air and Gen 10? That’s how Apple seems to be rolling now. A16 on board, maybe even mini-LED, because Apple seems not just intent, but full on content to push that out across the line. Except to the baseline iPad, that should stay non-mini-LCD, but maybe with the rumored size boost to 10.5 inches.

As for Macs, whatever Apple doesn’t manage to push out earlier in the year should close out at the end of the year, at the very outer limits of Tim Cook’s promised two-year transition. And that just leaves Apple’s augmented virtual mixed reality headset. Apple’s… vision… so to speak.

Now, the original iPhone and iPad were both shown off at January events, so the hardware wouldn’t leak and, in the iPad’s case, developers could get a head-start on porting their apps over to the bigger screen. The original Apple Watch debuted in September, much later in the year but for essentially the same reasons. So, it really just depends on timelines. When exactly Apple thinks the product will be stable enough to demo, and how long thereafter they think it’ll take to ship.

My guess is that, if not WWDC, it’ll be late in the year, because that’ll give Apple the most time and the world the most opportunity to return to normalcy, at least enough for an in-person event to really show off the next big thing. Then, whatever new frameworks they snuck into iOS 16 will be ready for primetime building for the new device, and either the whole entire fall, or whole entire winter to get ready to launch.

That way, when we all get our hands — and faces — on dual 4K lenses, spatial audio, M3 and R1 or whatever Apple calls the Reality chip, rOS, or whatever Apple calls the Reality Operating System, and all those sensors and pass-throughs, there’ll be apps aplenty to support it. FaceTime RoundTable, Fitness+ Classes, TV+ Theater, Music+ Concerts, Apple Arcade in actual VR arcades, the whole stack.

Categories
Uncategorized

M2 MacBook Pro (2022) Leaks — Faster but what else?!

The MacBook Pro… but not the recent M1 Pro and Max models, the M1 regular model, the 2-port model, but with a next-generation M2, maybe a Mini-LED display, maybe even a new design. Like… the MacBook Pro having a love baby with the MacBook Air design. Again.

I’m Rene Ritchie, hit subscribe so you don’t miss any of my upcoming 2022 previews, and then, yeah, we gotta talk about this!

So, back in November of 2020 when Apple announced the first wave of custom silicon Macs, it was all about ultra low power, entry level, getting M1 into the existing enclosures of the MacBook Air, Mac mini, and 2-port MacBook Pro.

That’s the MacBook Pro that was first released back in 2016, the one Phil Schiller said was for people who wanted something just a little more than the MacBook Air. Including a Retina display and a better-than anemic Intel Y-Series chipset. And, poetically, without a Touch Bar back then, but the only one remaining with a Touch Bar now…

It was meant for a new wave of customers, of people for whom Pro was less Mac Pro as in professional and more AirPods Pro as in premium. Maybe coders or designers with lighter workloads, maybe photographers or videographers always on the go, maybe just hipsters or founders who coveted the pro brand but didn’t need all the pro power, size, and weight that traditionally came with it. Whatever. It just made the word Pro… more accessible, affordable, and to some, more infuriating than ever.

But, it legit let those who wanted an ultra-light that could sustain performance and last on battery just a little longer than a MacBook Air, and didn’t want to pay full price for a MacBook Pro Pro, just… get their goldilocks on.

And now it’s time for Round 2. More specifically, M2. Now, I’ll get to what that means in a minute. Because the actual reports are… kinda thin. Like, thinner than iPhone 6 thin. But with the M1 iPad Pro and M1 Pro and Max MacBook Pros going mini-LED… and the M2 MacBook Air rumored to be going mini-LED as well, I think chances are good the M2 MacBook Pro will go mini-LED right along with them. Because having the worst display in the line-up right in the middle of the lineup… would just make the kind of sense that doesn’t. And I’ve already got a whole entire video up previewing the M2 MacBook Air, so I’ll drop a link to that in the description below the like button.

As to 120Hz ProMotion adaptive refresh rate… probably not. Apple seems to be keeping that feature for serious pros, at least for now.

Notch? Maybe? The combination of the distinctiveness and the ability to have both smaller bezels and a bigger camera is a winner in every way but aesthetics. And I’m still here for it. As long as we get that better 1080p camera. Otherwise, f the notch. Let it burn. Right along with the Touch Bar, as I imagine it will.

We should also get MagSafe back, because Apple seems to be all-in on that, all-around, going forward. And hopefully that means in addition to the two USB4, or combo USB-C / Thunderbolt 4 ports. Because going down to one, like the old 12-inch MacBook Nothing, would just make it a non-starter, a game-stopper, for way too many people.

HDMI and SDXC… like anything above and beyond those two ports… just doesn’t seem likely though either.

Back to M2. Where M1 was based on the iPhone 12’s A14 generation silicon, M2 should be based on the iPhone 13’s A15 generation silicon. And that means, what we’re looking at, is even better performance and efficiency. Where the M1 cores are a… song of Fire and Icestorm, M2 should be getting Avalanche and Blizzard.

Specifically, 4 Blizzard efficiency cores, though they offer over 20% higher performance as well. And 4 Avalanche performance cores, which are over 10% faster, but also considerably more efficient.

They’re still 5 nanometer, though probably the second-generation TSMC N5P process, and still ARMv8 instruction set, but out of the three major improvements offered by ARMv9, only the new matrix multipliers sound like they’ll be anything even remotely novel or useful to Apple in any way. So what that all means is processors that are still ultra low power but that can run cooler, so they can sustain heavier workloads longer.

The G14 graphics cores, well, they’re on average 20% better core-for-core, but in the relatively tiny iPhone 13 thermal envelope, they can throttle hard.

The MacBook Pro is nowhere nearly that tiny, and is also reportedly going from 8 GPU cores in M1, to 10 GPU cores in M2. And that could mean a 50% improvement overall, if not more.

So, anyone doing any amount of graphics work would be in for just a much better time.

And A15 also includes ProRes Media Engines for the iPhone 13 Pro. Similar to the ProRes media Engines Apple added to the M1 Pro and M1 Max this past October.

The M1 MacBook Pro already had H.264 and H.265, aka HEVC, custom encode/decode blocks, but if Apple doesn’t keep those exclusive to the higher-end Pros, and brings them over to the low-power models as well, that could be a real advantage to anyone who does a lot of video but just can’t afford or just doesn’t want to carry around a lot of MacBook Pro yet.

Categories
Uncategorized

M1 Pro / Max — Apple’s Intel-Crushing Silicon Power Explained!

2 ice storm high efficiency cores. Up to 8 fire storm high performance cores. Up to 32 G13 graphics cores. 16 Neural Engine cores. With up to 64 GB of unified memory and 400Gbps bandwidth to keep it all fed. A new display engined not just to drive XDR displays, but multiple XDR displays, a third thunderbolt and USB controller for more I/O, a new media engine for super fast, ultra efficient H.264, HEVC, and ProRes encode/decode. Up to 57 billion transistors. And maybe, just maybe, our first glimpse at what’s coming next for the iMac Pro and full-on Mac Pro.

I’m Rene Ritchie, hit subscribe so you don’t miss the next video, and this… is the M1 Pro and M1 Max deep dive!

Scalable Archiecture

X as in Extra

Apple’s been making ‘extra’ — as in extra but also as in just totally extra — versions of their custom chipsets almost since forever. Their first SoC, or system on a chip, was the A4 in the original iPad and iPhone 4, followed by the A5 in the iPad 2 and iPhone 4s. Now, an SoC just means most everything is integrated into the same die. So, instead of having a silicon… platter… with CPU over here, GPU over there, memory on the left, controllers on the right, you have a silicon sandwich with all the cores, all the features, all stacked together. There are a ton of advantages to this approach, which I’ll get to in a minute, but one of them is scalability. Not just generation over generation, as new architectures and processes are introduced, but even within the same generation as extra cores and extra features get added.

Enter the A5X

Where the OG A5 had dual ARM Cortex A9 CPU cores, dual Imagination PowerVR SGX543MP2 GPU cores, and 512 MB of package-on-package RAM, A5X kept the same CPU but escalated the GPU to quad core PowerVR SGX543MP4, doubled the size of the memory interface, and the amount of RAM… to 1 GB… but shifted it off-package, which, I don’t know, maybe could be something again in the future…

Apple needed those extra GPU cores and memory to power their first-ever tablet-sized Retina display for the iPad 3 aka The New iPad. Even though, turned out, only barely, and Apple ended up having to get the iPad 4 and A6X out just 6 months later. And let me know if you want to hear more about that whole story in the comments!

A6X was similar to A5X, keeping the same dual CPU cores, though this time, they weren’t ARM Cortex designs but Apple’s first custom Swift CPU cores, and it took the A6’s triple core PowerVR SGX543MP3 GPU to a quad-core PowerVR SGX554MP4 GPU, and the memory to quad-channel.

So, A4, but no A4X. Then A5X and A6X, but no A7X. That’s right, Apple stuck with their first 64-bit chipset, the A7, in its origin forme not altered forme, for the first iPad Air. The iPad Air 2 though, yeah, that got an A8X. Instead of dual Typhoon CPU cores, it had three. And instead of a quad cluster customized PowerVR Series 6XT, it had an octo cluster, and again with an external RAM module. 2GB worth.

There was also an A9X and an A10X, the latter of which was part of the first generation to use Apple’s fused version of big.LITTLE, or performance and efficiency cores. Triple Zephyr e-cores and triple Hurricane p-cores, to be exact, along with 12 customized PowerVR GT7600 GPU cores.

No A11X, because by the time the iPad Pro came around, Apple had fallen into less of an every 12 months and more of an every 18 months cadence for upgrades, but yes A12X. Which was the big one, because it most directly set the stage for everything that would come with M1.

Bargain Binning

A12 was Apple’s second generation Bionic architecture, which unlike the paired Fusion architecture before it, could use any or all cores separately or together. In other words, multicore wasn’t just 2 e-cores or 2 p-cores, it was the 2 e-cores plus the 2 p-cores. All 4 cores together, like Voltron.

4 Tempest e-cores and 2 Vortex p-cores to be exact, along with 4 custom G11 graphics cores, and 8 neural engine cores, or ANE. Also, custom encode/decode blocks for H.264 and HEVC, which I’ll get to in a minute because they’ll become a much bigger deal with with the M1 Pro and M1 Max as well. And, Apple’s increasingly secret sauce — their performance and, soon, machine learning controllers.

A12X kept the same number of e-cores but doubled the p-cores to 4, and almost doubled the GPU cores… almost. See, at the time, Apple announced 7 GPU cores on the A12X for the 2018 iPad Pro, but it turns out there were actually 8 cores, Apple was only making use of 7. They didn’t start using all 8 until the A12Z, the second iteration of that SoC, for the 2020 iPad Pro.

A12X also had 4GB of integrated RAM for most models, but 6GB for the highest tier model, which required it to support 1 TB of storage. A12Z, though, had 6GB for all storage tiers. And all of this, from binning to memory levels, were all things we’d start to see really play out for the M1… but especially for the M1 Pro and M1 Max.

Which, no surprise, because the A12Z also just so happens to have been the chipset they used for the Apple Silicon Mac dev kit — the iPad guts in the Mac mini case intended to help get apps ported over and ready for M1.

And yeah, there was no A13X. Even though Apple introduced the A13 for the iPhone 11 back in September of 2019, they were still perfectly happy to ship the A12Z for the iPad Pro and Dev Kit in 2020.

Same way there was no A14X… because it essentially became M1… and even though Apple introduced the A15 for the iPhone 13 back in September of this year, Apple was likewise still perfectly happy to ship the M1 Pro and M1 Max for the new MacBook Pros just one month later. And wow are they extra. Like triple X as in Extra.

M1 Pro & Max CPU

M1 is more than just A14X with new branding, of course. It has specific silicon IP for the Mac. But that didn’t stop Apple from following up on the A12Z iPad Pro with the M1 iPad Pro. Because the architecture was and is so broadly similar between the generations. 4 e-cores, 4 p-cores, 8 GPU cores, but 16 ANE cores now instead of the 8 on A12X and Z. And fabbed, or fabricated on Taiwan Semiconductors’ 5 nanometer process. Giving it even greater performance efficiency.

Instead of A12 generation Tempest and Vortex cores, M1 has A14 generation Icestorm and firestorm cores for the CPU, which provided a really good balance between that efficiency and performance for Apple’s initial wave of ultra low power Macs — the MacBook Air, 2-port MacBook Pro, Mac mini, and redesigned 24-inch iMac. A whole line up, from iPad Pro to iMac non-Pro, ultra-long lasting portables to ultra-low thermal desktops. Talk about your scalable architecture.

But with M1 Pro and M1 Max, Apple wasn’t as concerned with ultra low power. What they needed to deliver was ultra high performance. So, instead of 4 e-cores, they dropped those down to just 2. Bigger batteries and adaptive refresh rate displays would offset any real differences there anyway. And then they bumped the p-cores up from 4 to 6 or 8 for the M1 Pro and a solid 8 for the M1 Max.

The 6 p-core version in the M1 Pro being a binned down version. Same as what Apple did with A12X and even M1 on the GPU side. See, when monolithic chips like Apple’s SoCs come off the fab, especially on leading edge process nodes like TCMC’s 5 nanometer, there can be defects and some of the cores can be non-viable. If you just throw away every chip without a full set of perfectly functional cores, you end up with a lot of waste, which means low yield, low volume, and a high price per remaining unit. But, by keeping the ones with only 7 out of 8 GPU cores, or 6 out of the 8 p-cores, they throw away fewer chips, which means a better yield, which keeps volume up and prices down per unit. Then Apple passes on some of those savings to people who are fine buying less cores if it costs them less money.

So, yes, both M1 and the binned down version of M1 Pro have 8-core total for the CPU. But where M1’s 8 cores are the sum of 4 e-cores + 4 p-cores, the binned down M1 Pro’s 8 cores are the sum of 2 e-cores + 6 p-cores.

In other words, instead of 4 Toyotas and 4 Ferraris, you’re getting 2 Toyotas and 6 Ferraris. Which is more Ferraris.

And then the regular M1 Pro and the M1 Max both have 10 cores total for the CPU. The sum of 2 e-cores and 8 p-cores, or 2 Toyotas and 8 Ferraris. Which is even more Ferraris.

Why 6 or 8 p-cores for the M1 Pro instead of 7 or 8 like the GPU cores for the regular M1? It might just come down the realities of the fab, or it could have to do with the 8 p-cores actually being 2 clusters of 4 p-cores each. Also, each clusters has their own 12MB L2 caches, and each cluster can dynamically clock their CPUs independently, meaning a single active core on each can go all the way up to 3.2GHz, two cores can cut down to 3.1GHz, and 3 or all 4 cores, down to 3GHz. Sacrificing a little serial speed for a lot of parallelism.

The 2 e-cores are clocked at 2Ghz, but get the same 4MB of L2 cache that the M1 has for its 4 e-cores. On top of all that, where M1 has 16MB of system level cache, M1 Pro has 24MB and M1 Max, 48MB of SLC.

So, each individual e-core and p-core are the same, meaning any single core task will perform the same on M1 or M1 Pro or M1 Max. Like driving any one individual Toyota or Ferrari. But there are more p-cores even on the binned down M1 Pro, and many more on the regular M1 Pro and the M1 Max, meaning any multi-core task will run just that much faster. Because so damn many Ferraris. And that’s not even counting the improved memory system, which I’ll get to in a minute.

And that’s the first way Apple’s M1 Pro and M1 Max feel so fast. Just the overall speed of the cores. Everything gets done faster.

But because they’re all still Apple cores, not Ferrari’s, and those cores that have to fit in the tiny thermal envelopes of iPhones and the relatively small thermal envelopes of iPads, even the performance cores are still widely efficient. Which is just the starkest of contrasts to the previous Intel chipsets, which just… chugged power rather than sipped it, and hit thermal max pretty much at startup, only to ramp up and down… incessantly, constantly, thereafter.

At just 30 watts, fully fired, inside the relatively roomy chassis of the MacBook Pro, the M1 Pro and M1 Max CPUs can sustain… pretty much indefinitely.

And if you’re worried about battery life, you can turn on the new low power mode in macOS to maximize the efficiency. Conversely, on the 16-inch MacBook Pro, because the even bigger thermal envelope, you can turn on high power mode. That lets the fans and chips loose, so you maximize the performance.

It’s a cool idea made possible precisely because M1 architecture is so cool.

M1 Pro & M1 Max GPU

Carrying on the theme, where M1 had 7 or 8 slightly tweaked A14 generation G13 GPU cores, M1 Pro has 14 or 16 of those GPU cores and M1 Max… 24 or 32. And that’s just… such a ridiculously massive escalation. To help put it in context, M1 had 16 billion transistors. M1 Pro has 33.7 billion and M1 Max… a brain-blowing 57 billion, with all those GPU cores being a significant part of that budget.

But Apple’s always leant heavily on the GPU for everything from interface acceleration down to the literal core graphics and core animation, to things like the old OpenCL and the new Metal APIs.

And doing it this way, Integrated vs. discrete in a laptop, really turns out to be more than just an implementation detail. Especially when you’re talking SoC sandwiches rather than old fashioned board platters.

Because, Apple is keeping the CPUs fed with 16 to 32 GB of LPDDR5 memory on the M1 Pro and a whopping 32 to 64 GB on the M1 Max. Which, yeah, sure, isn’t anything new or novel for a MacBook Pro CPU, but because of the SoC architecture, that RAM isn’t just for the CPU, it’s a massive memory pool that’s also available to all the other compute engines, including the GPU. That’s compared to the traditional board architecture where the GPU might have 8 GB of dedicated VRAM if you’re lucky, 16 at the highest end, highest performance. And here it gets up to 64 GB. Which is just unheard of on a laptop.

And to keep that all fed to the GPU, Apple’s opened up on the memory bandwidth. All the way up. M1 is doing 70 GB/ps. M1 Pro is doing 200 GB/ps, and M1 Max… a jaw-dropping 400 GB/ps. And you guessed it, because of the unified architecture, the CPU and other compute engines also get access to all that bandwidth, which is also unheard of. Just… unheard of things all the way down.

It’s the second way M1 Pro and M1 Max feel so damn fast — the instant responsiveness afforded by that unified memory system and overall architecture. It makes the Mac feel as utterly instant as the iPad, even more utterly… instant-er now.

Also, where a company like Nvidia essentially abstracts away the computer into an interface for their CUDA cores, Apple’s Metal frameworks abstract away the GPU instead, so anything written against previous Intel or AMD graphics will work on M1, Pro, and Max GPUs, and because Apple’s GPUs are so damn good, chances are they’ll just work better. Massively better.

And even though M1, M1 Pro, and M1 Max vary so much in capability, scalable architecture means they present as very, very similar targets to developers. I mean, Apple had to do a ton of work with the fabric that brings together and binds all these core and all this RAM, but anything already written for M1 will just fly on M1 Pro and… go full on orbital on M1 Max.

And again, because these GPU were designed for performance through efficiency, and have to scale from the iPhone 12 to iPad Air to iPad Pro to MacBook Air to MacBook Pro to iMac, they still only just sip power.

Even firing CPU and GPU and… basically everything… the M1 Max flat out uses slightly less power than the 100 watt baseline on an Intel Alder Lake CPU, which can also reach over 300 watts when overclocked — as much as a giant hellicarrier looking Nvidia Ampere card.

Put those two things together, and even in a desktop, where Intel + Nvidia would require near-cryogenic levels of cooling, Apple could easily throw multiple M1 Max dies into even smaller, thinner enclosures and still offer ridiculous levels of performance.

And you better believe I already have a video up on just exactly that, linked in the description below the like button.

M1 Pro & Max Media Engines

I’ll get to the media engines in a supremely hot second, but in addition to the GPU, the M1 Pro and M1 Max have a third USB and a third Thunderbolt controller, which not only lets them power more ports than the original M1, but more displays. Up to two 6K displays with the Pro and three with the Max, in addition to a 4K TV over HDMI.

That is HDMI 2.0 not 2.1, which has much higher bandwidth. But it’s because when these chips were being specced out a couple years ago, HDMI 2.1 was even less of a thing than it is now, and Apple figured that I/O bandwidth should go to a third Thunderbolt port for more Pro-centric displays, rather than faster HDMI for then bleeding-edge TVs. Same with the SDXC card slot. Feel utterly free to quibble in the comments about that, but they’ll eventually just amp up the I/O in a future generation.

Now, those media engines.

Apple’s been adding custom encode and decode blocks to their silicon for years. And, honestly, hardware acceleration for video playback isn’t at all uncommon. Video transcoding has been a little more hit and miss, but not by much. And over those years, Apple has added support for H.264, the original 1080p standard, and H.265, aka HEVC, the 4K standard. Also Google’s alternative codecs, including the current VP9.

Apple even switched from the original T1 chipset in the Mac, which was a repurposed S2 system-in-package from the Apple Watch, to the T2 chipset, which was a repurposed A10 Fusion, in part because Intel failed to deliver H.265 encoding in a timely fashion, and Apple’s iPhone older iPhone chip could just do it faster and way more efficiently than leaving it CPU or offloading it completely to some of the GPUs.

That’s why Apple Silicon Macs don’t have T2 or T-anything chips any more. Everything that Apple had to work around Intel to provide, including secure enclaves for Touch ID, and now Neural Engines for machine learning acceleration, all the custom controllers, and yeah, the media engines, are all already built into M1. Because where T2 was an A10… M1 is an A14. And that’s how Apple silicon and SoCs work.

So, M1 has those A14 media engines for H.264 and HEVC, among other things. But what M1 Pro and M1 Max add are ProRes video engines.

Which is… not exactly a first. When Apple introduced the current Intel Mac Pro back in 2019, they introduced a reprogrammable ASIC card along with it. Branded Afterburner, it was a ProRes and ProRes RAW accelerator that could handle up to 12 streams of 4K or 3 streams of 8K.

Then, just a couple months ago, Apple introduced the A15 Bionic for the iPhone 13, which for the iPhone 13 Pro included an extra G14 GPU core and… a ProRes media engine. That’s what lets the iPhone 13 Pro shoot ProRes 422 HQ, including a new storage controller than can write those massive 6GB per minute files to the SSD without skipping a frame.

Now, Apple’s also brought those ProRes engines to the M1 Pro and M1 Max, which is super interesting for a couple of reasons.

First, because it means Apple isn’t restricting features to specific IP generations. In other words, A15 generation ProRes Engines can show up on A14 generation M1 Pro and M1 Max chipsets. Apple cares less about abstract numerical branding sequences and more about delivering the capabilities they need to deliver in the most economical, efficient, and performant way possible.

Or as the silicon team says, their one job is to run iOS and macOS and apps faster than anything else on the planet, constrained only by time and the thermal envelope of the device, and the rest is all just implementation details.

Second, because moving them off an ASIC board like Afterburner and putting them on the SoC is again like moving them off the platter and putting them in the sandwich, so they have the same immediate access to that huge pool of unified memory and bandwidth. Which just makes them even faster.

Third, that Apple is putting such a focus on video capture and production this year. Because these new engines let you not just capture ProRes on your iPhone 13 Pro, but edit it with jaw-dropping speed and efficiency on your MacBook Pro.

Forget 12 streams of 4K on Afterburner, M1 Max can handle 30. And just sit down with your 3 streams of 8K. M1 Max can handle 7.

That’s thanks to M1 Max not just having one ProRes encode and one ProRes decode blocks like M1 Pro, but two of each. And two H.264 and H.265 decode blocks as well.

And sure, Apple could have continued doing ProRes on the CPU, like they did with Intel Macs in the past, but moving it to dedicated silicon meant they could do it faster, with less power draw, and in a way they left the CPU free for other tasks.

And that’s really important. Because, prior to M1 Pro and M1 Max, when you went to render ProRes, it could thrash the CPUs, meaning anything else you tried to do at the same time was maple syrup on snow slow — I’m Canadian, you know what I mean — and made the render slower as well. Like almost untenable. Now though, you can hit render, and only the ProRes engines get thrashed. You can keep working away on the CPU as if nothing else is happening. Almost like you have a second Mac ready and waiting for you while the first one’s off exporting your video.

And to see why they’re willing to spend their transistor budget like this, I have a whole entire video up for you with Apple’s VP of Silicon and VP of Mac Product Marketing where they explain just exactly why. Link the description below the like button.

But it’s the third way M1 Pro and M1 Max just devastate on speed. Yes, it’s the pure performance of the cores, and yes it’s the utter responsiveness of all that unified memory and bandwidth, but it’s also those off-core features that are essentially giving us multiple parallel pro workflow engines in one.