M1 Pro vs. Max — What Apple Didn’t Tell You!

Apple's brand new M1 Pro and M1 Max and MacBook Pro, you had questions, you wanted answers, you wanted the truth. I can give you just all the truth that you can handle. Let's do this. As always, supporters over at get Q & A priority.

So, William Edstrom, "Is it possible that Apple could enable Face ID "on these laptops or is necessary hardware not included?" And yeah, the necessary hardware is not included. Face ID is more than just the RGB camera, which is what's on these new MacBook Pros. It's the infrared flood illuminator and dot projector that lets it actually scan the geometry of your face. And I think the reason that we don't have Face ID on the MacBooks, on any Mac yet, is that it's a fundamentally different concept that Apple still has to figure out. To just authenticate someone, to look at their facial geometry and say, "Yes, it's you. Unlock." That's fine. That's transparent. You sit down in front of it, you open it, and that just works. But to authorize purchases, it becomes trickier because you don't want to just authorize a purchase anytime you're looking at the device. There-in a lot of fraud takes place, and Apple has to be very careful to avoid that. So on the iPhone and the iPad Pro that have Face ID, you have to double click the button to add a direct user interaction, to show intent when you wanna make those purchases, when you wanna authorize those purchases. And that is unclear on a MacBook. For example, do you reach over and double-click the power button? Because in that case, that's where the Touch ID sensor already is. So you're basically being forced to do a Touch ID-style action. It doesn't really save you anything. It doesn't increase the utility at all. And I think that's why Apple is sticking with Touch ID.

Robert Rehders, ”Curious if there is any insight available "regarding thermal throttling on the 14-inch versions." The 16-inch MacBook Pro is just bigger than the 14-inch MacBook Pro, so theoretically, it has a, well, not theoretically, practically, it has a bigger thermal envelope. I think with the M1 Pro you'll never see a difference. You know, quote-unquote, never. If you're in Arizona, in 110-degree temperature, you know, that's gonna change things. But with the M1 Max, when you're up to 32 GPU cores and you're hitting other parts of the silicon IP, I think, for sustained loads there, you will potentially fill up the envelope of the 14-inch and have that extra capacity at the 16-inch, again, because physics. But I don't think most people in most workloads will ever come close. You'd have to do that like in testing or if you're ultra-high-end 3D artists just laying on every effect imaginable.

Jeffrey David Marraccini, and I hope I'm pronouncing that right, "Curious if this is 2X Retina or not?" And it is. It'll have a default to 2X Retina and you should be able to pick lower resolutions for increased legibility and higher resolutions for increased information destiny, just like you can on current MacBook Pros, it's just the default will finally be a pure 2X Retina. PK. "Do we know how scalable the silicon architecture is? "How big can the SOC go with GPU and compute cores "for desktop models?" And I think we've seen rumors from Mark Gurman on this, where they're gonna get hella big. Like for Mac Pro, they could be up to 40 core CPUs and 128 GPUs. So it's just like full-on scaling. There doesn't look to be a limit within a reasonable range for the kind of products Apple wants to make, up to and including a Mac Pro, it seems to be able to scale just fine. Craig Doran. "Why the reduction in efficiency cores by half? "Can they still be referred to as Icestorm? "Are the remaining two e-cores "changed to be more powerful or efficient?" And yeah, I think it's a difference in focus between the M1 and the M1 Pro and the M1 Max. And they are still Icestorm cores. M1 Pro and M1 Max are the same silicon generation as M1, which is A14-generation silicon. And that means Icestorm efficiency cores, Firestorm performance cores. There's just a different arrangement. So instead of four and four, we now have two and eight. Because while the battery life is really good on these computers, that's not the focus. That was the focus on the MacBook Air and the two-port MacBook Pro. This, the focus is on performance. So they're really giving you as much, giving us as much performance as we possibly can, and that means double the amount of performance cores. And there are some things like low-power mode, which is gonna help make up for some of the battery, you know, if you really need to push it. But when you start cranking up the workload and you start firing all those cores, especially all those graphics cores, it's gonna be all about performance, not all about battery life anymore. And that's the substantive change here.

Arthur Kahwa, "How loud is the 16 inch when the fans kick in? "Do we have a comparison with the latest Intel models?" And there haven't been any reviews released yet, so we don't really know. My guess is that they are gonna be really quiet. I mean, with the M1 MacBook Pro, the two-port MacBook Pro, a lot of people didn't even realize there was a fan in it because it was so hard to get that fan to kick in. And I think in normal modes, it'll still be hard to get the fan to kick in. And it'll be hard to hear the fan when it does. There is a high-power mode in these new MacBook Pros that you can enable manually, that will ramp up the fan. It'll force the fan to ramp up. That's something that people have been asking for for a while. It's like, "Apple stock throttling things so aggressively, "just give me the ability to increase the fan. "I don't care about the noise. I care about the performance. "Give me that option. Just let me do it." And they are. They're putting you and us in command of our own destiny there. Steven Kaplan. "I would love to see some information "on how much time you save in your workflow "from these new laptop models." One of the biggest advantages, to me, of M1 over Intel with the MacBook Air and the two-port MacBook Pro, was that it was just so instant. Everything was instant. Like iPad-level instant. And you forget. Like you think about things like render time and it renders five minutes faster. And is five minutes really worth it? Well, if you're rendering a bunch of videos, yeah, five minutes adds up quickly. But I'm basically living a life without beach balls on M1. And when you count up all the beach balls, all the little delays, all the seconds that add up on top of each other every minute, out of every hour, out of every day, that is not just a phenomenal increase in the quality of life, but it gives you back a significant part of your life. Next up, YouTube. And if you didn't manage to get your question in, make sure you hit the bell and the subscribe button, because I typically hang out in the comments for the first hour or so after any video goes live so I can answer anything extra that you wanna throw at me. Throw everything at me.

Philip Dunphy, "Some context on which workflows "would match well with each tier of chips, "because I don't wanna overpay for power that I don't need." If you need higher-end gear, you typically know it because you're hitting up against some sort of constraints, some sort of roadblock in your current workflow. So, for most people, I imagine the M1 Pro will be plenty. But if you're working on something that needs literally double the GPU cores, any of those high-end apps that just benefit, that are GPU bound, that the more cores you throw at them the better, faster, and more they can do, that's when that makes sense. So if you've never had a problem that in area, you certainly won't have a problem in that area now. But if you've always been like, "More!" Like in rock style, "We've gotta give the people more." Likewise, if you do a lot of ProRes editing, or you do a lot of video editing in general, you're getting twice the media encode, decode blocks. Well, the encode block on the standard H.264, H.265, but encode and decode on ProRes. So if you're doing workloads that are bound on video compression, then you're gonna see a doubling of that capacity on the Max, and you're probably gonna wanna consider the Max, as well. UltimatePokeGamer7. "Based on these chips, what are your predictions for M2?" So, yeah, if we go back to M1 being based on A14 generation silicon, I'd expect M2 to be based on A15 generation silicon, or the chipset that Apple just shipped with the iPhone 13. And that means, instead of Icestorm efficiency cores, and firestorm performance cores, we'd get the Avalanche efficiency cores and the Blizzard performance cores. And instead of the G13 GPUs, we get the G14 GPUs. And in the performance cores, it's not a huge difference. They tickled the frequencies a little bit higher, but in the efficiency cores, they were close to 30% better performance in the efficiency cores, and also double the cache and just way better GPU performance, although throttled in the iPhone, because it's such a small thermal envelope, but probably not throttled anywhere nearly as much in something the size of a MacBook or a MacBook Pro. So just better again, over the M1, in ways that to continue to push the envelope, but not just in performance overall, but in efficiency, because it is just a way, way cooler chip overall.

Omer Horovitz, "Does it have HDMI 2.0 or 2.1?" So it has HDMI 2.0, like the M1 Mac Mini, not HDMI 2.1, like the current generation, think it's A12, Apple TV. And I think that's just because Apple's not delivering anything yet that requires HDMI 2.1. Like, don't get me wrong, I would take it. I love it. I want all the ports at all their capabilities, even if they're not being used for anything, just because I'm a nerd and nerds can dream. But the main difference is for HDMI 2.1 is that it can handle up to 8K, up to 120Hz. It can handle variable HDR. And I don't think Apple is outputting that from the Mac, to begin with now, or these models ever. So HDMI 2.0 probably fit their overall bill of materials and capabilities they wanted to deliver, and they just saw no need for HDMI 2.1 yet. That'll be in a future Mac that is designed for an 8K, 120Hz world.

Chidorirasenganz, "Why didn't they go for an even bezel like the iPad, "instead of the notch?" I mean, I can't unsee it. I just, I wish it wasn't there. It's like a needle in my eye all the time. But, honestly, 99% of the time I don't even notice it anymore. And the difference between the iPad, even the iPhone and the Mac, is that with the iPad you have to be able to hold it, and you have to be able to hold it in your hands at every orientation, landscape and portrait, and that means you need a bezel in every orientation. And if you have that bezel, you don't need a notch. You can't have a notch, there's no utility to it. Where on the iPhone, for example, it gives you extra screen. If there wasn't a notch there, there wouldn't be no notch, there would just be a forehead and you would just lose those extra pixels, where right now they're shoving things like the battery indicator. And it's similar to the approach they've taken with the MacBook. Because you're not holding the MacBook in your hands, they can make those bezels really tiny. You don't need to hold onto the screen. And because of that, they can remove as much of those bezels as possible. Now, they need to leave some because they've put a much bigger camera in there now, a 1080P camera. So doing exactly the same thing they do with the iPhone, and that is just deleting the forehead. They're not adding a notch, they're deleting the forehead as much as possible, and then shoving the menu bar up there so that everything below the notch is completely usable screen. And the alternative would be a full-on forehead with the menu bar, so you'd lose just that little bit more of screen. And even though I don't find it attractive, understatement, at all, I will always take as much screen as you can give me. Christmas Clatter. "Will there ever be a Mac Mini version?" I think there'll be an M1 Pro and an M1 Max Mac Mini. But I think Apple was really focused on getting this machine out, just like they were the M1 iMac last spring, and they have limited bandwidth in that Mac department and they wanna really get these right and focus on one of these machines at a time. So I figure, next we'll get the Mac Mini and maybe the M1 Pro, M1 Max, M1 Extreme, whatever they call it, even higher-end chipset, either together or staged out next year, leading up to and including the Mac Pro by the end of next year, like the two-year transition that Tim Cook promised us.

Eddiemcohen, "Does more RAM really make a difference "with unified memory?" And yes, absolutely. I think there's still a lot of misconceptions about what unified memory really means. What it is is just one big pool of memory that's accessible to every part of the SOC, of the system on a chip. That includes the GPU, the CPU, the neural engines, the image signal processor, or the ISP. And it means they don't have to copy data back and forth. So it's way more efficient. But it also means like on a classical computer with a board, you have the CPU and then the RAM that feeds the CPU. And then you have a GPU with its own VRAM. And especially in a laptop, the VRAM available to the GPU I don't think has ever exceeded 16 gigabytes. And now, with these machines, you have 16 gigabytes that can go to the CPU or the GPU. I mean, not fully, there is some overhead, there is some sharing involved, but you have a larger pool of memory. And then when you get to 32 gigabytes, 64 gigabytes, being able to feed that GPU, that is just unheard of on a laptop. Also, unified memory isn't magic. Like eight gigabytes of unified memory is not equal to 16 gigabytes of traditional memory. All it means is that because of things like memory compression and the ultra-fast SSD for swaps, it's really all that it can be. You get the most out of it. But if you have memory pressure, like if you're using a lot of really memory hungry, memory-intensive applications, if you know what memory pressure is, basically, and if you're hitting it, the only way to solve that is more memory. The unified memory doesn't help you there, you need more memory. And if you're doing ultra, ultra-high-end work, stuff that you would previously do on a mid-range Mac Pro, you're gonna want as much RAM. You're still gonna want as much RAM as you possibly can to be able to do those things in a mobile... Things you can never do before in a MacBook Pro, a mobile computer.

Okay. Twitter time.

Stephane Ceretti, Huge fan of your work. "Which one did you order?" I went for the MaxBook. The just maxed out, 16-inch, everything I could put in it because I'm in that position of time being way more important to me than the money of this machine. I mean, I don't spend money on anything else. Like, literally nothing else. I spend it on gear because that's how I earn my living. And doing these videos, being able to do them faster, better quality with less frustration, it really does pay for itself. Dr. Ian Cutress. Also a big fan of your work at AnandTech. "Why are there 80 SKUs? "160 if you account for two colors. "Is not that a logistical nightmare?" And my guess is they've gotten so good at managing SKU complexity on the iPhone, like all the colors, all the models, all the configurations, that on the relatively much smaller MacBook Pro they can just handle that. That's like a breeze compared to handling iPhone complexity. Alex Miller. "Truly curious what the M1 Max "means for potential AAA gaming." On iOS, it's such a big platform that they enjoy a huge amount of support from developers, and that's just never been true on the Mac. So the ability to run iOS games means you're gonna be able to run as much as developers allow it. They still get to choose whether it's supported or not. But to any developer that supports running their iOS games on the Mac, that should be terrific. But I think there's still gonna have to, the developers, the studios, are still gonna have to decide to support and launch those games on the Mac. And that's more of a developer relations problem and less of a hardware problem and has always been thus. Lew Piper. "Why no Center Stage? "Is the sensor on the camera not big enough? And what Center Stage is on the new iPads, the latest generation of iPads, is basically an ultra-wide camera that Apple then crops down to just a regular wide camera, but pans and scans and zooms in and out so that if you're talking it's really focused, framed on you, but if you move around or somebody else comes in the frame, it'll expand, it'll be able to capture everybody in the frame and then move around with them, zoom back into you if they leave. And this just does not seem like that kind of camera. This is not one of those ultra-wide cameras. Apple really focused on, I think, the single-person perspective. Just getting a really, really good webcam in here for all the Zoom calls, all the FaceTime calls that we're making. And maybe in the future, we'll see an ultra-wide, but that seems to be just a complete product differentiation for them, at this point.

Nitesh, "One billion colors equals 10-bit panel?" And yeah, it's still an eight-bit panel on the MacBook Pro in terms of hardware. And Apple is still doing things like temporal and spatial dithering to simulate more colors. But they've gotten so, so good at it that I think for most people it's gonna be indistinguishable from what you would see on a 10 bit-panel. And through, I don't wanna call it trickery 'cause it's not trickery, but through display technology, they're probably exceeding what you would get just on a raw, unoptimized 10-bit panel anyway. Less about hardware now and it's more about the sum total, the complete package. I would love to see an actual 10-bit panel. I would love to able to compare between this kind of eight-bit panel that's capable of simulating one billion colors and what an actual 10-bit panel would provide. But based on what we're seeing on the iPad Pro now, it looks amazing.