Apple Watch Hermès Series 7

Unboxing the Apple Watch Series 7 Hermès in stainless steel with the black Deployment band, new Hermès Watch Faces, watchOS 8.1 fall detection for running and cycling, and some thoughts on Series 7 one month later...!


Beats Fit Pro — Best AirPods Yet?

Fit because they’re design to fit better in your ears, so that they stay in while you’re working out… to stay fit… which all just… Fits. Huh. Clever Beats.

I all caps love what Apple’s been doing with Beats lately. Keep the AirPods minimal and iconic, while letting the Beats… pods get all shades of colorful and creative.

Ok, so, imagine AirPods Pro but instead of just white they come in black as well, but also sage gray and stone purple. Yeah’d I’d love to see blue, orange, and red as well, but maybe they had to split colors with the HomePod mini this year, I dunno?

Then, instead of a stiff stem that shotguns down, you have a flexible hook-like wingtip that curves up to better secure the bud.

Well, you don’t have to imagine it, because that’s pretty much exactly what the mad geniuses in Beat’s labs have gone and done.

I mean, same Apple H1 headphone chip as AirPods Pro, same active noise canceling and transparency mode, so you can block out the world or make it clearer than ever, same spatial audio with dynamic head tracking, so you get a 3D sound stage that sounds like it’s being staged from your iPhone, iPad, or Apple TV as if they were the stationary source. Also, Adaptive EQ, so they monitor the audio inside your ear as constantly adjust the sound so it sounds like it’s supposed to sound regardless of the shape of your ear or how they’re moving around inside your ear.

I went over how a ton of this tech works in my Beats Studio Buds, AirPods 3, and AirPods Max reviews, so I’ll link those in the description below the like button.

Now, these do have silicon tips, so they are fully in-ear. Which I know some people find uncomfortable or painful, but that’s why Apple makes AirPods 3.

For me, personally, this style stays in my Brazilian Jiu-itsu mangled ears just way, way better. And while the Beats Studio Buds already stayed in better than AirPods Pro for me, thanks to the wingtip, these Fit Pros stay in even better, better.

I think AirPods Pro probably still has a slight edge when it comes to overall sound and ANC quality, but at $200, which is $50 less than the AirPods Pro, the Beats Pro Fit have the edge on affordability. Also the Beats App for Android, which gives you pretty much everything but Spatial Audio with Dynamic Head Tracking and Find My.

All that with 6 hours of battery life in the buds, and 21 extra in case. That’s with ANC on. Turn it off and you can get up to 7 hours per bud and 23 more from the case. And fast fuel, aka fast charging, still gets you 1 hour in 5 minutes..

Last time, I said the Beats Studio Buds fit well enough that I was tempted to switch away from AirPods Pro. This time, the Beats Fit Pro fit even better. So, I’m going to keep wearing them for now, and I’ll let you know if, not when, I’m tempted to switch back to the AirPods Pro.


HomePod mini — Unboxing Apple’s New Orange!

Yeah, now they come in colors.

The hardware hasn’t changed — the HomePod mini is still a 3.3-inch softball or… grapefruit sized orb of Aldur that sits on your shelf, table, or desk and gives you a high quality 3D audio soundstage, just not one as booming as the original HomePod biggie.

I still love the way it looks, classic white or black, and the new yellow, orange, and blue. Only regret is that Apple didn’t match all the new iMacs an go full in on green, purple, and red as well. But who knows, if these sell, maybe we’ll get more next spring or fall.

I still love the sound as well, again, not as booming as the HomePod biggie, but those are almost impossible to find now, and if you want more than corner-filling sound, you can add more mini’s to fill out more corners for less the cost of the OG.

For more on the hardware and base functionality, I’ll drop a link to my previous review in the description, right below the like button.

If you want to use Apple Music, but you don’t want to pay a lot for it, there’s a new $5 voice plan. Basically you can tell Siri to spin up songs and playlists, you just don’t get the full Music app interface experience to go with it.

That’s mostly how I use Apple Music now but only mostly, and the value of the family plan is still terrific so while I’m glad there are more options, I’m not downgrading any time soon.

And if Apple Music just isn’t your bag, baby, there’s an API that lets any and every other service tie in now instead. Just say which one you want to use and Siri will learn and start defaulting to your preference.

And, yeah, Siri is still… Siri. It’s been steadily improving ever since John Gianandrea took over AI at Apple, and it can answer more questions than ever, but that’s never really been the issue with Siri — it’s always been about consistency, or the lack thereof.

Before, it felt like 1 out of every 10 queries were being sent to some old un-updated magic 8-ball server in a Results Way closet. No… 1 out of every 20?
But at the same time functionality has disappeared and come back over the last year. “Play infinity War on Apple TV” from “Ok” to “Sorry I can’t do that on HomePod” to “Ok, now playing” and back again just can’t ever happen. Not ever.

So Apple has to both enforce zero regression at the engineering level, keep up the pressure on the intelligence level until error are closer to 1 out of every 1000, 10,000, or more. Because reliability is just the cornerstone of any assistant, digital or otherwise.

Otherwise, I just love the whole idea of ambient computing. Voice isn’t fast, you can’t just tap, you have to talk. And you can’t just visually parse from large quantities of text on a screen, you have to listen. And that verbosity takes time and prevents skimming. But it’s just so convenient.

Often, when I’m working, I get Siri to send and read messages, open and close curtains and lights, spell words for me, even fact check, all while my hands and attention are still focused on my editing or writing.

It’s still not JARVIS or FRIDAY, or even the SiriOS of my dreams, but each year, every year, it’s getting just a little bit less HammerTech. So all the AI fingers crossed.

One of my favorite things lately is how much better Siri on HomePod has gotten at sending visual or mobile information to my iPhone, like longer search results, and maps. I can ask while I’m getting ready, then take it with me I as I walk out the door.


Why Your iPhone 13 is Recycled* (Ft. MrBeast)

Go to and for every $1 you donate, you'll remove 1 lb of trash from our oceans and rivers!

Thanks to for donating the sponsor time on this video to #TeamSeas

The antenna bands are made from upcycled plastic water bottles, and that's not all! Just this week, as part of doubling down on their pledge to become carbon neutral by 2030, Apple announced 10 new renewable projects around the world through their Power for Impact program, including partnering to protect and restore the 27,000-acre mangrove forests in Córdoba, Colombia.

When it comes to reducing plastics specifically, Apple is working on incorporating more recycled materials. Like the HomePod mini mesh, which is 90% recycled plastic. The Apple Watch Brained Solo Loop, which is 100 recycled polyester, and across the company, more than 70 parts now average 50% recycled plastics as of 2020, and 17 additional parts now use plastics made from bio-based content instead of fossil fuels.

They’ve also begun switching to paper wrapping for products, including the iPhone, avoiding 600 metric tons of plastics, and retail bags are now 80% recycled fiber. Overall, Apple has already reduced plastic in packaging by almost half in three years and they have a goal of removing it completely by 2025.

Other efforts include recycling aluminum and rare earth elements like magnets, and doing trade-ins and recycling programs for whole entire products, and it’s all designed to minimize the impact e-waste has on our planet, our oceans, and more accurately, our lives.

To do your part, please visit now!

Oh, so many things, Jimmy!

Jimmy: educate us!

Thanks to Curiosity Stream for donating the sponsor time for this video, so we can talk about Apple, the environment, and #TeamSeas

Ok, so, here’s the deal. Our oceans, rivers, and beaches are filled with plastic trash. Pollution from plastic has created a massive health crisis for the whole entire planet.

And there’s no conspiracy theory here. No one is lying about our water supplies looking like Jake Paralta’s locker… just so Apple can switch from plastic to paper wrappers and ruin our ASMR unboxing fun, or Tim Cook can come to your house and rip the A/C adapter from your cold dead iPhone. If anything, the conspiracy is invested in keeping us in denial, so there’ll be less competition for sparse resources in a future Kingsmen movie, or whatever.

Point is, one person, one movement, one idea can make a difference. And I’ll get to just exactly how in a MrBeast minute. But also: it can’t be just us, just individuals. It can’t be companies abdicating their responsibility by telling us to recycle without any plan behind it, or governments that want to regulate our USB plugs because it’s easier and more politically expedient than tackling the mega polluters in their own backyards and off their coasts.

No, it has to be the mega corporations as well, including and especially the biggest, like Apple.

Who just this week, as part of doubling down on their pledge to become carbon neutral by 2030, announced 10 new renewable projects around the world through their Power for Impact program, including partnering to protect and restore the 27,000-acre mangrove forests in Córdoba, Colombia.

When it comes to reducing plastics specifically, Apple is working on incorporating more recycled materials. Like the HomePod mini mesh, which is 90% recycled plastic. The Apple Watch Brained Solo Loop, which is 100 recycled polyester, and across the company, more than 70 parts now average 50% recycled plastics as of 2020, and 17 additional parts now use plastics made from bio-based content instead of fossil fuels.

They’ve also begun switching to paper wrapping for products, including the iPhone, avoiding 600 metric tons of plastics, and retail bags are now 80% recycled fiber. Overall, Apple has already reduced plastic in packaging by almost half in three years and they have a goal of removing it completely by 2025.

Other efforts include recycling aluminum and rare earth elements like magnets, and doing trade-ins and recycling program for whole entire products, and it’s all designed to minimize the impact e-waste has on our planet, our oceans, and more accurately, our lives.

Is any of it enough? No, of course not. It will literally never be enough. Not from Apple, not from anyone. Not until our world and oceans are clean enough to ensure our children and descendants can not only survive but thrive for ages to come. To get to become Star Trek one day.

And that’s exactly where you and me and all of us can help: Thanks to MrBeast, Jimmy Donaldson, and Mark Robber, former NASA and Apple engineer, we’ve got thousands of creators coming together — basically the biggest and best team up in the history of YouTube — to raise $30 million to clean 30m pounds of plastic and trash out of the ocean.

Here’s how it works:

Now, If you’re at all worried about any of the methods or the organizations, I’ve personally been involved in a couple of weeks of discussions over them, I’ve seen the best objections and concerns from the brightest people, and all of them have been addressed by the organizers with consistent thoughtfulness and all the weight a campaign of this size can bring to bear to supervise, audit, and ensure every single penny is spent in the absolute best way possible.

And so, yeah, basically, every $1 donated results in 1 pound of trash being removed from our rivers and oceans. If that sounds like a lot to you, awesome, please donate a lot, as much as can, to help us reach our goal. If it sounds like… a drop in the ocean… then yeah, it’s a big problem, huge, but it’s not only a start, it can be a lot more — a way to inform and inspire everyone to do just way, way more.

So, please, go to where you can contribute and see just how much progress we can make when we all work together. That’s, thanks again to CuriosityStream, and thanks to all of you for your support.


iPhone 13 ProRes — New Video God Mode!

Rene Ritchie (RR): It's here. It's time. Apple ProRes is hitting the iPhone 13 Pro and ProMax right now. 422HQ, to be exact. Six gigabytes per minute, to be even more exact. But what does all of that even really mean? And how would you use it?

Let's do this Alex Lindsay style, the man who knows as much about ProRes as anybody outside Apple, and who you may know from ILM, Industrial Lights and Magic, and Queen Amidala's ship, or streaming pretty much every major tech event in big tech. We've seen ProRes in a bunch of products. We've seen it in Black Magic cameras, the Ninja and the Samurai displays record in ProRes. What does it mean having the ability to record native ProRes on an iPhone?

Alex Lindsay (AL): It's unbelievable. I mean, you know it's, the thing is is that the iPhone can capture a lot of footage that is actually fairly usable. One of the big problems that we had was storage. We weren't able to capture at a data rate that made sense for production. You constantly had this problem of not really being able to get the full quality that the iPhone was capable of because it just couldn't save it to something like Apple ProRes.

RR: Yeah, that's the thing that really blew my mind here is that two years ago, WWDC, they announced the new Mac Pro and it had a ProRes accelerator and afterburner card in it. And now just two years later, we have these Silicon blocks on the A15 chip that are doing ProRes decode, encode, hardware acceleration, and this whole new storage controller and storage system that can record fast enough to do it. And it feels like in two years, Apple has taken a lot of a low-end Mac Pro and shoved it into my phone.

AL: Oh, it's incredible. You know, and it's, the thing is is that the phone now is something that you could shoot real footage with. Now, does it replace shooting with a Black Magic camera or an Airy camera? No, there's a bunch of other things that they do that is great. We hear a lot of filmmakers or people trying to market it overselling that idea. Being able to have interchangeable lenses, being able to have all the creature comforts, and it's a larger chip, it's gonna have better low light performance. It's gonna have a lot of other things. These larger cameras are larger for a reason. That said, insert shots, documentaries, where it might be hard to bring one of those cameras into a country or into a location. You know, dealing with training, corporate, and some films. All of these could be using the footage that they're shooting with the iPhone because the camera itself is unbelievable.

RR: Yeah, and we've even heard sometimes in big Hollywood productions, they take shots with an iPhone as just a way of sort of speccing it out, and then sometimes end up actually using them because the iPhone's really good at high dynamic range, especially when you're transitioning from low light to bright light. Normal cameras don't adjust to it that fast, and Apple's got it to the point where the iPhone really can handle a lot of the stuff you throw at it.

AL: It really can. And we've definitely seen that, and one of the frustrations was, well, we didn't get in Apple ProRes, you know? And so the thing is is that, so you have this horrible little H.264 that you can't, you know, H.264, H.265, that you can't do the final color adjustments that you wanted to do with the rest of the film because it was in this cheap format. So the fact that you can do your tests with a phone and know that, push comes to shove, you at least have more data than you would have had before.

RR: ProRes is Apple's high-fidelity codec, but what does that really mean in terms of what you're getting over H.264, H.265.

AL: Yeah, well, I mean, one of the things is that when we look at when we're actually compressing footage and all of it is compressed, so ProRes has some version of compression, H.265 has got another version of compression. All of these things are taking the raw data and trying to make it smaller. And ProRes is just a lot gentler about it. So there's a couple of things that we do in, when we see H.264 and H.265, there's two things that we generally do to make them smaller. First, we do what we call spacial compression. So we take every frame, or we take certain number of frames, and we say, well, these colors are like each other. And this is a very complex, that's the big difference between age H.264 and 65, is the science behind that. And, you know, H.266 is already on its way. People are already talking about it, but these are refinements to how we take those pixels and combine them in a mathematical formula that allows us to reproduce them at a smaller rate, but we still throw away data. That's how you do that. And you see that when you see things posterized or when you see things that don't look as detailed as they could, that's because we had to throw some, the data had to go somewhere. And where you really see that is in complex images. So take pictures of a, a video of a forest, take video of water, take video of those types of things. It's just too much data. Now, the two, the other way that we do compression is that we do what we call groups of frames, or GOPs. And so these, what we call, and 265 and 264 tend to be kind of long GOP, you know, solutions. And so what we do is we say, well, for these, it's not changing too much. Let's take this and call it a key frame, and then 60 frames later, we'll take another frame. We're gonna hold those whole frames, but in between, we're gonna have what are called B frames and P frames, that are predictive frames and bi-directional frames, that are looking at the data on either side. And they're guessing, and just tracking the data that was changing. One of the things that's different about ProRes is it carries the whole frame. So it's, every frame is there, it isn't, you don't have to jump from key frame, to key frame, to key frame to do that. In addition to that, the data rate on most ProRes is obviously much, much higher than what you have with H.265 and H.264. And so it can, it can carry a lot more detail. It just has more data to work with. If you're working with HQ or 422, there's just tons and tons of data there to work with. And so, and again, the place to test this is take your phone, shoot H.265 of a forest, then take it and shoot it with, one with H.265 and one with ProRes. And you're gonna see an enormous amount of data that, you know, what was kind of swirly little designs in H.265 will be leaves in H.2, in Apple ProRes, you know? And so those are the kinds of things that you want to look at. And so it's, H.265, by the way, and H.264 are revolutionary, they have changed the way we look at video, it's, they are, as a delivery format, they're unbelievable. But as a production format, they're not so great.

RR: You just mentioned ProRes, a 422 HQ, which is what Apple's gonna be delivering in the camera app. We've also seen Filmic Pro use the APIs that Apple's providing to do ProRes, you know, proxy and LT. And we've also like, on a computer you can use 4444, I forget how many fours, four fours in there, what are the, what are the differences when you just hear those numbers?

AL: Absolutely. So, proxy is really designed as a preview format. It's never designed to be used for anything. LT, which is kind of a lighter version, is just more compression. And that's really what we're talking about. All of these have 10 bit, for instance, instead of eight bit, but they, but they're more compressed, you know, as you go down. So 422 is kind of a good baseline. LT is a lighter version of that baseline 422. HQ is a higher, it's got more, a higher bit rate. So that's not that it's only 10 bit, but I would say that HQ is the lowest Apple ProRes that you should do 10 bit, because 10 bit requires more data rates. So even though the other ones support them, the compression that you're applying to it will oftentimes overshadow any of the extra bits that you had for it in the color channels. And so HQ is where you really start paying attention to 10 bit, and so HQ and below is 422. And what that means is that you have two color pixels, it's really, Y-U-V, you have four color pixels in the Y, and then two color pixels in the U and the V. So essentially what that means is that your color is half the resolution of your luminance of the black and white information in your image, because we really cue on luminance for sharpness. We throw away a lot of data and it still works for us because that black and white information is there. So 422 takes advantage of that, and that lowers the size of the files. 444 is also 12 bit. So it also has yet, and two more bits to the color. It also carries an alpha channel, so that alpha channel can control transparency. So, and that's really valuable if we want to key something over top of something else. And where you, you don't really use that when you're shooting footage. It's when you're actually compositing computer generated footage over top of something else, you can save it out as ProRes four by four, and now what you're gonna be able to do is get the transparency, really fine transparency of smoke or, you know, transparent images can be captured, and all of that's in the alpha channel. And then XQ is a higher version of that. And when it comes to pushing for, towards HDR, all of those bits make a difference.

RR: So where do you see this making the biggest impact? Like there's probably not, you know, what you're gonna shoot every birthday party in, but for, especially for pros who are using these as tools, where do you think they're gonna make the biggest impact?

AL: Documentary footage is gonna be something that is really interesting coming out of the iPhone. A lot of times we want to shoot a lot of great footage, but we have to get into places that we can't take a big camera because it gets you into trouble, or it's hard to get there, or it's all these other bits and pieces. And so I think that news gathering, documentary footage, it makes a big difference to have really high quality footage. I also think that with training in a lot of corporate video, you could actually shoot something and produce something very high quality with these phones. Most of the training that I do, when we're doing internal training that we're gonna send to clients of how to use a product or a kit that we've sent them, I just do that with my iPhone. And I've been doing that already because it's just so fast and compact, shoots a great image, but my biggest complaint was, well, it would have been nice to fix that up just a little bit, do a little bit more color correction, have a little bit better low light. All those things now are being addressed inside the current phone. So I think that it has a lot of options there, and I'm gonna be shooting my kids' birthday parties with it. But here's the thing, is that you always want to think, when you're taking the birthday parties, when you're taking that graduation, you want to know that that footage is something that you're proud of down, way down the road. You know, so 20 years from now, do you, you're gonna look back and go, I really wish I'd shot that with the best footage I could, you know, I had a phone, that's all I had, all I needed. And all I needed to do is have an extra drive and I would've had all this great footage.

RR: Yeah, it's so good that the only thing, and I know it's not reasonable to want like multiple Thunderbolt controllers in a phone, but it's gotten to the point where now, if I'm shooting, it's six gigabits per second of ProRes HQ into a phone, I just want Thunderbolt.

AL: You really want something faster. You know, that that is a big problem if you're actually using it in production. We were talking about it, 'cause we were talking about the idea of using this phone. And we really were like, well, we might actually have to use a couple of them because we just wouldn't be able to get the data off as fast as we were shooting it. And so, yeah, it does add up pretty quickly, even with a one terabyte storage capacity. It's still problematic. And so that, if I was gonna use these actually in production, I probably have two of them so that I can be pulling footage off while I was shooting with the other one.

RR: I think especially now that we're beginning this big transition to HDR, and I expect that to be our universe going further, and just having this kind of footage, I think it's gonna make a big step up for that too.

AL: Yeah, I think that the phone's capability, both in the screen and in the camera towards HDR, is really starting to affect people on how they look at their own TVs, how they look at other content where standard dynamic range isn't enough. And the reason is is that you're looking at it all the time. Like, even when you look at my phone and you look at the footage I shot, it looks really bright and it looks very vibrant 'cause it's got Dolby vision. And then you look at the TV, you know, someone else's TV and it's very drab. You know, and it doesn't look like what you want, and you're like, what's wrong. Like I have a problem, I have a couple, I have some stuff that's still left over that's standard dynamic range or STR, and I keep on looking at it like, what's wrong with this image, you know? And I try to fix it, and I'm trying to figure out what's going on there, and then I realize, oh, it's just an STR monitor, so it's okay.


macOS Monterey Review — The M1 Pro / Max Update!

Live Text, FaceTime for lockdowns, Quick Notes, Focus mode, Shortcuts, M1 Pro and Max Acceleration, and yeah, Safari redemption. I'm Rene Ritchie. Thanks to Curiosity Stream with Nebula for sponsoring, and John Gruber of Daring Fireball, the talk show and dithering fame, for joining me for macOS Monterey, the review. Let's do this.

  • [Rene] So, do you wanna go through some of the big features sort of in the order Apple did, or do you wanna jump right into Safari.

  • [John] Were there any changes other than Safari? I don't remember Rene.

  • Live Text.

  • [John] Oh!

  • Live Text for me on the iPhone has been transformational and I'm pretty sure you saw that example of the kid in school-

  • [John] Yep.

  • [Rene] Holding his phone up and just copying it. I don't use it to that extent on the Mac, but it is unlocked generations of text layer PDFs for me.

  • I think Live Text is the best feature of the year. And if it was the only feature that they had this year, if they were like, "You know what, this is gonna be a slow year, but we've got Live Texts." We'd be like, "Yeah, good year." It's amazing. And the other thing about it is it works so perfectly every time for me. And I use it more often-

  • [Rene] Yes.

  • Than I expect.

  • I know they've brought it to Intel Macs now too-

  • Yeah.

  • They're not doing it in real time because they don't have a neural engine, but they're gonna do it, you know, whenever you need it to be done. But just the idea that because the silicon is the same now, it's like that slide they had where it suddenly grew and showed all the iOS features that you were getting on the Mac.

  • Right. Did you notice this at WWDC back when they first unveiled the Live Text, they purposefully showed some photos where the text was blurry in the background like bokeh, bokeh,

  • [John] Yeah.

  • They didn't wanna say, "Look at this, this text isn't even in focus and we can get the phone number out of it." Easily the feature of the year.

  • Some people have said that this was sort of like the pandemic release that, you know, a lot of the features were designed to make everybody's at-home lives easier. And it starts with like FaceTime, because now you can blur out the ugly background you know, if you're in your home office, if you have to. And you can even share invitations to people who don't use Apple devices. There's no FaceTime for windows or Android, they get a web interface, but you can include those in calendars and makes it really easy.

  • We're all on these voice calls now all the time. And they all stink in various ways.

  • [Rene] Yes.

  • We've made them better in a few ways, right? FaceTime now supports spacial audio.

  • Yeah.

  • And it's amazing. It is really immersive.

  • Yeah.

  • You cannot look at the feature set for this year's OSs and not think, "Okay, this was the year Apple got serious about remote meeting collaboration," right? It's probably the biggest win for most users this year. In my opinion.

  • Even though messages didn't get as much as FaceTime this year.

  • Are you surprised by the fact that messages for the Mac still doesn't have apps? And you know me, and I do send you a sticker once in a while.

  • [Rene] Yes, yes. You have like a ration of emoji and then like a smaller ration of stickers.

  • And it is weird where if I'm texting you from my Mac and I wanna send you a sticker, and I know you appreciate a good sticker, like a good-

  • I do. I do. Baby Yo&da with a Tim Horton's cup sticker, like if you can combine things like that, perfect.

  • But it so weird at this point, knowing the backstory of the catalyst conversion and that there are on a shared code base, they're literally on a shared hardware platform now, right? That we've moved to Apple silicon. And you still have to switch to your iPhone or iPad to send a sticker, it's kind of weird.

  • They didn't catch fire and they're not bad enough that they're gonna delete them, but they're not big enough that they're gonna really promote them anymore. The one thing that I'm not sure about is like all this shared stuff. I know that their argument is that people send you stuff every day. They send you news articles, and they send you photos, they send you music, and you lose track of all of it. But now I go to like my photos app and I see like all the dumb memes my friends have sent me and is it really or screenshots? I would almost prefer it, if I could go to the person and see what they sent me. I know that's very personal, everyone's gonna have different preference, but I just find having the app as the front page for that is, it just puts a lot of junk in my drawer.

  • Yeah. I have to say that that sounded good when they pitched it to me. So therefore I see why they green-lighted it. Maybe it is very personal, but in practice now that I'm living with it, it is not that useful, in my opinion.

  • Have you used the Focus modes much? I've used the bear functionality of Focus mode. And I like it that it says like, "You know, John has notifications off," things like that, but I haven't made 31 different focus modes with 81 different home screen layouts or anything like that yet.

  • Honestly, this is one where I've mentioned it to a few friends over the last few weeks, and it's gotten the most, "Whoa Grubs, you're way off base," 'cause most of the people I talked to love it and they're really into it.

  • [Rene] Yes.

  • And they're setting up these great. And my angle on it is-

  • But you're talking to Merlin.

  • I know, but my angle is I hate it. I liked the binary nature of I'm in normal mode, or I'm in Do Not Disturb mode.

  • [Rene] Yeah.

  • And that's it. And maybe it's, I'm not trying to be self-effacing, but it's not that I'm a dummy, but that sometimes it's like, I need the reduction in complexity. Where Do Not Disturb, I love it. You know, what is one of the greatest inventions of all time? Everybody has to agree with this, the Light Switch.

  • Well, but a dimmer switch, you press it. You never know which is off, so you go past it every time.

  • Right. Don't get me started on dimmer switches, Renee. I mean, seriously, it's gonna be the end of my marriage. You know, I hate 'em. I don't like it. I like lights on, lights off. Do Not Disturb is on, it is off. So I don't blame Apple from making it because I can see from talking to friends that some people really like having

  • [Rene] Yes. three, four different modes that they're in with different settings. I get it. And you know, not for me-

  • I've always wanted Apple to take the VIP setting and make it universal, make it not email level, but make it contact level. Because like, if I wanna get like highlighted emails from you, for example, I also wanna see your messages when I might be in Do Not Disturb, or I wanna see your, like just everything on the system. And they never did that. We had like the favorites in the phones, which had certain priority privileges and other things, and this sort of fixes that. So if you do have toxic family members, which is by no means unique, especially over the last two years, you can set it up so that those people think you're in Do Not Disturb. Where the nontoxic family members think you're available for all sorts of different messaging. And I think that is valuable. It just takes a little bit of extra work to set it up.

  • And again, I guarantee you, there are people watching your videos who love these new features because people are super into it.

  • Yes.

  • And I'm enthused for them, I know what it's like to feel heard. You know, like, "Oh my God, this is the one thing I really wanted to nerd out with. And the fact that that syncs across all my devices, I'm happy for you. But then some of the stuff they went in and did do, they did break, right? Like Shortcuts.

  • Yes.

  • So there were no Shortcuts for Mac, and clearly, you know, that was holding Shortcuts back as a Apple platform-wide automation technology. And they're like, "Okay, we'll do Shortcuts for Mac." And they sort of blew up the whole thing.

  • Well, it's interesting because SharePlay got postponed and Universal Control got postponed. So those sort of get some breathing room on, but not Shortcuts, Shortcuts are

  • And I do feel the difference is that like SharePlay, it's either there, or it's not there. And if it's not ready, you can hold it. Shortcuts is it's already there for iPadOS and iOS. And therefore they kind of need to ship what they have because you know, to their credit, they're all in on the new version. The new version just is not that great yet.

  • The one thing I thought was really interesting was Quick Notes. Like just the whole idea, the concept of a lightweight note-taking system that is persistent, not just persistent within the Notes app, but now it's persistent in Safari as well. Having it as part of the OS gives it a lot of advantages for me. So I've been using it actually more than I thought I would.

  • You know, it's interesting because if anything threw me off with it, it was that when it was first demoed to WWDC, I thought it was very pencil-centric. And that pencil was the only way to do these. And that they could have been called Pencil Notes instead of Quick Notes. And so, I got it in my head that you have to be on your iPad, you have to be using your pencil, and that's not true. You can do it from any platform. You know, I think they're committed to it. So I feel like this is it, but it's like you don't really notice how much you need it until you actually are jotting down a thing, right? You're like on a website you know, ordering something and you're like, "Oh, I just need to get this order number stashed away. So I don't forget the order number." Oh, boom, Quick Note.

  • The way I look at the cross-platform stuff is sort of like how I've always looked at it, where the iPhone is for like very fast data entry and triage on the go. The iPad is sort of like sitting back and looking at a broader view and doing like aggregation and reporting and that kind of stuff. And the Mac is just for mass entry or mass editing of everything, because it's still so much faster. The pencil feels really organic. Everything from the swiping up the Quick Notes, to the writing them, into doing those, it feels like a lightsaber, like a more elegant weapon from a more civilized age. But like the Mac can just chew through so much stuff. So I guess, let's dive into Safari now because this has been so interesting for me. Because typically Apple is so good at nailing it internally and then pretty much ignoring feedback. Where this time it feels like they totally missed it internally, but were really good at listening to the feedback.

  • The last five months of Safari evolution and then devolution. I mean, I don't know how else to say it. Has been one of the most fascinating things as an Apple observer, ever. And it's exactly what you think, where there effectively were two tribes, maybe three tribes. So you've got the one tribe that was like, "These changes are really cool. We're evolving Safari in a modern way. Why don't we rethink this?" There's another tribe that is like, "These changes don't make any sense if you have more than three tabs in a window, what are you doing? We cannot break people's habits like this." And if there's a third tribe, it was people in the middle who maybe didn't have a strong opinion. But I didn't hear a lot from that tribe. You know, it was mostly people on one side or the other.

  • Can I also just clarify that the people I heard most from in terms of liking it was ecstatic. Very subjective and various ecstatic. And the arguments against it were almost purely objective and functional. And I'm not saying that as an overall criticism, because you know, personal tastes have a lot to play in this. But for the last couple of months, I have not been able to consistently tell a tab from an address bar. And I keep clicking into a tab thinking I can type into it. And that just should not happen.

  • A lot of people who like them, especially on the Mac in particular have sent me screenshots. It's unbelievable how many people have done this. They've sent me screenshots of my website-

  • [Rene] Yeah.

  • In Safari. And they're like, "Okay, I get your arguments, but you have to admit, your website looks really cool in Safari." And I'm like, "Well, it sort of, but it's like, that's not the point, the point isn't to look cool. The point is to be cool."

  • That's not my menu, but it looks like it's part of my site and it's not. I have this thing that I modestly call Richie's Law, where whenever somebody says, "Steve Jobs would never," you can always go back a very short period of time and find several examples of him doing exactly that. And that you can't say that he would absolutely not tolerate things that he cared about. Like if you damaged an app or a function that he cared about, he would be in the room throwing it against a wall. But like, you can also tell which apps he didn't ever use. So I'm not gonna say it's like one of those things. But I do think that the executive, some of them were really famous for being on the betas. And I do imagine there was a point where, whether it was Tim Cook or somebody else who called and said, "Where are my tabs?" And somebody started to give him an answer. And he's like, "Why aren't you on that plane?" You know, why aren't you on the plane fixing my tabs? And then it was very clear that this was gonna get handled before-

  • But in the end it worked out right? Which is kind of, you know, the system works.

  • And I do think, especially on the brand new, new announced Mac pros with M1 Pro and M1 Max, especially. They've done a really good job. Like the big advantage they have, is that they control the silicon, the hardware, and the software. And they've been building these Mac books for like going on three years. And what's fascinating to me about it is you can go back to like the Mac Pro controversies in 2019, they were building this stuff back then they knew all of this was coming. You know, the 2019 16 inch MacBook Pro, they were building these MacBook Pros. While all of that, with like the butterfly key, all of that was happening. And I think when they landed on these products, again, especially on the new MacBook Pros, they are so fast. Like the best thing I like, the one thing I enjoy the most about Monterey, is that even on Intel, but especially on M1, M1 Max, M1 Pro, it's not just how fast apps launch or how fast websites render, how fast video renders, but I'm living a beach ball free life. I don't get slowdowns. I used to have to wait over and over again when I would skim a timeline, or I'd launch mail and go to VIP's and it would literally take a minute to load. Everything is like iPad level instant. And that, I think is that beautiful dream that Steve Jobs had when he slapped that iPad down on the table and said, "Why can't the Mac do that?" And we've gotten to a point now with M1, all the new processors and Monterey where the Mac absolutely can do that now.

  • They have definitely been approaching this for a long time. And I do think that now that we see, and again, the transition isn't over yet, we still don't know the full-size iMac story. We certainly don't know the Mac Pro story, which is, you know, arguably the most interesting technical question. But for most Mac users, they now have an Apple silicon Mac to buy, right? Because most Mac users are buying MacBooks. And the entire MacBook line has been transitioned.

  • Yeah, I think that's my overall take here, and because the new Macs are shipping with Monterey, I'm gonna be going all in on production on, I usually wait, like I wait six months to a year before upgrading a production machine just because you never know, especially with plugins and with all the little apps and everything, how long it's gonna take. But the gains are so good. Like the environment is so good. It is so much better to work on, that I'm immediately switching over to Monterrey. And I think part of that is, a lot of the credit that goes to Monterey because I've been using it all through the beta cycle. And aside from, you know, Safari, everything has been working great. Like I've had zero problems. That's the first time I've had the confidence to just make that switch immediately.

  • Yeah. I would say long story short, overall, Monterey feels to me like one of those, "Hey, if it ain't broke, don't fix it," OS releases. Where they're like, "You know, this platform is in great shape. Let's not mess with stuff that doesn't need to be messed with and keep it going."

  • But I also think that they had to rewrite so much, like everything from the big stuff, like APFS and like, over the years, they basically rebuilt the entire understructure of macOS, and each of those was painful. And we experienced that pain as macOS users for five years under Intel. Like Discovery D was probably the biggest example of that. You know, like rewriting foundation blocks one after the other, but it got us to a point where when they had to make that transition, all that pain was spent. That was all budgeted out. We spent that over five years and it was a really good transition. And then we just hit the ground running with Big Sur and now straight into Monterey.

  • So, you know, I think overall it is a very, I would say to anybody who's already on Big Sur, you should probably upgrade. I mean, do you wanna upgrade on day one? I mean, maybe wait 'til day two and see if anything bad happens on day one.


AirPods 3 Review — New Best Buds?

The AirPods 3 look like the AirPods Pro. Same oblong buds and shorter, more angled stems, with a virtual force sensor button for media control. But there are some critical differences: No silicon tips, for people who find that annoying or downright painful. Also, spatial audio for a truly immersive experience, but no active noise canceling. Adaptive EQ, which constantly corrects the audio based on how exactly the bud is sitting or moving in your ear, but no Transparency mode. Much better Find My, if you lose a pod, longer battery life, MagSafe and sweat and water resistance in the case, and so much more functionality thanks to years of AudioOS updates, that they’re not just a whole entire 3D sound stage in our ears — they’re basically external cybernetics. If only they weren’t so annoying sometimes. Let me explain!

I’m Rene Ritchie, thanks to CuriosityStream with Nebula for sponsoring, hit that subscribe button so you don’t miss all the reviews I have coming your way, like this one, for the brand new AirPods 3!

So, yeah, AirPods 3 look like the lovechild of AirPods 2 and AirPods Pro. Which I think is great. Same iconic style but edgier design. And most importantly, it leaves an option for people who don’t like anything literally in their ear, because it’s uncomfortable or hurts. I mean, I’m the opposite, my ears have been mangled by a decade of Brazilian jiu-jitsu, so I needs those silicon tips for them to have even a chance of staying in, but that’s literally why options are so important.

Except for the double tap. Switching the the force sensor for media control is way, way better, simply because it no longer feels like I’m combo smacking myself in the brain anymore every time I want to skip a track.

I’m still missing colors though. Consumer end of iPod, iPhone, iPad, even Mac and HomePod mini now have colors. AirPods, only the high-end Max. I get it, white’s iconic, it’s the gloss that launched a million iTunes ads, but I’d still love to see the full range of Apple-chromatics here.

Setup is as easy as ever, which is great. Add it to your iPhone and iCloud propagates it to all your other devices. Only issue is, ever since Apple added auto-switching, a lot of us have experienced something way more like rando switching.

I’m on a live stream on my Mac, listening, talking, a message comes in, and all of a sudden I’m being torn between worlds. Guessing user intent is tough, no question, but active audio streams should just never, not ever be interrupted. It even puts a huge, persistent block up on the Apple TV now. It just… needs a lot of refinement still.

I’m with Apple on Spatial Audio though. It’s the future. But where the AirPods Pro make the 3D Sound stage feel like it’s in your head, because in your ears, and AirPods Max make it feel like it’s all around you, like a theater, because over your ears, AirPods 3 are somewhere in between, because open backed. It’s not as claustrophobic or as lonely. And I really dig it.

The dynamic head tracking part is cool too! It lets you treat your iPhone or iPod or Apple TV as a static speaker, and walk around without that audio following your headphones, but rather staying locked to that speaker. Like it would in real life. And it’s especially cool with the new FaceTime group call feature, because people sound like where they are. It’s almost freaky. But normal. But freaky!

No active noise canceling, transparency mode, or conversation boost, which all require the more multiple and more expensive components of the Pro or Max. And I really like them for travel, work, and working out. But again, the openess of the AirPods 3 is still compelling, especially if you don’t need those features.

The new, inset Mic is pretty good, especially paired with new AAC-ELD codec for better HD voice. I’m not going to VO any videos with them, except in a pinch, but for calls, they work more than well enough.

And the whole system sounds great. Again, I’m that monster that listens to podcasts on phone speakers, not the audiophile who won’t get out their music app for any headphones under 10K, but they’ve got a good mix of fairly booming base and crisp highs, chocolate overtones with just a hint of nut and fruit. Oh, sorry, that’s the coffee. I’ll link up some hard core audio reviews from El Jeffe, TLD, and the like as soon as they hit. But these sound better than AirPods 2, if not as good as AirPods Pro to me, which totally tracks.

You get up to 6 hours of battery life per bud now, up by 1 hour. 30 hours with the case. And you can charge to 1 hour of listening time in 5 min if you get stuck. I typically rock one bud at a time, because mono podcasts and audio books, so I can go for days.

And both the pods and the case are IPX4, which means no dust, but water and sweat resistant. Yeah, including the case. So don’t swim or dive with them, but if they get rained on or splashed and you wipe them off after, they should still be good to go.

What continues to blow my mind though, is the beyond music stuff. The ambient computing. The stuff we’ve all wanted since Star Trek or just Tony Stark.

It’s getting directions, workout metrics, notification announcements, and being able to ask for pretty much everything and anything now, that really make AirPods feel like external cybernetics.

I mean, clocks went from the town square to the living room to the pocket to the wrist and then stopped. Computers have now gone from mainframes to desktops to our hands, wrists, and now they’re taking over our heads.

While we’re all waiting for AR glasses, we have AR headers here, now, today, already. And it’s increasingly amazing.

Plus, I all caps love how Apple keeps introducing new features to the top of the line, paying them down, and then pushing them down to the entry level. Now, the base model AirPods have the Pro design, a magnetic inductive charging case, spatial audio, and dynamic IQ, all for $179, which sounds like a lot, but packs in more technology than ever. And if that’s just too much for you, Apple’s keeping around the OG AirPods 2 for $129. They don’t have all the features but they don’t cost all the money. And they work. So, if you’ve been waiting, you’ll get the same for less money or even more for the same money.

Now, if you already have AirPods, should you upgrade? If what you have still works well for you, has the features and battery life you need, then no. But if you want the new look, or the Pro look without the Pro tips, or the new features, even a new, longer battery, then yeah.

Bottom line, these are the new baseline iPhone buds to beat. And they make for one hell of a force multiplier for all your other Apple gear.


M1 Max MacBook Pro Review

I'm breaking down Apple’s brand new M1 Max 14-inch MacBook Pro. That’s 8 performance cores, 2 efficiency cores. 32 graphics cores. 64 gigs of unified memory. And 2 terabytes of ultra-fast SSD. Oh, yeah!


M1 Pro vs. Max — What Apple Didn’t Tell You!

Apple's brand new M1 Pro and M1 Max and MacBook Pro, you had questions, you wanted answers, you wanted the truth. I can give you just all the truth that you can handle. Let's do this. As always, supporters over at get Q & A priority.

So, William Edstrom, "Is it possible that Apple could enable Face ID "on these laptops or is necessary hardware not included?" And yeah, the necessary hardware is not included. Face ID is more than just the RGB camera, which is what's on these new MacBook Pros. It's the infrared flood illuminator and dot projector that lets it actually scan the geometry of your face. And I think the reason that we don't have Face ID on the MacBooks, on any Mac yet, is that it's a fundamentally different concept that Apple still has to figure out. To just authenticate someone, to look at their facial geometry and say, "Yes, it's you. Unlock." That's fine. That's transparent. You sit down in front of it, you open it, and that just works. But to authorize purchases, it becomes trickier because you don't want to just authorize a purchase anytime you're looking at the device. There-in a lot of fraud takes place, and Apple has to be very careful to avoid that. So on the iPhone and the iPad Pro that have Face ID, you have to double click the button to add a direct user interaction, to show intent when you wanna make those purchases, when you wanna authorize those purchases. And that is unclear on a MacBook. For example, do you reach over and double-click the power button? Because in that case, that's where the Touch ID sensor already is. So you're basically being forced to do a Touch ID-style action. It doesn't really save you anything. It doesn't increase the utility at all. And I think that's why Apple is sticking with Touch ID.

Robert Rehders, ”Curious if there is any insight available "regarding thermal throttling on the 14-inch versions." The 16-inch MacBook Pro is just bigger than the 14-inch MacBook Pro, so theoretically, it has a, well, not theoretically, practically, it has a bigger thermal envelope. I think with the M1 Pro you'll never see a difference. You know, quote-unquote, never. If you're in Arizona, in 110-degree temperature, you know, that's gonna change things. But with the M1 Max, when you're up to 32 GPU cores and you're hitting other parts of the silicon IP, I think, for sustained loads there, you will potentially fill up the envelope of the 14-inch and have that extra capacity at the 16-inch, again, because physics. But I don't think most people in most workloads will ever come close. You'd have to do that like in testing or if you're ultra-high-end 3D artists just laying on every effect imaginable.

Jeffrey David Marraccini, and I hope I'm pronouncing that right, "Curious if this is 2X Retina or not?" And it is. It'll have a default to 2X Retina and you should be able to pick lower resolutions for increased legibility and higher resolutions for increased information destiny, just like you can on current MacBook Pros, it's just the default will finally be a pure 2X Retina. PK. "Do we know how scalable the silicon architecture is? "How big can the SOC go with GPU and compute cores "for desktop models?" And I think we've seen rumors from Mark Gurman on this, where they're gonna get hella big. Like for Mac Pro, they could be up to 40 core CPUs and 128 GPUs. So it's just like full-on scaling. There doesn't look to be a limit within a reasonable range for the kind of products Apple wants to make, up to and including a Mac Pro, it seems to be able to scale just fine. Craig Doran. "Why the reduction in efficiency cores by half? "Can they still be referred to as Icestorm? "Are the remaining two e-cores "changed to be more powerful or efficient?" And yeah, I think it's a difference in focus between the M1 and the M1 Pro and the M1 Max. And they are still Icestorm cores. M1 Pro and M1 Max are the same silicon generation as M1, which is A14-generation silicon. And that means Icestorm efficiency cores, Firestorm performance cores. There's just a different arrangement. So instead of four and four, we now have two and eight. Because while the battery life is really good on these computers, that's not the focus. That was the focus on the MacBook Air and the two-port MacBook Pro. This, the focus is on performance. So they're really giving you as much, giving us as much performance as we possibly can, and that means double the amount of performance cores. And there are some things like low-power mode, which is gonna help make up for some of the battery, you know, if you really need to push it. But when you start cranking up the workload and you start firing all those cores, especially all those graphics cores, it's gonna be all about performance, not all about battery life anymore. And that's the substantive change here.

Arthur Kahwa, "How loud is the 16 inch when the fans kick in? "Do we have a comparison with the latest Intel models?" And there haven't been any reviews released yet, so we don't really know. My guess is that they are gonna be really quiet. I mean, with the M1 MacBook Pro, the two-port MacBook Pro, a lot of people didn't even realize there was a fan in it because it was so hard to get that fan to kick in. And I think in normal modes, it'll still be hard to get the fan to kick in. And it'll be hard to hear the fan when it does. There is a high-power mode in these new MacBook Pros that you can enable manually, that will ramp up the fan. It'll force the fan to ramp up. That's something that people have been asking for for a while. It's like, "Apple stock throttling things so aggressively, "just give me the ability to increase the fan. "I don't care about the noise. I care about the performance. "Give me that option. Just let me do it." And they are. They're putting you and us in command of our own destiny there. Steven Kaplan. "I would love to see some information "on how much time you save in your workflow "from these new laptop models." One of the biggest advantages, to me, of M1 over Intel with the MacBook Air and the two-port MacBook Pro, was that it was just so instant. Everything was instant. Like iPad-level instant. And you forget. Like you think about things like render time and it renders five minutes faster. And is five minutes really worth it? Well, if you're rendering a bunch of videos, yeah, five minutes adds up quickly. But I'm basically living a life without beach balls on M1. And when you count up all the beach balls, all the little delays, all the seconds that add up on top of each other every minute, out of every hour, out of every day, that is not just a phenomenal increase in the quality of life, but it gives you back a significant part of your life. Next up, YouTube. And if you didn't manage to get your question in, make sure you hit the bell and the subscribe button, because I typically hang out in the comments for the first hour or so after any video goes live so I can answer anything extra that you wanna throw at me. Throw everything at me.

Philip Dunphy, "Some context on which workflows "would match well with each tier of chips, "because I don't wanna overpay for power that I don't need." If you need higher-end gear, you typically know it because you're hitting up against some sort of constraints, some sort of roadblock in your current workflow. So, for most people, I imagine the M1 Pro will be plenty. But if you're working on something that needs literally double the GPU cores, any of those high-end apps that just benefit, that are GPU bound, that the more cores you throw at them the better, faster, and more they can do, that's when that makes sense. So if you've never had a problem that in area, you certainly won't have a problem in that area now. But if you've always been like, "More!" Like in rock style, "We've gotta give the people more." Likewise, if you do a lot of ProRes editing, or you do a lot of video editing in general, you're getting twice the media encode, decode blocks. Well, the encode block on the standard H.264, H.265, but encode and decode on ProRes. So if you're doing workloads that are bound on video compression, then you're gonna see a doubling of that capacity on the Max, and you're probably gonna wanna consider the Max, as well. UltimatePokeGamer7. "Based on these chips, what are your predictions for M2?" So, yeah, if we go back to M1 being based on A14 generation silicon, I'd expect M2 to be based on A15 generation silicon, or the chipset that Apple just shipped with the iPhone 13. And that means, instead of Icestorm efficiency cores, and firestorm performance cores, we'd get the Avalanche efficiency cores and the Blizzard performance cores. And instead of the G13 GPUs, we get the G14 GPUs. And in the performance cores, it's not a huge difference. They tickled the frequencies a little bit higher, but in the efficiency cores, they were close to 30% better performance in the efficiency cores, and also double the cache and just way better GPU performance, although throttled in the iPhone, because it's such a small thermal envelope, but probably not throttled anywhere nearly as much in something the size of a MacBook or a MacBook Pro. So just better again, over the M1, in ways that to continue to push the envelope, but not just in performance overall, but in efficiency, because it is just a way, way cooler chip overall.

Omer Horovitz, "Does it have HDMI 2.0 or 2.1?" So it has HDMI 2.0, like the M1 Mac Mini, not HDMI 2.1, like the current generation, think it's A12, Apple TV. And I think that's just because Apple's not delivering anything yet that requires HDMI 2.1. Like, don't get me wrong, I would take it. I love it. I want all the ports at all their capabilities, even if they're not being used for anything, just because I'm a nerd and nerds can dream. But the main difference is for HDMI 2.1 is that it can handle up to 8K, up to 120Hz. It can handle variable HDR. And I don't think Apple is outputting that from the Mac, to begin with now, or these models ever. So HDMI 2.0 probably fit their overall bill of materials and capabilities they wanted to deliver, and they just saw no need for HDMI 2.1 yet. That'll be in a future Mac that is designed for an 8K, 120Hz world.

Chidorirasenganz, "Why didn't they go for an even bezel like the iPad, "instead of the notch?" I mean, I can't unsee it. I just, I wish it wasn't there. It's like a needle in my eye all the time. But, honestly, 99% of the time I don't even notice it anymore. And the difference between the iPad, even the iPhone and the Mac, is that with the iPad you have to be able to hold it, and you have to be able to hold it in your hands at every orientation, landscape and portrait, and that means you need a bezel in every orientation. And if you have that bezel, you don't need a notch. You can't have a notch, there's no utility to it. Where on the iPhone, for example, it gives you extra screen. If there wasn't a notch there, there wouldn't be no notch, there would just be a forehead and you would just lose those extra pixels, where right now they're shoving things like the battery indicator. And it's similar to the approach they've taken with the MacBook. Because you're not holding the MacBook in your hands, they can make those bezels really tiny. You don't need to hold onto the screen. And because of that, they can remove as much of those bezels as possible. Now, they need to leave some because they've put a much bigger camera in there now, a 1080P camera. So doing exactly the same thing they do with the iPhone, and that is just deleting the forehead. They're not adding a notch, they're deleting the forehead as much as possible, and then shoving the menu bar up there so that everything below the notch is completely usable screen. And the alternative would be a full-on forehead with the menu bar, so you'd lose just that little bit more of screen. And even though I don't find it attractive, understatement, at all, I will always take as much screen as you can give me. Christmas Clatter. "Will there ever be a Mac Mini version?" I think there'll be an M1 Pro and an M1 Max Mac Mini. But I think Apple was really focused on getting this machine out, just like they were the M1 iMac last spring, and they have limited bandwidth in that Mac department and they wanna really get these right and focus on one of these machines at a time. So I figure, next we'll get the Mac Mini and maybe the M1 Pro, M1 Max, M1 Extreme, whatever they call it, even higher-end chipset, either together or staged out next year, leading up to and including the Mac Pro by the end of next year, like the two-year transition that Tim Cook promised us.

Eddiemcohen, "Does more RAM really make a difference "with unified memory?" And yes, absolutely. I think there's still a lot of misconceptions about what unified memory really means. What it is is just one big pool of memory that's accessible to every part of the SOC, of the system on a chip. That includes the GPU, the CPU, the neural engines, the image signal processor, or the ISP. And it means they don't have to copy data back and forth. So it's way more efficient. But it also means like on a classical computer with a board, you have the CPU and then the RAM that feeds the CPU. And then you have a GPU with its own VRAM. And especially in a laptop, the VRAM available to the GPU I don't think has ever exceeded 16 gigabytes. And now, with these machines, you have 16 gigabytes that can go to the CPU or the GPU. I mean, not fully, there is some overhead, there is some sharing involved, but you have a larger pool of memory. And then when you get to 32 gigabytes, 64 gigabytes, being able to feed that GPU, that is just unheard of on a laptop. Also, unified memory isn't magic. Like eight gigabytes of unified memory is not equal to 16 gigabytes of traditional memory. All it means is that because of things like memory compression and the ultra-fast SSD for swaps, it's really all that it can be. You get the most out of it. But if you have memory pressure, like if you're using a lot of really memory hungry, memory-intensive applications, if you know what memory pressure is, basically, and if you're hitting it, the only way to solve that is more memory. The unified memory doesn't help you there, you need more memory. And if you're doing ultra, ultra-high-end work, stuff that you would previously do on a mid-range Mac Pro, you're gonna want as much RAM. You're still gonna want as much RAM as you possibly can to be able to do those things in a mobile... Things you can never do before in a MacBook Pro, a mobile computer.

Okay. Twitter time.

Stephane Ceretti, Huge fan of your work. "Which one did you order?" I went for the MaxBook. The just maxed out, 16-inch, everything I could put in it because I'm in that position of time being way more important to me than the money of this machine. I mean, I don't spend money on anything else. Like, literally nothing else. I spend it on gear because that's how I earn my living. And doing these videos, being able to do them faster, better quality with less frustration, it really does pay for itself. Dr. Ian Cutress. Also a big fan of your work at AnandTech. "Why are there 80 SKUs? "160 if you account for two colors. "Is not that a logistical nightmare?" And my guess is they've gotten so good at managing SKU complexity on the iPhone, like all the colors, all the models, all the configurations, that on the relatively much smaller MacBook Pro they can just handle that. That's like a breeze compared to handling iPhone complexity. Alex Miller. "Truly curious what the M1 Max "means for potential AAA gaming." On iOS, it's such a big platform that they enjoy a huge amount of support from developers, and that's just never been true on the Mac. So the ability to run iOS games means you're gonna be able to run as much as developers allow it. They still get to choose whether it's supported or not. But to any developer that supports running their iOS games on the Mac, that should be terrific. But I think there's still gonna have to, the developers, the studios, are still gonna have to decide to support and launch those games on the Mac. And that's more of a developer relations problem and less of a hardware problem and has always been thus. Lew Piper. "Why no Center Stage? "Is the sensor on the camera not big enough? And what Center Stage is on the new iPads, the latest generation of iPads, is basically an ultra-wide camera that Apple then crops down to just a regular wide camera, but pans and scans and zooms in and out so that if you're talking it's really focused, framed on you, but if you move around or somebody else comes in the frame, it'll expand, it'll be able to capture everybody in the frame and then move around with them, zoom back into you if they leave. And this just does not seem like that kind of camera. This is not one of those ultra-wide cameras. Apple really focused on, I think, the single-person perspective. Just getting a really, really good webcam in here for all the Zoom calls, all the FaceTime calls that we're making. And maybe in the future, we'll see an ultra-wide, but that seems to be just a complete product differentiation for them, at this point.

Nitesh, "One billion colors equals 10-bit panel?" And yeah, it's still an eight-bit panel on the MacBook Pro in terms of hardware. And Apple is still doing things like temporal and spatial dithering to simulate more colors. But they've gotten so, so good at it that I think for most people it's gonna be indistinguishable from what you would see on a 10 bit-panel. And through, I don't wanna call it trickery 'cause it's not trickery, but through display technology, they're probably exceeding what you would get just on a raw, unoptimized 10-bit panel anyway. Less about hardware now and it's more about the sum total, the complete package. I would love to see an actual 10-bit panel. I would love to able to compare between this kind of eight-bit panel that's capable of simulating one billion colors and what an actual 10-bit panel would provide. But based on what we're seeing on the iPad Pro now, it looks amazing.


AirPods 3 — Apple Expert Reacts! (+ HomePod mini)

Apple has just announced new blue, orange, and yellow colors for the HomePod mini and — finally! — the 3rd Generation AirPods with spatial audio, Dolby Atmos, water resistance, longer battery life, and MagSafe charging. Here's my live reactions!

  • These are the new AirPods with spatial audio.

  • Hi, I'm Rene Ritchie. I'm reacting live.

  • HomePod Mini.

  • It was introduced last year in October, almost exactly a year ago as part of the iPhone 12 event.

  • HomePod Mini is a powerful smart speaker that delivers amazing sound.

  • And Apple has really been putting a lot of time and effort into their audio lately, sort of what they did with their cameras a few years ago. They built a huge, a huge sound lab just down the street from Apple headquarters, from Apple Park, where they have all these sorts of acoustical chambers from -2 Decibels to a room where they can actually model different acoustical patterns that they've sampled from different Apple employees. They can make the room sound like the room that are in all the houses, all the homes of those Apple employees, and they can use that to test how the different products like HomePods, sound in all those different rooms. So, it's not new HomePod mini hardware. It's not a new HomePod biggie, which I think a lot of us, it's been a lot of people, given how much the pricing for OG HomePods is on eBay these days, have been asking for, begging for HomePod theater. You know, something that replaces the apple TV and provides Dolby Atmos sound, a HomePod with an iPad attachment, a doc, or just a screen to use in your kitchen or your bedroom. It doesn't look like there's new hardware now, but it does look like there's new colors.

  • [Narrator] We infuse color into every element. From the beautiful mesh fabric to the tinted touch surface, to details like the volume icons and moving cable.

  • Apple is really going to color on their commercial line, their consumer lineups, not on the pros. It's not that pros don't like color or want color, we like color and we want color, but it's distracting. If you're trying to do photo editing and there's color in your vision, it can really change the way you perceive the color in the photo or the video. So you want to keep that stuff, interface colors. You know, grays, different shades of gray, so that it doesn't distract you. It doesn't change, literally change the cones in your eyes, but for consumer products or things that are fun, you aren't doing pro work on, color is absolutely, not just a differentiator, but a feature.

  • Hey Siri, set a reminder to buy wood glue.

  • [Siri] Okay Jordan. Added to reminders.

  • Apple is pushing Siri heavily here. So I really hope that means that, they even lost some features recently with the shipment of iOS 15, iPad OS 15. They cut back on things like accessing your mail, which was hugely impactful to the accessibility community. It was a real negative to them. It hurt them. But hopefully that was just so that Apple could put this new system in place. But they've run out of sort of free credits when it comes to Siri. They have to absolutely deliver.

  • [Narrator] These new colors join white and space gray and will be available for just $99, starting in November.

  • Same as the original HomePod mini.

  • We want to bring the full spatial audio experience to even more of our customers. So today, we are excited to announce the third generation of our most popular AirPods.

  • So it sounds like finally, they're getting a little bit of event time, which they haven't gotten since 2016, since the original, like I mentioned, the AirPods Pro and the AirPods Max were both announced in press releases, they weren't announced on stage.

  • By moving music beyond the simple left and right of traditional stereo sound and into three dimensional, immersive listening.

  • Going back to the new Dolby Atmos feature, the spatial audio that Apple's been adding to almost all of their audio products, which sort of uses temporal, what's the right word for it. a temporal phasing. They mess with time to mess with sound, to adjust where it feels like the sound is coming from in a 3D sound stage around you. There's two kinds of basically, spatial audio, right now. One is where the object doesn't move, and as you move, the sound moves with you. But the other, which has dynamic head tracking, uses sensors or uses, like if you're sharing an apple TV for a long period of time, it assumes that's where the TV is. And then as you turn your head, it moves with you. And that simulates a speaker experience. You know, if you have a speaker in the room, it doesn't move with your head, the way that headphones do. So, spatial audio tries to use computational audio to make up for that, to give you that speaker experience.

  • These are the new air pods with spatial audio, featuring an all new design inside and out.

  • And they look very much like AirPods Pro, probably without active noise cancellation. The differentiation, is no longer so much in the shape, at least until we get new AirPods Pro, which will probably change further, but it's in the feature set.

  • To deliver the best sound quality. We designed a brand new low distortion driver, created just for the new AirPods to provide powerful bass and crisp, clean high frequencies.

  • They look very much like AirPods Pro, except they do not have the different silicon tips on them, they have sort of the traditional one piece design of the previous EarPods. Going all the way back to the EarPods and then the first generation, second generation AirPods, nothing. The non Pro, the non Maxed, AirPods amateur, whatever you wanna call them. Now, some people were scared that Apple would actually put silicon tips on here because as much as some people love silicon tips, like I do, they help keep the AirPods in my ears. Other people don't like anything in their ear. It feels painful. It feels intrusive. And they really hoped that Apple would keep that original EarPod, AirPod design. So for them, you know, luckily Apple's doing the best of both worlds now. You'll be able to get the AirPods 3 with this design, or you're able to get the AirPods Pro with the silicon tip design.

  • Workouts or a jog in the rain, the next edition AirPods are sweat and water resistant.

  • And they're including adaptive EQ, which some people love, and some people hate, some people hate it because they're used to doing EQ for themselves. Either they have specific tastes or specific needs, and they wanna be able to change the EQ on their own. Whereas what adaptive EQ does, is it takes over that, you lose that feature, but you gain the ability for the Airpod to do modeling of your ear. So, it actually makes an acoustic model inside your ear, then adjusts the EQ for the unique shape of your ear. Even as you move and change, like, as it goes further in or out, or you're talking or not talking, it just makes sure that the sound stays the same regardless of the shape of your ear. So, it's a much more advanced feature, but you do lose out on that traditional EQ, and Apple hasn't been able to marry them both. Hasn't been able to let you adjust the EQ, while you adjust, while they keep up the adaptive EQ going.

  • And to extend your experience, we increased the battery life for up to six hours of listening time. Or if you're in a hurry, five minutes of charge time, gets you around an hour of use. And of course, the convenience of the charging case, giving you an additional four full charges.

  • If you use it like I do, to listen to mostly mono audio podcasts and audio books, you can just swap back and forth and basically never stop using them.

  • Even more convenient, we added MagSafe and wireless charging to the case.

  • Which was what Apple introduced with the iPhone 12, last year. So, if you have a MagSafe charger for your iPhone, these will work with that. And we have spatial audio, a new Apple custom driver, longer battery life, personal engraving, and now certifications with Siri.

  • [Narrator] All of this for $179.

  • Oh, but the OG, not the OG, the second generation AirPods are now down to $129, which is good, which is gonna push them even further, which is exciting because they're already a meme. So, this is just gonna make them what? More meme-able? More of a meme? Brokie meme? I don't know. You tell me.

  • This is our new AirPods lineup, starting at just $129.