Categories
Uncategorized

How Apple Private Relay Kills Data Profiling

Apple’s new Private Relay. Part of iCloud+. The not a geo-hop, not a corporate tunnel, not-a-VPN.... that will soon be protecting your IP… and identity online. But what does any of that even mean?

What Private Relay will do, what it’s designed to do, is encrypt all your connections, including any legacy connections not using https, to obfuscate your real IP address, and to prevent anyone, even Apple, from knowing both who you are and where you’re connecting to on the web.

Categories
Uncategorized

‘WTF’ M1 iMac Hot Takes — Reviewer Reacts!

One month later and reviews of Apple's all-new, all-colorful, all-custom silicon 24-inch M1 iMac are still coming in. Then... there's this!

The Washington post, one of the papers of record just published their iMac review. And it's a lot. Sponsored by ting. I am not going to defend either the new iMac or apple here a trillion dollar company perfectly capable of doing that themselves. But what I am going to do is what I increasingly seem to be doing on this channel. And that is just troning the mainstream user from the tyranny of the techarate for anybody who is interested in this new iMac. I just want to make sure that you have the best possible information so that you can make the best possible choice. And if you ultimately love the new iMac or hate it at least you are loving it or hating it smart. Apple's newly redesigned iMac measures just 0.45 inches thick. That's a hair thinner than the original iPhone. It's thin enough to wedge under a wobbly table. And this whole thin thing is going to come up here over and over again like it's a triple word score for Yahtzee or something. So brace yourselves. An obsession with thin design has taken over consumer tech and apple is its leader.

And this whole thing is just such a thick obsession over this supposed thin obsession. And honestly, it's also just such a tired, old cliched trope by now. Like the horse has been beaten beyond death beyond Dormamuu. It's not even a zombie horse anymore. It's like the 300 zombie horse movie. The one that's not even bid on by Netflix. And yet part of me just totally gets it because if you're not into industrial design that's a super easy, barely inconvenient mistake to make. But thinness is almost never the goal. It's almost always a side effect. Iphones, iPads, Mac books, mobile devices in general. They're increasingly thin because they're meant to be increasingly light, but for the iMac, which is a desktop, so not at all about the lightness even though this one is so light, it's like 10 pounds. You can carry it to the coffee shop with you for that ultimate post lockdown flex, if you so choose. The reason it's thin here is the same reason for the rainbow of colors. It's this generation's version of the transparency of the OG of the original iMac, same reason for the white bezels, so that it feels light and part of the decor and almost blends away. So that not only can you fit it into more spaces and more rooms, but you'll want to. Your family will allow you to, precisely because it doesn't look like a traditional computer, a bead blasted, aluminum, big bubble back hunk of hardware.

And this thinness is a side effect of that design goal. The new iMac I've been testing for a few weeks is a departure from my old one. It's essentially an iPad on a stand. And that is totally true but kind of the whole entire point. Apple even made that point. They showed off exactly that. The iPad next to the iMac when they announced it, it's like it's like that famous Steve jobs story, where he slapped the iPad down in front of the Mac team and said, why can't you do this? And that moment has defined Apple's Mac culture for good. And sometimes for ill ever since for the last decade because the iPad was just Keyser Sozed into existence to reach a customer base far beyond the max traditional single digit. And once it started doing that once apple saw it accelerating faster than even the iPhone did at launch, they wanted to see if they could do that. Reach that larger customer base with the Mac as well. And that's what got us, the Mac book air which redefined the ultra book for a generation. And now this new iMac, this whole new kind of iMac which I think is meant to be the Mac book air of iMacs or yeah, the iPad of iMacs and for cynical old reviewers yelling at clouds, that's a bug. I could see that. I totally get that, but for the mainstream market it might just be one hell of a feature. There are good arguments for thin handheld devices skinny phones fit better in skinny jeans but even with Apple's mobile tech, we've been burned. The ultra thin butterfly style keyboard apple introduced for laptops in 2015 created so many problems that apple eventually abandoned it and just wow, skinny jeans and butterfly keyboards in the same graph, is that cliche Bingo? Do any of you have cliche bingo on that? Because look, this is a total tangent, but I'll risk it.

The problem wasn't that apple tried the butterfly keyboard, it's that they didn't fix it iPhone four antenna fast or abandon it buttonless iPad shuffle fast, which yeah totally unacceptable. But that same team also shipped the force touch track pad at the same time to fit over the terrist battery of that same Mac book. And that's been just a huge, huge hit a really good progression and exactly that kind of experimenting that kind of continuously pushing the Manila envelope is part of what makes apple, apple and the Mac, the Mac. And I hope apple keeps doing that with the Mac keeps pushing that Manila envelope, just you know, fixes things a hell of a lot faster when and if they have to, even if it ends up being used as just the laziest spice on the blandest of faux edgy hot takes. So always. On the minus side, the thinner imperative took over. Thinner again, does that mean drink? Do we have to drink now, now that it could fit the brain of an iMac into a half inch body apple cut other parts that just didn't fit. First went to power supply. The part that transforms electricity coming out of the wall. Now the iMac has an external power brick. Maybe you'll just throw yours on the floor or maybe it's one more thing for your cat to chew on. And now I'm left. Wondering is nobody feeding this theoretical cat. Was not satiated by the MacBook brick by a PC laptop bricks by all the other bricks all the other cords on just everything in the house.

Does it have a particular hate on for the iMac brick? Did the iMac brick hurt it? Did you soak the iMac brick in tuna? So it'd get all Shrodinger and buy it? Do I need to call PETA now? I am legit existentially worried about this cat at this point. I don't see any way USBA would fit within half an inch. Maybe it could fit below the chin line on the side like the 3.5 millimeter headphone Jack does. But my guess is that like six years after the introduction of USB-C apple simply doesn't think USBA is that compelling a feature anymore for the more new to iMac customers that they're targeting with this new iMac. You know like in the tradition of ditching the floppy drive when you look out another six years of potential lifespan being able to dongle USBC into so many other types of connections, even new and upcoming connections is something you can't really do with any of those legacy ports. It's just a huge advantage of this architecture, especially for a Mac book, air or iPad style iMac apple also cut the compact flash card reader included in past iMacs making one more thing photographers need to buy.

And it was actually an SD card reader, which has escalated over the last few years from SDHC to SDXC. And so you still had to buy extra dongles to get the faster speeds when you have one of those static older slots. But yes, it's the same thing. SD cards are kind of a narrow pro-sumer middle ground now where pros are increasingly moving to CF express. And the vast majority of people are just using iPhones now for their photos and videos. And apple is investing in everything from photo sync to airdrop to make that as easy and appealing to all of those people as possible. My old 27 inch iMac has a little door on the back to upgrade memory after the fact a real help for future-proofing and oh wow. It's total plot twist. But like later LaMer, M night Shyamalan and the trees are alive and angry plot twist that old iMac that was waiting to be replaced with the higher end 27 inch model. So we got kind of bait and switched. We got bamboozled.

And maybe that explains the cognitive dissonance here the expectational debt, because the new 24 inch M1 iMac absolutely does not replace the 27 inch Intel iMac. And you can tell that because apple still sells that 27 inch iMac, they updated it less than a year ago and it has yet to be replaced. This new 24 inch iMac only even kind of replaces the 21.5 inch Intel iMac which apple also still sells a model of because this M 1 iMac is really something new like I've been explaining but either way, anyway, the 21.5 inch iMac didn't have user replaceable memory either. And I totally get why that can be beyond aggravating to traditional computer users but it's something that's been beyond aggravating for years. So it's just all shades of cringe, weird that so much blame is being so sloppily laid on the M 1 here. And what about when your iMac inevitably just can't keep up in six years as recently as 2014, iMacs could transform into a monitor for another computer, but apple no longer supports what it calls target display mode.

And yes rip. I really, really miss target display mode. Apple ditched it when they began supporting 5k displays and had to make their own custom timing controllers just to get them to work just to fuse two streams together. So the screen wouldn't show any shearing down the middle and we still don't have that back. Which is so hella irksome. I am irked, but we do have airplay to the mac and now universal control coming this fall with mac OS Monterrey, which won't be the same as a wired connection. I totally get that, but I'm hoping it will make that kind of repurposing at least feasible at least possible again. This is partly a philosophical divide between apple and people like Wiens Turning computers into appliances can simplify them. You don't need to know about what's going on if it just works. And this is again true, partially true. Apple is making this new iMac for a very specific type of customer, a very different one than for the new Mac pro with very different priorities. And that's not bad. It's actually the opposite of bad. It's good because options are good. And whether it's having something like this new iMac on the market, for people who want it instead of say a Mac mini or traditional iMac, or having PCs from other vendors entirely that are less appliance and more customizable so that people who may just hate the breathing guts of this new iMac can go get that something else including something else with USBA, Hell even VGA from anybody else. Because that's what real choice is. Not everything being the same, all computers from all companies being the same, but being able to get something like this new iMac from apple. Pretty much only from apple.

Or being able to get a more traditional PC from literally everybody else. But Apple's appliance mindset is also self-serving because it means we have to keep buying new stuff. And there it is. For anybody keeping track at home, the cliche clincher the trope trifecta, apple customers are such cultists, such sheep that no matter what apple does, where does to us we'll just keep on buying and buying from them. But no people don't like being burned. So if I feel burned by this new iMac, I'm going to go and get at a Microsoft or HP or whatever instead, because we may need to keep buying new stuff inevitably, eventually but we certainly don't have to keep buying it from apple. Just like if we feel burned by a review, if we feel it's being disingenuous or disrespectful or toxic or trying to rage bait us, we'll stop reading that reviewer. We'll go to Joanna or Marquez or Justine or Andre or whomever, because ultimately that's what this is all about. Apple is making calculations based on what they think will be most appealing to the market for their current Mac customers and customers that they're hoping to lure away from the PC or halo up from the iPhone or iPad. And if they calculate wrong if they're not satisfying customers new and repeat, we'll vote with our wallets and get something else. And we saw that briefly with higher end pros before apple started re-engaging with us so strongly a couple of years ago. And I expect, we'll see more of that this year and next year as well.

Categories
Uncategorized

Apple Watch VP — It Has to Work in 2 Seconds!

Apple's Kevin Lynch, VP of Technology, and Deidre Caldbeck, Product Marketing, chat with us about their histories with Apple Watch and Health, new features like ID cards and keys, how they decide on new features, how they made Assistive Touch for Watch, bringing Mobility to iPhone, the visual representation and security of Health information, security vs sharing, and the future of Apple Health!

  • I'm Kevin Lynch. I work on software across Apple watch and health.

  • And I am Deidre Caldbeck, and I am on the Product Marketing team for Apple watch and health.

  • And it's cool that Deidre had been off the paddleboard and came into the, yeah--

  • No, it's underneath my feet. I'm just very stable.

  • I know we didn't get to see you dance again this year. Kevin, we didn't get to see you and Jules do a duet. And I did file a radar, expect to behavior dancing.

  • Off camera, that was all off camera this year.

  • I'm actually really curious. If we go back to the beginning earliest days of the watch, when you were first read in or disclosed and you heard that Apple was making a watch, what went through your head? What was your immediate reaction?

  • Well, for me, that was actually day one when I started at Apple, the morning I started, the story was we wanna build a watch, and we've got to get going. In fact, we wanna get going right now. We're feeling like we got to go fast. And actually I was advised to not even go to orientation. Like you don't even have time to go to orientation just get going, you know Apple well enough already and get started. And so it was a very rapid start. The thing that went through my mind, there's always, of course occasionally rumors about what Apple's may be working on. But when you look at Apple watch, I felt like that was inevitable. Like to have Apple make a watch, I felt like that was the first thing that went through my head was of course, oh we're gonna do that. 'Cause if you look, I love the history of computing. If you look at the history of computing, it's a series of evolution of smaller and smaller computing devices that are more and more powerful. And it goes from mainframes to mini computers, to desktop computers, laptops, portables, mobile phones. And so that evolution of power and administration leads you to, well, what's next, what's next after the phone? And that really leads you to something as small as a watch. And the watch is a natural location on your body. If you start thinking about something that's even smaller than a mobile phone, which is already pretty small, it's like, well, where would you put it? How would you carry it? How would you look at it? And so attaching it to your arm was a really really natural place to do that. So that was really interesting. And so of course, going through my head then was this is gonna be an incredibly challenging project. How do you even get a compelling interaction on a really tiny display? And battery life, of course is gonna be an issue, which is a major challenge. And so designing around that was from an engineering perspective, really interesting challenge, which I love. And then also this great opportunity, and really a sense of responsibility for working on a device that is actually on your body, like you wear with you, like that responsibility is really significant. How do we make something that people will feel really good about that supportive, and empowering for them, not bugging them. So that kind of philosophy around how we design all of our interactions is really sensitive to that.

  • Unlike Kevin, I had no idea what was going on. Apple is very good at making sure that we keep our surprises, surprises for both employees and our users. And so I was working on iOS, and actually what I worked on initially was the health app because I worked on iOS, and Kevin's team was working on the health app, but I was on the iOS product marketing side. And so we were working out how to help tell the story of this whole new health app in iOS eight. And what I thought was amazing was we were gonna be able to talk about how you could bring all of your health data together from third-party apps and devices, and little did I know that one of those devices was soon to be Apple watch. So it was actually a really exciting thing for me, as you said, to be read in or sworn in on the secret of Apple watch much closer to when we actually announced it. So I really wasn't told about it until I think just a few weeks before the announcement. And I was very fortunate to be one of the people to show it off to our audience at the event, in the hands-on area. So I learned about it, honestly, just probably the same way you did Renee, and that was a really neat thing for me to work at Apple and hear about it like everyone else. So it was cool.

  • My dad worked at IBM and he got an Apple two plus we wouldn't have to drive downtown to use the mainframes. And I had an iPhone. I wouldn't even have to go back to my Mac to do certain really important things. And then I got an Apple watch and it can't do as much as an iPhone, but it can do these brief frequent, but really critically important things without me even having to reach into my pocket or reach onto the desk for my iPhone. And it occurred to me that it took until like iOS five, and iCloud before the iPhone was really that independent from the Mac or the PC running iTunes, but systematically over the years with on-device logic and on-device app store, and bringing LTE over and just continuing now you have the family sharing where you can set up a watch for somebody, you are systematically just increasing the functionality. And I know that Apple is not shy about cannibalizing their own devices. You'd much rather do it than have somebody else do it, but it is that like a... Not a predatory process where you're looking at the iPhone and going, "What can we do next?" But is that like more of an organic thing where as time and technology allows, you'll just keep building up the functionality and the things that we're capable of doing from our wrist?

  • The work we're doing there is really how can we empower people with Apple watch, and in different situations whether you're near your phone or away from your phone. And you've seen us doing that more and more over time, for sure. But really the main focus is how all of our devices work really well together. And it kind of, when you interact with each one of them what are you trying to accomplish? And each of the devices has its own kind of personality and way of going about things. So like you were saying with Apple watch, the little brief interactions is really core to how Apple watch works. And we started with that and that's how it's different than like you're using your Mac, you'll tend to use it for longer periods of time and phone. Like you were saying less than your Mac but more than your watch. So there are these kind of interaction times that are very different across the devices. And when we started working on Apple watch, we thought that time of interaction might be maybe something like 10 seconds. We were kind of estimating what those typical interactions might be like. And it was really interesting when we started actually living with the watch internally, we started learning that really, you wanna do those interactions even in shorter times than that. More like two seconds rather than 10 seconds. And that's a tall order to do some of these tasks in two seconds. We weren't sure how we could do that. So we paused what we were doing for a moment, we had everybody go and brainstorm for a bunch of different areas of the system. How could we possibly make those happen in two seconds? Like no rules. Like you can do whatever you want. You can change the interaction with the system, you can make shortcuts, you can take stuff out, whatever it is, make it two seconds. And we made a list of those things, and that was super productive. One of the examples of that is the messages app, when you receive a message, it used to be initially when you replied you had to kind of go into reply, and then go to a compose view and then press send. And that took more than two seconds. Now in messages, which has been this way for a long time now in the watch, when you receive a message, you can just scroll up with your digital crown, there's a bunch of suggested replies that we've generated for you, and there's some you can edit if you want, and you just tap them. So scroll and tap, and soon you receive a message that scroll and tap you can totally do in less than two seconds. And so we went across and did a bunch of that same for like wallet, doing Apple pay, like just a whole bunch of areas of the system, we went and tuned in. And that type of investment in time and engineering makes a huge difference. And it's sometimes like what are you removing more than what are you adding?

  • That's one of the things I love most about Apple watch, because like the, the downside of focus is sometimes tunnel vision, but it really seems like over the last few years even though health and fitness have become so popular, have gotten so much attention, the original sort of promises of the Apple watch also included payments and authentication, and remote control and communication all these other things. And you've been systematically improving those as well whether it's walkie-talkie or now the Intercom system, the new home app, way more robust. And one of the things that interest me the most is keys and IDs, because this just takes... It both simplifies the process of authentication, identity and control, but also greatly empowers it. Can you talk just a little bit, about how those new technologies work?

  • This is an area we've been working on for many years and it's really exciting. The infrastructure is really well understood now, and we're at the beginning of it kind of rolling out this generation of access to your home, your office, your car, hotel rooms. So each of these categories has its own way of implementing this and then we've designed it on Apple watch so you can just use tap to access, just like you do with Apple pay using NFC. You can make them so they have express mode turned on, we call it, so you can just tap and access, whatever it is you're using. And with the proximity of NFC, there's an intent you're waving your device in front of the lock to cause it to unlock. So there's not this guesswork about whether it's unlocking or not. You still have a gesture that's involved to do the unlock. So I think we've got really a great approach to this now and super excited for it to start rolling out.

  • Previously, we talked a lot about how messages and photos, and things like that were really what kept you connected. And now we have this evolution of the watch where it's your identity. It allows you to control the things around you and access the places you love. And I mean, if you think about Apple watches, your identity we started with Apple pay, when it's on your wrist, that's your identity. And we've recently used it as your identity for when you wanna unlock your iPhone and you're wearing a mask. And now with these watch USAID updates, with things like state ID and the home app redesign and these other wallet features, it's really starting to be more of this identity access, and control where it's the safety and security of a device that's always on you. So it's really perfect for all of these capabilities.

  • I'm always continuously impressed and inspired by the accessibility technologies that Apple brings to all of their products. But I think that demo of assistive touch on the Apple watch, it just next level blew people like straight up science fiction away. And I was wondering like, how do you sort of realize all of this potential and sometimes maybe even take it further than anybody imagined.

  • Yeah, these are deep collaborations across multifunctional teams. So it's the design, accessibility, engineering storytelling, marketing, it's all the different teams work on projects like this. And it's the combination that really makes them awesome. And on that one, there were lots of explorations like more gross movements of your arm, shaking your arm rotating your arm, lots of things like that. Those require energy as well. So we thought about like how much energy does it take to do these different gestures? And if you're using assistive touch, you wanna be really thoughtful about how much energy every time and every interaction. And so we settled on a couple that work really well from a low energy human energy perspective, but are really hard to sense. And the two are there's four signals, clench, double clench and then we did pinch and double pinch. And the sensing of that is happening on your watch. And it's using the gyro, the accelerometers looking at micro movements of your arm when you do those gestures, it actually uses the heart rate sensor as well. It gets the imaging feedback from from your arm included with the gyro and the accelerometer. So those are all blended together with machine learning and we've created models to capture these gestures. This wouldn't have been possible a few years ago, and it's just, as we're starting to understand how we can use these approaches in terms of building software. And also the capability of the performance of Apple watch has gotten so incredible, we can run these things live now. And we've been working on accessibility in Apple watch since day one, we've first shipped Apple watch. We were thinking about how to enable people with this and it's been such a pleasure working across all the teams on this and the team members, and the assistive technology area have just really been brilliant and have been working as part of the integrated Apple watch team since the beginning.

  • I love it, I love your dedication, like the plural your dedication to accessibility and also health, just the way you've been ramping up health. And please correct me if I'm wrong, but Apple watch has been a big focus of health, but now you have a mobility and a sort of tying in the iPhone in a way that's really Apple watch style like using the sensor fusion and the machine learning on device and taking all of that, and giving us really useful preventative information. And in my head, I'm thinking, well it's the Apple pencil one day gonna be able to tell me I have RSI or carpal tunnel, like is this the beginning of some great Apple ecosystem health thing, but I'm curious like when you see this escalating across multiple Apple devices now, Deidre, well, first, can you talk about bringing this kind of technology to the iPhone, and maybe how people will use it or get the most out of using it?

  • Yeah, I mean, I think you said it perfectly when you said that we do kind of think about our entire ecosystem of devices, and the capabilities within each of those devices. And really our goal is always to empower people to better manage their health. And whether there are sensors on the watch, or the phone, or Apple pencil that can offer that empowerment for our users, that's actually where we'll focus the investment and the energy. And, you know, with walking steadiness, it's something that, as you said, we really wanted to think about how can we prevent these falls. We introduced fall detection with Apple watch series four. And of course it's been incredibly rewarding to hear the stories from our users who have benefited from that feature. But if you think about it, gosh wouldn't it be better if that fall had never occurred in the first place. And we know that Apple watch has amazing health sensors, the motion sensors, the heart rate sensor, even we're using the microphone for health features with the noise app, but the iPhone also has as you know, motion sensors. And in this particular case, the motion sensors on the phone and where you actually carry your iPhone, typically in your back pocket or your front pocket or maybe a shoulder bag or a cross body bag, those sensors actually can pick up those subtle signals from your hip movement that can actually give us the fidelity that we we need to provide you with those mobility metrics, that can then give you the signal for when you might be at an increased risk of falling. So of course the watch similarly has powerful motion sensors and some of those mobility metrics are from the watch things like stair ascent speed and stair descent speed. We can only get that from the watch, because of the way the watch is positioned, and you often may not have your phone with you when you're going up and down stairs. And so we're trying to think about the best place to get the best data, to provide the best experience. And in this case, the iPhone was really best positioned for this and that certainly gives us the opportunity to offer it to all of the iPhone users, and not just Apple watch users, but we know that combination of phone and watches is really powerful for a lot of these health features.

  • Always felt to me like the phone was like the star ship and the watch was like the shuttle craft.

  • Ooh.

  • And you can just get so much done by going back and forth--

  • I love it, I love it.

  • Between those things.

  • Walking steadiness actually came initially from fall detection. So we're doing fall detection in Apple watch, we're all working on it and it's incredible how that's been helping people, and we still get letters about that, but we were thinking as we're working on it, well how can we prevent people from falling? Like, rather than just detecting that they're falling can we actually stop them from falling? And that just led through this really interesting journey of discovery and brainstorming about while we may not be able to stop you in the moment from falling--

  • very carefully, walked around disclosing any potential anti-gravity fall prevention technology, that you might have in play after you teased us so mercilessly with the teleportation, a couple of years ago, and then Craig escalated to portal technology, through his aperture science app just this week. So I appreciate that your whatever vibranium casing you're using is still in the future, but you're trying to give us tools we can use today.

  • In the meantime exactly doing what we can, exactly.

  • Yes.

  • We actually did this really deep work that has taken years of sensing these different attributes that Deidre was mentioning. And then doing studies where we partly use the Apple heart movement study for this was about a hundred thousand people in that. And so we were able to use the insights that we've gathered from people's movements in that study, to actually see what would be predictive of falls. And so that's what this walking study this metric came from, was the modeling of those metrics. And then seeing which ones indicated most clearly that you had potential to fall. And that was supplemented by studies specifically on this topic too, where we had cohorts of people, where we were doing traditional walking studies with them, like observations and questionnaire filling out and things like that, kind of traditional characterization of potential falls. And then we tracked those users with, you know, they were part of the study, and as people did fall over the next kind of year or two, we were able to go back, and look at those authentic falls, and see what the indicators were earlier and what led to those. And you can't fake that you can't just do prep fake falls, you have to have actual falls. So it takes a long time to really get that pre-work done, see it actually happening and model it correctly.

  • It feels like we're getting into like phase two of all of this technology, where in the beginning you were just building up more and more sensors and more and more data. And maybe labs is like the current ultimate expression of that. But now that you have all of this rich data you're providing things like trends, but not only are you looking at the trends, you have this Edward Tufty inability to take all of this medical information, and make it just so human digestible, like, just in terms of the interface and the graphing and all of that, how do you sort of deal with these ancient sometimes very inhuman data sets, and make them so people who are legitimately stressed like they have health anxiety can get so much benefit from just seeing it easily displayed maybe for the first time?

  • Yeah, so trends is like, how do we draw insights across all kinds of different data types in a simple way. And again, we use a lot of modeling from the studies that we've been doing to look at patterns and the significance of those patterns for people. And so we basically have built systems that do statistical relevance of these insights. So not just showing you numbers that maybe bigger or smaller, but which ones might matter. So there's a lot of thoughtfulness by data type about the variances in which types of variances might matter for which data types. And then we look at comparing the past month, through the past six months, and then we look for the changes that might be notable in that time. And there was also a lot of work on like, how many of these trends might we show people? We don't wanna show people too many trends 'cause it's kind of overwhelming, but also you don't wanna like not show any trends. And so we did a lot of experimentation on tuning that in to get it to a point where it was a useful relevant trends without being too many.

  • One of the things I'm curious about I get asked this question all the time and I'm sure you do as well now, especially that you have iPhone now involved in health. And I was joking about the Apple pencil before, but for example, the iPad, when you look across the Apple ecosystem and you start thinking about a future where, you know, maybe it doesn't make as much sense to have these particular features on an iPad because you don't carry it the same way. It's not with you the same amount of time, it doesn't have the same sensors or connectivity all the time, but maybe it's really great for reporting or for a big screen to see all these infographics that you're doing such a good job on, is health something that you really do look at in terms of the entire Apple ecosystem?

  • Yeah, absolutely. I mean, if you think about the health features we have today, there are obviously several on Apple watch and iPhone, there's also some health features with Air Pods and some of our audio products, right? I mean, so there's absolutely an opportunity for us to leverage the ubiquity of our devices, to discover new ways that we can empower people to better manage their health. And we'll continue to investigate those areas, we'll continue to invest in those areas. I think with the new features that we introduced this year, we are excited to hear from more and more users who are going to take advantage of these great features, for example, with health sharing, we know that we hear from so many of our users that they're feeling this this burden really, or this overwhelming sense to care for both their children and their aging parents. And to be able to do that in a way that, just have it in one place in their health app where they can see their own health data, but then also the health data of their loved ones. And in a way that's private and secure, where not only do their family members feel safe when they're sharing their data, but also the people that they're sharing with know that it's a secure connection between the two of them.

  • I love that so much. And I love your approach to privacy and security so much, because I think it's easy to be extremist like Info Sec extremist about these things. Yes, you wanna keep your health data private, but if you're an absolutist about it, you can't share it with your doctor or with your kids or your parents data, or maybe you're not physically capable of using that, but your caregiver is the one you want to, and I love that you're being so, like you're doing informed consent which to me is everything. But when people are willing to do that, you're empowering them to use that data, beyond just locking it up and making it actually functional for them.

  • Exactly, and I think to your point about informed consent that's why the way we built the experience, it was with all of our features privacy is at the core, you are in complete control over the data you share and who you share it with. You can of course stop sharing at any time. You can choose very granularly which pieces of data you wanna share. If you've had a chance to go through the experience, you can even see a preview of what the person you're sharing with will be able to see, so that you can be sure that's what you wanna share. And it's not just control, right? It's also transparency. So you'll never share data that you can't see yourself, and all of those principles are at the core of these features. And we'll continue to ensure that that's the case in the future as well.

  • One of the things we've done which is wonderful is we've added kind of this discipline to the mix of making products, which is the clinical side. So clinicians working with great health backgrounds, working on and that's really helped us make some great strides in the work that we're doing here around health, and have it be something that's super relevant medically to people, especially when we do the doctor sharing stuff, like how do we make a dashboard that a physician can look at and understand very quickly, 'cause these interactions tend not to be very long. And so you've got to really understand someone's information really fast. And so we put a lot of energy into the design of those views, so that that would be super relevant to a doctor. And on the privacy side of that, that was really interesting too, because the end point of that is not an Apple device usually it's a computer running electronic health record system, and they're viewing the patient's records, they're inside the health institution. And so how do we get this information from their Apple watch and their iPhone to show up in that other device? And yet, have it be completely end-to-end encrypted and secure in a way that Apple can't see the data. And we're actually serving that to that EHR through a web view inside of those systems. And so we generate a view of the data that comes from the person's phone into this web view and download it onto the EHR so they can see it right in place. And it's in context, it knows which person's data to bring up 'cause of the current person they're viewing the rest of the information for in the EHR. But the encryption of that data is handled with an encryption key that's shared between the user's phone and that health's institution that Apple doesn't have. And so the package of data is encrypted on the user's device, sent via the cloud down to the EHR, and you get this encrypted package of data with them. If you looked at it in transit, you couldn't tell what it was, it's this encrypted blob, but it shows up and it's decrypted at the last second inside the browser of the doctor's view. So that's where it kind of unpacks itself and shows it in the view. And then when the doctor stops looking at it it's not there anymore and it's encrypted.

  • I have this beautiful dream where one day when the world stops ending, I'm back in South Bay with all of you and maybe I'm at bitter and sweet having a red velvet latte. And my Apple watch goes off and alerts me to a potential health concern, and I'm just a Canadian on US soil. So I go to the local clinic and I can just tap my Apple watch and it immediately authorizes like all my health records, all my allergies, my medication, everything that has to do with me, maybe it Apple pays my copayment or my insurance deductible. It does everything that I would otherwise have to sit there with endless clipboards, endless phone calls, in a panic having to go through. And it handles that as easily as it may be handled at transit interaction today. And so my question is not so much about future technologies but about when you're approaching all of these things, how much of it to you is just like yearly updates versus the stars that you see hanging above the horizon?

  • We're at the beginning of this work in health I would say, even as we've done a number of things already, there's so much potential here. And then some of the things you just outlined are lots of great challenges that still exist in the world around health and sharing information. And we're at the beginning. So the work we do is long-term. So we do work that, you know, may show up kind of on an annual basis or so, but that work is really backed by in some cases, years of work, that just happened to align at that time. So there's a pattern of that. Some of it's short-term, some of it's long-term, but our thinking is definitely long-term about what we can do here for Apple watch and for health. And for both of those things, it's early still, like we've done a bunch of great stuff, but the ideas are still ahead for us to do a lot more. It's just really exciting.

  • Deidre I mean, Tim Cook said that health is one of the things that Apple's gonna be remembered for. So zero pressure, right?

  • Zero pressure. But if you think about it, there's the, you know, we talk about this a lot, but it really warrants continued emphasis. We are just so moved by these letters we hear from our users. And I think that's why Tim makes these comments, because we are all just so fortunate to be able to read these letters and you hear, oh gosh, I had no idea I had this condition and I I bought an Apple watch, because I wanted to get my messages when I'm on a run or I bought an Apple watch, because I wanted to be motivated to work out more. And I had no idea I had this, this issue, or I was out kite surfing, and I had an issue and I remembered I had a cellular Apple watch that's also water resistant. And so I could just make a phone call. And so I think the way I think about it is we in some ways have already made quite an impact on our users lives in a way that I don't think we would have imagined, when we were first building Apple watch. But I also think to exactly Kevin's point, this is just the beginning, because those stories are really what inspire us to do more and to invest in these areas where we think we can make a real difference in people's lives. And so we're so excited to continue to do this.

  • I've said a few times that I think, and I don't want Cayenne or Craig to gift it to me, but I've said several times that I think that Apple watch is just the most important device that Apple has ever made, because it like, yes, a PC, a phone can save lives, but the Apple watch has so many features that are designed just for the purpose of improving, preventing, helping, contacting for life. And I think just thank you, you personally, and the teams for all the work and all the effort and all the thoughtfulness and compassion, and empathy that you've put into each year, every year, making it just a better and better device, not just for me, but like many people, I've given it to my parents, I've given it to my siblings, and it's made a tremendous, tremendous difference. So sincerely thank you.

  • Thank you, Renee, it's like an honor. The whole team feels this excitement and dedication to working on this project for very much the reasons you're talking about, and it feels like responsibility as well. It comes back to the Apple watch being worn. I mean, it's on your body, it's there all the time. And that's a unique thing. And a lot of goodness can come from that and a lot of support for people. So I've never worked on something as meaningful and compelling as this project, I never have worked on something where we constantly get users writing to us about how this has affected their lives, I've worked on some really great things over time, but this one really is special in that regard. And it just keeps coming, Deidre, we get these stories all the time, and we share them across the team. So people write, Tim, Tim forwards them around or to us, like we collect all these letters up, it's the emails mostly and the whole team reads them, and that's super inspiring. And some of them are great letters, some of them are like, Hey, you guys can do better on XYZ. We share those too, which is really helpful if you do write those in, know that actually we read those, and it really affects our thinking about what we were working on in the product. So yeah, it's an amazing thing to be working on this.

Categories
Uncategorized

Beats Studio Buds — Better Than AirPods?

Beats Studio Buds. Black. White. Red. Active Noise Cancelation and Transparency Mode. Small, medium, and large tips. Physical buttons. Full-on, native iOS AND… and Android Support. 8 hours of battery in the buds, 16 more in the USB-C charge case. No H1. No Lightning. No head-tracking. No stems. No inductive charging. But — $149.

So, are they better than AirPods?

Categories
Uncategorized

Is Apple Leaving Intel Macs Behind?

While all recent Macs are getting new macOS 12 'Monterey' features like Universal Control, AirPlay to Mac, Shortcuts (in AppKit!), redesigned Safari, and ecosystem features like SharePlay, Quick Notes, Spatial Audio, iCloud+, and more, Intel Macs aren't getting some Apple silicon-dependant features like Live Text, system-wide Translation, Portrait Mode in FaceTime, and Apple Earth. And that's making some people hella mad!

So, what's the deal?

"For the first time, some of Apple's own features will only be available on computers with the company's homegrown silicon," It's because first time Apple has computers with homegrown silicon.

So just like in my video yesterday where I went over why some people were hella mad, hella angry that Apple didn't use WWDC and the M1 as an opportunity to bring macOS to the iPad Pro, I'm gonna go over today why some people are just as mad, just as angry, that Apple with macOS Monterey is bringing some features exclusively to the M1 Mac and not to the previous Intel Macs. "Apple starts leaving Intel Macs behind in macOS Monterey.

And that includes the features that they wanna bring to those products. So it is absolutely not a surprise that once Apple has the same kind of capabilities in Macs that they've had in iOS devices for over a decade now, they are gonna just use the ever loving stuffing out of those capabilities. "As MacRumors spotted after Apple's developer conference this week, the teeny-tiny footnote at the very bottom of the macOS Monterey preview page indicate that certain new upgrades will only be available on Macs with M1 chips." And first, kudos to MacRumors for not only finding that story and posting it so early, but the minute I gave some context to it on Twitter, Joe Rossignol immediately updated with that context. "A portrait mode in FaceTime that lets you blur your background during calls won't be available for Intel Macs. Neither will Apple's new LiveText feature, which allow you to copy text directly from photos." So in the case of these specific features, of several of the features that are exclusive to the M1 to the Apple silicon Macs, it's because a neural engine, the 16-core neural engine is exclusive to the M1 Macs. It's part of the same package that went into the A14, that went into the the M1. The same Silicon generation, the same architecture that Apple is using across their devices this year from the iPhone to the iPad Air to the iPad Pro to the new Macs.

And what it does is allow Apple to perform all of these machine learning operations much, much more quickly than if they had to write everything out to the GPU, for example. But if that doesn't exist on Intel, and part of the reason Apple has been so frustrated is that they couldn't do the same kind of features that they could do on iOS on the Mac because Intel lack these things. And Apple has been working around sort of these deficits in Intel chips for years using things like the T2 chip, which they offloaded not only real-time encryption too, but also like things like H.265 transcode, encode, and decode. Because it just it worked way better on the Apple silicon than it could on the sort of generalized Intel chips. But the T2 is an equivalent to an A10 processor, and that is the generation before Apple introduced neural networks, those came with the A11 with the iPhone X's chipset. So those features were just simply not available in T2 equipped Macs and T1 Macs that was based on the S2 on the second generation Apple watch system in package, which didn't have any concept of a neural engine either. And of course, older Macs just had nothing, not even a T2 or a T1 chip.

So these features were specifically designed for iOS 15 and iPadOS 15. And because Apple is now running those same chipsets on the Mac, I never wanna use the word trivial when it comes to software development, but it is probably as close to trivial as it has ever been for Apple to take features from iOS and bring them over to macOS, because they will writ large, in theory, just work. Essentially the Mac is now getting those features for quote, unquote free. "Some of Apple's revamped Maps features, including a detailed city mode and the ability to manipulate an interactive globe market will also be M1 exclusives." Now, those features don't rely on the neural engine, but they do rely on the performance efficiency of M1. And you can actually go into debug mode and turn a flag, switch a flag, flip a bit, and enable those even on the Intel Macs. And they will work, but the frame rate is not good, it is bad. And I'm not sure exactly what the battery cost is, but there is an additional battery cost of doing it that way as well. So Apple just disabled it rather than have a bad experience. And there's sort of a rock and a hard place situation going on there.

And again, it's never my intent to make excuses for Apple. They're a multi-trillion dollar company. They have whole departments that they can use to make us use it for anything they wanna do. I wanna give you the information so you can decide for yourself if Apple made a good call, a bad call, or just everything is a compromise, everything is a trade-off, and it all could have gone either way. Totally up to you to decide. But for Apple, they wanna always provide a good experience. And if anything doesn't meet their sometimes ridiculously high standards, especially when it comes to things like frame rate, which they have been just obsessed with at a fundamental, core DNA level, since the original iPhone, like if anything didn't hit 60 frames per second on an iPhone, it did not ship. And these features are exactly the same in those class, so Apple just turns them off. So given that people can find a way to be royally pissed, and sometimes, absolutely justifiably, so, either way, Apple just draws a line in the sand and says, " You know, this, we will not cross. If it doesn't do this, we are not shipping it." And they'll take their lumps for that decision if they have to. "Apple's in the habit of making very bold moves if it helps them tighten up their ecosystem, says Linn Huang, research vice president at the tech analyst group IDC.

And this certainly feels like the start of something like that." For years, for years, the Mac was just behind the iPhone and the iPad. Features that existed on the iPhone and the iPad took forever to come to the Mac, like Shortcuts just did this year. And apps that were on the Mac were just so far behind. The version of those apps that were on the iPhone or the iPad, things like Messages and Maps. And just the whole idea that if Intel can't have them, M1 shouldn't have them feels just so utterly desolate, and cynical, and pessimistic to me that I have a hard time entertaining it. "Apple wants to tap into the machine learning capabilities of its new M1 chip, Patrick Moorhead, founder and principal analyst at Moor Insights and Strategy, said in an email. Pat says that while Intel has chips that could enable these features in theory, Apple is concentrating its efforts on its home grown silicon. Therefore Apple is likely only motivated to do the work for go-forward processors." And one thing to note, because Wired completely fails to note this, is that Pat includes Intel on his client list, Intel but not Apple. Pat consults for, works for Intel, does not consult or work for Apple. Because Apple just doesn't believe in those kinds of consultants. So while Pat is absolutely upstanding, and he has a reputation to protect and he's gonna always give you the best insight that he can, if you're a publication, and these are the analysts you're asking, you really need to note any potential conflicts of interests right there in the article, because otherwise they can come back to bite you. "Their points about neural engine are very apt, Huang says. But that's not to say none of this could be done on Intel with a little bit of elbow grease."

And yeah, so exactly what I said. But it's not only that it would require that elbow grease to implement those M1 features on Intel, but that those elbows could then not implement any Mac specific features as well. It's the simple realities of the development process. The reason why we haven't been getting day and date features on the Mac previously, because it entails more work. And I say, we as someone who's still using an Intel 16-inch MacBook Pro because they just don't make a 16-inch Mac book Pro with Apple silicon yet. So, you people who are using M1 Macs are getting those features day and date now. So Apple absolutely could assign engineers to go and take these features that are already built, that are already ready to deploy for Apple silicon and recreate those for Intel. But it would take the same amount of time again to do them, probably more, because Apple has specifically optimized their own custom silicon to do exactly the kind of features that they have road mapped out for their products, where Intel is this big generic box that they'd have to do any number of engineering workarounds to achieve the same results.

And that is what would cause the delays, not only in how long it would take to get these iPhone and iPad features ported over to the Mac, but at the opportunity cost that those Mac specific engineers would otherwise be spending that time on Mac specific features, on features that don't exist on the iPhone and iOS. And for good and for ill, whether you agree with it or you hate it, Apple has always, always been absolutely relentless in leaving the past behind, in destroying it whenever they possibly could so they could embrace the future. Move towards the future as fast as inhumanly possible. "It's a really clever move by Apple to force obsolescence and get users to buy new Macs to increase revenue, Moorhead says." But in a different universe where Apple had provided these features and they didn't work as well on Intel as they do on Apple silicon because they're purpose built for Apple silicon. And Apple silicon just simply perform so much better than Intel. they'd find another analyst or that analyst but find another angle just to say, "Apple is making sure things don't perform as well on Intel as they do on M1 to force people to upgrade." And I can just see like a Fringeverse a, my nightmare, a coffee free Fringeverse where Apple decided to withhold features from M1 Macs just because they didn't wanna make the Intel Mac owners salty.

And so once again, iOS, the iPhone, and the iPad get LiveText and Translate, and blurry backgrounds, all of these things. And they're totally absent from the Mac, or they take until next year or the year after to show up on the Mac, when Apple can finally get Intel compatibility working up to their standards. At a time when Intel is further and further from consciousness anyway. And then the articles that would come out from the exact same publications would be so super salty about that. Apple deliberately denies M1 features from new Mac owners. And then I feel zero sympathy for Apple because trillion dollar company, and they can take every lump they need to take. And they should take every lump that they can possibly take. But as someone who deeply loves the Mac, as someone who understands that the whole reason that Apple transitioned from Intel to their own custom silicon was to be able to deliver features, not just in number, but in quality, and on time that were just impossible with Intel. This is the entire reason for doing that.

And I just, I refuse to be rage baited about that. Because eventually, I will get that M1X MacBook Pro of my dreams, and it'll have all of these features. And if you're waiting two or three more years, you will get whatever more additional features come there and you will be better off as well. So again, there are very specific, very pragmatic reasons why Apple made the choices they did when it comes to having these features exclusive to M1 Macs. But now, after hearing those reasons, if you think that's the best decision Apple could have made, they made the right choice, that's great. But if you still think that Apple made the wrong choice, that they should have spent the time and resources to port these features over to Intel as well, even if it would have taken more time, even it would have been at the cost of other Mac specific features, then I think that's great too. You know, I'll never ever tell you not to hate, but I will always, always make sure that if you hate, you can hate as smart as absolutely possible.

Categories
Uncategorized

Why You Hate iPadOS 15

I'd just posted my WWDC 2021 iOS 15 for iPhone keynote reaction video when you all started commenting — raging in the comments! — about iPadOS 15. So, I scrapped the video I was working on about that and, instead of talking about my thoughts, I decided I should listen to you and answer your questions and concerns.

I asked you why you were angry or disappointed about iPadOS 15 — what specifically you were angry/disappointed about — and you let me know!

And here are the answers!

Literally everybody is gonna lose their over this video. Sponsored by CuriosityStream with Nebula. So I finished watching Apple's WWDC 2021 keynote. I got my iOS 15 video for iPhone live, my reactions to the keynote, and then I started getting ready, I started editing my iPad OS 15 reactions, and I started noticing a lot of comments, a lot of your comments coming in, and they were just hella mad, hella angry, about iPad OS 15. And so I stopped. I took a break. I set aside everything I already edited and instead of talking to you about what I felt happened or was announced at the keynote for the iPad, I started listening to what you were saying, what you were telling me, what you were asking me, and I just decided, ultimately, I could not post that video even though it was more than halfway done being edited and would have been live a day or two ago. I had to stop, burn it down, and start over with this video so that I could provide you not the information I wanted but the information you wanted. So that's what I'm gonna do now. I'm gonna go through all the questions, all the concerns that you raised, and address them as best I can. And by that I mean I'm going to explain to the best of my ability why Apple made the decisions that they made. And I'm not apologizing for Apple in any sense of the word. I'm just providing you with information.

Some of Apple's choices here I will personally agree with. Some of them I will personally very much not agree with. But I wanna make sure that each and every one of you has the best possible information so that you can decide for yourselves which choices you agree with and which ones you don't so that if you're gonna love things, you're loving them informed, and if you're gonna hate things, you are hating them smart. Because what I think is happening all too often lately is Apple, who just never explains implementation details, who doesn't consider implementation details worthy of or worthwhile explaining, it leaves this huge void. And instead of using that as an opportunity to inform, to educate, to empower, all too often people use it as an excuse to rage bait, to just pander to that negative sentiment in order to get their own sort of negative attention out of it, and that to me is, you know, the second-worst cardinal sin after not explaining things to begin with. So let's do that now.

Why did Apple put a Mac chip in the iPad Pro and then not put macOS on the iPad Pro? So a couple things there. First, Apple didn't really put a Mac chip in the iPad, and I think Apple, in hindsight, made a really big mistake in how they handled and announced this, everything from choreographing Tim Cook's "Mission Impossible" sequence where he literally took the M1 chip out of a Mac and put it into an iPad. And also just using the name M1 for the iPad chip, that created a huge amount of expectational debt because to just most people, the idea of the M1 is synonymous with the idea of Apple silicon on the Mac. So bringing the M1 to the iPad gave a lot of people the idea that Apple was, in fact, bringing the things that they wanted from the Mac to the iPad, including macOS, including Mac apps, things like that. And in truth, it's not a Mac chip going to the iPad. It's an iPad chip that went to the Mac. We saw that as early as last year's WWDC when Apple first announced custom silicon for the Mac. The developer kit used the A12Z from the iPad in a Mac mini shell to run macOS Big Sur for a developer so they could begin to transition their apps to what became the M1. It has always been part of Apple's A-series of chips. They just used to use a different name for it. We got, for example, the A8 and the A8X, the A10 and A10X, the A12 and the A12X and then A12Z. And this year, instead of having an A14 and an A14X, Apple sort of took an A14X Plus concept and called it the M1. But when you look at it architecturally, it's very, very similar to what we got with the A12Z and there are some things under the hood, like the A12 had greater memory bandwidth and the A14X or the M1 has specific things for the Mac, like virtualization acceleration, x86 translation acceleration, and those Thunderbolt ports, which turn out to also be useful for the iPad Pro now, but it's very much an iPad chip working in a Mac and it's just because Apple has been making such good iPad chips for so long that it can run rings around what Intel was providing.

And I think the second part of this is while a lot of people on tech Twitter and tech YouTube will say they really want macOS on an iPad, we represent an extreme minority of the market, like maybe 10% at most, and, of course, you know, I'm part of this, and we all consider our opinions to be majority opinions, what we want to be what the majority of people want, and they're not. They're really small, minority opinions. The vast majority of the market, the 80-plus, 90%, have very different needs and expectations than all of us caught up in the echo chamber of Twitter and of YouTube. And for a large percentage of those people, macOS would not be an advantage on the iPad, it would be a detriment. The same way Windows wouldn't be an advantage on the iPad. It would be a detriment, a deterrent. When Steve Jobs originally conceived of the iPad, and I think why he said it was the most important project he ever worked on, was that he had this constant relentless drive to popularize, to democratize computing technology to make them even more and more accessible to more and more people.

You know, from the Apple II to the Mac to the iPad, it was just ever simplifying it. And as much as he loved the Mac, he believed that it was still not accessible enough, that there were people who found even the Mac just too complicated, too inaccessible, too alienating, too off-putting, too intimidating, and so he wanted to make the iPad. And originally, the iPad was just gonna be literally a big iPhone, even using the exact same apps just blown up to fill the iPad screen, and the UIKit, the AppKit, not AppKit, sorry, but the app teams convinced them that that wasn't the best approach on a big screen, that they could do much more with that real estate, and so they came up with the split view, with, you know, the master list and then the detailed specific item list became signature to the iPad. And then, of course, Steve, as he usually did, would just go full in on that new idea and would end up complaining or belittling how Android tablets were just blown up Android phone apps. But he really wanted to keep them simple. He was really enamored with the idea of full-screen apps where, for example, a normal, regular person would never have to worry about, oh, I lost my email app, just because it was behind their browser app and they had no idea how any of that worked. And so you only really had that full-screen app and that Home button, that escape button, that would restore you to a known state all the time and he pushed back hard on any more complications from that, you know, even disallowing apps themselves to do sort of widgetized views, multi-window views. That became a hard rule, and I think to this day you still can't make apps that do that on the iPad. They just did not, Apple, Steve Jobs, just did not want the iPad to be a big Mac. You already had that. That device already existed. And I'll go so far as to say we nerds, we power users, we traditional computer users, we have the Mac, we have the PC, we have Android, we have Linux and the various flavors of Unix, and there's very little that mainstream customers have.

They have the iPad and they have Chrome, but because we covet the iPad hardware, because it is such nice hardware, we wanna take that too and we wanna make that into a traditional computer because that's what we're used to. Those are the workflows we have and we wanna be able to do those workflows on the device we wanna do them with and we don't really have the empathy or the perspective-taking for the vast majority of the market who just want something simple. And by no means are they any less intelligent, any less sophisticated than we are. They just could not care less about the underlying computing technology or comp sci metaphors that have sort of evolved over the years, the cruff that has built up. They're architects and designers and scientists and people who just wanna get their work done and can do that perfectly well, yes, even on an iPad Pro that handles all the apps and all the activities that they wanna handle without any need to manage large amounts of file systems or multi-window views of apps. It's just a different type of pro. And I know that we all think that a pro is basically anybody that has our exact workflow or what we consider to be slightly more complicated, like a northerner is anybody who comes from one town further north than we are. It's all completely relative. But there is just a vast swath of people with pro needs that are just very different pro needs and the iPad was designed to be the computer for everybody else, not a Mac. For anybody who wanted the Mac, there was the Mac.

And I think sometimes we confuse wanting macOS on the iPad or petitioning Apple for macOS on the iPad for just wanting a better Mac, for petitioning Apple for a better Mac, up to and including touchscreen, hybrid, transformable Macs. Because if that's the system, the paradigm we want, there is that product already, we just need it to better suit our needs. Why use M1 hypervisor acceleration and then not let us run macOS in a virtual machine? And I think, you know, in the normal course of events, previously, Apple would have just made an A14X for the iPad the way they made the A12X and Z for the iPad, the A10X for the iPad, but because they needed something to go in the Mac, they tweaked that to be the M1 but it is just so similar to what an A14X would have been that there was no need to make a separate chip. It would have been a waste of money. It would have lost the economies of scale and the efficiencies that come from making, Apple making custom silicon for this many premium devices. The same way Apple put the A14 into the iPhone 12 and the iPad Air gives the iPad Air the economies of the iPhone scale, this gives the Mac the economies of the iPad scale and vice versa. And then the sum total of us nerds just spat from our heart cannon all of our hopes and all of our expectations around that only to set us up for the mother of disappointments or to set up Apple to deliver us the mother of disappointments, depending on which way you wanna look at it. But I do think the concept of the iPad being able to bring up a virtual machine of macOS, I think those are really, really good ideas and ideas worth lobbying for. Those just aren't the ideas that Apple either has or is willing to ship at this point. What's even the point of an M1 iPad Pro if it can't do anything more than a 2020 iPad Pro? I feel that sentiment so much, but if we're being intellectually honest about it, the M1 can do way more than the A12Z. It's two generations newer, silicon architecture, so the cores are higher performance. It has the Thunderbolt controller, so it can do Thunderbolt, which the previous one couldn't, and it has updated versions of the image signal processor of all the encode/decode blocks. Just all of that is two generations newer.

But on top of that, Apple internally, I don't believe, thinks about year-over-year upgrades the way that so many of us, again, the 10% on tech Twitter and tech YouTube, obsess about it. We produce 90% of the content based on year-over-year upgrades. Should you upgrade from the last iPad Pro to the new one? From the last iPhone to the new one? Where 90% of the market just never, ever, not ever, thinks in that way. And I believe the typical cycle for the iPad, even the iPad Pro, is like every four to five years. So this is, designed is too hard a word, but this is intended for people who are still on the 2017, if not the 2015 iPad Pro, for people who have been using that for several years and it's just starting to feel its age, will no longer be getting all the iPad OS upgrades going into the future, or just people are looking to the capabilities of it, whether it's an mini-LED display or Thunderbolt or the sum total of all of the upgrades that have come out since 2015, 2017. Apple Pencil 2, the Magic Keyboard, all of those things together making it a compelling upgrade. So whenever you think about, it's not worth it in terms of last year's model, that's just not how the market thinks at all. The market, it's not, is it or is it not worth it compared to my four- or five-year-old model? Is the M1 just ridiculously overpowered for the iPad Pro? Yes, absolutely, totally, 100%, ludicrously OP, but not much more than the A12X and Z were for the previous iPad Pro, or the A10X. That is sort of the way that Apple operates and the only difference here, again, is that Apple named it the M1, instead of naming it the more understandable, I think, for most people, A14X. But the way that Apple operates is to build in significant silicon headroom in all of their devices. They want these devices to last both in terms of hardware engineering and software support for a number of years, four, five, six years even. I mean, iOS 15 is going all the way back to the iPhone 6S. iPad OS 15 is going back to the 2014 iPad Air, which shipped with the A8X chipset, and the original 2015 iPad Pro. So that's, math, carry the one, divide by zero, it's like five, six more years of useful, relevant life on these devices. The value of these devices isn't the year you buy them but every subsequent year that you keep gaining that value from them.

And that includes not only, you know, iPad OS 15 and those features and capabilities and some of the apps that we have now, like the heavier apps, the AR apps and the video apps and 3D apps, all of those things, but subsequent versions of iPad OS, 16, 17, 18, 19, 20, and whatever apps we get into the future, like the way we now have Photoshop on the iPad Pro after the 2018 iPad Pro update. There will be next-generation apps that will take further advantage of this hardware and instead of you only getting, you know, a chipset that performs well now but starts to scroll slowly and is not able to run apps that come out next year or the year after or the year after that, instead of thinking about it being ludicrously overpowered now, I think it's better to think about it in terms of how overpowered will it still be when we are in, you know, 2025, 2026, and we're running the kinds of apps that come out, you know, over the next two, three, four, all those intervening years. Why put up to 16 gigabytes of RAM on the iPad Pro if iPad OS won't let an app access more than five gigabytes? And I've seen this question so often and I think it's due to the fundamental difference of how Apple sees the iPad compared to how they see the Mac.

The Mac is a full-on traditional, preemptive, multitasking, multi-windowing computer that uses things like swapping from RAM, paging out to the hard drive, the SSD now, to manage memory, where the iPad OS was just never designed to work that way, especially in the early days, where just that style of multitasking was prohibitive in terms of battery usage and that kind of swapping was prohibitive in terms of the read/write cycles and also the power cost of keeping the drives swapping all the time that Apple just didn't do it and instead they came up with this process called jetsam where if any app started pushing pressure on memory, it would just be flushed. It'd be booted out, yoted, straight out of memory. And that has just evolved over the years. And so the way that Apple looks at it is if they have a five gigabyte ceiling on any individual app using RAM and then they give you eight or 16 gigabytes, that means that can just have more of those big apps sitting in memory. And if you go back to a time when there was lower memory iPads, even iPhones, you would open up a few apps and then try to go back and it would have to respawn completely, just reload completely, and that could be really frustrating because you might have to wait very long for a load screen. You might not end up, you know, if the app didn't properly save state, you might not end up exactly where you left off, especially if it was a game that you were just moving to look up something quickly or answer the phone or send a message and you came back and your game reloaded.

And as the RAM has increased, Apple's been better able to keep apps in memory. And on the iPad Pro, I've gone days with apps in memory, like I've gone back to an app I haven't used in several days, big apps even, and they're still there there in memory. So the increased memory on the iPad Pro wasn't meant to be used the same way that it was on the Mac in terms of making the sum total of that memory available to every app all the time. And maybe that's just anachronistic at this point. Maybe iPad OS needs to investigate doing things like swap or, in the very least, looking at the actual workloads of multiple people because there are a lot of what people would consider to be pro apps that this five gigabyte RAM limit will just never be a problem for but then there are certain types of users with certain types of workflows on certain types of apps, like Procreate, who maybe wanna have as many layers as memory possible on a device, and they're hitting that limitation hard. So everything is a trade-off. Everything is a compromise.

And Apple has to balance the needs of all of their users, the ones who will never come close to five gigabytes to the ones that are gonna hit all the time, and sort of figure out the balancing act there, whether it is to continue to tweak the implementation as is or investigate a whole new, or a whole old, implementation. How can Apple keep calling it an iPad Pro without putting their own pro apps, like Xcode, Final Cut, and Logic, on the iPad Pro? And yeah, I was really, really hoping for that moment, like that original iPad keynote moment where they showed off iWork for the iPad and just redefined the expectations for what a mobile app, what a tablet app could do, or especially the iPad 2 event where they showed off GarageBand and iMovie for the iPad 2 and it was just, it just blew me away in terms of how well they brought that software to mobile, and I was hoping we would have that again where, just, here is Final Cut Pro and Logic Pro completely reimagined for the iPad, Xcode completely reimagined for the iPad. And my understanding is they're just not there yet. They're not at a place where they have an interface and a value proposition that they think is worth it on an iPad. And I have a suspicion, a deeply rooted suspicion, that we are seeing the move towards that, though, although it's way slower than I think anybody, maybe even including Apple, would like, both with seeing new features in Swift Playgrounds that lets you do greater amounts of code, more amounts of code than ever before, but also for the first time deployed to the App Store, taking maybe the lowest level of pressure, the educational or the iteration level of Xcode, and allowing that to move over to the iPad, but also Xcode Cloud, because one of the challenges with the iPad still is even eight to 16 gigabytes of RAM isn't a lot and the interface certainly still isn't a traditional computing interface.

But if you start moving things like Xcode to the cloud and then, conceivably, Final Cut in the cloud and Logic Pro in the cloud, then you start making the devices themselves end points. Your projects live, if you choose them to do, they live in that cloud-based infrastructure and you can collaborate with them absolutely more than ever before, which is a feature that Final Cut and Logic are really lacking. Those, you have to look at third-party alternatives now, like Frame.io, but if, conceivably, you have the project in the cloud, multiple end points, multiple people points, can interact with it and then it just becomes you choosing whichever one you think is most apropos for the task at hand. So you could sit there coding or editing on your Mac, on your iMac, with the big screen and then pick up your iPad and you're connected to the same cloud account and you can just continue the same project there and it's all syncing back and forth and you're not worried about moving enormous files. You're always working on these small, lightweight proxy files that, you know, you're editing on your iPad but maybe is syncing down to your Mac or your Mac Pro that has just way more cores, way more memory, way more storage, but you're able to sit there and use your Apple Pencil to edit something in a way that you couldn't or wouldn't, just didn't want to, on the Mac at that particular time and place. I think that's what we're slowly moving towards. But in typical Apple fashion, they're releasing it step by step.

They're putting the components in place, frustratingly slowly at times, but eventually we'll just see more and more of that come out until the whole big announcement is made. And again, that is just completely a guess. But rather than just having Final Cut on or just having Logic Pro or Xcode on the iPad, I would really like to see the next generation, complete next generation of those technologies on all of Apple's devices. Why is Apple intentionally holding the iPad back just to sell more Macs? And this is, again, one of those examples where I think we just do a profound disservice to our audience because it is so easy to rage bait, to perpetuate these kinds of urban myths, rather than to go, you know, through the effort of sharing more and better information.

And Apple has this doctrine, I like to ascribe it to the Phil Schiller doctrine, where they will happily cannibalize any of their products with another one of their products because their worst fear isn't losing sales on any one product from another one of their products but losing those sales to a competitor's products, to another company's products. So just like they replaced the iPod Mini with the iPod Nano and they replaced the iPod, one of their most successful products, with the iPhone, you know, and they allowed the original iPad to compete head-to-head with the Mac and the iPad's popularity soared but then it went down when the MacBook Air got its redesign. The Mac's popularity soared again and Apple didn't care because they were competing against themselves. Whether you bought an iPad or a Mac, you still bought from Apple, rather than them being protectionistic, almost Microsoft-y and about everything has to be Windows while people are jumping off the ship. You might be riding those profits but you're riding them straight into the ground if you're not willing to take that next leap, if you are not mistaking your products for your business but understanding that your products will change even as your business may evolve over time. So the thinking within Apple, as I understand it, is that the iPad's job is to get so good that a majority of people no longer need a Mac, and again, I'm not talking about us 10% tech Twitter, tech YouTubers who really have very specific Mac-ity Mac needs, but I'm talking about the majority of people who were buying a Mac because they had no other choice, is to get it better and better so that it serves more and more of their needs and less and less of them have to get a Mac that maybe doesn't fit them as well. But then it's the Mac's job to fight back, to push back, to prove that it deserves to survive, by taking on more and more characteristics that appeal to that wider audience, which is why, you know, Will keeps saying Apple's dumbing down the Mac or Apple's putting too much iPad features or iOS features on the Mac, where what they're doing is just trying to take what's best about iOS and iPad OS, the friendliness of those things, and bring it to the Mac.

And to totally abuse Steve Jobs' car and truck analogy, you know, we still need trucks but there's no reason why we can't have the amenities, the comforts, of the car at the same time. The good sound system, the comfy seats, all of those things. That's what Apple has been bringing to the Mac and using to help the Mac fight back, even up to and including now Apple silicon, which came from the iPad, going to the Mac to give it the same kind of performance potential that the iPad has had but for users and workflows that make just much more sense on the Mac.

So whenever I hear things like Apple's intentionally holding it back, I just think more like they have very set philosophies and markets in mind for these products and why have one product that fits well in one market and fits badly in another market when you can have two products that maybe have some overlap in the middle but each one is better suited, is best suited for a particular market? And I think in their minds, the Mac is that traditional computer for people who want and need traditional computing tasks and the iPad is that think different, compute different machine for everybody else, you know, the computer for everybody else. Why is the Files app still trash? I just really don't know. I've been lobbying, personally lobbying, you can go back and look at iMore, for years to have better file support on the iPhone. I was asking for, you know, a file picker and a Files app feels like, you know, eight years ago and then every year, year after year, I kept asking and we slowly got the document picker and we slowly got the Files app, but it always felt like begrudgingly Apple was doing it.

And I understand from a certain point of view that they don't wanna just port the complexity of Finder over to the iPad. They wanna make something that is more accessible to a much wider audience on the iPad. But there are simple things, simple affordances, simple considerations, like how big files are or like status bars. Even though the status bar is a lie. It's purely psychological. Humans operate off those kinds of psychologies. And there's this old joke about engineering where, you know, you don't wanna have settings because settings are complicated but you end up engineering around them in such a way that the complexity far outstrips the relativistic simplicity of just having settings, and this feels like that to me. It feels like Apple is doing everything possible to work around the simplicity, just making a functional Files app, and that's all they really need to do. This is one place, this is one app, where I think Apple really can lean totally into doing fan service for nerds. Why is external display support still trash? So the issue here is that all of the iPads forever have been close to 4:3 aspect ratio where most external displays are 16:9, if not wider, and so when you just mirror an iPad display, all you do is get these giant pillar boxes on both sides and it makes your eyes bleed, makes any power user's eyes bleed.

But one of the considerations here is that external displays are not used by the vast, vast majority of iPad customers. It is really, you know, one of those 10% features. But Apple does continue to show it off, both when they introduced USB-C to the 2018 iPad and now when they introduced Thunderbolt. Just front and center in the presentations are external displays and Apple is careful to show them doing what the current capabilities are and that is not mirroring but playing video or playing video games and they leave it up to the developers to implement that. But if you just wanna use it as extra desktop space, you only have mirroring. That is your only option. And so far, Apple's not provided anything to allow, nevermind extended desktop where you would just fill the screen with more app icons because the home screen has no concept of multiple states, one for the iPad display and one for the external display, but for developers to create arbitrary aspect ratios for their apps and then have that shown on external displays that are maybe 16:9 or 16:10 or ultra-wide screen, wrap-around-your-desktop displays. It just hasn't been built. Apple needs to build it. But because so many people don't use it, it is just not high on their priority list. Where is multi-user support?

And this one, again, is personally really frustrating for me, especially because Apple has multi-user support already built in to the iPad. It is just only surfaced in the educational applications, in classroom, where they provide really, really good multi-user support but only for students. And maybe you could argue that, you know, education and students is a controlled environment where the traditional lack of resources on an iPad, the lack of RAM, the lack of storage, wouldn't cause too many problems, especially in a student environment where the devices are essentially wiped every night and there's very little personal information stored and no personal information persisted on those devices. It's a more complex problem in a home or work environment where you want your stuff available any time, all the time, in a very Mac-like way.

But now that Apple has iPads that are coming with one, two terabytes, eight, 16 gigabytes, and as we've seen over the years, pro features get pushed down to the Air, get pushed down to the base model, as the resource limitations start to evaporate, I really hope we see Apple extend that currently student-only multi-user support just across the line of iPads because there are just so many situations, so many environments, where it just makes so much sense.

Categories
Uncategorized

iOS 15 for iPhone — MEGA REACTION! (WWDC 2021)

Apple has just announced iOS 15 for the iPhone and I'm watching along and give you my live analysis on all the new features and functionality, including FaceTime, Messages, Keys, Maps, SharePlay, Safari, Photos, and more!

  • [Woman] Hi, Wanderlust Creamery.

  • [Woman] Morning to Starbucks.

  • Oh, whoops, sorry. Just applying a deep neural network here to a photo of your shop.

  • [Speaker] Yes, I'd like to order 4,000 lattes to go please.

  • No actual need for ice cream at this time.

  • No, just kidding. Wrong number. Thank you.

  • Gotta drop, bye.

  • Bye-bye.

  • WWCDC 2021, live reactions, iOS 15. Will it next level the iPhone? Let's do this. Sponsored by Curiosity Stream with Nebula. Now hit that subscribe button and bell, so you don't miss any of the reviews.

  • This year, we've set out to make FaceTime calls feel more natural, comfortable, and lifelike.

  • Makes a certain amount of sense that does because of how much we've all become dependent on video conferencing over the last year and a half.

  • [Speaker] We're bringing spacial audio to FaceTime.

  • And spatial audio is Apple's technology that takes either a Dolby Atmos signal or any signal and turns it to a 3D sound stage for FaceTime that's coming from where they're positioning the people on the FaceTime stage.

  • The first, is voice isolation.

  • Which is trying to do a better job of noise canceling just removing everything else that could be considered background noise, so that your part of the conversation will be more distinct and not lost in that background noise. It's almost like, the reverse of active noise cancellation because it's not removing the noise for you. It's removing the noise for everybody else.

  • Let me fix this. Better right?

  • And then equal in opposing that, they're doing wide spectrum, which, I don't wanna call the opposite of transparency, but it kind of is, it's making sure that everything, all the ambience, all the din that's going on around you comes clearly through if that's what you wanna share with everybody.

  • I wish I could be there.

  • Me too.

  • There's a grid view at long last, because while the first FaceTime group view was animated and interesting to look at, once you're doing it all day every day, it gets really tiring. So now you can just drop it right into grid mode and see everybody in a normal static, well, not static, but normal steady view.

  • We're introducing, portrait mode.

  • FaceTime bouquets backgrounds, to sort of blur out the clutter that could be behind you.

  • We've created, FaceTime links.

  • So that you can establish a FaceTime call, ahead of time the way that you can do with a Zoom or WebEx call now.

  • [Speaker] You're friends with Android and Windows devices can use your FaceTime link to join the call, right from their browser.

  • And there is a compelling argument towards them making, towards Apple and making FaceTime more applicable to things beyond just interpersonal communications because they are end to end encrypted and they're serious about it. And a lot of other tools have ongoing security concerns. So FaceTime as an option here, you know, for a lot of people will be a good option.

  • SharePlay is a powerful new set of features, for shared experiences that you can enjoy, when you're on a FaceTime call.

  • So you can listen to music together, you can watch videos together, and yes, something that's been on my list for a long time FaceTime screen-sharing, which some of you have been able to do on Mac OS since the days of iChat, but it's been just completely missing on iOS forever.

  • [Speaker] When you're on a call, all you need to do, is press play in Apple Music, and you can choose to listen together.

  • So Apple is using all these new audio technologies to enable the music sharing, which is interesting. They've improved the voice quality, but you know, you don't need that much quality for voice. I mean, it sounds good, but it's not mission critical, but for music, listening to poor music would be a bad experience.

  • When you're on a FaceTime call, jump to a streaming App and press play. And boom, you and your friends are watching the same video, totally in sync.

  • Oh, that's a really nice integration of Picture in Picture with messages, because, FaceTime and iMessage are already connected in that, you can move back and forth between the same group calls and the same group messages, and now they're integrating the play experience into that, which is legit terrific.

  • And when the moment calls for a bigger screen, you can extend the video to your Apple TV and watch it there, while sharing the moment with your friends.

  • It's not just Apple music, it's not just the TV, the Apple TV App, but there's an API, so that other developers can integrate, question remains, will they? Companies like Spotify have been good at complaining, but very bad at implementing. Netflix of course will be a big question because they haven't, they've chosen not to play nicely with the Apple ecosystem because they don't wanna lose ownership of the audience. They don't want their interfaces to be inter-mediated. But to me, that just means they're disqualifying themselves from the greater conversation and, I'm forgetting to open Netflix more and more these days.

  • Check out screen-sharing. You can use it to browse Zillow listings live with your future roommates, or show off a tricky move in your favorite game.

  • And that's over-delivering. They're never promising, but over delivering.

  • We're introducing great new ways to enjoy all of this amazing shared content, and the exchanges that follow.

  • So they're using new interface parent times, stacks and collages, so that, multiple photos don't just create a skyscraper in your iMessage App but you can stack them and then expand them or spread them out so that they're, they still keep the conversations, it's not a wall in between the conversations but all the content is still available to you.

  • [Woman] I'm not going to read this right now, but I'd love to the next time I catch up on the news. And I can tap from Casey to pick up our conversation and remind her, that deep dish, is obviously the best kind of pizza.

  • And we'll have to see if there's an API for this as well, because Apple news is probably only one very limited type of content that people share.

  • [Woman] So now, if I wanna check out that playlist Leah sent earlier, I can head over to Apple Music.

  • They're shared with you in photos as well, so those stacks and those collages are gonna be featured for you, so that the next time you go to photos, there'll be available for you there. And that again, that's pull interface, not push interface, that requires you to remember to go back or to happenstancially, be willing to interrupt whatever reason you're going to the photos App, just in general, humans aren't good at state change. It's why we forget what we were doing when we go into another room because that change is enough to distract us.

  • And if something especially important comes through, you can quickly pin it.

  • Which are great because, once you have the pin down, when you're going back to messages, they're available there and hope that you can always still use the Siri remember this, and then your reminder is populated with it.

  • We wanna free up space, to focus and help you be in the moment.

  • Because as we've already seen with, Do Not Disturb and things, when everything is important, nothing is. Notification quickly becomes interruption.

  • [Man] They have a fresh, new look. Contact photos for people and larger icons for Apps.

  • And there's a new notification summary, which is like a digest of all the notifications that have come in to you, over a period of time or that you can schedule to look at at certain times. And it's tying into the device intelligence which we saw Sebastien Marineau-Mes introduce last year which has just been getting better and better. It's a really good side of Apple AI right now.

  • [Man] Notifications in the summary are ordered by priority with the most relevant ones, rising to the top. Notifications from people, will not end up in the summary so you won't miss important messages as they come in.

  • And there's also status now. So for example, if you're in, Do Not Disturb, people will know that. They won't just be sending you, like I often do, a wall of text for you to come back to later. Maybe they still will, but at least they'll be doing it knowingly and with intent.

  • [Man] And for truly urgent messages, there's a way to get through.

  • And so, super setting Do Not Disturb is focus, which includes yes, Do Not Disturb, but also, work, and personal, and sleep, so that you could have various statuses within these Apps. And it goes beyond notifications. It's also being used as a system wide organizational paradigm.

  • [Man] When setting up a focus, we use on-device intelligence, about your past activity, to suggest Apps, and people, to allow notifications from.

  • I'm not quite sure about the modes yet, I'm gonna have to try those out, but, I do like that Focus is, such a focus this year.

  • [Man] You can even dedicate a page on your home screen to match your focus, and organize your Apps and widgets in a way that reduces temptations.

  • And next is, live text. which looks like basically turning all texts and all images, into something that is searchable throughout the operating system. Which is something other Apps and other operating systems have been doing for a long time but something that has been only barely and specifically implemented in iOS until now. And it really, it should have been universal years ago but it's really great that it's becoming universal now. And it looks like Apple is going back the way it did, when they released their version of face identification, finding faces of your friends in your photos, and that is going back and churning through on device using machine learning, to recognize, understand, categorize, and index all of this information. Which as much as I just said, it took way too long for this to come to iOS, is sort of how Apple has been rolling with these things, and that is taking a lot of time, but doing it in an implementation, that they believe really prioritizes and respects privacy and security, that doesn't go to a server, doesn't share all your information with a server, but that keeps it all personal to you.

  • [Man] I can just select it like this, and I have the option to use a lookup, to find out more about this restaurant, and even look up its location.

  • And they're tying it into that computer vision system. The system lets us search for specific things in photographs. So for example, dogs, and types of flowers, and, as we got the hint with the accessibility system, Apple is getting much more sophisticated about describing, ingesting, understanding and describing what is in these photos. Will they be as good at it as Google out of the gate? Almost certainly not.

  • Live text works with photos across the entire system, including screenshots, quick look, and even photos on the web.

  • But will a lot more people or at least some people for whom those security and privacy issues are so fundamental, prefer or only use this, rather than something provided by Google? Equally absolutely. So they're also integrating Apple Music into the photos App, taking the existing, For You system and the part of it that would put together a personalized movie for you based on an event, or a person, or something like that, and tying it into Apple Music so that instead of the generic soundtracky sort of stuff that they've been using, the very limited stuff they've been using previously you'll have access or they'll have access to everything in the Apple Music catalog, but that at the same time, just pushes Apple Music all the more because they have that integration, they have that shared platform. What will the experience be like if you don't have Apple Music? Probably not as good as if you do have Apple Music, which again, tries to increase that ecosystem, that added value. And from Apple side that average revenue per customer. So the music is being chosen again by on-device intelligence and they're applying filters and effects based on device intelligence. So doing things like adjusting color and contrast, to create, I don't wanna say cinematic, more TV commercial like, energy and effects in real time. And then of course, because you're the human, you get to overrule all of this and you can either search for your own tracks or choose something that you've listened to recently that might be especially relevant to you. And that's sort of my favorite approach with all of this. And that is, to have machine learning do all the heavy lifting, all the grunt work, the sorting, doing everything that would be tedious and boring for us to do, get that all out of the way and let us do that last 10%, 5%, of polishing, or again, overruling, that makes it truly human, truly representational of us.

  • We started with car keys, which were improving with ultra wide band support for an unlock and drive experience. No need to take your iPhone out of your bag or pocket.

  • But now, adding home key, so that if you have a system that supports, cardless, keyless, entry, you can now put that card onto you iPhone.

  • Just add your corporate badge, and tap to enter your building, or office. And where you travel, with hotel keys.

  • You'll be able to scan your driver's license, your state ID, all of these things. And TSA will be implementing it. So at least if you're traveling within the US, to and from the US, that'll be beneficial. Apple always does amazing, amazing things with the wallet App but they're always just so limited to the US or a very handful, very few handful of countries and I hope there's some way that Apple can just apply, bend their will upon making them more universal, more international.

  • The weather App. We're introducing a beautiful new design with data and layouts that change based on conditions.

  • Not just the background animations, although yes that too, but the kind of information that it's giving you, the information density, both graphical and numerical, by better animations, better looks, high resolution maps, and also just a better swath of available data around what's happening with the weather.

  • [Woman] Maps will offer incredible new details, for commercial districts, marinas, buildings and so much more.

  • And you can look at this as a precursor to what we're gonna eventually need for an augmented reality world, when you have, your future Apple glasses product, on your face, and you're looking around, and you need this layered on top of the real world for you whether you're a tourist or just in an area of town, you don't go to a lot or trying to find a specific thing. Even an area you do go to a lot, all of that will have to be available then, so Apple's all obviously getting it all prepped now. And it's got a much more distinctive look. Before it was sort of a bad reality, it was the low polygon count reality, melty reality. Well, this is almost looking so real. Almost like, I don't wanna call it a cartoon look, but it's a very stylized look. And that starting off, it looks like with this, the new driving experience, that's coming to the iPhone and then later to CarPlay. Oh and they're adding another very Googley feature, that is the ability to scan buildings, to scan your surroundings, to help you better more accurately, place yourself in the real world and then provide you directions based on that.

  • [Man] The new tab bar appears with just a tab, and it's directly under your thumb.

  • And, you know, for years now, ever since the iPhone 6 and 6 plus when Apple went to big and bigger designs, they've kept the interface elements largely at the top forcing you to either juggle your phone like you're Penn and Teller, or to use reachability, which is an extra step. So more interface elements that they can push to the bottom of the screen, the better, especially as devices keep getting bigger and bigger.

  • [Speaker] It's now easier and more natural to access your other tabs too. You can just swipe over the tab bar, to move quickly between them.

  • And of course, one of the huge benefits that Apple's providing is that, these groups will be available, these tabs will be available for you across your devices. I mean, it may be easier to create them on the Mac, but really convenient, much more convenient to access them on the go with your iPhone.

  • And the new customizable start page, syncs with Mac, so you have the same Safari experience everywhere.

Categories
Uncategorized

iPad mini Leaks — Tiny Air Incoming?

Redesigned A14 iPad mini with… maybe no Home button? Updated M2 iPad Pro with a glass back for MagSafe charging! Thinner iPad 9 and, yeah, oh yeah, AirPower. Take 3. Again.

Let’s do this!

Mark Gurman and Debbie Wu writing for Bloomberg. Apple is planning to release the new iPad pro in 2022 and the iPad mini later this year. And that makes the kind of sense that mostly does. Apple was updating the iPad pro every 18 months but then this last time they updated it in the spring of 2020, and now the spring of 2021. So whether they go to spring 2022 and keep that 12 month cadence, or they wait until fall of 2022 and go back to the 18 month cadence either way that would be just well within the next year window. The iPad mini though, hasn't been upgraded as aggressively as the other iPads in quite a while. So either way anyway getting a new iPad mini even if it's later this year, it's just about time. The updated iPad mini is planned to have narrower screen borders.

And yeah because the iPad mini is currently the only premium iPad, the only modern iPad that is no longer very modern that hasn't gotten its redesign. So whether it's Thanos snapping those bezels in half like the iPad pro or just in quarter, like the iPad air which is my guess as to what's gonna happen it just needs that redesign. While the removal of the home button has also been tested testing the removal of the home button sounds weird though. Apple knows how to remove home buttons. They've removed them from everything. From the smallest iPhone to the biggest iPad. Removing home buttons over the last few years is what Apple does. Whether it's face ID or maybe more likely touch ID in the power button. Like we just saw on the new iPad air. Technologically that's just a done deal, but whether or not Apple's testing keeping the iPad mini as a premium iPad or maybe dropping it down to be sort of an entry-level, low cost iPad like the iPad nothing that retains its home button to keep those costs down maybe that's something they're testing.

But if my opinion counts for everything just keep the iPad mini premium all the way and yoke that home button completely away. The main design change and testing for the iPad pro is a switch to a glass back from the current aluminum enclosure in part to enable wireless charging for the first time. This is interesting because you don't need an entirely glass back for inductive charging. You may not even want an entirely glass back just for inductive charging. What you want is a sweet spot just centered on the Apple logo so you have an easy target and then the magnets, just make it big enough for the magnets to latch on around the edges and then reinforce it so that the cut-out doesn't suddenly become a point of structural weakness, like a folding point. So if Apple is going to glass that is an industrial design decision continuing to just move that retro future chic design language forward.

And it does fill me with a little bit of anxiety. But material sciences in general, physics in general is a jerk. And just like metal bends and plastic cracks and ceramics shatter, glass breaks. So you've just got to pick your substance and then mitigate against it as best you can. And maybe Apple thinks some application of ceramic shield or similar technologies will end up making a better product. Wireless charging will likely be slower than directly plugging in a charger to the iPad's Thunderbolt port. And I'm curious about the use of the term likely here. I don't know if it's just more hedging or it's implying that there'll be some kind of technological leap at least attempted here because we have seen some companies go with multiple cells, specifically two cells and charge them in parallel. And you do lose slightly some capacity there but you gain the ability to charge at basically twice the speed. And then theoretically, you could do it at four times the speed with four cells. And some companies just have zero F's to give about this at all. And they'll let you charge at ridiculous speeds that will burn down the battery health and longevity of your device, but look really, really great on boxes and tweets.

But Apple has historically been really conservative about battery life and wants to maintain the battery health of their devices for years and years and years. But it's also great that there's just nothing here about them deleting any more ports on the iPad pro at least anytime soon. Apple is also trying out technology called reverse wireless charging that would allow users to charge their iPhone or other gadgets by laying them on the back of the tablet. We've seen that from other companies already as well. You can drop earbuds or a watch on the back of your Samsung phone and it'll just charge them up at the expense of your phone's battery life and not optimized really in any way. So you lose more power than you gain. But in a pinch it's a really helpful feature especially with a battery as big as you get in an iPad pro.

In addition to the next generation iPad pro and iPad mini Apple is also working on a thinner version of its entry-level iPad planned to be released as early as the end of the year. And that too just makes all the kind of sense that does because we've seen Apple updating the iPad every year on the year for the last few years. Apple keeps taking the technologies that they've been introducing and paying down in the higher end iPads and then pushing those down into the entry level iPads. Everything from the Apple pencil to the smart keyboard, to the larger displays. So the only real question remaining is when Apple will do that major transition to the more modern design with the smaller bezels and the Apple pencil too.

And a version of the magic keyboard because all this says is thinner. So could be a thinner version of the previous iPad air which should be around a 10.5 inch screen but still include a home button. And then maybe it doesn't go to the iPad pro 2018 style design until the iPad pro 2022 takes that to the next level. Apple is still exploring a future wireless charger that works similarly to its failed Airpower.

And yet that is also complicated by the Apple watch where everything else is Qi standard charging. And even third-party vendors have made mats now that charge multiple versions of Qi standard devices that Apple watch which draws more power than Qi but also has that more distinctive magnetic locking mechanism for it's much smaller puck. That increases the complexity quite a lot, especially if you can arbitrarily place the Apple watch down everywhere and Apple doesn't make a specific divot or extension expressly for the Apple watch. It is also internally investigating alternative wireless charging methods that can work over greater distances than an inductive connection. Which is something, we've seen other companies more openly, more publicly experiment with and work on as well. And it's pretty much still science fiction at this point and has an incredible amount of challenges to overcome because of what I said at the beginning because of physics being kind of a jerk.

Categories
Uncategorized

Intel ATTACKS… Intel Macs!

"In a recent call with Intel's Ryan Shrout about the performance of its 11th Generation H-series laptop processors, he took some time out to throw some serious shade Apple's way. "

Of course they did. And not just the new M1 Macs but the Intel MacBooks as well. Because it’s not like Intel has anything better to spend their brain-space on right now, do they?

Oh… Wait...

Previously on Intel Apple drive-bys

So, after pretty much everyone in the industry, including Windows -stans, dunked on Intel like a donut over the last couple ill-conceived, ill-advised, ill-executed anti-Apple marketing mess-ups, Intel has to have been learned, right? No way could some fragile ego micro… peon in Marcomms or the exec team screw this up again… or worse, could they?

According to Intel, it's the poor gaming experience of a Mac that makes its own devices superior.

The poor gaming experience. Gotcha. Not the poor computing or creating experience? If you want that, for sure get a Mac. It’s just the poor gaming experience that should drive you too…

Well, not Intel, I guess, because both Xbox and PlayStation went AMD…

This wasn't an idle boast either, Intel has thrown some research behind its feeling.

Now I’m not going to pretend that money wasted on attack marketing would be better spent on getting Intel’s chipset business turned around, because those are totally different departments and staff… but honestly at this point would it really hurt to have everyone in the company aligned and focused on making Intel better? Getting out and helping to push?

And don’t say you can do both at the same time because the current product clearly, clearly shows absolutely otherwise.

Not only are Apple Macs rubbish because they run Apple's own M1 silicon, which is obviously inferior to what Intel used to make,

The M1, which is an ultra-low-power part, has some of the best single-core pure performance in the industry, and the absolute best performance efficiency in the industry, but it’s rubbish and obviously inferior to Intel’s aggressively hot, egregiously power hungry chips, that were so delayed they had to invent a tic-toc-toc-toc-optimize-optimize-optimize-optimize cycle while Apple was pushing out new silicon architecture generations and gains year-over-year? Does Websters really need to subtweet the definition of rubbish here? Or is that abundantly, hilariously obvious by now?

but over half of the most popular games don't even run on the platform.

That’s totally fair. Apple has never focused on high-end gaming on the Mac, and even when they included higher-end AMD graphics cards, they never optimized them for gaming, like at all. Maybe that’ll change with a higher-end Apple TV or VR headset product, maybe not. But for now, totally fair.

What of emulators, I hear you ask? Yeah, they had that covered too. By showing off Valheim running appallingly through Parallels: trees popping in out of existence, textures appearing and disappearing. It was not pretty.

Now, normally I wouldn’t be all in-before cliche toxic pedant comment guy, but Parallels is virtualization not emulation. And I point that out only because you can’t virtualize x86 Windows on M1, only Windows for ARM, which has nothing to really do with Intel. Unless we’re, River-Song-Spoilers talking about Intel Macs, in which case you can boot camp into Windows on an Intel Mac for Native performance.

Which just… Aiya. Just Uncle Roger level Aiya.… once again has Intel own-goaling themselves not as a significant differentiator in the PC space, but as a commodity part where everything significant or differentiated comes from running Windows on top of them.

The red in this graph is actually an Intel-powered MacBook Pro, using a Core i9 9980HK alongside an AMD Radeon Pro 5600M to be exact, compared to the Core i5 11400H alongside a GeForce RTX 3060. Proof, if it were needed, that PCs are just better.

So… 11th generation Intel with latest generation Nvidia runs faster than 9th generation Intel with nearly 2 year old AMD…

Wait, I take it all back. This is actually 5D 9000IQ level marcomms… If what Intel wants to do is show why Apple had no choice but to yote them for custom M-series silicon.

And so the fact that you can't game properly on a Mac means that Apple should probably give up and go home.

Yup, shut it down, return the money to shareholders and Michael Dell. Fun and trillions of dollars while it lasted… If only there was a reality where a small percentage of PC customers — so small that it would be incomprehensibly weird for Apple to be living so rent free, for so long, in Intel’s brain stem, if Intel wasn’t either no-wrath-like-a-supplier-scorn or worried the mindshare of M1 all by itself is an existential threat to their stalled silicon business — if only there was a reality where people who preferred the overall experience of the Mac but also wanted higher-end gaming couldn’t also get dedicated gaming consoles or rigs… yes, running AMD.

I mean, it’s what a crapton of PC laptop owners have done. And with the price-to performance ratio of the M1 MacBook Air… That might even be the killer… est combo in the business right now.

To be fair, gaming on a Mac absolutely is rubbish, and it's not something I'd want to do. But I'm still surprised that Intel is so angry about it all. I even asked if Intel was burning its bridges with Apple? The answer was that Apple has been very public about moving to its own silicon and that it is now a competitor. All's fair in love and war I guess.

It is. It totally is. And it’s not like Apple hasn’t taken more than their fair share of pot shots at Android and Windows over the years. They even came close to an Intel slap-back during the M1 iMac intro in April.

Other than that, though, Apple’s been surprisingly… polite about Intel over the years. Bitten their tongue nearly completely off. As Intel’s constant delays pushed back the release date of new Mac models by months or more, and their failure to meet process shrink roadmaps had MacBooks running like TDP max space heaters for nearly half a decade. Even after direct questions during press briefings, hell, between them, Apple marcomms would just smile and keep Intel completely un-thrown from under any busses. For years. And now they pretty much don’t talk about Intel at all.

And… Maybe that’s what’s gotten Intel so pissy? Not that Apple walked away, but that Apple never looked back? Never answered their late night thirst texts? Never reacted at all to their constant self-owning beyond cringe-embarrassing attention seeking. That’s like watching the death-throws of a once-apex predator, Dinosaured down not by an M1 meteor from the heavens but a shame comet of their own complancency? Maybe that’s what’s got Intel so you mad?

And about the only thing I can think of that could help them now is if the entire exec and marcomms team sign up for Brilliant. Stat.

It’s this awesome website and app that teaches algorithms and neural networks, math, science, and computer science, critically — critically! — logic and deduction, but also physics, quantum mechanics, game theory, cryptocurrency, and so much more.

And it’s built on learning while doing and solving real challenges in real-time. With no memorizing long messy formulas or fact sheets, no tests or grades. Just instant feedback that coaches you bit by bit so you can rapidly improve and learn fundamental concepts… literally before you even realize it.

Categories
Uncategorized

M1X MacBook Pro — ALL Leak Bombs!

14- and 16-inch displays. Maybe… mini-LED displays! New, retro-future chic designs. Massively multicore M1X processors. Up to 64GB of RAM and 8TB of SSD. And ports… USB-C, MagSafe, Thunderbolt 3, HDMI, SD… all… the… ports…

It’s what I’ve been waiting for, what so many of us have been waiting for, and it could be coming as soon as next week’s WWDC keynote…

Let’s do this!

Design

Ok, so, ever since the 2018 iPad Pro, Apple’s been going all-in on this iPhone 5-inspired design language, and after assimilating the iPhone 12 and M1 iMac, it sure sounds like it’s coming for the MacBook Pro next. Biggest visual refresh in half a decade. All squared off edges, extruded round rects, and… flat… just flatter than iOS 7 flat. Flat bottom Macs make the rocking chassis calm down flat.

And… That might leave less space inside, but once Apple rips the Intel and AMD out, and replaces them with their own custom silicon, they won’t need anywhere nearly as much space.

Especially if they’re yoting the TouchBar as well. Which, true confessions, I’ll miss as a way of scrubbing tabs and easily accessing shortcuts, but since Apple hasn’t done anything to improve it in the 5 years post-launch, not haptics, not textures, not anything… well, then, it had its chance, and nature and Apple have selected it for extinction. RIP.

Display

As for the bezels, Apple already Thanos snapped them in half on the big MacBook Pro, taking it from 15- to 16-inches, so remains to be seen if they’ll whitle the casing down now as well — ratio them even further — or just let it be.

But the 13-inch still needs it’s snapping, still needs to be pushed out to 14-inches in the same or similar-sized frame.

Likewise, those displays are already Retina high density, P3 wide gamut, True Tone dynamic white balancing, and the 16-inch can even be manually switched between 48 and 60Hz, so you can better edit 24 frames per second movies as opposed to 30 frames per second TV…

Which is good… but could be better… As in ProMotion adaptive refresh rate on the iPad Pro better… Which can ramp all the way up to 120hz for smoother than buttery… just Gee smooth scrolling, or down to 24hz for static images.

Would Apple go there? Could Apple go there? No rumors yet. But I want it.

Same for mini-LED, the other edge the iPad Pro currently enjoys.

It provides almost OLED levels of dynamic range and contrast without OLED level issues like off-axis color shifting, where it can look all red or blue at an angle, or lack of consistent brightness, where it can look splotchy when there’s a lot of white on the screen. But, it has its own issues like blooming around bright areas, which is especially wicked obvious in the dark.

Still mini-LED seems more practical, available, and affordable at MacBook display sizes…. So the only question is whether or not Apple will have it ready in time for this year’s models or if it’ll be pushed to next?

And whether or not it’ll be on both the 16-inch and 14-inch MacBook Pro? Because on the iPad Pro, mini-LED only went on the bigger, better able to absorb the $100 price bump, 12.9-inch, not the price-held 11-inch.

AV

Same for the sound systems. Right now the 16-inch is the only MacBook Pro with Apple’s top-of-the-line spatial audio speaker system and studio quality mics. Will the 14-inch finally get the same treatment?

And… yeah… those webcams though… Will Apple stick with 720p like the M1 MacBooks, or will they finally find a way to cram a thick-as-in-thicc 1080p sensor like the M1 iMacs into that ultra-thin lid?

Ports

For ports, we’re supposed to be getting HDMI, SD, and don’t call it MagSafe anymore back.

The HDMI could be 2.0 like last November’s M1 Mac mini… or maybe 2.1 like the just-released A12 Apple TV 4K. Would the difference be a big deal to you and your workflow or setup?

The SD slot should be SDXC (UHS-II), just like the M1 Mac mini.

And, MagSafe, which will have to look different than the doc Ok tentacle version on the iMac, but could function similarly, transiting Ethernet over USB on the power brick.

But the Thunderbolt ports… the thunderbolt ports… The big hope here is that we’ll be getting more than the two currently supported on all the M1 models. But maybe not as many as the four on the current higher-end Intel MacBook Pros, because HDMI, MagSafe, and SD

That’ll be great for people who love those specific, soon-to-be static locked in hardware ports… not so great for those pro-est of pros who already dongle different with CFExpress or just want to plug in all the TB3 displays and accessories.

M1X

Powering these higher-end MacBooks Pro should be a higher-end version of the M1 chip. An M1X, so to speak. Not an M2, because if Apple holds to pattern, M2 would be the next generation of ultra-low power chip for future MacBook Airs and 24-inch iMacs, not the ultra-high performance version we want for these MacBooks Pro and future iMacs Pro.

And I know some people are worried about the numbers, but just remember Apple released an A12Z in the 2020 iPad Pro after releasing an A13 in the 2019 iPhone 11… and it was fine. No one ran head first into a wall and fell down. Twice.

For the last decade, the number in the name has referred to the silicon architecture generation, not the time stamp on the release date. And X or Z has referred to extra cores, which is what it sounds like we’ll be getting here.

Instead of 4 efficiency cores and 4 performance cores like M1, rumor has it there’ll be 2 efficiency cores and 8 performance cores on M1X. Which… Unless Apple feels like they can nudge the frequencies, each core should be the same speed as M1, there’ll just be twice as many pCores to throw at any given multicore workload.

Likewise, Instead of 8 graphics cores like M1, rumor has it there’ll be 16 or 32 core options. Whether that’s 16 for the 14-inch and up to 32 for the 16-inch — or both options for both sizes — we’ll have to wait and see.

Same with RAM and SSD. Up to 32 GB and 4 TB on the 14-inch and 64 GB and 8 TB on the 16-inch, like the current comparative Intel boxes, or… all options for all sizes. I’ll be your huckleberry and guess the former but you’d love to see the latter.

And then I’m just super curious to see what kind of active cooling system and battery sizes Apple throws at these new pros to balance out sustained performance and battery life with the increased performance levels and power draw. Because damn.

And instead of 16 neural engine cores, rumor has it there’ll be… more or just better there as well.

Which will be terrific for machine learning, algorithms, neural networks, and all the next-level, next-generation technology you can get in on and help shape the future… with Brilliant.

Also math, science, and computer science, logic and deduction, physics, quantum mechanics, game theory, cryptocurrency, and so much more

Brilliant is this awesome website and app built on learning while doing and solving real challenges in real-time. With no memorizing long messy formulas or fact sheets, no tests or grades. Just instant feedback that coaches you bit by bit so you can rapidly improve and learn fundamental concepts… literally before you even realize it.