iOS 26! It feels like just last year we were here discussing iOS 18. How time flies.
After a year that saw the debut of Apple Intelligence and the subsequent controversy over the features that it didn’t manage to ship, Apple seems to have taken a different tack with iOS 26. In addition to the expansive new Liquid Glass design that spans all of its platforms, Apple has largely focused on smaller, “quality of life” improvements rather than marquee new features.
That’s not a bad thing, either—these are often the types of things that Apple does best, and which actually make a meaningful impact on the lives of their customers: saving them time waiting on hold on the phone, helping them avoid dealing with spam, and improving their driving experience.
It’s also worth noting that almost all of the iOS 26 features that Apple demoed during its WWDC keynote this year are available in this initial version. (The exception seems to be the Digital ID feature that lets you use your U.S. Passport to make an ID in the Wallet app, which Apple says is still forthcoming in a future update.) Most of it has been there since the earliest beta builds this summer, showing that Apple really is trying not to get over its own skis.
While this update is probably going to be most remembered for its Liquid Glass overhaul—a redesign that feels more than a little ill-conceived—there are definitely things to like in iOS 26. Let’s dive in.
Liquid Glass half-full or half-empty?
Apple’s new design language, dubbed Liquid Glass, applies across all its platforms, but unsurprisingly, it feels most at home on the iPhone and iPad. That’s in part because of the touch interface; the literal hands-on nature makes the feel responsive and more like physical things that you’re interacting with. For example, dragging the new magnifying loupe across the screen, watching the way it magnifies and distorts text and images as it passes over them—this interaction has always been unique to iOS for practical reasons, but the way it feels here doesn’t have a direct analogue on other platforms.
Perhaps the truest “liquid glass” interaction, in that the loupe when it moves back and forth deforms like a water droplet.
Controls now overlay content rather than sitting in designated toolbars or areas of the screen reserved for those controls, and are rendered in transparent glass that refracts and distorts the colors of whatever passes behind it. That’s impressive but also, at times, distracting: sometimes you see a distortion of text from what you’re reading within the UI, which is odd. Or, when scrolling past content that goes abruptly from light to dark, the buttons might similarly flip appearance from, say, black icons to white icons in a way that can feel jarring.
Let’s break Liquid Glass up into its component parts and take the latter first.
“Glass” is the overall look of these updates, and it’s everywhere. Transparent, frosted, distorting. In some places it looks quite cool, such as in the edge distortion when you’re swiping up on the lock screen. But elsewhere, it seems to me that glass may not be quite the right material for the job. The Glass House might be architecturally impressive, but it’s not particularly practical.
It’s also a definite philosophical choice, and one that’s going to engender some criticism—much of it well-deserved. Apple has argued that it’s about getting controls out of the way, but is that really what’s happening here? It’s hard to argue that having a transparent button sitting right on top of your email is helping that email be more prominent. To take this argument to its logical conclusion, why is the keyboard not fully transparent glass over our content?
Apple has designed extensive rules to try and minimize some of the most distracting impacts of Liquid Glass. For example, if you’re viewing black-on-white content and suddenly scroll past a darker image, the UI widgets will only flip from light to dark mode based on the speed of your scrolling: scroll past it quickly and they won’t flip; it’s only if you slow down or stop with the widgets over the image that they’ll shift into dark mode.
While clever, this also feels remarkably over-engineered to work around the fundamental nature of these devices. It’s a little reminiscent of the old apocryphal story about how the American space industry spent years and millions of dollars designing a pen that could write in space while the Soviets used a pencil. Perhaps they should have used a design that doesn’t require adjusting its look on the fly.
If the “Glass” is about the new look, the “Liquid” part of the moniker is more about the feel of this design. The way the aforementioned loupe moves, or how controls animate—dividing up and reintegrating, in a fashion that’s clearly building on what Apple first did with the Dynamic Island. Tap the Select button in the Library view in Photos, and you’ll watch the buttons bounce and morph into new controls. It’s eye-catching, to be sure, and does do a nice job of subtly making you aware of how the UI is changing, but it’s applied inconsistently—why does going into a mailbox in Mail show the Edit button morph into Select and … buttons, but tapping the Select button here does…nothing?—and after the fifteenth time, the animations are less novel and more distracting. It brings to mind the Genie effect of the earliest days of Mac OS X (and still there to this day!): impressive to see what the device can do, but to be turned off at the earliest opportunity.
No doubt Liquid Glass will continue to be refined. Some parts of it have already shifted during this summer’s betas: Control Center seems in better shape in the final shipping version than it did earlier on. But the biggest challenge is that Apple is never going to be able to test transparency with every single possible background, and as hard as it’s worked to design systems that can adapt on the fly, the end result is often distracting and sometimes unsightly.

The redesign is more than skin deep, however. Apple has rethought the way some of its most fundamental interactions work. For example, the increasingly long horizontal popover menus that hid options behind an interminable scroll have morphed into a dual-stage design. Tapping and holding on the screen brings up a popover with a few common options, but it now doesn’t make you scroll; instead, there’s an arrow indicating more options. Tap that, and you’ll get a big pop-up panel of all the available commands in a much easier-to-read and use format. As someone who frequently finds himself swiping through a very long list to find the one command I want (and somehow, it’s always the last one), this is a tangible improvement.
It would be nice if that first menu were more customizable, though. Apple does try to detect contextually what items should be at the top of the list, but it would be nice to be able to pin something to the top or add a custom shortcut to the list. And although the menu varies from app to app, some of the organizational choices are puzzling. In the Safari screenshot above, I’m not sure why Writing Tools is visible. After all, I’m looking at uneditable text on a web page. Am I rewriting the web now? This feels less like a choice focused on user needs and more like a reflexive promotion for Apple’s AI tools.
Other system-level features have been expanded as well. For example, while you used to be able to swipe from the left side of the screen to go back or up a level in a hierarchy, that gesture now works when swiping anywhere on the screen, making it both more discoverable and easier to use.
You paid for the whole lock screen, but you’ll only need the edge!
The design changes also extend to the look and feel of app icons, which are now built with transparent layers. Interestingly, if developers adopted the design changes to better support iOS 18’s “tinted” theme, their icons already get some of the benefits baked in. That tinted theme has been expanded with both light and dark options, and there’s also a new clear theme that turns all your app icons ghostly, which is a great way of testing your muscle memory for where you put your apps. I’m not sure it’s for me—everything looks a bit too same-y—but it is definitely a look, and I’m equally certain there will be folks who love it.
As with any change this sweeping, it’s always going to take some time to adjust. There are some who will decry this as change for change’s sake, but as undesirable as that might be, the countervailing argument is that you shouldn’t keep things the same just because it’s the way they’ve always been. My experience with Liquid Glass has had its ups and downs, with interactions that feel both interesting and dynamic to those that are downright frustrating. Much of design is highly subjective, and it will be interesting to see how the general public reacts to something that so far has been the province of early adopters and developers.
Lock step
After the last few years of Lock Screen customization options, this year’s additions are more muted, and mostly in step with updating the look for Liquid Glass.
The biggest addition—literally—is the new clock, which can expand to fit the empty space on your screen. If you have a rotating set of lock screen photos, it will dynamically adjust for each one, trying not to obscure people’s faces; in the customization screen, you can manually adjust the size of the clock, but I haven’t been able to figure out what that actually does in the photo shuffle mode—it still changes dynamically.
I don’t always love the way that the big clock interacts with pictures of people. I’m definitely not the best photographer, so I’m not always framing my shots ideally, and the photo shuffle algorithm seems agnostic to how good my pictures are, so I definitely still end up with the clock running into people’s faces, or uneven backgrounds that make the clock’s glass effect look a little more muddled. Is it possible to make a good-looking lock screen? Absolutely. But if you, like me, leave it to the system, then the results are mixed.

I’m also happy to say that one of my favorite features of last year—the rainbow-hued clock available in some lock screen styles, like Live Photo—still exists—you just have to change the clock style to solid rather than glass in order to see it.
There’s also now an option to move the widgets that used to sit below the clock all the way to the bottom of the screen, right above the notification stack, so as to not block the subject of your photo. I kind of prefer this location—I find it easier to tap a widget and open the app if I want, and I find the data from them don’t get as lost.
Your Lock Screen photos can also be more dynamic now, with the addition of spatial scenes. That’s a feature imported from the Apple Vision Pro where iOS will apply a three-dimensional effect to an image, animating depth as you move the phone around. How effective that is varies from photo to photo, although it feels less compelling here than viewing true spatial scenes on a Vision Pro; the animation of the spatial versions is sometimes a little jerky, and some people with motion sensitivity might find them off-putting. As with the lock screen images, Apple’s attempts to identify what makes a “good” spatial scene can be hit or miss: I’ve ended up with some odd results at times.
Speaking of images that move, the lock screen also now has an animated artwork option for music apps—note that I said “apps” not “the Music app” since it’s an API available to developers of third-party apps. But it will need adoption from the producers of albums in order to take full effect. When it shows up, it takes over the entire lock screen rather than being constrained within a little album square. It’s an interesting approach, although one that you may not notice depending on how often you actually visit the lock screen while music is playing. So, while it’s a cool idea, I’m not sure it does much for me. Maybe it’s time to commission some animated artwork for the Six Colors podcast?
Point and shoot
I’d venture a guess that Camera is the most used app on the iPhone, though I’ve got no real numbers to back that up. But given the amount of time Apple has spent upgrading the camera hardware on the iPhone over the years, I feel pretty confident in my assessment.
As a result, redesigning the Camera app—hot on the heels of last year’s redesign of Photos—is a bold choice. But it’s not surprising that the company’s alterations here are focused on the minimal, reinforcing the way that most people already use the app. (And if anybody’s got the metrics to know how people use its apps, it’s obviously Apple.)
For example, controls for more advanced features like aperture, exposure, and photographic styles are now buried in a separate menu, available by tapping a button at the top of the screen or by swiping up from the bottom. Given that I’ve definitely ended up in these controls by accident over the years—and I suspect I’m not alone—that’s not a bad thing.

Likewise, what used to be an at times overwhelming carousel of modes—panorama, portrait, photo, video, slo-mo, time lapse, etc.—has now been visually reduced, by default, to just the two most popular: photo and video. The others are still there if you scroll left or right, but you’re less likely to accidentally find yourself shooting spatial video at a lower resolution when you don’t mean to. Similarly, those resolution and format choices are also now squirreled away behind a separate button, there if you need them without being omnipresent.
The redesign reflects the fact that most people want to get in, snap a picture or shoot a video, and get out. Not to mention that Apple has spent a lot of time designing its phones so that they take great photos without having to tweak those details. Those advanced features are still there—and, arguably, more accessible using something like the Camera Control button on the latest iPhones—and for those who long for more than Apple’s Camera app offers, there are an assortment of popular and powerful third-party camera apps to fill in the gaps.
The counterargument, of course, is that by hiding those features away, they are less discoverable. This is the eternal battle in interfaces, especially in someplace as constrained as an iPhone screen. In other places, Apple has done its part to pop up hints about features you might not see at first glance, including here in the Camera app. Personally, I think this redesign walks a solid line—the new interface is not so different from the old that I had any trouble with it, and I appreciate that there are fewer distractions, though I do wonder if it might be beneficial to offer an option to pin a frequently used mode or two for those who want to get to panoramas or portraits, for example, without trying to remember where to swipe.
There are also a couple of AirPods-related features in Camera: first, if you’ve got the latest models with H2 chips, the microphones should be improved. Apple touts them as “studio quality”, a meaningless qualifier that could mean anything from “suitable for a recording studio” to “you can use these in your studio apartment,” but at least it doesn’t sound like you’re in a wind tunnel anymore. In one of my test calls, my wife was genuinely impressed when I asked, at the end, how I’d sounded. “I wouldn’t have known you were on your AirPods if you hadn’t told me.”
And you can now use the AirPods’s stem controls to take a picture or start recording a video: handy for people using a selfie stick or tripod, or even just a quick way to snap a group photo (as long as you don’t mind having an AirPod in your ear in said photo). Bear in mind, this is a feature you’ll have to turn on in Settings under your AirPods, though it does let you choose between a simple press or a press-and-hold.
Calling cards
An update to the Phone app? Are we sure iOS 26 doesn’t stand for 1926? People knock the Phone app, but, well, I still make phone calls. In addition to a couple of handy features, there are also some substantial design changes afoot.

A redesign strikes again! The new Unified view pins your favorites to the top, then shows you your recent calls, missed calls, and voicemails all in a single very long list on the Calls tab, with separate tabs for Contacts and the Keypad. Some might not care for this approach, but I find it kind of a no-brainer. It did encourage me to pare my Favorites list down a bit to the one line of people I actually call as well as finally update their contact pictures to the more recent poster format. I don’t mind having voicemails mixed in; I don’t get very many. But if you hate this new interface, don’t worry: Apple will let you switch back.
Unquestionably good is the new set of Filtering features available in the menu at the top right. By default, this includes options to view just Missed calls or Voicemails, but there’s also now, praise the heavens, a Spam section for calls that are recognized as such. Apple’s using a combination of carrier tagging (those calls that you’ve seen flagged as “Spam Risk”) and its own analysis. You can manually mark a call as spam by swiping left on it in your recents list and choosing Block and Report Spam.
The real challenge, as always, is the calls that fall in between your contacts and out-and-out spam. For this there’s the new Screen Unknown Callers feature. You might remember that Apple previously added a Silence Unknown Callers feature in iOS 13 that would mute calls from numbers that weren’t recognized—with the challenge that if you got a call from a doctor’s office, tradesperson, or even food delivery, you might not see it. That was followed by Live Voicemail in iOS 17, which helped somewhat mitigate the issue, but Screen Unknown Callers goes a step further: when activated, which you can do in the Phone app or in Settings > Phone, callers not in your contacts will be greeted with a robotic message and asked to provide more information before the call rings through. You can also choose to leave unknown calls totally silenced, or turn screening off entirely to have all calls ring your phone.
There’s a separate but connected feature in iOS 26 called Call Filtering. Once you turn this on, you’ll see an Unknown Callers category in the filter list in Phone. From there, you can choose to mark the numbers as known, at which point they will ring through—without having to be added to your contact list, which is nice. However, I’m not sure how you move a number back to “unknown” if you accidentally mark it as known—you can delete it from the list or block it, but I’m not sure what to do if you want to simply move it back to the “Unknown Callers” section. You can also choose to have calls detected as spam by your carrier simply not ring at all, which seems like a real no-brainer.
One criticism: my filter icon has a badge listing the number of unknown calls, a rather high 324. There’s no option to mark them all as read, meaning I’m apparently stuck with that badge forever? I don’t love it.
Overall, I’ve got mixed feelings about the Screen Unknown Callers feature. On the one hand, it will undeniably help weed out potentially spam calls. On the other, some part of my upbringing makes me feel embarrassed about the idea that someone—especially a likely underpaid person in a service industry—is going to have to justify their call to a robot. I’ve gotten calls from AI assistants from my dentist’s office recently, and frankly…I just hang up. I’m not going to spend my time chatting with a computer, and I don’t blame anybody else for feeling the same. That said, I have turned it on, though I haven’t actually encountered it much in use.

Along similar lines, Apple’s also added a feature called Hold Assist that automates the oft-annoying task of waiting on hold. I did get a chance to try this out during the beta period, and it worked fine except for one caveat. The idea behind the feature is that when you’re put on hold with some cheesy hold music or deafening silence, you can trigger this feature and be notified when somebody comes back on the line.
In my experience, however, one problem I encountered was that it registered the occasional recorded messaged while I was on hold with the Massachusetts Department of Revenue—”Your call is important to us!” or “Did you know you can go to our website?”—as a human coming back, and notified me, leaving me to scramble for the phone only to find that I wasn’t talking to a live person after all. My understanding is that the feature should be able to distinguish between a regular recorded message and a human, but that was not my experience at the time—I haven’t yet had the need to put the feature through its paces in the shipping build.
Just browsing

While Safari may not have gotten quite the expansive overhaul of some of Apple’s other built-in apps this year, it’s still gotten a bit of a revamp. And as another of Apple’s most popular apps, even small changes here have a tendency to reverberate.
Apple’s taken a variety of stabs at UI minimalism in Safari over the years, both on iOS and macOS. Often those first, more substantial changes, get dialed back. In iOS 26, these changes aren’t quite as radical, but they’re more than just a coat of Liquid Glass. Gone is the longstanding toolbar with its back/forward arrows, Share icon, bookmarks, and tab menus beneath a location bar. In its place, by default, is a more reduced UI with a back button, location bar, and the now seemingly ubiquitous “More” button denoted with three dots.
You’ll find many of the previous controls under that More button, including both bookmark and tab management, as well as Share. But some controls are still accessed by tapping on the icon in the location bar—including Reader mode, if available, translation, extension management, and so on—and others are instead squirreled away under a long press on the location bar, including closing tabs, tab groups, and…another Share button. The button so nice they included it twice!
As with the Phone app you can, if you so wish, revert back to classic Safari—either with the location bar at the top or bottom. In a few weeks of usage, I’ve elected to stay with Apple’s new design, though I still struggle to remember whether the control I want is accessed via the location bar or the More button. Or…both? At least some common gestures, like swiping left and right on the location bars to switch tabs or flicking upwards on the URL to see your tab overview, have remained.
I never really felt like the old toolbar style was getting in the way of my content, so I’m not sure if this change is anything but an attempt to mix things up. And given that the UI still hovers a little bit above the bottom of the screen, causing any text or content behind it to fade, I’m not sure that it actually adds any more usable real estate.
Overall, I’ve largely gotten used to the look, though at times the effects of a non-uniform website background on Liquid Glass can lead to disparate effects like one pop-up menu being a light color while another is dark.
Beyond the design changes, most of Safari’s other updates are under the hood. One notable change: developers of web extensions don’t need to use Xcode or even a Mac anymore; they can just upload their extensions to the App Store in a ZIP file. Hopefully, that’s another step closer to some of the myriad extensions out there making their way to Safari, though the jury will remain out. And any web page can be opened as a full web app from the home screen now, rather than just essentially being a bookmark.
Let’s get visual…visual
Apple Intelligence may have been the big news in iOS 18, but this year its new features are somewhat more muted. While the capabilities that didn’t end up shipping in 2025—Personal Context and a smarter Siri among them—are still expected to arrive next year, Apple has with this initial iOS 26 release focused on some smaller capabilities, like integrating ChatGPT into Image Playground, the ability to combine two emoji in Genmoji, and summaries of voicemails. It’s also brought back summaries for notifications with more granular controls over what kinds of apps you want it to apply to—plus more stringent warnings about the fact that said notifications may be inaccurate, which certainly raises questions about their ultimate utility. Having used those for some time, I can’t say if they’re significantly improved—we’ll have to wait until the next brouhaha.
Perhaps the most significant of these Apple Intelligence-powered features in iOS 26, though, is an expansion of the Visual Intelligence feature launched last year. Instead of being confined to pictures taken with the camera, Visual Intelligence in iOS 26 now offers the same capabilities with screenshots. In fact, the feature is built right into the existing screenshot interface, so now whenever you squeeze the sleep/wake button and volume up button to take a picture of what’s on your screen, you’ll see two new controls at the bottom: Ask and Image Search.

The former lets you ask ChatGPT questions about the image, while the latter brings up Google results. You can even highlight a portion of the image by running your finger over it if you only want to search on a subset of the picture. (I’ve found myself accidentally triggering that feature on occasion while trying to crop an image, which is annoying.)
This is all a shot over the bow of Google’s longstanding Lens feature, with a dash of AI thrown in. I’ve barely used Visual Intelligence on my iPhone 16 Pro since its debut; to date, the screenshot integration hasn’t been enough to get me to change my ways, but it does open up some new possibilities for extracting information from your screen, in the same way that Live Text has done.
Speaking of Live Text, in case answers from two different tech giants aren’t enough, Apple is also using a bit of that same machine learning technology to pull out relevant details from the image, whether it be a URL or a calendar event and present them in a little floating lozenge at the bottom of the screen. That can be handy, though it’s also at the whims of whatever information is captured in the screenshot.
It is a little odd that Visual Intelligence is offered in two different places with two different interfaces, but given there is a distinction between screenshots and taking photos, perhaps that’s not as jarring as it seems at first blush.
Bits and bobs
As with any major platform update, there’s simply too much to cover absolutely everything. Here, then, are a few other features that I’ve noticed in my time with iOS 26.
The Battery section of Settings has been redone, providing a quick glance at when you last charged or, if your phone is plugged in, how long until it’s charged. The main graph now shows your battery usage to your average daily usage—including in the app-by-app breakdown—rather than providing the somewhat less useful hour-over-hour view. I do find some of the visuals a little silly: an orange exclamation point letting me know that I’ve used an app more than usual when it’s an app I only occasionally launch seems overly alarmist.
There’s also a new Adaptive Power mode that supposedly helps prolong battery life if you’re using more than usual by toning down some things like display brightness. I’ve had this active for some time, and I don’t know that I’ve seen a marked difference.

As on the iPad, you can record your audio locally on the iPhone with the new Local Capture feature, whether it’s via the built-in mic, AirPods, or a USB-C microphone (not to mention a new audio input picker that lets you choose which mic you want to use). While it still needs controls for audio input volume—some mics, including my ATR-2100x, which I would be most likely to use with this feature, are distorted because they’re simply too loud—this does make it feasible to record a podcast on your iPhone. I honestly never thought I’d see the day, but it’s here. Mostly.
AirPods can also now have a Pause Media When Falling Asleep option. I like the idea in principle, but I find it a bit too eager. I wish there were a way to adjust the threshold or to opt out of certain apps. For example, when I’m traveling I sometimes sleep with white noise (usually from Dark Noise), and I find that when the audio is paused it wakes me up. I’m not sure what algorithm it’s using to figure out when I’m asleep but it’s not right for me, I guess.
Notes may not support writing in native Markdown, but it does now let you import and export from the format. That includes any images that you’ve embedded in the note, which is handy. Despite being a Markdown fan, I’m not sure I’m likely to use this feature…I like Markdown because I want to write in it for the web, not have to take the extra step to export. But it’s nice that there’s at least an easy and accessible way to get your data in and out of the Notes app.

The Passwords app adds a history of changes you’ve made to passwords (only, of course, for changes since installing iOS 26). That’s a nice feature because I have definitely ended up not realizing I’ve already got a password and then reset it. In fact, in one of my favorite moves, it will even tell you when it created a password for a site, even if that password may not have actually gotten submitted—something that’s happened to me more than a few times.
Photos gets its toolbar back, mostly. There are now Library and Collections views rather than the unified version we got in iOS 18. Personally, I miss the ability to hide sections of the Collections view, but at least it maintains the rest of its customization options. It also gets the aforementioned feature to turn images into Spatial Scenes, though as above, your mileage will probably vary depending on how cool or off-putting you think that feature is.
Remember how even iTunes had the ability to crossfade between songs maybe twenty years ago? Well, Music‘s new AutoMix feature takes that to eleven by trying to actually find the perfect DJ-style moment to segue into the next song. This feature is downright odd and, I think, way too aggressive. I’ve heard it slowing down a song for like thirty seconds to match beats with the following song. As a novelty, it’s kind of fascinating, but I think a lot of people are going to be annoyed with it tweaking their favorite songs. There’s also a new feature that lets you pin favorites to the top of your library, whether it’s a song, album, playlist, or artist; that’s a welcome addition to the app.
Can’t remember the name of that café you stopped at on your most recent road trip? If you opt into the new Visited Places feature in Maps, you can search or scroll through places you visited—even by category or city. All the information is stored privately and securely, so nobody else can see it, not even Apple, and you can easily remove items from the list (or correct erroneous ones). It’s also a great way to retroactively build a guide of places for a trip you’ve taken. There’s also a Preferred Routes feature that supposedly learns about how you make regular trips, but as someone who works from home, I don’t expect to get too much use out of this.
I don’t generally use alarms, so an adjustable snooze time in the Clock app doesn’t really do much for me, but I know some people will be excited, so here you go. However, this does come with one interface criticism about the redesigned alarm display on the lock screen, which now has equally sized Stop and Snooze buttons, leading to the possibility of sleep-addled people hitting the wrong button.

I order plenty of stuff on the internet, so I’ve used Wallet‘s new Apple Intelligence-powered tracking of orders a few times now. I’ve had mixed results with it automatically adding orders, but I do appreciate that there’s a little banner that appears in your mail message if an order is detected, and prompts you to add it to Wallet. Beyond that, though, it feels kind of half-done. For one thing, it seems to include items like orders for picking up food and digital orders which is comprehensive, if not always useful. More annoyingly, though, while Wallet does show a carrier and tracking number in the entry in Wallet, you can’t tap through to see actual tracking details. Honestly, it seems a little disingenuous to call it “Order Tracking.” I’m not really sure what the point is. This probably won’t lure anybody away from their third-party tracking apps…yet.
Hallelujah, you can now select the text inside a bubble in Messages. I know it’s not the flashiest improvement, but it’s always seemed absurd that this was an all-or-nothing proposition. I mean, you can copy just some text out of an image these days, for heaven’s sake. A small but very meaningful improvement.
Last, but hardly least, CarPlay gets a handful of new features, including the new Liquid Glass design, a smaller call notification, tapbacks in Messages, and Widgets. I really want to like widgets but two things hold me back: first, my CarPlay screen is very small and can only show one widget at once, and second, I’ve struggled to figure out which widgets are actually useful while I’m driving. Most of the time I really do just want my map and whatever media is playing. Maybe on a bigger screen they’d be more compelling. I was a little worried that tapbacks would encourage too much interaction with the screens, but having now used it, I find it pretty seamless, and certainly easier than trying to reply.
A qualified update
Some years have bigger updates, some more modest. iOS 26 definitely feels like the former, especially given the new design, but peel that away and you have mostly a bunch of smaller updates. Having used it for a few months, I don’t think I’m in “ringing endorsement” territory for Liquid Glass—there are far too many places where things are just a bit off, like you’re seeing your phone through a fun-house mirror. That said, it’s also not a phonepocalypse to run from in fear. I’ve largely acclimated to the changes, and I expect most users will as well. It’s pretty rare for iPhone owners not to update their phones, and I honestly just don’t expect a huge rash of people sticking with iOS 18 because they’re that upset.
The other question going into this year’s update was Apple Intelligence, an issue that Apple mostly seems to have sidestepped in iOS 26. Yes, there are new Shortcuts actions and the expansion of Visual Intelligence, and yes, the company seems determined to remind you that its AI-powered features are there, but if you don’t want them, you can more or less ignore them and just go on about your life.
As always, some features are going to require buy-in from third parties to really show off what they can do. Speaking of AI, the ability for developers to integrate Apple’s own models into their apps has tremendous potential, but we’re not going to get a great picture of that for a little while yet. Likewise, some of Liquid Glass’s ultimate impact will depend on how much other apps embrace that—and given how many of them have their own cross-platform branding/style (WhatsApp, Netflix, Slack, Discord, etc.)—it may not have quite the far-reaching effect that Apple hopes.
Overall, iOS 26’s quality of life improvements are what are likely to resonate the most, and there’s something here for most of us. Who wants to be stuck on hold, or get spam calls and messages, after all? How about the ability to talk with people in different languages, or remember that place you went to six months ago? Who isn’t glad to have the ability to record a podcast on their phone? (Is that just me? It might be just me.)
And it’s important to remember that one of the most essential qualities of software is its dynamism: like life, its only constant is change. Unlike the phone you buy, which will look and feel the way it does until you move on to the next one, the software experience is malleable. Ultimately, it’s what our devices can enable us to do, how they make our lives easier, that will be more important to our daily lives than what color buttons are (or aren’t).
