
While Apple quickly dismissed major Siri improvements as coming next year, the company still announced a lot at WWDC 2025. From a whole new look across all Apple devices to full-on windowing on iPads, it was a fast-paced event with a lot to cover.
Here’s a roundup of all the major announcements – please take our poll on your favorites of these, and share your reactions in the comments …
Liquid Glass
All Apple devices get a new look and feel, through a design language Apple calls Liquid Glass.
Apple today previewed a beautiful new software design that makes apps and system experiences more expressive and delightful while being instantly familiar. It’s crafted with a new material called Liquid Glass. This translucent material reflects and refracts its surroundings, while dynamically transforming to help bring greater focus to content, delivering a new level of vitality across controls, navigation, app icons, widgets, and more. For the very first time, the new design extends across platforms — iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, and tvOS 26 — to establish even more harmony while maintaining the distinct qualities that make each unique […]
Controls, toolbars, and navigation within apps have been redesigned. Previously configured for rectangular displays, they now fit perfectly concentric with the rounded corners of modern hardware and app windows — establishing greater harmony between hardware, software, and content. Controls are crafted out of Liquid Glass and act as a distinct functional layer that sits above apps. They give way to content and dynamically morph as users need more options or move between different parts of an app. And with thoughtful groupings, it’s easier for users to find the controls they need.
Live translation
Messages, FaceTime, and Phone apps all get a new live translation feature.
For those moments when a language barrier gets in the way, Live Translation can help users communicate across languages when messaging or speaking. The experience is integrated into Messages, FaceTime, and Phone, and enabled by Apple-built models that run entirely on device, so users’ personal conversations stay personal.
In Messages, Live Translation can automatically translate messages. If a user is making plans with new friends while traveling abroad, their message can be translated as they type, delivered in the recipient’s preferred language, and when they get a response, each message can be instantly translated. On FaceTime calls, a user can follow along with translated live captions while still hearing the speaker’s voice. And when on a phone call, the translation is spoken aloud throughout the conversation.
On-screen visual intelligence
Visual Intelligence previously worked through the iPhone camera, but now also works with on-screen content.
Building on Apple Intelligence, visual intelligence extends to a user’s iPhone screen so they can search and take action on anything they’re viewing across their apps.
Visual intelligence already helps users learn about objects and places around them using their iPhone camera, and it now enables users to do more, faster, with the content on their iPhone screen. Users can ask ChatGPT questions about what they’re looking at on their screen to learn more, as well as search Google, Etsy, or other supported apps to find similar images and products. If there’s an object a user is especially interested in, like a lamp, they can highlight it to search for that specific item or similar objects online.
Visual intelligence also recognizes when a user is looking at an event and suggests adding it to their calendar. Apple Intelligence then extracts the date, time, and location to prepopulate these key details into an event.
Fitness Buddy on the Apple Watch
If you’re exercising alone, the Apple Watch now aims to keep you motivated.
Workout Buddy is a first-of-its-kind workout experience on Apple Watch with Apple Intelligence that incorporates a user’s workout data and fitness history to generate personalized, motivational insights during their session.
To offer meaningful inspiration in real time, Workout Buddy analyzes data from a user’s current workout along with their fitness history, based on data like heart rate, pace, distance, Activity rings, personal fitness milestones, and more. A new text-to-speech model then translates insights into a dynamic generative voice built using voice data from Fitness+ trainers, so it has the right energy, style, and tone for a workout. Workout Buddy processes this data privately and securely with Apple Intelligence.
Foundation Models for developers
Developers now get access to on-device Apple Intelligence features to incorporate into their own apps.
Apple is opening up access for any app to tap directly into the on-device foundation model at the core of Apple Intelligence.
With the Foundation Models framework, app developers will be able to build on Apple Intelligence to bring users new experiences that are intelligent, available when they’re offline, and that protect their privacy, using AI inference that is free of cost. For example, an education app can use the on-device model to generate a personalized quiz from a user’s notes, without any cloud API costs, or an outdoors app can add natural language search capabilities that work even when the user is offline.
The framework has native support for Swift, so app developers can easily access the Apple Intelligence model with as few as three lines of code.
Apple Intelligence-infused Shortcuts
Shortcuts can now take advantage of Apple Intelligence capabilities.
Shortcuts are now more powerful and intelligent than ever. Users can tap into intelligent actions, a whole new set of shortcuts enabled by Apple Intelligence. Users will see dedicated actions for features like summarizing text with Writing Tools or creating images with Image Playground.
Now users will be able to tap directly into Apple Intelligence models, either on-device or with Private Cloud Compute, to generate responses that feed into the rest of their shortcut, maintaining the privacy of information used in the shortcut. For example, a student can build a shortcut that uses the Apple Intelligence model to compare an audio transcription of a class lecture to the notes they took, and add any key points they may have missed. Users can also choose to tap into ChatGPT to provide responses that feed into their shortcut.
iPadOS 26 with windowing and more
The iPad was shown some major love by Apple. What started as a single-tasking device was gradually given multi-tasking capabilities, but the company has now given it what everyone was asking for – full windowing capabilities.
iPadOS 26 introduces powerful new features that help users work with, control, organize, and switch between app windows — all while maintaining the immediacy and simplicity that iPad users expect. The new windowing system lets users fluidly resize app windows, place them exactly where they want, and open even more windows at once.
Familiar window controls allow users to seamlessly close, minimize, resize, or tile their windows. Window tiling is designed for the unique capabilities of iPad, and enables users to arrange their windows with a simple flick. If a user previously resized an app, it opens back in the exact same size and position when they open it again. With Exposé, users can quickly see all their open windows spread out, helping them easily switch to the one they need. The new windowing system works great with Stage Manager for those who want to group their windows into distinct stages, and with an external display for those who want even more space to work across their apps.
macOS 26 with major Continuity and Spotlight enhancements
macOS 26 of course gets a Liquid Glass makeover, but Apple also made major improvements to Continuity and Spotlight.
The Phone app arrives on Mac thanks to Continuity, which lets users relay cellular calls from their nearby iPhone. The Phone app on Mac has the familiar features of the Phone app on iPhone — including Recents, Favorites, and Voicemails — and the latest updates like Call Screening and Hold Assist. Call Screening automatically answers calls from unknown numbers and asks the caller for information so a user can decide whether or not to answer. And when a user is stuck on hold, Hold Assist allows them to keep their spot in line while they wait for a live agent, so users can continue working on their Mac.
Live Activities from a user’s nearby iPhone will now appear in the menu bar on their Mac so they can stay on top of things happening in real time, like an upcoming Uber ride, flight, or live sports score. When clicking on a Live Activity, the app opens in iPhone Mirroring to show more information so users can take action right from their Mac.
Spotlight, the central place to search for things on Mac, makes finding what users are looking for easier than ever, and provides users with all-new ways to take action. During a search, all results — including files, folders, events, apps, messages, and more — are now listed together and ranked intelligently based on relevance to the user. New filtering options rapidly narrow searches to exactly what a user is looking for, like PDFs or Mail messages. Spotlight can also surface results for documents stored on third-party cloud drives. And when a user doesn’t know exactly what they’re searching for, Spotlight’s new browse views make it easy to scan through their apps, files, clipboard history, and more.
Users can now take hundreds of actions directly from Spotlight — like sending an email, creating a note, or playing a podcast — without jumping between apps. Users can take actions from both Apple apps and apps built by developers, because any app can provide actions to Spotlight using the App Intents API. Users can also run shortcuts and perform actions from the menu bar in the app they’re currently working in, all without lifting their hands off the keyboard. Spotlight learns from users’ routines across the system and surfaces personalized actions, such as sending a message to a colleague a user regularly talks to. Additionally, Spotlight introduces quick keys, which are short strings of characters that get users right to the action they’re looking for.
visionOS 26 gets major improvements
Improvements here included spatial widgets, shared spatial experiences, spatial scenes, and more powerful 360-degree video support.
Users can decorate their spaces with favorite widgets, including stunning panoramas and spatial photos of their favorite memories, clocks with distinctive face designs, and quick access to their go-to playlists and songs on Apple Music. The Widgets app helps users find widgets, including those from compatible iOS and iPadOS apps, and developers will also be able to create their own widgets using WidgetKit.
Users love how visionOS lets them connect with family, friends, and colleagues remotely, and with visionOS 26, they can share spatial experiences with other Apple Vision Pro users in the same room. They can come together to watch the latest blockbuster movie in 3D, play a spatial game, or collaborate with coworkers. Users can also add remote participants from across the world via FaceTime, enabling connection with people near and far.
visionOS 26 makes spatial photos even more realistic, leveraging a new generative AI algorithm and computational depth to create spatial scenes with multiple perspectives, letting users feel like they can lean in and look around.
visionOS 26 supports native playback of 180-degree, 360-degree, and wide field-of-view content from Insta360, GoPro, and Canon. Users can enjoy their exciting 3D action footage the way it was meant to be seen. Developers can incorporate this new playback capability into their apps and websites.
tvOS gets Liquid Glass and a new karaoke feature
tvOS of course gets the new look too, along with the ability to use your iPhone as a mic.
Liquid Glass brings a vibrant look to Apple TV, delivering a fresh and expressive design that beautifully reflects and refracts its surroundings using real-time rendering. Interactions with the Apple TV app are also enhanced to create a more delightful and engaging experience while watching shows and movies. The new design keeps the content central to the viewing experience while fast-forwarding or rewinding, starting a sleep timer, adjusting audio, or setting a “Movie night” scene in Control Center.
Sing-along sessions reach a new level of fun with tvOS 26, allowing users to transform iPhone into a handheld microphone for Apple TV and have their voices amplified as they belt out their favorite songs.
AirPods get camera remote, studio mic feature
AirPods also got most of the new features we’d spotted in the works, with Apple highlighting two of them.
Creating content gets even better with studio-quality audio recording. Interviewers, podcasters, singers, and other creators can record their content with greater sound quality, and even record while on the go or in noisy environments with Voice Isolation. Studio-quality audio recording and improved call quality work across iPhone, iPad, and Mac, while also supporting the Camera app, Voice Memos, dictation in Messages, video conferencing apps like Webex, and compatible third-party camera apps.
Capturing content at a distance is easier than ever with camera remote. While using the Camera app or compatible third-party camera apps on iPhone or iPad, content creators can press and hold the AirPods stem to take a photo or start a video recording, and one more press-and-hold will stop the recording. For users who like capturing themselves singing or dancing, the new features make it easy to perform in sync with a soundtrack while simultaneously recording the video.
Which are your favorites?
You can choose multiple features in our poll, and please share all your thoughts and reactions in the comments.
Highlighted accessories
Image: Apple
<