Key Announcements at Apple’s Annual Developer Conference
At its annual Worldwide Developers Conference (WWDC25), Apple announced significant improvements to its operating systems across all devices, including visual refreshes, a new naming system for software updates, and new features in the Apple Intelligence package.
“Liquid Glass” Design Language
Apple introduced a new design language called “Liquid Glass,” which brings elegant translucency and crystal-like brilliance to application interfaces. Inspired by visionOS for the augmented reality Vision Pro device, this design adapts to light and dark modes and reacts dynamically to movement through real-time modeling.
- The new design will be implemented in buttons, multimedia controls, larger elements like tab bars and sidebars, as well as redesigned toolbars and navigation.
- Apple has released updated application programming interface (API) designs for developers to start adapting their interfaces before the new design is deployed later this year.
Operating System Updates
Apple is changing its naming convention for operating systems. Future iOS versions will be numbered based on the year following their release, similar to how automobile manufacturers number new models.
- Several parts of the iOS system are receiving a significant visual overhaul as part of the redesign.
- The Phone app now includes a call filtering feature, while the Messages app updates with customizable chat backgrounds.
- Apple plans to integrate generative AI into its coding tools, Xcode, to assist developers in writing code, testing it, and debugging errors. The company also intends to add other coding models like ChatGPT from OpenAI to Xcode.
Apple Intelligence Enhancements
The new operating system additions include Live Translation, which uses on-device AI models to translate conversations in real-time into text messages, phone calls, or FaceTime.
- Apple Pay is also receiving integration with Apple Intelligence, allowing it to track orders even for purchases made outside of Apple Pay.
- Image Playground is enhanced with a new feature that enables users to generate images with the help of ChatGPT from OpenAI.
- Apple will now allow developers to leverage its foundational device model for their own apps through the new Foundation Models framework. This will enable developers to create privacy-focused, offline-capable intelligent experiences.
Visual Intelligence
Apple’s Visual Intelligence will enable users to learn more about what’s on their iPhone screens.
- Users can search for similar images or products on Google, Etsy, and other compatible apps.
- If the tool detects you’re viewing an event, iOS 26 will suggest adding it to your calendar.
- Access to this feature can be achieved using the same button combination used for taking a screenshot on an iPhone.
Key Questions and Answers
- What is the main focus of Apple’s WWDC25 announcements? The primary focus is on enhancing operating systems, introducing a new design language called “Liquid Glass,” and providing developers with access to Apple Intelligence technology.
- How will the new “Liquid Glass” design language impact users? Users will experience elegant translucency and crystal-like brilliance in application interfaces, with dynamic adaptation to light and dark modes and real-time movement reaction.
- What changes are coming to Apple’s operating system naming conventions? Future iOS versions will be numbered based on the year following their release, similar to automobile manufacturers’ model numbering.
- How will Apple Intelligence be integrated into existing apps and tools? Developers can now use on-device AI models for real-time translation in messages, track orders with Apple Pay, and generate images using ChatGPT from OpenAI.
- What new features does Visual Intelligence offer? Users can search for similar images or products on various apps, and iOS 26 will suggest adding events to the calendar if detected.