Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Apple made several artificial intelligence (AI)-related announcements, branded as Apple Intelligence, at the Worldwide Developers Conference (WWDC) 2025 on Monday. During the keynote session, the company recapped the existing AI features amd unveiled new features that are now available for testing, and will be rolled out to users later this year. These new features include Live Translation, Workout Buddy in Apple Watch, ChatGPT integration in Visual Intelligence, updates to Genmoji and Image Playground experiences, as well as AI capabilities on Shortcuts.
Craig Federighi, the Senior Vice President (SVP) of Software Engineering at Apple, announced that the tech giant is now opening up access to its on-device foundation models to third-party app developers. These AI models also power several Apple Intelligence features. Developers can access these AI proprietary models to build new features within their apps, or entirely new apps via the Foundation Models Framework.
Apple highlighted that since these are on-device models, these AI capabilities will function even when a device is offline. Notably, it will also ensure that the user data never leaves the device. For developers, they will not have to pay any application programming interface (API) cost for on-cloud inference. The framework natively supports Swift, allowing developers to access the AI models seamlessly. Additionally, the framework also supports guided generation, tool calling, and more.
Federighi mentioned that Siri will not be getting the advanced AI features teased at last year’s WWDC till 2026, which is when Apple will share more information about it. However, this year, the Cupertino-based tech giant is planning to ship a few more Apple Intelligence features.
The biggest new arrival is Live Translation. The AI-powered feature is being integrated into the Messages app, FaceTime, and the Phone app to allow users to easily communicate with those who speak a different language. It is an on-device feature, which means the conversations will not leave the users’ devices.
Live Translation will automatically translate messages in the Messages app. Users will see an option to automatically translate their messages as they type, and they can then send it to their friends and colleagues in a language they speak and understand. Similarly, when the user receives a new message in a different language, the feature will instantly translate it.
On FaceTime calls, the feature will automatically add live captions in users’ language to help them follow along. During phone calls, Live Translation will translate what a person says in real-time and speak it aloud.
Apart from Live Translation, Apple is also updating Visual Intelligence. iPhone users can now ask ChatGPT questions while looking through their device’s camera. The OpenAI chatbot will know what the user is looking at and understand the context to answer user queries. It can also search apps such as Google and Etsy to find similar images and products. Additionally, users can also look for a product online just by highlighting it in their camera.
Apple says Visual Intelligence can also recognise when a user is looking at an event, and automatically show suggestion to add it to their calendar.