Dive Brief:
- Apple announced a slew of new features for its Apple Intelligence artificial intelligence system during the keynote of its annual Worldwide Developers Conference on Monday. The features are currently available for testing and will become widely available in the fall.
- Updates include an expansion of visual intelligence, which helps users learn about objects around them using their iPhone camera, to content on their screen. Consumers will be able to highlight specific products they’re interested in and search for them online.
- Other new features include Live Translation, enhancements to Image Playground and Genmoji and the ability for the Shortcuts app to tap into Apple Intelligence. The tech giant previewed its iOS 26 software update, which includes an extensive redesign.
Dive Insight:
It was during last year’s WWDC keynote that Apple unveiled Apple Intelligence, and a much-anticipated partnership with OpenAI. While this year’s keynote carried a similar focus, it’s safe to say that many were left wanting for the splashy displays the tech giant is known for at its annual showcase, especially as some Apple Intelligence features teased last year hang in the balance. Still, announcements like its expansion of visual intelligence could signal a way forward for Apple, particularly when it comes to search.
“This year’s event was not about disruptive innovation, but rather careful calibration, platform refinement and developer enablement — positioning itself for future moves rather than unveiling game-changing technologies in the present,” said Francisco Jeronimo, vice president of IDC, in comments shared with Marketing Dive.
With visual intelligence, iPhone users can already learn about the places and objects around them through their phone camera. That capability now expands to the content on their iPhone screen, allowing users to ask ChatGPT questions about what they’re looking at to learn more or search supported apps including Google and Etsy. If a consumer is interested in a particular item, like a jacket they see in a photo, they can highlight it to search for the item or something comparable online.
Consumers can access visual intelligence for the content on their phone screens by pressing the same buttons used to take a screenshot, which will then give them the choice to save or share their screenshot, or use visual intelligence. The feature could signal interest by Apple in integrating its AI tools more deeply into the mobile search landscape, a move that could eventually have a critical impact on chief rival Google.
Other new Apple Intelligence features include the addition of Live Translation, which will be integrated into the Messages, Facetime and Phone apps. With Genmoji, consumers can now mix together emoji and combine them with descriptions to make something new. Previously, they could turn a text description into a custom emoji. With Image Playground, consumers can use new themes with ChatGPT, like an oil painting style.
In the Shortcuts app, users can tap into Apple Intelligence or ChatGPT to feed their shortcut. In a provided example, a student could build a shortcut that uses an Apple Intelligence model to compare an audio transcription of a lecture to the notes they took to have it add additional context that they may have missed.
Beyond, Apple previewed its latest iPhone software update, iOS 26, which carries with it a new design crafted with what the company calls Liquid Glass, a translucent material that reflects and refracts its surroundings. The design extends to the Home and Lock screens and also introduces new customization opportunities for app icons. The latest software update, which will become available in the fall, will also bring new functionality to apps including Apple Music, Maps and Wallet.
Apple also spent time highlighting new features for its Vision Pro headset as part of its visionOS 26 update, including the addition of spatial scenes. Spatial scenes leverage generative AI and computational depth to add lifelike depth to photos. Developers can use the Spatial Scene API to make a more immersive app experience, an opportunity real estate platform Zillow has already taken advantage of to allow users to see images of homes in richer depth.