My new favorite iOS 26 feature is a supercharged version of Google Lens - and it's easy to use

4 days ago 2
iPhone 16 Camera Control
Kerry Wan/ZDNET

Apple Intelligence has largely fallen flat among consumers, with delayed deliveries and underwhelming performances; giving Siri context awareness was one of these AI features that Apple failed to deliver. 

However, on Monday, the company announced at WWDC that it is expanding Visual Intelligence to include on-screen awareness, so you can search not only what your iPhone camera captures, but also what is on your screen. 

Also: Everything announced at Apple's WWDC 2025 keynote: Liquid Glass, MacOS Tahoe, and more

"Last year, we took the first steps on a journey to bring users intelligence that's helpful, relevant, easy to use, and right where users need it, all while protecting their privacy," said Craig Federighi, Apple's senior vice president of Software Engineering. "Now, the models that power Apple Intelligence are becoming more capable and efficient, and we're integrating features in even more places across each of our operating systems." 

Federighi added that Apple is giving developers access to Apple Intelligence's on-device foundation model. 

Visual Intelligence already lets you search objects by capturing them with your camera, like you would with Google Lens. With iOS 26, users will be able to search and ask ChatGPT questions about what is displayed on their screen. This makes it easy to look up a pair of shoes you may see in a photo while scrolling Instagram or add events to your calendar by simply using a photo of a poster. 

When you take a screenshot, the Visual Intelligence capabilities appear. Just like you can edit, mark up, or crop a screenshot, a new option for searching with Visual Intelligence will appear. The AI-powered feature will process what is on your screen to recognize the context and offer options accordingly, including an Add to Calendar function to easily mark your calendar with screenshotted event flyers. 

Also: 4 best iPadOS 26 features revealed at WWDC - and which iPads will get them

Developers will be able to create tools that enable search capabilities using on-screen context awareness using App Intents.

Apple also announced other updates to its AI features, including Live Translation on Messages and Phone, integrations with Shortcuts, new languages coming by the end of the year, and improvements to Genmoji and Image Playground. 

Want more stories about AI? Sign up for Innovation, our weekly newsletter.

Editorial standards
Read Entire Article