- Visual Intelligence lets you search for anything around you by pointing your camera at it.
- The feature uses on-device intelligence along with other Apple Services to bring you answers.
- Visual Intelligence can be quickly triggered by the new Camera Control button on the iPhone 16 series.
- Visual Intelligence won't be available right away with the release of iPhone 16 and will be coming to Camera Control later this year.
Apple released the latest lineup of their iPhones at the 'It's Glowtime' time. iPhones 16 and 16 Pro models are marketed by Apple as made from the ground up for Apple Intelligence. If you missed the memo on Apple Intelligence, which the tech giant unveiled earlier this year in June at their annual WWDC conference, it's their take on Artificial Intelligence.
However, the company conveniently left out one feature during the WWDC, and rightly so, since it's tied to the hardware changes coming to the iPhone 16 lineup. Visual Intelligence is making its way to the iPhone that's similar to having Google Lens on the go on Android but the former is powered by artificial intelligence. It somewhat resembles the multimodal (image/ video) input capabilities for Google's Gemini Live or OpenAI's ChatGPT-4o Advanced Mode (both not available yet). But it seems that Visual Intelligence can only take images as input.
With Visual Intelligence, you can learn about the things around you by just pointing your iPhone's camera at it. It's triggered by pressing and holding the new Camera Control button on the iPhone 16 series.
You can ask Visual Intelligence about anything. For instance, if you point your camera at a restaurant, Visual Intelligence can quickly pull up restaurant hours, ratings, and options to check out the menu or make a reservation.
Similarly, you can point your camera at an event poster, and add it automatically to your Calendar. Notably, Google's Gemini can assist similarly on Android devices and it can even a step further and inform you if you have any prior commitments on that date.
All this is powered by a combination of on-device intelligence and Apple services that don't store your images. Additionally, Visual Intelligence can also act as your gateway to third-party services like Google and ChatGPT. The interface shows two buttons – Search and Ask.
With Search, you can ask Google anything about what's on your screen, like a bike you spot and want to buy. With Ask, you can invoke ChatGPT, and ask for help with, say, your notes. Information is only shared with third-party tools when you choose to use them.
The feature won't be available right away and will be coming later this year, with no concrete information about the release date yet. It also seems like the feature won't be coming to older iPhones that will support Apple Intelligence, though, since Craig Federighi specifically says that it would be coming to "Camera Control later this year".
Member discussion