iPhone 15 Pro users just got a major AI upgrade with Visual Intelligence

Apple's second iOS 18.4 beta adds cool new AI feature to the iPhone 15 Pro

ZDNET

Owners of the iPhone 15 Pro and Pro Max can now tap into a helpful AI-powered feature, courtesy of the latest iOS 18.4 developer beta.

Previously exclusive to the iPhone 16

Launched on Monday, the new beta gives users of these older phones the ability to set up and use Visual Intelligence. Previously accessible only on the iPhone 16, Visual Intelligence lets you run web searches and ask questions about the people, places, and things you view through the camera.

Also: Apple Intelligence now needs 7GB of storage, up from 4GB – here’s why

Beyond supporting the AI-powered feature, the new beta adds a couple of new ways to trigger it.

How to trigger Visual Intelligence on iPhone 15 Pro models

The four iPhone 16 models use the physical Camera Control to launch Visual Intelligence. But that button doesn’t exist on the iPhone 15 Pro or Pro Max. Instead, users of the iPhone 15 Pro models and the iPhone 16 can now use the Action button to activate Visual Intelligence. Since the Action button is customizable, you can set it up to perform a variety of different actions.

With the new beta, you can also launch Visual Intelligence directly from Control Center. Swiping down from the top of the screen reveals a new Apple Intelligence section with options for activating Siri, using the Type to Siri feature, and triggering Visual Intelligence.

Here’s how this would work on the latest developer beta.

Also: I bought an iPhone 16 for its AI features, but I haven’t used them even once – here’s why

On an iPhone 15 Pro, Pro Max, or any iPhone 16, you’d head to Settings and select the option for Action Button. Swiping through the next screen would show all the potential actions you’re able to set. After finding the one for Visual Intelligence, you’d exit the Settings screen.

Now let’s say you spot an animal, plant, landmark, business, or other item that strikes your curiosity. Aim your phone at the object and hold down the Action button or select Visual Intelligence from Control Center. The next screen gives you two choices. Tap Search to run a Google Search on the object. Otherwise, tap Ask, and you can pose questions about the object that ChatGPT will attempt to answer.

Official release

Last month, Apple representatives confirmed to Daring Fireball’s John Gruber that the iPhone 15 Pro and Pro Max would receive Visual Intelligence. Though the company didn’t specify when that would happen, Gruber said he believed it would arrive with iOS 18.4. With this version only on its second developer beta, we have a few more iterations to go until the official release arrives in April.

Scroll to Top