logo
iOS 18.4 adds a crucial Apple Intelligence feature to the iPhone 15 Pro — and it makes your phone more powerful

iOS 18.4 adds a crucial Apple Intelligence feature to the iPhone 15 Pro — and it makes your phone more powerful

Yahoo11-03-2025

When you buy through links on our articles, Future and its syndication partners may earn a commission.
I have an iPhone 15 Pro in my possession, which means that I also have the ability to access most Apple Intelligence tools after Apple launched its suite of AI features with the iOS 18.1 release last fall. Most tools, but not all.
When Apple launched the iPhone 16 lineup last year, it also announced a new feature called Visual Intelligence. With the Camera Control button on those iPhone 16 models, you could summon Apple's answer to Google Lens and get more information and even a handful of actionable commands based on whatever it was you were pointing your phone's camera at.
Though the iPhone 15 Pro and iPhone 15 Pro Max have enough RAM and processing power to run Apple Intelligence features, Visual Intelligence has not been one of them. That means someone's $799 iPhone 16 could do something the phone you paid at least $999 for couldn't And when the $599 iPhone 16e debuted last month, we learned that it, too, could access Apple Intelligence while iPhone 15 Pro and Pro Max owners remained shut out. Why, the very idea!
That's changed, though, with the arrival of the second public beta of iOS 18.4. If you're trying out that beta on an iPhone 15 Pro, you've now gained the ability to run Visual Intelligence. And while that's not necessarily a game-changing decision, it does give your older iPhone new powers it didn't have previously. And some of those powers are proving to be quite useful.
Here's a quick rundown of how iPhone 15 Pro and iPhone 15 Pro Max users can set up their devices to take advantage of Visual Intelligence, along with a reminder of just what you can use that AI-powered feature to do.
If you want to use Visual Intelligence on your iPhone 15 Pro, you'll need to find a way to launch the feature since only iPhone 16 models come with a Camera Control button. Fortunately, you've got two other options, thanks to the iOS 18.4 update.
iOS 18.4 adds Visual Intelligence as an option for the Action button, so you can use that button on the left side of your iPhone to trigger Visual Intelligence. Here's how.
(Image: © Future)
Launch the Settings app, and on the main screen, tap on Action Button.
(Image: © Future)
You'll see a list of possible shortcuts to trigger with the Action button. Scroll through by swiping until you see Visual Intelligence. Tap on Settings to exit.
From that point, whenever you press and hold the Action button, it will launch Visual Intelligence.
If you don't want to tie up your Action button with Visual Intelligence, you can also use the ability to customize the lock screen shortcuts in iOS 18 to add a Visual Intelligence control. The iOS 18.4 adds a Visual Intelligence control that you can place on the bottom of your lock screen.
To add that control, simply edit your screen and select the control you want to customize. (In the screen above, we're putting it on the bottom left corner.) Select Visual Intelligence from the available options — you'll find it under Apple Intelligence & Siri controls though you can also use the Search bar at the top of the screen to track down the control. Tap the icon to add it to your lock screen.
So your iPhone 15 Pro is now set up to launch Visual Intelligence, either from the Action button or a lock screen shortcut. What can you do with this feature?
Essentially, Visual Intelligence turns the iPhone's camera into a search tool. We have step-by-step instructions on how to use Visual Intelligence, but if your experience is like mine, you'll find things very intuitive.
Once Visual Intelligence launches, point your camera at the thing you want to look up — it could be a business's sign, a poster or just about anything. The iOS 18.3 update that arrived last month added the ability to identify plants and animals, for example.
The information that appears on your screen varies depending on what you point at. A restaurant facade might produce the hours of the place is open, while you can also collect phone numbers and URLs by capturing them with Visual Intelligence.
I captured a poster of an upcoming event with my iPhone 15 Pro, and Visual Intelligence gave me the option of creating a calendar event with the date and time already filled in. It would be nice if the location were copied over, too, since that information was also on the poster, but we'll chalk that up to this being early days for Visual Intelligence.
Visual Intelligence can also get flummoxed in situations like these. When I tried to add a specific soccer match to my calendar from a schedule listing multiple dates, Visual Intelligence got confused as to which one to pick. (It seems to default to the date at the top of the list.) Having to edit incorrect data defeats most of the purpose of this particular capability, but you'd expect Apple to expand Visual Intelligence's bag of tricks over time.
You have two other options for expanding on the info Visual Intelligence gives you. If you've enabled Chat GPT in Apple Intelligence, you can share the information with ChatGPT by selecting the Ask button, or you can tap Search to run a Google search on the image you've collected.
Of those two options, ChatGPT seems to be the more fully featured in my experience. When I captured a recipe for a bean dip, ChatGPT initially summarized the article, but by asking a follow-up question, I could get the chatbot to list the ingredients and the steps, which I could then copy and paste into the Notes app on my own. To me, that's a lot more handy than having to pinch and zoom on a photo of a recipe or, worse, transcribing things myself.
Google searches of Visual Intelligence image captures can be a lot more hit and miss. A photo of a restaurant marquee near me produced search results with similarly named restaurants, but not the actual restaurant I was in front of.
Google Search did a much better job when I took a picture of a book cover via Visual Intelligence, and the subsequent search results produced reviews of the book from various sites. That could really be useful the next time I'm in a book store — it's a place that sells printed volumes, youngsters, ask your parents — and want to know if the book I'm thinking of buying is actually as good as its cover.
That's been my experience with Visual Intelligence so far, but my colleagues have been using it since it came out last year as everything from a virtual guide in an art museum to a navigation tool for getting out of a corn maze. If you've got an iPhone 15 Pro, you can now try out your own uses for Visual Intelligence.
Apple confirms Siri 2.0 is delayed to iOS 19 and possibly beyond
New to Apple Intelligence? Try these features first
This update improves my least favorite Apple Intelligence feature

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

iOS 18 promised the biggest update in history — here's what iOS 26 needs to fix
iOS 18 promised the biggest update in history — here's what iOS 26 needs to fix

Tom's Guide

time2 hours ago

  • Tom's Guide

iOS 18 promised the biggest update in history — here's what iOS 26 needs to fix

Apple's WWDC event is slated for next week, and there is a lot to be excited about. However, one of the most important announcements that we're expecting to see is the new iOS 26, which may also be called iOS 19. Over the last few months, we've seen a lot of rumors and leaks about the upcoming update, with Bloomberg's Mark Gurman calling it one of the biggest overhauls in Apple's history. However, the same thing was said about the last update. When it was first announced, there was a lot of talk that iOS 18 was Apple's biggest update in history. Now, it wasn't strictly untrue, as iOS 18 held a lot of promise. However, there's no doubt that the actual release left a fair amount to be desired. So, if iOS 26 wants to be the biggest update ever, it needs to solve a few things. There's no doubt that Apple Intelligence was the reason for the hype around iOS 18. However, while Apple's new AI promised the world, it left a lot to be desired, thanks to limited device support alongside delayed features. For instance, certain language support, priority notifications and more didn't launch until March of this year. Now, in the interest of fairness, I will say that the features we have work just fine, but they're heavily limited. For instance, Apple's Image Playground has an irritating UI that's more annoying than revolutionary. Added to this is the simple fact that you can't create anything novel or interesting without a lot of work. If we compare this to the other options, then you see the issue. This is a problem that runs the gamut of AI features for Apple Intelligence: there are simply better options out there. I would have thought that Apple would have some of the best features available, it's what the company is known for, after all. As such, I would want to see iOS 26 push these features to their limits, but I also want Apple to be realistic, rather than promising the world and not delivering. When it comes to missing features, there's no greater culprit than the long-awaited"Siri 2.0." Apple has been working on an improved AI assistant for a while now, but has reportedly only managed to get Siri 2.0 to work "two-thirds of the time." This is an issue because AI Assistants have slowly become an integral part of phones, yet Apple is continuing to fall behind the competition. That isn't to say the current iteration is useless, though, as Siri can manage a lot of your basic tasks. The issue is that it fails in comparison to what assistants like Google Gemini offer. However, considering all we've heard about Siri 2.0, that could change so long as Apple releases it with iOS 26. While we don't know all the features that are coming with the improved Siri, we do have some idea about the core functions. Reportedly, the new Siri offers more personal context, onscreen awareness and deeper app integration. Hopefully, Apple uses WWDC 2025 to give Apple fans the software they deserve. Of course, it isn't just Apple intelligence that needs solving, as one thing that I found to be severely lacking with iOS 18 was the (apparently) improved look of the homescreen and menus. Again, in the interest of fairness, I don't want to imply that the update was awful; it just wasn't all that interesting. For instance, the ability to tint app icons was underwhelming at best, ugly at worst. We do know that Apple is possibly planning a pretty major improvement with the so-called Project Solarium, although what this will contain is up for debate. For instance, we've seen several reports that Apple will completely overhaul the look of the UI with designs similar to Apple's VisionOS. Alternatively, one of the more interesting designs we've seen comes from a series of mock-ups from noted tech leaker Jon Prosser, which were posted on his Front Page Tech YouTube channel. While this might not be the exact look of iOS 26, it's certainly more notable and striking than what iOS 18 gave us. Making the icons more circular helps them to look cleaner, while the transparent menus make it all seem more futuristic. This has two advantages: the first is that it makes the look of the screen memorable, while the second allows Apple to be more distinct when compared with Android The video also helps to show off how we could control our phones, and they all look pretty great. For instance, the search bar appearing at the bottom of the screen is a nice touch. Not only that, the menus offering a more distinct swiping animation, as well as highlighting which tab you're on, will be a draw for many. This is the kind of change that we need to see from Apple, and it's one I hope iOS 26 brings. At the end of the day, we can't know for certain what Apple will announce, and the company is well known for being tight-lipped in the run-up to release. For instance, one of the big questions at the moment is whether Apple will even focus on AI for the presentation. In one recent report, Mark Gurman stated that it was possible Apple would avoid AI entirely, so it isn't impossible. However, I hope that, even if Apple doesn't directly mention it, iOS 26 will at least solve some of the issues I've discussed. If Apple does try to quietly sweep it under the rug and ignore what it needs to solve, that might be interesting to watch, too. On that note, what are you hoping to see during WWDC, and what would you like to see from Apple regarding iOS 26?

Apple under pressure to shine after AI stumble
Apple under pressure to shine after AI stumble

Yahoo

time3 hours ago

  • Yahoo

Apple under pressure to shine after AI stumble

Pressure is on Apple to show it hasn't lost its magic despite broken promises to ramp up iPhones with generative artificial intelligence (GenAI) as rivals race ahead with the technology. Apple will showcase plans for its coveted devices and the software powering them at its annual Worldwide Developers Conference (WWDC) kicking off Monday in Silicon Valley. The event comes a year after the tech titan said a suite of AI features it dubbed "Apple Intelligence" was heading for iPhones, including an improvement of its much criticized Siri voice assistant. "Apple advertised a lot of features as if they were going to be available, and it just didn't happen," noted Emarketer senior analyst Gadjo Sevilla. Instead, Apple delayed the rollout of the Siri upgrade, with hopes that it will be available in time for the next iPhone release, expected in the fall. "I don't think there is going to be that much of a celebratory tone at WWDC," the analyst told AFP. "It could be more of a way for Apple to recover some credibility by showing where they're headed." Industry insiders will be watching to see whether Apple addresses the AI stumble or focuses on less splashy announcements, including a rumored overhaul of its operating systems for its line of devices. "The bottom line is Apple seemed to underestimate the AI shift, then over-promised features, and is now racing to catch up," Gene Munster and Brian Baker of Deepwater Asset Management wrote in a WWDC preview note. Rumors also include talk that Apple may add GenAI partnerships with Google or Perplexity to an OpenAI alliance announced a year ago. - 'Double black eye' - Infusing its lineup with AI is only one of Apple's challenges. Developers, who build apps and tools to run on the company's products, may be keen for Apple to loosen its tight control of access to iPhones. "There's still a lot of strife between Apple and developers," Sevilla said. "Taking 30 percent commissions from them and then failing to deliver on promises for new functionality—that's a double black eye." A lawsuit by Fortnite maker Epic Games ended with Apple being ordered to allow outside payment systems to be used at the US App Store, but developers may want more, according to the analyst. "Apple does need to give an olive branch to the developer community, which has been long-suffering," Sevilla said. "They can't seem to thrive within the restrictive guardrails that Apple has been putting up for decades now." As AI is incorporated into Apple software, the company may need to give developers more ability to sync apps to the platform, according to Creative Strategies analyst Carolina Milanesi. "Maybe with AI it's the first time that Apple needs to rethink the open versus closed ecosystem," Milanesi said. - Apple on defensive - Adding to the WWDC buildup is that the legendary designer behind the iPhone, Jony Ive, has joined with ChatGPT maker OpenAI to create a potential rival device for engaging with AI. "It puts Apple on the defensive because the key designer for your most popular product is saying there is something better than the iPhone," Sevilla said. While WWDC has typically been a software-focused event, Apple might unveil new hardware to show it is still innovating, the analyst speculated. And while unlikely to come up at WWDC, Apple has to deal with tariffs imposed by US President Donald Trump in his trade war with China, a key market for sales growth as well as the place where most iPhones are made. Trump has also threatened to hit Apple with tariffs if iPhone production wasn't moved to the US, which analysts say is impossible given the costs and capabilities. "The whole idea of having an American-made iPhone is a pipe dream; you'd have to rewrite the rules of global economics," said Sevilla. One of the things Apple has going for it is that its fans are known for their loyalty and likely to remain faithful regardless of how much time it takes the company to get its AI act together, Milanesi said. "Do people want a smarter Siri? Hell yeah," Milanesi said. "But if you are in Apple, you're in Apple and you'll continue to buy their stuff." gc/arp

Apple's Siri Could Be More Like ChatGPT. But Is That What You Want?
Apple's Siri Could Be More Like ChatGPT. But Is That What You Want?

Yahoo

time3 hours ago

  • Yahoo

Apple's Siri Could Be More Like ChatGPT. But Is That What You Want?

I've noticed a vibe shift in the appetite for AI on our devices. My social feeds are flooded with disgust over what's being created by Google's AI video generator tool, Veo 3. The unsettling realistic video of fake people and voices it creates makes it clear we will have a hard time telling apart fiction from reality. In other words, the AI slop is looking less sloppy. Meanwhile, the CEO of Anthropic is warning people that AI will wipe out half of all entry-level white-collar jobs. In an interview with Axios, Dario Amodei is suggesting government needs to step in to protect us from a mass elimination of jobs that can happen very rapidly. So as we gear up for Apple's big WWDC presentation on Monday, I have a different view of headlines highlighting Apple being behind in the AI race. I wonder, what exactly is the flavor of AI that people want or need right now? And will it really matter if Apple keeps waiting longer to push out it's long promised (and long delayed) personalized Siri when people are not feeling optimistic about AI's impact on our society? In this week's episode of One More Thing, which you can watch embedded above, I go over some of the recent reporting from Bloomberg that discusses leadership changes on the Siri team, and how there are different views in what consumers want out of Siri. Should Apple approach AI in a way to make Siri into a home-grown chatbot, or just make it a better interface for controlling devices? (Maybe a bit of both.) I expect a lot of griping after WWDC about the state of Siri and Apple's AI, with comparisons to other products like ChatGPT. But I hope we can use those gripes to voice what we really want in the next path for the assistant, by sharing our thoughts and speaking with our wallet. Do you want a Siri that's better at understanding context, or one that goes further and makes decisions for you? It's a question I'll be dwelling on more as Apple gives us the next peak into the future of iOS on Monday, and perhaps a glimpse of how the next Siri is shaping up. If you're looking for more One More Thing, subscribe to our YouTube page to catch Bridget Carey breaking down the latest Apple news and issues every Friday.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store