
WWDC 2025 — Apple can't afford to take a 'gap year' on AI
When talk first surfaced in March that this year's iOS update would feature very little in the way of consumer-facing Apple Intelligence features, it seemed hard to believe. Apple had put such an emphasis on AI at its 2024 developers conference, you would expect a similar push for WWDC 2025 — especially after Apple Intelligence didn't exactly come roaring out of the gate.
Well, a lot has changed in the ensuing months, including the possibility that the update we thought was going to be iOS 19 will instead be renamed iOS 26. But just ahead of next week's WWDC, it seems the possibility of Apple Intelligence being a major focus has become even more remote.
That's my takeaway after a recent column by Bloomberg's Mark Gurman, in which the well-sourced reporter contends that Apple "will do little at WWDC to show it's catching up to leaders like OpenAI and Google." Developers will reportedly get access to Apple's large language models to incorporate AI features into their apps, and we've heard talk that some features, like new iPhone battery management capabilities, will tap into AI. But otherwise, words like "letdown" and "gap year" are being bandied about.
On the surface, it still seems like a remarkable reversal on Apple's part, particularly in light of last month's AI-heavy Google I/O event, where one of Apple's chief rivals spent two hours demonstrating just how far ahead of Apple it is when it comes to integrating artificial intelligence into everyday activities. To go light on Apple Intelligence at WWDC 2025 would seem to relegate Apple to also-run status.
Even so, if you look a little more closely at where Apple is with its AI efforts, the company may have no other choice.
A charitable recap of the year since Apple previewed its AI tools would describe the efforts as "hit and miss." As I've noted before, there are some Apple Intelligence features I really like, such as email summaries of long back-and-forth exchanges and Visual Intelligence, especially now that the image recognition feature works on the iPhone 15 Pro and iPhone 15 Pro Max after initially being limited to iPhone 16 models.
But for the most part, Apple Intelligence additions like Writing Tools, Image Playground and Genmoji are basic at their best and frivolous at their worst. I also get the sense that they haven't added much over time, at least if my experiences with Image Playground and Memory Movies are anything to go by. As a result, it's too easy to just ignore Apple Intelligence features — to go about using your iPhone like you always have.
That's not Apple's biggest miscue with Apple Intelligence, though. Instead, the biggest problem is that some promised features — like an AI-infused revamp of the Siri personal assistant — never actually materialized. We were told Siri was going to work across multiple apps and understand our personal data, but now Apple says it needs more time to get that feature working properly. At this point, we may not see substantial changes to Siri until 2026.
And that might explain why Apple is so reluctant to make much of a fuss over Apple Intelligence at WWDC 2025. The company faced some serious backlash for previewing AI features that weren't ready to ship last year. It probably decided that a repeat would further damage its credibility, which could have greater long-term consequences than if it shows off very little.
If there's any consolation to this year without significant Apple Intelligence updates, it sounds from Gurman's reporting that Apple has plenty of irons in the AI fire. The company is reportedly working on a new version of Siri's architecture so that the personal assistant can better execute those features we were promised last year. The Shortcuts app is getting a revamp, too, that will work with more Apple Intelligent features. Beyond that, there's an AI-powered health coaching feature and an Apple-built chatbot in development.
The trouble is, none of this stuff is likely to be ready this year — at least not to the point where Apple likely feels confident about showing it off in public again. So we're back to a WWDC 2025 keynote that's going to largely focus on massive software redesigns, with AI playing a supporting role at most.
I can certainly understand Apple's motivation for keeping things low-key, and I appreciate how events like this are geared toward accentuating the positives, rather than dwelling on what went wrong. Still, if Apple's looking to re-establish credibility, it should address the AI-shaped elephant in the room next week.
I'm not talking about Apple executive Craig Federighi appearing on stage at WWDC in a hairshirt or cuts to other Apple higher-ups in the crowd rending their garments in penitence. But acknowledging publicly that Apple is taking a step back to make sure the next major AI feature push is done right would restore some confidence in Apple Intelligence's future.
A WWDC 2025 keynote without much in the way of Apple Intelligence isn't going to do much to dispel the notion that Apple's an also-ran in the AI race. But the way you minimize that perception is to let people know you're coming back strong in 2026.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
43 minutes ago
- Yahoo
Tesla's Optimus robot VP is leaving the company
The head of Tesla's Optimus humanoid robot program, Milan Kovac, is leaving the company. Kovac said Friday in a post on X that he "had to make the most difficult decision" of his life to leave. "I've been far away from home for too long, and will need to spend more time with family abroad," he wrote. Kovac said that was "the only reason" and that his support for Musk and Tesla is "ironclad." Kovac's departure was first reported Friday by Bloomberg News. The departure comes as Tesla CEO Elon Musk has claimed the company will have "thousands" of Optimus robots operating in its factories by the end of this year. "And we expect to scale Optimus up faster than any product, I think, in history, to get to millions of units per year as soon as possible," Musk said last month. Kovac worked at Tesla for nearly 10 years, with much of that time coming as a top engineer on the Autopilot team. He was tapped to help lead development of Optimus in 2022 and became a vice president overseeing the program in late 2024. "I'm driving the Optimus program (Tesla's humanoid robot) & all its engineering teams," Kovac previously wrote on his LinkedIn profile. "Separately, I'm also driving the engineering teams responsible for all the software foundations & infrastructure common between Optimus and Autopilot." Ashok Elluswamy, the vice president of Tesla's AI software division, will take over the Optimus project, according to Bloomberg. This story has been updated with information from Kovac's X post about his departure.


Bloomberg
an hour ago
- Bloomberg
Investing Africa: 'Verdict Is Out' on African VC Funding
Enygma Ventures Founder Sarah Dusek says that Venture Capital funding in Africa could get a boost in 2025 with investors potentially diverting their capital away from more traditional destinations. She speaks to Bloomberg's Jennifer Zabasajja. (Source: Bloomberg)


Tom's Guide
an hour ago
- Tom's Guide
5 features iOS 26 needs to steal from Google to catch up on AI
I've been enjoying Google's AI features on my Pixel phones for the last couple of years. Starting with the Pixel 8 Pro and proceeding with the Pixel 9 Pro, Google has proven to me that its AI features in its Pixel phones are unmatched — and Apple's in trouble if it doesn't catch up. With WWDC 2025 right around the corner, it's Apple's chance to redeem itself by introducing more Apple Intelligence features for what's presumably going to be the next iteration of its phone software: iOS 26. While there's been a handful of useful AI features, such as Visual Intelligence and Photo Clean Up to name a few, iPhones could still stand to get more. In fact, there are a number of Google AI features I think Apple needs to copy that could boost the iPhone experience. I'm not saying outright steal the same exact features, but at least come up with something similar — or if not, better one. If there's one AI feature that Apple desperately needs to copy from Pixel phones, it has to be none other than Call Screen. Not only is it one of the most underrated AI features I've tried in any phone, but it's also one of the most helpful. Call Screen allows Pixel phones to take incoming calls on your behalf, using Google Assistant to listen to callers and then provide you with contextual responses on your phone to choose. Think of it like an actual assistant who's fielding the call for you and relaying your response. I can't tell you how many times it's been such a lifesaver when I'm stuck in a work meeting. Although it technically debuted with the Galaxy S25 Ultra, the cross-app actions function migrated to Pixel phones and it shows the impressive abilities of AI. While Apple Intelligence can call on Siri to perform simple actions, it doesn't have the ability to connect with third-party apps — which is exactly what makes cross-app actions such a big game changer with Pixel phones. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Through simple voice commands, it can work with several apps to complete a certain request. For example, you can ask Gemini on a Pixel phone to summarize an email or find a nearby restaurant that's pet friendly and add a calendar appointment for it. Another feature that debuted with Samsung and eventually made its way to Pixel phones is Circle to Search. Apple currently doesn't have anything like it, although you could argue that Visual Intelligence can effectively function in almost the same way. With Circle to Search, it's a quick and convenient way to perform searches directly on-device, from whatever app you're using. When activated, you simply circle or select what you're looking at on your phone's screen to perform a search — which could result in answering a question, performing a general Google Search, identifying something, and even finding deals on a product. One AI feature I've come to appreciate as a photo editor is the Pixel's Reimagine tool, which allows me to select parts of a photo and transform it into something else through a text description. The closest Apple Intelligence feature to this would be Image Playground, but that generates images from scratch through a text description — it doesn't work with existing photos. Reimagine helps to make existing photos look better, whether it's to change up the scene entirely or make minor edits. I personally love being able to select the sky in my photos and change it up to something else, or using Reimagine to insert different elements with realism. Even though it could benefit from a few enhancements, Pixel Screenshots can be better at helping you recall information you might forget — or need to remember for later on. It's exclusively available on the Pixel 9, Pixel 9 Pro, Pixel 9 Pro XL, and Pixel 9 Pro Fold and lets you use the screenshot function and AI to recall details in them. For example, if you screenshot a pizza recipe you want to try for later, or the details about an upcoming party you're going to, Pixel Screenshots will allow you to perform a search to find the exact details about it. Apple doesn't have a comparable AI feature, but wouldn't it be neat if Apple Intelligence could recall the most obscure (or detailed) information that you go through on your iPhone.