
Apple to open up underlying AI technology to developers
Apple says it will open up the underlying technology it uses for Apple Intelligence and has announced an overhaul of its operating systems.
The tone and content of the presentations at its annual Worldwide Developers Conference focused more on incremental developments, including live translations for phone calls, that improve everyday life rather than the sweeping ambitions for AI that Apple's rivals are marketing.
Apple software chief Craig Federighi said the company is opening up the foundational AI model that it uses for some of its own features to third-party developers.
"This work needed more time to reach our high quality bar," Federighi, senior vice president of software engineering, said of the delays of some features such as improvements to the Siri virtual assistant.
In an early demonstration of how partners could improve Apple apps, the company added image generation from OpenAI's ChatGPT to its Image Playground app, saying that user data would not be shared with OpenAI without a user's permission.
Apple is facing an unprecedented set of technical and regulatory challenges as some of its key executives kicked off the company's annual software developer conference on Monday.
Shares of Apple, which were flat before the conference, slipped 1.5 per cent after executives took to the stage in Cupertino, California.
Federighi also said Apple plans a design overhaul of all of its operating systems.
Apple's redesign of its operating systems centered on a design it calls "liquid glass" where icons and menus are partially transparent, a step Apple executives said was possible because of the more powerful custom chips in Apple devices versus a decade ago.
Federighi said the new design will span operating systems for iPhones, Macs and other Apple products.
He also said Apple's operating systems will be given year names instead of sequential numbers for each version.
That will unify naming conventions that have become confusing because Apple's core operating systems for phones, watches and other devices kicked off at different times, resulting in a smattering of differently numbered operating systems for different products.
In other new features, Apple introduced "Call Screening" where iPhones will automatically answer calls from an unknown number and ask the caller the purpose of their call.
Once the caller states their purpose, the iPhone will show a transcription of the reason for the call, and ring for the owner.
Apple also said it will add live translation to phone calls, as well as allow developers to integrate its live translation technology into their apps.
Apple said the caller on the other end of the phone call will not need to have an iPhone for the live translation feature to work.
Apple's Visual Intelligence app - which can help users find a pair of shoes similar to ones at which they have pointed an iPhone camera - will be extended to analysing items on the iPhone's screen and linked together with apps.
Apple gave an example of seeing a jacket online and using the feature to find a similar one for sale on an app already installed in the user's iPhone.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Sydney Morning Herald
2 hours ago
- Sydney Morning Herald
Inside the secretive labs where Apple's torturers put iPhones to the test
Most of us have, at some point, dropped a phone. Sometimes it hits at just the wrong angle, or on just the wrong surface, and shatters. Other times, it's miraculously unscathed, either because of sheer luck or because of the way it's been designed. In Sunnyvale, California, inside an unmarked and nondescript building, a team of engineers drops more devices each day than you hopefully will in your entire life. The building is home to Apple's durability labs – among many similar facilities around the world – where phones and other products are thrown, dunked, sprayed, submerged, humidified, salted, buffeted, shaken and dismantled. Not only to test their durability and qualify for certifications, but to guide design decisions from the earliest development stages to help the final devices survive the dangers of the outside world. When I visit, the staff are friendly and eager to discuss their meticulous and scientific brand of tech torture (though Apple has not allowed me to quote them). They also give the impression of lab workers who aren't used to visitors. Their work is largely out of the public eye, even more so than some of the work at the nearby main Apple campus in Cupertino. Something that becomes immediately apparent is that, while Apple wants to simulate real-world scenarios, it can't just have its workers drop an iPhone down the stairs or slip an iPad into a soapy bath. The incidents have to be consistent and replicable, so any damage can be understood and mitigated, meaning there's an awful lot of science involved. And robots. But the first area I find is largely robot-free. Here, devices are subjected to simulated worst-case environmental conditions. A massive walk-in cupboard has new iMacs operating in 90 per cent humidity, at 40 degrees. A month in there can simulate years of muggy real-world exposure. Elsewhere, iPhones are being soaked in a high-density salt mist, or withstanding a vortex of artificial sand, designed to simulate the particulate matter of the Arizona desert. A UV chamber simulates the long-term effect of the sun on devices. Sure, you could just put them outside, but the chamber can impart many years worth of rays in just 50 hours. When Apple introduces a punchy new colour or sparkly new finish for one of its devices, it's one that's put up with this kind of punishment and come through fine. Other potential finishes may not be so lucky. Loading It's not all about making sure the devices stay nice on the outside, though. They're tested thoroughly to ensure 100 per cent functionality after their ordeals, and autopsied to check for corrosion or dust ingress. The tests are developed against real-world data indicating the worst likely cases of what could happen to a consumer's device. Part of that comes from analysing damaged products that are sent in for repair or recycling, but a lot also comes from devices in the wild, with anonymised data including the amount of sunlight hitting the sensors and other analytics. When you set up an Apple product and it asks whether you want to send the company data to help improve its products, this is some of the stuff it's talking about.

The Age
2 hours ago
- The Age
Inside the secretive labs where Apple's torturers put iPhones to the test
Most of us have, at some point, dropped a phone. Sometimes it hits at just the wrong angle, or on just the wrong surface, and shatters. Other times, it's miraculously unscathed, either because of sheer luck or because of the way it's been designed. In Sunnyvale, California, inside an unmarked and nondescript building, a team of engineers drops more devices each day than you hopefully will in your entire life. The building is home to Apple's durability labs – among many similar facilities around the world – where phones and other products are thrown, dunked, sprayed, submerged, humidified, salted, buffeted, shaken and dismantled. Not only to test their durability and qualify for certifications, but to guide design decisions from the earliest development stages to help the final devices survive the dangers of the outside world. When I visit, the staff are friendly and eager to discuss their meticulous and scientific brand of tech torture (though Apple has not allowed me to quote them). They also give the impression of lab workers who aren't used to visitors. Their work is largely out of the public eye, even more so than some of the work at the nearby main Apple campus in Cupertino. Something that becomes immediately apparent is that, while Apple wants to simulate real-world scenarios, it can't just have its workers drop an iPhone down the stairs or slip an iPad into a soapy bath. The incidents have to be consistent and replicable, so any damage can be understood and mitigated, meaning there's an awful lot of science involved. And robots. But the first area I find is largely robot-free. Here, devices are subjected to simulated worst-case environmental conditions. A massive walk-in cupboard has new iMacs operating in 90 per cent humidity, at 40 degrees. A month in there can simulate years of muggy real-world exposure. Elsewhere, iPhones are being soaked in a high-density salt mist, or withstanding a vortex of artificial sand, designed to simulate the particulate matter of the Arizona desert. A UV chamber simulates the long-term effect of the sun on devices. Sure, you could just put them outside, but the chamber can impart many years worth of rays in just 50 hours. When Apple introduces a punchy new colour or sparkly new finish for one of its devices, it's one that's put up with this kind of punishment and come through fine. Other potential finishes may not be so lucky. Loading It's not all about making sure the devices stay nice on the outside, though. They're tested thoroughly to ensure 100 per cent functionality after their ordeals, and autopsied to check for corrosion or dust ingress. The tests are developed against real-world data indicating the worst likely cases of what could happen to a consumer's device. Part of that comes from analysing damaged products that are sent in for repair or recycling, but a lot also comes from devices in the wild, with anonymised data including the amount of sunlight hitting the sensors and other analytics. When you set up an Apple product and it asks whether you want to send the company data to help improve its products, this is some of the stuff it's talking about.

9 News
a day ago
- 9 News
Exclusive: Apple executive admits Siri revamp was pulled for falling short
Your web browser is no longer supported. To improve your experience update it here Exclusive: One year ago, Apple announced a suite of new features and capabilities for their products that tapped into the absolute boom in artificial intelligence , and called them Apple Intelligence. This week, the company fronted up with their next round of software updates, but missing from any list was a core update to Siri announced in 2024. Referenced quickly in the Keynote by Apple as needing "more time to reach a high-quality bar", I sat down with the company's Senior Vice President of Worldwide Marketing, Greg "Joz" Joswiak, to unpack the company's approach to artificial intelligence and the misstep that is the missing Personal Context features in Siri. Trevor Long speaks with Apple's Senior Vice President of Worldwide Marketing, Greg "Joz" Joswiak. (Trevor Long) When talking about AI, Joswiak says it has to be like any Apple feature and "just work", telling 9News exclusively: "Our approach is to take generative AI and use it to make the features across our apps, across our operating systems, across our products to make those things better. "Sometimes you don't even know or care that you're using generative AI, that you're using Apple Intelligence to do those things. They just - and you know us - it just works." But 12 months ago, the company promised a new and innovative experience with the voice assistant Siri, one that could know more about you and your life. "One of the things that we wanted to do for it (Siri), that we talked about last year, was to make it a more personal context," Joswiak says. "That it was able to use a semantic index of all the information about you on your device. "So, for example, when's my mum's flight coming in? "It knows who your mum is in that context. It knows that flight information, no matter where it came in, was that information on text or an email?" Twelve months ago, the company promised a new and innovative experience with the voice assistant Siri. (Adobe Stock) It didn't happen - the feature hasn't launched yet. Joswiak says "we said these things would be coming in the coming months, and that we thought we'd ship by later in the year". "It just wasn't quite hitting our quality standards," he says. "So we said, okay, maybe we should do it by the spring. And it still again, was working, but too many times it was not working correctly. "So while a demonstration on stage was possible, the concept of taking all the information on your phone and using that to help Siri make decisions based on questions you asked just wasn't working in the real world." This brought the development teams to a decision point. "So we had to make a decision and say, look, do we want to ship it to our customers and say, okay, look, we did it. We promised you it would, but it's again, it's not perfect," Joswiak says. "Or do we want to wait until we can do it better? And we knew we were working on another version of Siri, a new generation of Siri, that would allow us the underpinnings to do it better, to do it with a much lower error rate. "So we had to make what I would say is a tough call to hold it off." Despite fierce criticism in recent months of this failure to deliver, Joswiak says he "would make that call again". "To say that we want to deliver a better experience, not just a checkbox that we shipped it, but shipped an experience that hit our level of quality," he says. "And so, while we never want to disappoint people, I think we disappoint them more if we ship something that didn't work." Apple announced several changes coming with iOS 26 - but not the promised Siri update. (Supplied) Clearly though, Apple knows they are getting more attention on this than perhaps might be deserved, with Joswiak pointing the finger, albeit vaguely, at other companies who don't receive the same scrutiny. "We want our customers to have great experiences with our product," he says. "And we've made mistakes where we ship something that we wish was more perfect than it was. "And in the end, you learn from those and you say, look, we have a quality bar, we want to make sure that we hit it. "Oftentimes you see our competitors will announce things, not ship them. And sometimes people who don't even notice." For Joswiak and Apple, this is about quality and the experience. "People are so used to us delivering what we say or delivering at a quality level, expectations are different from that," he says. "And we welcome that. That's who we are. We love that our customers have high expectations for us. And like I said, we never want to disappoint them, but we also want to make sure that we deliver quality products to them." With all that said, it appears unlikely we'll see any dramatic improvement to Siri in 2025; these fundamental updates and personal context features are likely to come in 2026, hopefully well before the company's next Worldwide Developers Conference. Trevor Long travelled to the US as a guest of Apple. Apple Technology Tech World national Artificial Intelligence CONTACT US