logo
#

Latest news with #FoundationModels

WWDC 2025: Apple delivers biggest visual overhaul after iOS 7 in AI era
WWDC 2025: Apple delivers biggest visual overhaul after iOS 7 in AI era

Hans India

time31 minutes ago

  • Hans India

WWDC 2025: Apple delivers biggest visual overhaul after iOS 7 in AI era

Analysts on Tuesday said that this year, Apple highlighted its one of the core strength: delivering a unified and seamless experience across all its hardware and added an additional visual appeal to it — the biggest visual overhaul after iOS 7. Apple has previewed iOS 26 at its flagship developer conference 'WWDC 2025' here — a major update that brings a beautiful new design, intelligent experiences, and improvements to the apps users rely on every day. The new design provides a more expressive and delightful experience across the system while maintaining the instant familiarity of iOS. Counterpoint Research Director, Tarun Pathak, told IANS that the updated visual design, featuring the new "liquid glass" aesthetic, is set to further enhance user engagement across Apple's device ecosystem. 'However, two announcements particularly captured our attention: 'On-device AI for Developers' and 'iPadOS Reimagined'. The ability for developers to access Apple's on-device models through the Foundation Models framework, coupled with the introduction of Apple Intelligence across iPhone, iPad, Mac, Apple Watch, and Apple Vision Pro, is a positive move in its AI journey where it is being perceived as being behind wrt the competitors,' Pathak explained. However, AI of 2025 is different to the AI hype of 2024. Users are still getting warmed upto AI use-cases and 'we have not see a killer use case yet'. The most impactful revelation came from the iPadOS updates. These enhancements will solidify the iPad's position as a robust device for both content creation and consumption on the go, Pathak noted. The tech giant showcased watchOS 26, offering a beautiful new look and even more intelligence for a more personalised experience, to support users in staying active, healthy, and connected. A new design with Liquid Glass makes features like the Smart Stack, Control Center, the Photos watch face, and in-app navigation and controls more expressive, while maintaining the instant familiarity of watchOS. According to Prabhu Ram, VP at CyberMedia Research (CMR), Apple adopted a fundamentals-first approach, emphasising user interface (UI) enhancements over AI. Rather than chasing speculative AI trends, Apple focused on delivering immediate, tangible value to users. 'A key highlight was the introduction of on-device access to Apple's large language model through the new Foundation Models framework. This move signals a decentralised AI strategy, empowering developers to create differentiated experiences directly on Apple devices,' Ram told IANS. However, the success of this approach will hinge on developer adoption, the strength of Apple's tooling, and the company's ability to iterate quickly. Apple's tightly integrated hardware ecosystem gives it a significant advantage and a unique opportunity to lead in AI monetization. However, this window of opportunity may close rapidly if Apple fails to keep pace, especially as agentic AI and cross-platform innovations accelerate elsewhere, said Ram.

Apple unveils design overhaul & new AI tools for developers
Apple unveils design overhaul & new AI tools for developers

Techday NZ

time2 hours ago

  • Techday NZ

Apple unveils design overhaul & new AI tools for developers

Apple has announced a series of updates to its developer tools and software platforms intended to support the creation of new app experiences across iOS, iPadOS, macOS, watchOS, and tvOS. The company has introduced a new software design called Liquid Glass, which aims to bring an increased focus to content across Apple devices while retaining a sense of familiarity. This design extends from minor interface elements such as buttons and sliders to larger navigation features. Native frameworks like SwiftUI are designed to help developers adopt this new design. A new Icon Composer app has also been unveiled, intended to assist developers and designers in creating consistent and modern app icons with advanced features, such as blurring, translucency adjustment, and previewing icons in multiple tints. The Foundation Models framework is new to Apple's developer ecosystem and allows developers to build on-device experiences leveraging Apple Intelligence. The aim is to offer intelligent features that are available offline and prioritise user privacy. The framework supports Swift and is structured to simplify access to Apple's on-device AI models with minimal code. "Developers play a vital role in shaping the experiences customers love across Apple platforms," said Susan Prescott, Apple's vice president of Worldwide Developer Relations. "With access to the on-device Apple Intelligence foundation model and new intelligence features in Xcode 26, we're empowering developers to build richer, more intuitive apps for users everywhere." Automattic has already taken advantage of the Foundation Models framework for its Day One journaling app. "The Foundation Model framework has helped us rethink what's possible with journaling," said Paul Mayne, head of Day One at Automattic. "Now we can bring intelligence and privacy together in ways that deeply respect our users." Xcode 26 Apple has also updated its integrated development environment with the release of Xcode 26, adding intelligence features to assist developers in tasks such as code writing, test generation, documentation, error fixing, and more. Xcode now supports integration with large language models, including built-in support for ChatGPT, and allows for the use of third-party API keys or running models locally on Macs equipped with Apple silicon. Coding Tools in Xcode 26 offer developers suggested actions and support a range of tasks directly within the editor. Additional features include a redesigned navigation interface, improvements to localisation, and expanded voice control capabilities for coding and interface navigation. App Intents and visual intelligence This update includes enhancements to App Intents, with new support for visual intelligence. Developers can now provide visual search results that link users directly into their apps. For example, Etsy is incorporating visual intelligence to improve product discovery in its app. "At Etsy, our job is to seamlessly connect shoppers with creative entrepreneurs around the world who offer extraordinary items — many of which are hard to describe. The ability to meet shoppers right on their iPhone with visual intelligence is a meaningful unlock, and makes it easier than ever for buyers to quickly discover exactly what they're looking for while directly supporting small businesses," said Etsy CTO Rafe Colburn. Swift 6.2 and new framework support Swift 6.2 is set to bring improved performance, concurrency, and interoperability, now with expanded support for WebAssembly in conjunction with the open-source community. The new version also offers enhancements for writing single-threaded code. The Containerisation framework now enables developers to run Linux container images directly on a Mac, with an emphasis on isolation and security, and is optimised for Apple silicon. Gaming tools and APIs Updates for game developers include Game Porting Toolkit 3, new Metal 4 APIs designed for Apple silicon, and expanded support for running inference networks in shaders. Other features include MetalFX Frame Interpolation and Denoising, a new Apple Games app, Game Center enhancements, and support for managing in-game digital assets. Child safety and accessibility Developers can now use the new Declared Age Range API to deliver age-appropriate content, allowing parents to control what information is shared for their children. App Store Accessibility Nutrition Labels have also been introduced, providing users with more detailed accessibility information before downloading apps. The App Store Connect app also gains new capabilities, including viewing TestFlight feedback and crash reports, push notifications for tester feedback, enhanced support for webhooks, and expanded asset management features. The Apple Intelligence features require supported devices, including the latest iPhones, select iPads, and Macs with M1 and later processors, as well as compatible regional settings and languages. More languages are expected to be supported by the end of the year, with availability varying by region and device.

Apple Is Pushing AI Into More of Its Products—but Still Lacks a State-of-the-Art Model
Apple Is Pushing AI Into More of Its Products—but Still Lacks a State-of-the-Art Model

WIRED

time4 hours ago

  • Business
  • WIRED

Apple Is Pushing AI Into More of Its Products—but Still Lacks a State-of-the-Art Model

Jun 9, 2025 8:22 PM Apple took a measured approach to AI at WWDC. A new research paper suggests the company is skeptical about some recent AI advances, too. At WWDC25, Apple showed it was taking a more incremental approach to AI development. Courtesy of Apple Apple continued its slow-and-steady approach to integrating artificial intelligence into devices like the iPhone, Mac, and Apple Watch on Monday, announcing a raft of new features and upgrades at WWDC. The company also premiered the Foundation Models framework, a way for developers to write code that taps into Apple's AI models. Among the buzzier AI announcements at the event was Live Translation, a feature that translates phone and FaceTime calls from one language to another in real time. Apple also showed off Workout Buddy, an AI-powered voice helper designed to provide words of encouragement and useful updates during exercise. 'This is your second run this week,' Workout Buddy told a jogging woman in a demo video. 'You're crushing it.' Apple also announced an upgrade to Visual Intelligence, a tool that uses AI to interpret the world through a device's camera. The new version can also look at screenshots to do things like identify a product or summarize a webpage. Apple showcased upgrades to Genmoji and Image Playground, two tools that generate stylized images with AI. And it showed off ways of using AI to automate tasks, generate text, summarize emails, edit photos, and find video clips. The incremental announcements did little to dispel the notion that Apple is playing catch up on AI. The company does not yet have a model capable of competing with the best offerings of OpenAI, Meta, or Google, and still hands some challenging queries off to ChatGPT. Some analysts suggest that Apple's more incremental approach to AI development is warranted. 'The jury is still out on whether users are gravitating towards a particular phone for AI driven features,' says Paolo Pescatore, an analyst at PP Foresight. 'Apple needs to strike the fine balance of bringing something fresh and not frustrating its loyal core base of users,' Pescatore adds. 'It comes down to the bottom line, and whether AI is driving any revenue uplift.' Francisco Jeronimo, an analyst at IDC, says Apple making its AI models accessible to developers is important because of the company's vast reach with coders. '[It] brings Apple closer to the kind of AI tools that competitors such as OpenAI, Google and Meta have been offering for some time,' Jeronimo said in a statement. Apple's AI models, while not the most capable, run on a personal device, meaning they work without a network connection and don't incur the fees that come with accessing models from OpenAI and others. The company also touts a way for developers to use cloud models that keeps private data secure through what it calls Private Cloud Compute. But Apple may need to take bigger leaps with its use of AI in the future, given that its competitors are exploring how the technology might reinvent personal computing. Both Google and OpenAI have shown off futuristic AI helpers that can talk in real time and see the world through a device's camera. Last month OpenAI announced it would acquire a company started by the legendary Apple designer, Jony Ive, in order to develop new kinds of AI-infused hardware. Even if Apple still lags behind in terms of building advanced AI, the company is publishing AI research at a steady clip. A paper posted a few days before WWDC points to significant shortcomings with today's most advanced AI models—a convenient finding, perhaps, if you are still getting up to speed. The paper finds that the latest models from OpenAI and others, which use a simulated form of reasoning to solve difficult problems, tend to fail when problems reach a certain level of complexity. The Apple researchers asked various models to solve increasingly complex versions of a mathematical puzzle known as the Tower of Hanoi, and found that they succeeded up until a point, then failed dramatically. Subbarao Kambhampati, a professor at Arizona State University who previously published similar work on the limits of reasoning models, says Apple's research reinforces the idea that simulated reasoning approaches may need to be improved in order to tackle a wider range of problems. Reasoning models 'are very useful, but there are definitely important limits,' Kambhampati says. But even if the work suggests that a more cautious approach to AI is warranted, Kambhampati does not believe Apple is being complacent. 'If you know what's going on inside Apple, they're still pretty gung-ho about LLMs,' he says.

Apple just leapfrogged Android by giving apps access to powerful AI tools
Apple just leapfrogged Android by giving apps access to powerful AI tools

Android Authority

time10 hours ago

  • Android Authority

Apple just leapfrogged Android by giving apps access to powerful AI tools

TL;DR Apple is giving third-party apps direct, offline access to its on-device AI model. Developers can use Apple's Foundation Models framework with just a few lines of code. Testing starts today, with full rollout this fall on supported devices. Google's been trying to weave AI into Android apps for years. Apple Intelligence may have received some criticism in the past for being a little slow to the party, but at today's WWDC 2025, Apple just handed app developers the keys to its main LLM, with offline and instant access. As part of a wave of updates to its Apple Intelligence platform, the company announced that developers can now tap directly into the on-device foundation model that powers these features. This means any iOS app — not just Apple's own — can add generative AI capabilities without an internet connection. Apple pitched examples like a quiz app that auto-generates questions from notes. According to Apple's press release, the Foundation Models framework lets developers access Apple's large language model using as little as three lines of Swift code. The model supports guided generation, tool calling, and natural language processing, all locally and free of charge. Apple pitched examples like a quiz app that auto-generates questions from notes, or a hiking app that lets you search trails in plain language, even if you're completely offline. Because everything happens on-device, Apple emphasizes that it 'protects privacy by design.' Google may have been quicker in baking AI into the OS, but Apple is making it available to every app in a more seamless manner. The new developer tools and Apple Intelligence features are available for testing now via the Apple Developer Program. A public beta will arrive next month, and full access will roll out this fall to users with supported devices. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.

Apple brings ChatGPT and other AI models to Xcode
Apple brings ChatGPT and other AI models to Xcode

TechCrunch

time10 hours ago

  • TechCrunch

Apple brings ChatGPT and other AI models to Xcode

At WWDC 2025, Apple released a new version of Xcode, its app development suite, that integrates OpenAI's ChatGPT for coding, doc generation, and more. The company also announced that developers can use API keys to bring AI models from other providers to Xcode for AI-powered programming suggestions. 'Developers can connect [AI] models directly into their coding experience to write code, tests, and documentation; iterate on a design; fix errors; and more,' said the company in a blog post about the new version of Xcode, Xcode 26. Image Credits:Apple With the new AI integrations in Xcode, developers can use tools to generate a preview of code or handle other tasks. Developers can tap ChatGPT in Xcode without creating an account; paid ChatGPT user can connect their accounts to increase the rate limits. In other AI-related news today, Apple also launched the Foundation Models framework to let developers tap into the company's AI models running on-device. The company said that developers need to write just three lines of code to access these models.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store