Latest news with #CraigFederighi


Tom's Guide
2 days ago
- Tom's Guide
iOS 26 beta just teased the highly anticipated HomePod with a screen
The fourth developer beta of iOS 26 has arrived and, allegedly, it comes with code that teases HomePod settings — including one setting that seems to reference a HomePod with a display. The HomePod data was discovered by MacRumors, who found the foreshadowing phrase: "Your HomePod won't be able to show you the local weather, time, or respond to Siri requests about your area." The suggestive word is "show", since right now the HomePod can't show you anything like the weather or time since there is no screen of any kind to display that information. But that could change with the upcoming Apple HomePod 3. Beyond the slightly sketchy phrasing, there doesn't appear to be more in the code, but what's there is believed to be another hint at the long-rumored next-generation HomePod speaker. Up to this point, only leaks and rumors have painted a picture of an updated Apple speaker that features an OLED iPad-like display that's supposed to be powered with a more AI-forward Siri 2.0, but no solid evidence has yet to appear. At one point, the new HomePod was supposed to launch in March of this year, but recent rumors have pushed the device into the third quarter of 2025. Part of the problem, reportedly, is that Siri integration. Many of the rumors claim that Apple wants to turn the HomePod into a home hub that utilizes AI and voice directives to control your smart home via the speaker, but that has caused a delay in development. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. In addition to advanced Siri support, the HomePod 3 is supposed to control HomeKit and Matter devices, run Apple apps and work in a similar fashion to an Amazon Echo Show. It's no secret that Apple is struggling with the building its own AI features. Recently the company was dealt a major blow to its Apple Intelligence strategy, largely because Apple keeps getting in its own way when it comes to AI. It's gotten so bad that there are even reports that Apple may hand off AI upgrades to other companies like OpenAI or Anthropic. Apple's Craig Federighi and Greg Joswiak spoke with Tom's Guide during WWDC 2025 and discussed Apple's approach to AI and how the company is working to improve Apple Intelligence going forward. "We will announce the date when we're ready to seed it, and you're all ready to be able to experience it,' said Federighi said of an updated Siri 2.0. Since then, though, nothing has been announced. Despite rumors that the HomePod 3 is still coming in 2025, it's not clear when Apple will finally release it, as the company continues to grapple with its AI problem.


Tom's Guide
3 days ago
- Business
- Tom's Guide
Apple's AI efforts dealt another major blow — this has caused 'an earthquake inside Apple'
Apple's push to catch up in the AI race has hit another serious roadblock. The company has lost Ruoming Pang, the highly respected leader of its foundation models team, to Meta — and insiders say the ripple effects are already being felt across Apple's AI division. Pang, who joined Apple in 2021 from Google DeepMind, was central to the company's efforts to build its own large language models (LLMs). His departure, along with that of several close collaborators, signals deeper unrest within Apple's AI ranks. As reported by The Information, Pang's exit and its aftermath has led to an 'earthquake inside Apple." You may like Why Pang's exit matters (Image credit: Shutterstock) Pang was known for his hands-on technical contributions, including developing a key open-source training tool for Apple's AI models. Under his leadership, Apple made strides in shrinking LLMs to run efficiently on iPhones, a critical part of its 'on-device AI' strategy. But those advances came with internal tensions. According to reporting from The Information, Pang's team had wanted to release some of Apple's AI models as open source earlier this year. This move could have shown progress while inviting collaboration from outside researchers. But Apple exec Craig Federighi reportedly shut it down, concerned it would expose performance compromises Apple made to run the models on iPhones. That disagreement was just one of many signs of friction between Apple's research-driven foundation models team and its product-focused leadership. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. A shift in power (and priorities) (Image credit: Shutterstock) Earlier this year, Apple reorganized its AI efforts following delays to its revamped Siri assistant. The Siri team was pulled from longtime AI chief John Giannandrea and placed under Federighi, who also oversees Apple's software division. Meanwhile, Pang's team remained with Giannandrea, but the separation highlighted a growing divide between R&D and product execution. Now, with Pang gone and several of his top researchers either leaving or exploring offers from OpenAI, Anthropic, and Meta, Apple faces a major talent drain at a critical moment. Bloomberg recently reported that Apple is testing outside models, including those from OpenAI and Google, to power Siri, a move that reportedly disheartened many on the internal AI team. The bigger picture (Image credit: Shutterstock) While Apple made headlines with its Apple Intelligence announcement in June, integrating ChatGPT into iPhones and showcasing writing and image-generation tools, the company's own foundation models remain behind closed doors. Insiders say there's still a lack of clear direction about whether Apple wants to compete head-to-head with models like GPT-4 or build more narrow, hardware-optimized tools. In an interview with Tom's Guide following WWDC 2025, Craig Federighi, Apple's senior vice president of software engineering, and Greg Joswiak, the senior vice president of worldwide marketing, made it clear that Apple doesn't want to make a chatbot. Without Pang's leadership and vision, some fear Apple's internal AI efforts could stagnate, or become overly reliant on outside partners. Others remain optimistic that the hiring of Zhifeng Chen, a former Google engineer now leading the foundation models team, will bring fresh momentum. Either way, Apple's AI ambitions face a decisive inflection point. As rivals like Meta, OpenAI and Google continue to poach top researchers and ship headline-grabbing models, Apple must prove it's still a serious contender in the generative AI era. More from Tom's Guide


Hindustan Times
15-07-2025
- Hindustan Times
Remember ‘Trash Can' Mac Pro? Apple now calls it vintage and here's what that means for owners
Apple has quietly added a familiar fan-favourite, or fan-frustrator, to its vintage products list. The 2013 Mac Pro, most often dubbed the 'trash can' Mac, has officially been marked vintage, more than a decade after its release. Apple's boldest desktop design just got demoted and it might stir up some strong opinions all over again.(Apple) Though sold for years, it now enters its final phase of support, alongside a small batch of Apple products that are gradually reaching the end of the line. While the move isn't unexpected, it closes the chapter on one of Apple's boldest and most unconventional designs. A design Apple would rather forget When Apple launched the cylindrical Mac Pro in 2013, it caught a lot of attention. Sleek, black, and turbine-like, it was unlike anything they'd made before. The idea was to deliver powerful performance in a small and futuristic form. At the time, it looked like something from a sci-fi film. But the excitement didn't last. The design was compact, yes, but that same sleek form made upgrades nearly impossible. Graphics, storage, thermals, all were boxed into a space that didn't leave much room to grow. Power users quickly hit limitations, especially those wanting to swap out GPUs or add memory. Even Apple's own engineers ran into trouble. 'I think we designed ourselves into a bit of a thermal corner,' said Craig Federighi back in 2017, admitting publicly that the design didn't allow the performance flexibility they'd hoped for. The Mac Pro remained on sale until 2019, when Apple finally rolled out a new version based on a more familiar tower-style frame, often referred to as the 'cheese grater Mac Pro.' That design stuck, offering better airflow and vastly improved upgrade options. What vintage means and what comes next Once a product hasn't been sold by Apple for five years, it's labelled vintage. That means limited support may still be available but only if replacement parts are in stock. Spend two more years on this list, and the product becomes obsolete, losing all official support altogether. You also won't find help at Apple-authorised service centres after that point. Other devices now joining the vintage list include the 2019 13‑inch MacBook Air, 2019 iMac, 2018 iPad Pros, and the iPhone 8 (128GB) model. Meanwhile, older accessories like AirPort routers and Time Capsules have now shifted into the obsolete category. The 'Trash Can Mac Pro' may have been a bold swing that didn't quite land, but it's now part of Apple's design legacy - unusual, eye-catching, and finally, vintage.
Yahoo
14-07-2025
- Yahoo
The AI Mirage
The Atlantic Daily, a newsletter that guides you through the biggest stories of the day, helps you discover new ideas, and recommends the best in culture. Sign up for it here. 'I'm not going to respond to that,' Siri responded. I had just cursed at it, and this was my passive-aggressive chastisement. The cursing was, in my view, warranted. I was in my car, running errands, and had found myself in an unfamiliar part of town. I requested 'directions to Lowe's,' hoping to get routed to the big-box hardware store without taking my eyes off the road. But apparently Siri didn't understand. 'Which Lowe?' it asked, before displaying a list of people with the surname Lowe in my address book. Are you kidding me? Not only was the response incoherent in context, but also, only one of the Lowe entries in my contacts included an address anyway, and it was 800 miles away—an unlikely match compared with the store's address. AI may not ever accomplish all of the things the tech companies say it will—but it seems that, at the very least, computers should be smarter now than they were 10 or 15 years ago. It turns out that I would have needed an entirely new phone for Siri to have surmised that I wanted to go to the store. Craig Federighi, Apple's senior vice president of software engineering, said in an interview last month that the latest version of Siri has 'better conversational context'—the sort of thing that should help the software know when I'm asking to be guided to the home-improvement store rather than to a guy called Lowe. But my iPhone apparently isn't new enough for this update. I would need cutting-edge artificial intelligence to get directions to Lowe's. [Read: The entire internet is reverting to beta] This is effectively Apple's entire pitch for AI. When it launched Apple Intelligence (the company's name for the AI stuff in its operating systems) last year, the world's third-most-valuable company promised a rich, contextual understanding of all your data, and the capacity to interact with it through ordinary phrases on your iPhone, iPad, or Mac. For example, according to Apple, you would be able to ask Siri to 'send the photos from the barbecue on Saturday to Malia.' But in my experience, you cannot ask even the souped-up Siri to do things like this. I embarked on a modest test of Apple Intelligence on my Mac, which can handle the feature. It failed to search my email, no matter how I phrased my command. When I tried to use Siri to locate a PDF of a property-survey report that I had saved onto my computer, it attempted to delegate the task to ChatGPT. Fine. But ChatGPT provided only a guide to finding a survey of a property in San Francisco, a city in which I do not live. Perhaps I could go more general. I typed into Siri: 'Can you help me find files on my computer?' It directed me to open Finder (the Mac's file manager) and look there. The AI was telling me to do the work myself. Finally, I thought I would try something like Apple's own example. I told Siri to 'show me photos I have taken of barbecue,' which resulted in a grid of images—all of which were stock photos from the internet, not pictures from my library. These limitations are different from ChatGPT's tendency to confidently make up stories and pass them off as fact. At least that error yields an answer to the question posed, albeit an inaccurate one. Apple Intelligence doesn't even appear to understand the question. This might not seem like a problem if you don't use Apple products or are content to rawdog your way to Lowe's. But it does reveal a sad state of affairs for computing. For years, we've been told that frictionless interactions with our devices will eventually be commonplace. Now we're seeing how little progress has been made toward this goal. I asked Apple about the problems I'm having with Apple Intelligence, and it more or less confirmed that the product doesn't work—yet. Apple's position is that the 2024 announcement, featuring Malia and the cookout, represents a vision for what Siri can and should do. The company expects that work on functionality of this kind will continue into 2026, and it showed me a host of other forthcoming AI tools, including one with the ability to recognize an event in a screenshot of a text message and add the info to a calendar, or to highlight an object in a photo and search for similar ones on Google or Etsy. I also saw a demo of live language translation on a phone call, updated AI-created emoji, and tools to refine what you've written inside emails and in Apple software. Interesting, but in my mind, all of these features change how you can use a computer; they don't improve the existing ways. After rolling around in my head the idea that Apple Intelligence represents a vision for how a computer should work, I remembered that Apple first expressed this vision back in 1987, in a concept video for a product called Knowledge Navigator. The short film depicts a university professor carrying out various actions of daily and professional life by speaking directly to a personified software assistant on a tablet-like computer—all of the things I long to do with my computer 38 years hence. Knowledge Navigator, per the video, could synthesize information from various sources, responding to a user's requests to pull up various papers and data. 'Let me see the lecture notes from last semester,' the professor said, and the computer carried out the task. While the professor perused articles, the computer was able to identify one by a colleague, find her contact info, and call her upon his request. Although obscure outside computer-history circles, Knowledge Navigator is legendary in Silicon Valley. It built on previous, equally fabled visions for computing, including Alan Kay's 1972 proposal for a tablet computer he called DynaBook. Apple would eventually realize the form of that idea in the iPad. But the vision of Knowledge Navigator wasn't really about how a device would look or feel. It was about what it would do: allow one to integrate all the aspects of a (then-still-theoretical) digital life by speaking to a virtual agent, Star Trek style. Today, this dream feels technologically feasible, yet it is still, apparently, just out of reach. (Federighi promised in the June interview that a better Siri was right around the corner, with 'much higher quality and much better capability.') Apple Intelligence—really, generative AI overall—emphasizes a sad reality. The history of personal-computer interfaces is also a history of disappointments. At first, users had to type to do things with files and programs, using esoteric commands to navigate up and down the directory structures that contained them. The graphical user interface, which Apple popularized, adapted that file-and-folder paradigm into an abstraction of a desktop, where users would click and move those files around. But progress produced confusion. Eventually, as hard disks swelled and email collected, we ended up with so much digital stuff that finding it through virtualized rummaging became difficult. Text commands returned via features such as Apple's Spotlight, which allows a user to type the name of a file or program, just as they might have done 50 years ago. But now the entire information space is a part of the computer interface. The location and route to Lowe's gets intermixed with people named Lowe in my personal address book. A cookout might be a particular event I attended, or it might be an abstraction tagged in online images. This is nothing new, of course; for decades now, using a computer has meant being online, and the conglomeration of digital materials in your head, on your hard disk, and on the internet often cause trouble. When you're searching the web, Google asks if you're perhaps really looking for the thing it deems more common based on other people's behavior, rather than the thing you typed. And iCloud Drive helpfully uploads your files to the cloud to save disk space, but then you can't access them on an airplane without Wi-Fi service. We are drowning in data but somehow unable to drink from its wellspring. In principle, AI should solve this. Services such as ChatGPT, built on large language models that are trained on vast quantities of online and offline data, promised to domesticate the internet's wilds. And for all their risk of fabrication and hallucination, LLMs really do deliver on that front. If you want to know if there exists a lens with specific properties compatible with a particular model of camera, or seek advice on how to carry out a plumbing repair, ChatGPT can probably be of use. But ChatGPT is much less likely to help you make sense of your inbox or your files, partly because it hasn't been trained on them—and partly because it aspires to become a god rather than a servant. [Read: The AI birthday letter that blew me away] Apple Intelligence was supposed to fill that gap, and to do so distinctively. Knowledge Navigator never got built, but it was massively influential within the tech industry as a vision of a computing experience; it shows that Apple has expressed this goal for decades, if under different technological conditions and executive leadership. Other companies, including Google, are now making progress toward that aim too. But Apple is in a unique position to carry out the vision. It is primarily a personal-computer-hardware business focused on the relationship between the user and the device (and their own data) instead of the relationship between the user and the internet, which is how nearly every other Big Tech company operates. Apple Intelligence would make sense of all your personal information and grant new-and-improved access to it via Siri, which would finally realize its purpose as an AI-driven, natural-language interface to all that data. As the company has already done for decades, Apple would leave the messy internet mostly to others and focus instead on the device itself. That idea is still a good one. Using a computer to navigate my work or home life remains strangely difficult. Calendars don't synchronize properly. Email search still doesn't work right, for some reason. Files are all over the place, in various apps and services, and who can remember where? If computationalists can't even make AI run computing machines effectively, no one will ever believe that they can do so for anything—let alone everything—else. Article originally published at The Atlantic


Tom's Guide
14-07-2025
- Tom's Guide
Apple's infamous trash can Mac is now classified as 'vintage'
Apple's ongoing vintage and obsoletes products list got a recent update, and the latest update adds some unique Apple products, including the infamous "trash can" 2013 Mac Pro, considered retro 12 years after it's introduction. In general, products added to the list are actually more recent than the trash can desktop, but Apple sold that version of the Mac Pro for years. To be added to the vintage list a device has to be five years is produced and distributed for sale. Apple sold the trash can Mac Pro until December of 2019 when it was replaced with the equally unique "cheesegrater" Mac Pro, a design that has largely stuck around since then. At the time, the Mac Pro's cylindrical design was smaller than than the previous tower Mac Pro. However, the design was flawed with a lack of space to upgrade components like GPUs and RAM. As MacRumors reported in 2017, even Apple had to admit that the design was a failure when it came to updating the system, even for Apple engineers. "I think we designed ourselves into a bit of a thermal corner, if you will," said Craig Federighi said, who was then software engineering chief. Beyond the turbine-looking Mac Pro, a few other Apple devices are now vintage: Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Additionally, a number of devices are now considered obsolete. The transition occurs when a product has been considered vintage for two years. More accurately, devices are obsolete seven years after Apple discontinued them. Most of this list includes Apple's long-dead AirPort routers. Apple will support vintage products with repairs if parts are available. However, once parts supplies are gone, repairs will no longer be offered. Obsolete devices won't be repaired by Apple Stores or Apple Authorized Service Providers.