Apple's smart glasses project may use the same chips from the Watch
Apple is developing a chip for smart glasses, according to Bloomberg's Mark Gurman, and it's based on the chip used for the Apple Watch. The company's silicon group has reportedly removed some components from the chip used for the Watch to improve its power efficiency. In addition, the group has been working to give it the ability to control multiple cameras that the glasses would need to be able to function. Gurman previously reported that the company has set aside plans for Mac-tethered augmented reality glasses but that it still intends to release standalone AR glasses in the future. These glasses won't be true AR glasses, however, and would be more of a direct competitor to Meta's Ray-Bans
The company is taking a slower, more cautious approach in developing the standalone glasses, so it could take some time before they're released. Gurman says Apple is looking to start mass producing the chips for the product sometime in 2027, so the company is expecting its development to take two more years at the very least. Based on that timeline, Meta will likely be able to release its first true pair of augmented reality glasses first: The company is already working on one and is hoping to launch it in 2027.
If the device is going to be a true competitor to Meta's Ray-Ban spectacles, then it will be able to capture photos and videos, will have speakers or earphones and will be able perform tasks with voice commands. Apple's glasses will reportedly be using cameras to scan its environment and will assist wearers with the help of artificial intelligence. So yes, it will be similar to Meta Ray-Bans, but Bloomberg says Apple is still figuring out its exact approach for the product. Aside from the chip for its smart glasses, Apple is also reportedly working on chips for use in other devices, including a range of more powerful Macs and AI servers.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
21 minutes ago
- Yahoo
Watch These Apple Price Levels After WWDC 2025 Updates Fail to Boost Stock
Apple shares rose slightly Tuesday after losing ground the previous session following announcements from the company at its developers conference that failed to impress investors. The iPhone maker unveiled several AI-related improvements with iOS 26, but said details on some highly anticipated updates, including to its virtual assistant Siri, will come later. The stock ran into selling pressure near the downward sloping 50-day moving average, potentially setting the stage for a continuation of the stock's longer-term downtrend that started in late December. Investors should watch important support levels on Apple's chart around $193 and $180, while also monitoring resistance levels near $214 and $ (AAPL) shares rose slightly Tuesday after losing ground the previous session following announcements from the company at its developers conference that failed to impress investors. The tech giant, which kicked off its week-long Worldwide Developers Conference on Monday, unveiled several AI-related improvements with iOS 26 but said enhanced Siri features needed more time to meet the company's quality standards. The lack of new Siri updates likely raised concerns that the company, which was slow to roll out its flagship Apple Intelligence software, is losing ground to other tech giants with artificial intelligence and that further delays could slow iPhone sales as consumers postpone their upgrade cycle. The company's keynote presentation Monday delivered "slow but steady improvements to strategy," Wedbush analysts said, 'but overall [it was] a yawner." Apple stock is down 19% since the start of 2025, making it the weakest performer among members of the Magnificent Seven group of major technology companies, alongside Tesla (TSLA). Apple shares gained 0.6% on Tuesday to close at $202.67, after dropping 1.2% yesterday. Below, we break down the technicals on Apple's chart and point out important price levels worth watching out for. After rebounding from their early-April low, Apple shares have traded mostly sideways, with the price recently forming a rising wedge. The price then ran into selling pressure near the downward sloping 50-day moving average, potentially setting the stage for a continuation of the stock's longer-term downtrend that started in late December. Meanwhile, the relative strength index has struggled to reclaim its neutral threshold, signaling bearish price momentum. Let's identify important support and resistance levels on Apple's chart that investors will likely be watching. A breakdown below the rising wedge could initially see the shares fall to around $193. This area may provide support near the low of the pattern, which also aligns with a range of corresponding price action on the chart extending back to May last year. The bulls' failure to successfully defend this level opens the door for a more significant drop to $180. Bargain hunters may seek buy-and-hold opportunities in this location near a brief retracement in May last year following a breakaway gap above the 200-day moving average. This level also sits in the same neighborhood as a measured move downside price target that calculates the decline in points that preceded the rising wedge and deducts that amount from the pattern's lower trendline. During upswings in the stock, investors should monitor the $214 level. The shares may encounter overhead selling pressure in this area near several peaks and troughs that formed on the chart between September and May. Finally, a decisive close above this level could see Apple shares climb toward $235. Investors who bought shares at lower prices may look for exit points in this region near notable peaks that developed on the chart in July and October last year. The comments, opinions, and analyses expressed on Investopedia are for informational purposes only. Read our warranty and liability disclaimer for more info. As of the date this article was written, the author does not own any of the above securities. Read the original article on Investopedia Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Bloomberg
33 minutes ago
- Bloomberg
Starlink Is Best for Rural Airports, Not So Much for Urban Ones, Cruz Says
Urban airports should use fiber-optic communications while Elon Musk's Starlink satellite service may be better for rural ones, Senator Ted Cruz said. 'In a dense urban environment, fiber makes a lot of sense,' Cruz, a Texas Republican and chair of the Senate Commerce Committee, told Bloomberg Television in an interview on Tuesday.


Fast Company
33 minutes ago
- Fast Company
AI assistants still need a human touch
When I first encountered AI, it wasn't anything like the sophisticated tools we have today. In the 1990s, my introduction came in the form of a helpful, but mostly frustrating, digital paperclip. Clippy, Microsoft's infamous assistant, was designed to help, but it often got in the way, popping up at the worst moments with advice no one asked for. AI has evolved since then. Major companies like Apple are investing billions, and tools like OpenAI 's ChatGPT and DALL-E have reshaped how we interact with technology. Yet, one challenge from Clippy's era lingers—understanding and adapting to user intent. The original promise of AI was to create experiences that felt seamless, intuitive, and personal. AI was supposed to anticipate our needs and provide support that felt natural. So, why do so many systems today still feel mechanical and rigid—more Clippy than collaborator? When AI assistance is a burden When first introduced, Clippy was a bold attempt at computer-guided assistance. Its purpose was groundbreaking at the time, but it quickly became known more for interruptions than useful assistance. You'd pause when typing, and Clippy would leap into action with a pop-up: 'It looks like you're writing a letter!' Its biggest flaw wasn't just being annoying: It lacked contextual awareness. Unlike modern AI tools, Clippy's interactions were static and deterministic, triggered by fixed inputs. There was no learning from previous interactions and no understanding of the user's preferences or current tasks. Whether you were drafting a report or working on a spreadsheet, Clippy offered the same generic advice—ignoring the evolving context and failing to provide truly helpful, personalized assistance. Is AI destined to be like Clippy? Even with today's advancements, many AI assistants still feel underwhelming. Siri is a prime example. Though capable of setting reminders or answering questions, it often requires users to speak in very specific ways. Deviate from the expected phrasing, and it defaults to, 'I didn't understand that.' This is more than a UX flaw—it reveals a deeper issue. Too many AI systems still operate under a one-size-fits-all mentality, failing to accommodate the needs of individual users. With Siri, for instance, you're often required to speak in a specific, rigid format for it to process your request effectively. This creates an experience that feels less like assistance and more like a chore. Building a smarter assistant isn't just about better models. It's about retaining context, respecting privacy, and delivering personalized, meaningful experiences. That's not just technically difficult—it's essential. Helpful AI requires personalization Personalization is what will finally break us out of the Clippy cycle. When AI tools remember your preferences, learn from your behavior, and adapt accordingly, they shift from being tools to trusted partners. The key to this will be communication. Most AI today speaks in a one-dimensional tone, no matter who you are or what your emotional state is. The next leap in AI won't just be about intelligence, it'll be about emotional intelligence. But intelligence isn't only about remembering facts. It's also about how an assistant communicates. For AI to truly feel useful, it needs more than functionality. It needs personality. That doesn't mean we need overly chatty bots. It means assistants that adjust tone, remember personal context, and build continuity. That's what earns trust and keeps users engaged. While not every user may want an assistant with a personality or emotions, everyone can benefit from systems that adapt to our unique needs. The outdated one-size-fits-all approach is still common in many AI tools today and risks alienating users, much like Clippy's impersonal method back in the early days. For AI to thrive in the long term it must be designed with real humans in mind. Building Clippy 2.0 Now imagine a 'Clippy 2.0'—an assistant that doesn't interrupt but understands when to offer help. One that remembers your work habits, predicts what you need, and responds in a way that feels natural and tailored to you. It could help you schedule meetings, provide intelligent suggestions, and adapt its tone to fit the moment. Whether it has a personality or not, what matters is that it adapts to—and respects the uniqueness of every user. It might even respond with different tones or emotions depending on your reactions, creating an immersive experience. This kind of intelligent assistant would blend seamlessly into your routine, saving you time and reducing friction. Clippy may have been a trailblazer, but it lacked the technological foundation to live up to its potential. With the advances we've made today, we now have the tools to build a 'Clippy 2.0'—an AI assistant capable of transforming the way we interact with technology. Although maybe this time, it doesn't need to come in the form of a paperclip with a goofy smile.