&w=3840&q=100)
Google mocks Apple's delayed AI features in Pixel 10 series ad: Watch here
In the narration, Google appears to reference Apple's WWDC 2024 announcement, suggesting that you should 'just change your phone' if you bought 'a new phone because of a feature that's coming soon, but it's been coming soon for a full year.' The teaser plays an instrumental version of 'The Next Episode' by Dr. Dre in the background, a likely nod to Apple's acquisition of Beats by Dre in 2014, adding another layer to the dig.
At WWDC 2025, Apple's Senior Vice President of Global Marketing, Greg Joswiak, confirmed in an interview with The Wall Street Journal that the Siri upgrades will now arrive in 2026. He said the feature set did not meet Apple's quality standards in internal testing.
Google Pixel 10 series: What to expect
Google is expected to unveil the Pixel 10 series at its upcoming Made by Google event on August 20. The lineup is likely to include four models: Pixel 10, Pixel 10 Pro, Pixel 10 Pro XL, and Pixel 10 Pro Fold.
All four devices are expected to be powered by the new Google Tensor G5 chipset, reportedly built on TSMC's 3nm process. The chip should offer improved performance and power efficiency, along with a custom image signal processor (ISP) to boost photo and video quality.
In terms of cameras, Google may reuse the main and ultra-wide sensors from the Pixel 9a on the Pixel 10 and Pixel 10 Pro Fold — marking a slight downgrade compared to the Pixel 9 series. However, the 5x periscope telephoto lens from last year's Pixel 9 Pro Fold might now appear on the base model as well. Meanwhile, the Pixel 10 Pro and Pro XL are expected to retain the same camera setup as last year's models.
The Pixel 10 series is also expected to support Qi2 magnetic wireless charging, adopting the newer 25W standard for faster and more efficient charging. In addition to the phones, Google is reportedly preparing a new line of accessories under the 'Pixelsnap' branding, which will include magnetic chargers and custom-fit cases.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Hindustan Times
11 minutes ago
- Hindustan Times
iPhone 17 Pro launch date and price in India is…
Apple is gearing up for its major launch of the year with the iPhone 17 series, expected in the first week of September 2025. While no official announcement has been made, Bloomberg's Mark Gurman suggests the event could take place on September 9 or 10. The new lineup will likely include the iPhone 17, iPhone 17 Pro, iPhone 17 Pro Max, and a new slim model, the iPhone 17 Air, which replaces the iPhone 17 Plus. Alongside the phones, Apple is also anticipated to unveil the Apple Watch Series 11, Apple Watch SE 3, and Apple Watch Ultra 3. Apple iPhone 17 launch event in 2025 could take place on September 9 or 10. (Majin BU/ X) In India, pricing is expected to start at around ₹1,45,990 for the iPhone 17 Pro and ₹1,64,990 for the iPhone 17 Pro Max. Pricing details for the iPhone 17 Air and the standard model remain to be confirmed. Also read: iPhone 16e quick review: Compact but powerful contender The iPhone 17 series is expected to maintain display sizes similar to the current models. The Pro and Pro Max should keep their 6.3-inch and 6.9-inch displays, while the iPhone 17 Air will sit between these two in size. The standard iPhone 17 might grow to 6.3 inches, matching the Pro. Earlier reports suggested the iPhone 17 Air would be a premium device priced above the Pro Max, but current sources indicate it will be positioned as a mid-range option—less expensive than the Pro models but more costly than the standard iPhone 17. This positions it as a direct successor to the 'Plus' model, offering a balance of size and price. All four models are expected to receive ProMotion technology with a 120Hz refresh rate for smoother scrolling and video playback, previously exclusive to Pro models. This will be enabled by LTPO OLED panels. Although LTPO displays support always-on screens, it remains uncertain whether this feature will be available on all models or remain limited to the Pro versions. Also read: Samsung Galaxy S25 review: Flagship features in a handful package For the Pro models, Apple may replace the titanium frame with aluminium while keeping glass for MagSafe charging. This would create a half-glass, half-aluminium build, improving durability. The current square camera bump may be replaced by a horizontal pill-shaped design, which the iPhone 17 Air is also expected to adopt. Powering the series is expected to be Apple's A19 Pro chip, built on TSMC's latest 3nm process. This chip promises better speed and power efficiency. For the first time, iPhones might come with up to 12GB of RAM, allowing smoother multitasking and support for advanced AI features under Apple Intelligence. Also read: Samsung Galaxy S25 Ultra review: Almost the perfect Android flagship Camera upgrades are significant across the lineup. All models could see a jump from the 12MP front camera in the iPhone 16 series to a new 24MP sensor for sharper images and better cropping capability. The iPhone 17 Pro Max is expected to have three 48MP rear cameras—a Wide, Ultra Wide, and a Tetraprism Telephoto lens—and may support 8K video recording for the first time. The iPhone 17 Air might feature a single 48MP rear camera, while the standard iPhone 17 could come with a dual-lens setup consisting of Wide and Ultra Wide cameras. The Pro models may introduce a mechanical aperture for adjustable light intake, improving depth-of-field control—an industry first for iPhones that have traditionally used fixed apertures. They could also include dual video recording to film with front and rear cameras simultaneously, a feature popular among content creators but currently limited to third-party apps. The 5x Telephoto optical zoom will continue to be exclusive to the Pro models. The standard iPhone 17 and iPhone 17 Air will not feature Telephoto lenses or 5x zoom.


Indian Express
11 minutes ago
- Indian Express
NASA and Google develop an AI medical assistant to be used by astronauts on deep‑space missions
Google and NASA are working on a medical assistant powered by artificial intelligence that could be used for extended trips to the space, starting with its Artemis campaign to return to the moon. The Crew Medical Officer Digital Assistant (CMO‑DA) is powered by Google AI trained on spaceflight literature. It's designed to support a crew medical officer or flight surgeon in keeping the flight crew healthy. The AI medical assistant provides real‑time analysis of crew health and performance, enabling medical decisions driven by data and predictive analytics. Google said the AI assistant is being tested with simulated scenarios and evaluated using a clinical framework designed to assess the performance of medical students. The model performed well in early tests with a diagnostic accuracy of 88 per cent for the ankle injury case, 80 per cent for ear pain, and 74 per cent for flank pain, according to a TechCrunch report. 'Early results showed promise for reliable diagnoses based on reported symptoms,' the blog reads. 'Google and NASA are now collaborating with medical doctors to test and refine the model, aiming to enhance autonomous crew health and performance during future space exploration missions.' The project is being implemented under a fixed contract with Google Public Sector, which includes cloud computing, infrastructure for application development, and model training. NASA owns the source code of the application and the agency will participate in the finalisation of the models. Vertex AI provides access to both Google models and third-party solutions. NASA plans to gradually expand the system's capabilities. Future versions will incorporate real‑time data from onboard medical devices and learn to detect spaceflight‑specific health conditions such as the effects of microgravity on the human body. Google Cloud and @NASA have collaborated on a new AI-powered proof of concept tool to help astronauts autonomously diagnose and treat symptoms during deep space missions, a significant step that could also benefit remote medical care on Earth. — Thomas Kurian (@ThomasOrTK) August 7, 2025 Both NASA and NASA are now working with doctors to refine the model, with the goal of improve autonomous medical care for future missions to the moon, Mars and beyond. The technology could also help deliver quality medical care to people in remote parts on Earth. This isn't the first NASA project to incorporate artificial intelligence. Earlier this year, the space agency's Jet Propulsion Laboratory successfully tested a new AI system called Dynamic Targeting, which allows Earth‑observing satellites to autonomously decide where and where not to point their cameras in under 90 seconds, without human intervention. Developed over more than a decade, this technology mimics how a human might interpret imagery.
&w=3840&q=100)

Business Standard
11 minutes ago
- Business Standard
Gemini Live gets real-time access to Google Calendar, Tasks and Keep apps
Google's Gemini Live now supports real-time integration with Calendar, Tasks, and Keep, enabling users to manage schedules, reminders, and notes directly in live chats on Android and iOS New Delhi Google has expanded Gemini Live's capabilities with real-time integration into Google Calendar, Tasks, and Keep for Android and iOS. The update allows users to manage schedules, reminders, and notes directly within live conversations. The integration, first teased at Google I/O 2025 in May, moves Gemini Live towards more personalised functionality by linking it with widely used Google apps. Users can now create Calendar events, set Task reminders, and add notes to Keep without leaving the chat interface. How it works The rollout, which began limited testing in late June, is now more widely available. It starts with integration for Google Maps, Calendar, Tasks, and Keep, enabling actions like adding events mid-chat or pulling location details instantly. On Samsung devices, Gemini Live also connects with Calendar, Notes, and Reminders. According to 9To5Google, when enabled, the interface shows the app name above fullscreen controls along with a loading indicator. Actions, such as creating a list, prompt confirmation messages and an 'Undo' option for quick edits. Users can reference apps directly — for example, 'Create a new task in Tasks' — or make general queries like 'Do I have any reminders today?' to trigger responses. The features work alongside Gemini Live's video and screen sharing, allowing, for instance, immediate event creation when dates are detected in the user's environment or on-screen. Rollout status While some users gained access in late June, the rollout has been gradual. The integration is now appearing in both stable and beta versions of the Google app on Android, as well as on iOS, broadening access to Gemini Live's personal data tools.