logo
Made by Google on August 20: What to expect from Pixel 10 Pro Fold, Watch 4

Made by Google on August 20: What to expect from Pixel 10 Pro Fold, Watch 4

Promotional materials for the anticipated Google Pixel 10 Pro Fold foldable smartphone and Pixel Watch 4 have reportedly surfaced online. According to a report by 9To5Google, the leaked marketing assets showcase the devices from various angles and highlight new features, including Gemini AI integration on the Pixel Watch 4.
For the uninitiated, Google is set to host its annual Made by Google event on August 20, where the company is expected to launch its Pixel 10 series smartphones. The Pixel Watch 4 and Pixel Buds 2a are also likely to debut at the event.
Google Pixel 10 Pro Fold: What to expect
According to the report, the marketing material offers a closer look at the design of Google's next-generation book-style foldable smartphone. The images show the Pixel 10 Pro Fold in the same grey-blue colour that Google previously previewed for the Pixel 10 Pro. This new colourway is expected to be called 'Moonstone.'
Design-wise, the foldable appears similar to its predecessor, featuring a floating camera island that houses a triple-camera setup. The cover display is expected to be slightly larger, likely due to slimmer bezels. Google may also introduce changes to the hinge mechanism to improve durability.
The Pixel 10 Pro Fold is expected to be powered by the next-generation Tensor G5 chip, which is likely to power the entire Pixel 10 series. Built on TSMC's 3nm process, the new chip is expected to deliver improvements in both performance and efficiency, along with a custom image signal processor (ISP) for enhanced photo and video quality.
In terms of cameras, the foldable is rumoured to use the main sensor from the Pixel 9a, instead of the Pixel 9 sensor found in the Pixel 9 Pro Fold. Additionally, the device is expected to support Qi2 magnetic wireless charging, along with support for custom accessories that Google will likely brand as 'Pixelsnap.'
Google Pixel Watch 4: What to expect
The Pixel Watch 4 is also expected to retain the overall design language of its predecessor, but with a notable hardware change: its charging contacts will now be placed on the left side instead of the back. The images show a side-mounted magnetic charging pin, likely meant to work with a redesigned charger. According to 9To5Google, this new charger will be called the 'Quick Charge Dock' and will support up to 25 per cent faster charging.
The report also mentions that the Pixel Watch 4 will feature a new 'Actua 360' display, which is expected to be slightly larger and capable of reaching 3,000 nits of brightness. The smartwatch will also include dual-frequency GPS for more accurate location tracking, particularly in dense urban areas.
On the software front, the Pixel Watch 4 will come with built-in Gemini AI. According to the leaked marketing copy, the integration will allow users to 'raise your wrist for quick AI assistant responses and personalized help,' and to 'keep the conversation going with AI text suggestions that sound like you.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

iPhone 17 Pro launch date and price in India is…
iPhone 17 Pro launch date and price in India is…

Hindustan Times

time12 minutes ago

  • Hindustan Times

iPhone 17 Pro launch date and price in India is…

Apple is gearing up for its major launch of the year with the iPhone 17 series, expected in the first week of September 2025. While no official announcement has been made, Bloomberg's Mark Gurman suggests the event could take place on September 9 or 10. The new lineup will likely include the iPhone 17, iPhone 17 Pro, iPhone 17 Pro Max, and a new slim model, the iPhone 17 Air, which replaces the iPhone 17 Plus. Alongside the phones, Apple is also anticipated to unveil the Apple Watch Series 11, Apple Watch SE 3, and Apple Watch Ultra 3. Apple iPhone 17 launch event in 2025 could take place on September 9 or 10. (Majin BU/ X) In India, pricing is expected to start at around ₹1,45,990 for the iPhone 17 Pro and ₹1,64,990 for the iPhone 17 Pro Max. Pricing details for the iPhone 17 Air and the standard model remain to be confirmed. Also read: iPhone 16e quick review: Compact but powerful contender The iPhone 17 series is expected to maintain display sizes similar to the current models. The Pro and Pro Max should keep their 6.3-inch and 6.9-inch displays, while the iPhone 17 Air will sit between these two in size. The standard iPhone 17 might grow to 6.3 inches, matching the Pro. Earlier reports suggested the iPhone 17 Air would be a premium device priced above the Pro Max, but current sources indicate it will be positioned as a mid-range option—less expensive than the Pro models but more costly than the standard iPhone 17. This positions it as a direct successor to the 'Plus' model, offering a balance of size and price. All four models are expected to receive ProMotion technology with a 120Hz refresh rate for smoother scrolling and video playback, previously exclusive to Pro models. This will be enabled by LTPO OLED panels. Although LTPO displays support always-on screens, it remains uncertain whether this feature will be available on all models or remain limited to the Pro versions. Also read: Samsung Galaxy S25 review: Flagship features in a handful package For the Pro models, Apple may replace the titanium frame with aluminium while keeping glass for MagSafe charging. This would create a half-glass, half-aluminium build, improving durability. The current square camera bump may be replaced by a horizontal pill-shaped design, which the iPhone 17 Air is also expected to adopt. Powering the series is expected to be Apple's A19 Pro chip, built on TSMC's latest 3nm process. This chip promises better speed and power efficiency. For the first time, iPhones might come with up to 12GB of RAM, allowing smoother multitasking and support for advanced AI features under Apple Intelligence. Also read: Samsung Galaxy S25 Ultra review: Almost the perfect Android flagship Camera upgrades are significant across the lineup. All models could see a jump from the 12MP front camera in the iPhone 16 series to a new 24MP sensor for sharper images and better cropping capability. The iPhone 17 Pro Max is expected to have three 48MP rear cameras—a Wide, Ultra Wide, and a Tetraprism Telephoto lens—and may support 8K video recording for the first time. The iPhone 17 Air might feature a single 48MP rear camera, while the standard iPhone 17 could come with a dual-lens setup consisting of Wide and Ultra Wide cameras. The Pro models may introduce a mechanical aperture for adjustable light intake, improving depth-of-field control—an industry first for iPhones that have traditionally used fixed apertures. They could also include dual video recording to film with front and rear cameras simultaneously, a feature popular among content creators but currently limited to third-party apps. The 5x Telephoto optical zoom will continue to be exclusive to the Pro models. The standard iPhone 17 and iPhone 17 Air will not feature Telephoto lenses or 5x zoom.

NASA and Google develop an AI medical assistant to be used by astronauts on deep‑space missions
NASA and Google develop an AI medical assistant to be used by astronauts on deep‑space missions

Indian Express

time12 minutes ago

  • Indian Express

NASA and Google develop an AI medical assistant to be used by astronauts on deep‑space missions

Google and NASA are working on a medical assistant powered by artificial intelligence that could be used for extended trips to the space, starting with its Artemis campaign to return to the moon. The Crew Medical Officer Digital Assistant (CMO‑DA) is powered by Google AI trained on spaceflight literature. It's designed to support a crew medical officer or flight surgeon in keeping the flight crew healthy. The AI medical assistant provides real‑time analysis of crew health and performance, enabling medical decisions driven by data and predictive analytics. Google said the AI assistant is being tested with simulated scenarios and evaluated using a clinical framework designed to assess the performance of medical students. The model performed well in early tests with a diagnostic accuracy of 88 per cent for the ankle injury case, 80 per cent for ear pain, and 74 per cent for flank pain, according to a TechCrunch report. 'Early results showed promise for reliable diagnoses based on reported symptoms,' the blog reads. 'Google and NASA are now collaborating with medical doctors to test and refine the model, aiming to enhance autonomous crew health and performance during future space exploration missions.' The project is being implemented under a fixed contract with Google Public Sector, which includes cloud computing, infrastructure for application development, and model training. NASA owns the source code of the application and the agency will participate in the finalisation of the models. Vertex AI provides access to both Google models and third-party solutions. NASA plans to gradually expand the system's capabilities. Future versions will incorporate real‑time data from onboard medical devices and learn to detect spaceflight‑specific health conditions such as the effects of microgravity on the human body. Google Cloud and @NASA have collaborated on a new AI-powered proof of concept tool to help astronauts autonomously diagnose and treat symptoms during deep space missions, a significant step that could also benefit remote medical care on Earth. — Thomas Kurian (@ThomasOrTK) August 7, 2025 Both NASA and NASA are now working with doctors to refine the model, with the goal of improve autonomous medical care for future missions to the moon, Mars and beyond. The technology could also help deliver quality medical care to people in remote parts on Earth. This isn't the first NASA project to incorporate artificial intelligence. Earlier this year, the space agency's Jet Propulsion Laboratory successfully tested a new AI system called Dynamic Targeting, which allows Earth‑observing satellites to autonomously decide where and where not to point their cameras in under 90 seconds, without human intervention. Developed over more than a decade, this technology mimics how a human might interpret imagery.

Gemini Live gets real-time access to Google Calendar, Tasks and Keep apps
Gemini Live gets real-time access to Google Calendar, Tasks and Keep apps

Business Standard

time12 minutes ago

  • Business Standard

Gemini Live gets real-time access to Google Calendar, Tasks and Keep apps

Google's Gemini Live now supports real-time integration with Calendar, Tasks, and Keep, enabling users to manage schedules, reminders, and notes directly in live chats on Android and iOS New Delhi Google has expanded Gemini Live's capabilities with real-time integration into Google Calendar, Tasks, and Keep for Android and iOS. The update allows users to manage schedules, reminders, and notes directly within live conversations. The integration, first teased at Google I/O 2025 in May, moves Gemini Live towards more personalised functionality by linking it with widely used Google apps. Users can now create Calendar events, set Task reminders, and add notes to Keep without leaving the chat interface. How it works The rollout, which began limited testing in late June, is now more widely available. It starts with integration for Google Maps, Calendar, Tasks, and Keep, enabling actions like adding events mid-chat or pulling location details instantly. On Samsung devices, Gemini Live also connects with Calendar, Notes, and Reminders. According to 9To5Google, when enabled, the interface shows the app name above fullscreen controls along with a loading indicator. Actions, such as creating a list, prompt confirmation messages and an 'Undo' option for quick edits. Users can reference apps directly — for example, 'Create a new task in Tasks' — or make general queries like 'Do I have any reminders today?' to trigger responses. The features work alongside Gemini Live's video and screen sharing, allowing, for instance, immediate event creation when dates are detected in the user's environment or on-screen. Rollout status While some users gained access in late June, the rollout has been gradual. The integration is now appearing in both stable and beta versions of the Google app on Android, as well as on iOS, broadening access to Gemini Live's personal data tools.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store