logo
#

Latest news with #TheAndroidShow

Android 16 could still adopt iPhone-style notifications and quick settings — what we know
Android 16 could still adopt iPhone-style notifications and quick settings — what we know

Tom's Guide

time3 days ago

  • Tom's Guide

Android 16 could still adopt iPhone-style notifications and quick settings — what we know

The Android Show: I/O Edition gave us our first look at Android 16, but the rumored splitting of the Quick Settings and Notifications panel was seemingly absent. However, it looks like the feature is still in the works. At first glance, the option isn't available in the current Android 16 QPR1 beta 1 release. However, Android Authority has discovered the feature hidden in the code of the recent beta during an APK teardown. According to the report, code strings found in the beta's files indicate that Google plans to add a new 'Notifications & Quick Settings' option. This addition will allow users to switch the panel design from the current look, called 'Classic,' to the split panel design, which has been called 'Separate'. The Classic design allows users to simply swipe down from anywhere on the top of the screen to see a merged list of Quick Settings and notifications. However, the Separate panel requires users to swipe down from the top right side of the screen to access Quick Settings, and from the top left to access notifications. This isn't the first that we've seen Android try this, with split notifications first appearing last year in an Android 15 beta. The change was dropped when the OS launched, only to appear again in the early Android 16 betas. However, after The Android Show and the announcement of the new Material 3 Expressive design didn't feature the change, many thought it had been removed again. While this code line indicates that this isn't entirely the case, there could be a slight caveat to which devices have access to the new notification/quick settings style. Android Authority found a menu footnote that only appears on foldable phones like the Galaxy Z Fold 6, which states that the Combined option is only available on the outer display. This could, in theory, mean that the Separate design is accordingly only available on the large inner screens of foldable phones, but there's no clear indication either way just yet. Right now, there are still a fair number of unknowns about the feature, including when it will be launched. It's possible that it could launch as part of the Android 16 public release, which is expected to come in June. Alternatively, it could feature as part of a later update for Android 16, which is perhaps more likely considering its hidden current status. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. However, considering the negative feelings about the option, we want to hear your thoughts. Would you be switching to the Separate option, or will you stick with the Combined screen?

Google I/O 2025 recap: AI updates, Android XR, Google Beam and everything else announced at the annual keynote
Google I/O 2025 recap: AI updates, Android XR, Google Beam and everything else announced at the annual keynote

Engadget

time20-05-2025

  • Business
  • Engadget

Google I/O 2025 recap: AI updates, Android XR, Google Beam and everything else announced at the annual keynote

Today is one of the most important days on the tech calendar as Google kicked off its I/O developer event with its annual keynote. As ever, the company had many updates for a wide range of products to talk about. The bulk of the Android news was revealed last week, during a special edition of The Android Show. However, Tuesday's keynote still included a ton of stuff including, of course, a pile of AI-related news. We covered the event in real-time in our live blog, which includes expert commentary (and even some jokes!) from our team. If you're on the hunt for a breakdown of everything Google announced at the I/O keynote, though, look no further. Here are all the juicy details worth knowing about: Your Yahoo privacy setting is blocking social media and third-party content You can Allow your personal information to be shared and sold. Something went wrong. Try again. You can update your choice anytime by going to your privacy controls, which are linked to throughout our sites and apps. This page will now refresh. Quelle surprise , Google is continuing to shove more generative AI features into its core products. AI Mode, which is what the company is calling a new chatbot, will soon be live in Search for all US users. AI Mode is in a separate tab and it's designed to handle more complex queries than people have historically used Search for. You might use it to compare different fitness trackers or find the most affordable tickets for an upcoming event. AI Mode will soon be able to whip up custom charts and graphics related to your specific queries too. It can also handle follow-up questions. The chatbot now runs on Gemini 2.5. Google plans to bring some of its features into the core Search experience by injecting them into AI Overviews. Labs users will be the first to get access to the new features before Google rolls them out more broadly. Meanwhile, AI Mode is powering some new shopping features. You'll soon be able to upload a single picture of yourself to see what a piece of clothing might look like on a virtual version of you. Also, similar to the way in which Google Flights keeps an eye out for price drops, Google will be able to let you know when an item you want (in its specific size and color) is on sale for a price you're willing to pay. It can even complete the purchase on your behalf if you want. AI Overviews, the Gemini-powered summaries that appear at the top of search results and have been buggy to say the least, are seen by more than 1.5 billion folks every month, according to Google. The "overwhelming majority" of people interact with these in a meaningful way, the company said — this could mean clicking on something in an overview or keeping it on their screen for a while (presumably to read through it). Still, not everyone likes the AI Overviews and would rather just have a list of links to the information they're looking for. You know, like Search used to be. As it happens, there are some easy ways to declutter the results. Your Yahoo privacy setting is blocking social media and third-party content You can Allow your personal information to be shared and sold. Something went wrong. Try again. You can update your choice anytime by going to your privacy controls, which are linked to throughout our sites and apps. This page will now refresh. We got our first peek at Project Astra, Google's vision for a universal AI assistant, at I/O last year and the company provided more details this time around. A demo showed Astra carrying out a number of actions to help fix a mountain bike, including diving into your emails to find out the bike's specs, researching information on the web and calling a local shop to ask about a replacement part. It already feels like a culmination of Google's work in the AI assistant and agent space, though elements of Astra (such as granting it access to Gmail) might feel too intrusive for some. In any case, Google aims to transform Gemini into a universal AI assistant that can handle everyday tasks. The Astra demo is our clearest look yet at what that might look like in action. Your Yahoo privacy setting is blocking social media and third-party content You can Allow your personal information to be shared and sold. Something went wrong. Try again. You can update your choice anytime by going to your privacy controls, which are linked to throughout our sites and apps. This page will now refresh. Gemini 2.5 is here with (according to Google) improved functionality, upgraded security and transparency, extra control and better cost efficiency. Gemini 2.5 Pro is bolstered by a new enhanced reasoning mode called Deep Think. The model can do things like turn a grid of photos into a 3D sphere of pictures, then add narration for each image. Gemini 2.5's text-to-speech feature can also change up languages on the fly. There's much more to it than that, of course, and we've got more details in our Gemini 2.5 story. You know those smart replies in Gmail that let you quickly respond to an email with an acknowledgement? Google is now going to offer personalized versions of those so that they better match your writing style. For this to work, Gemini looks at your emails and Drive documents. Gemini will need your permission before it plunders your personal information. Subscribers will be able to use this feature in Gmail starting this summer. Google Meet is getting a real-time translation option, which should come in very useful for some folks. A demo showed Meet being able to match the speaker's tone and cadence while translating from Spanish to English. Subscribers on the Google AI Pro and Ultra (more on that momentarily) plans will be able to try out real-time translations between Spanish and English in beta starting this week. This feature will soon be available for other languages. Gemini Live, a tool Google brought to Pixel phones last month, is coming to all compatible Android and iOS devices in the Gemini app (which already has more than 400 million monthly active users). This allows you to ask Gemini questions about screenshots, as well as live video that your phone's camera is capturing. Google is rolling out Gemini Live to the Gemini iOS and Android app starting today. Google Search Live is a similar-sounding feature. You'll be able to have a "conversation" with Search about what your phone's camera can see. This will be accessible through Google Lens and AI Mode. A new filmmaking app called Flow, which builds on VideoFX, includes features such as camera movement and perspective controls; options to edit and extend existing shots; and a way to fold AI video content generated with Google's Veo model into projects. Flow is available to Google AI Pro and Ultra subscribers in the US starting today. Google will expand availability to other markets soon. Your Yahoo privacy setting is blocking social media and third-party content You can Allow your personal information to be shared and sold. Something went wrong. Try again. You can update your choice anytime by going to your privacy controls, which are linked to throughout our sites and apps. This page will now refresh. Speaking of Veo, that's getting an update. The latest version, Veo 3, is the first iteration that can generate videos with sound (it probably can't add any soul or actual meaning to the footage, though). The company also suggests that its Imagen 4 model is better at generating photorealistic images and handling fine details like fabrics and fur than earlier versions. Handily, Google has a tool it designed to help you determine if a piece of content was generated using its AI tools. It's called SynthID Detector — naturally, it's named after the tool that applies digital watermarks to AI-generated material. According to Google, SynthID Detector can scan an image, piece of audio, video or text for the SynthID watermark and let you know which parts are likely to have a watermark. Early testers will be able to to try this out starting today. Google has opened up a waitlist for researchers and media professionals. (Gen AI companies should offer educators a version of this tech ASAP.) To get access to all of its AI features, Google wants you to pay 250 American dollars every month for its new AI Ultra plan. There's really no other way to react to this other than "LOL. LMAO." I rarely use either of those acronyms, which highlights just how absurd this is. What are we even doing here? That's obscenely expensive. Anyway, this plan includes early access to the company's latest tools and unlimited use of features that are costly for Google to run, such as Deep Research. It comes with 30TB of storage across Google Photos, Drive and Gmail. You'll get YouTube Premium as well — arguably the Google product that's most worth paying for. Google is offering new subscribers 50 percent off an AI Ultra subscription for the first three months. Woohoo. In addition, the AI Premium plan is now known as Google AI Pro. As promised during last week's edition of The Android Show, Google offered another look at Android XR. This is the platform that the company is working on in the hope of doing for augmented reality, mixed reality and virtual reality what Android did for smartphones. After the company's previous efforts in those spaces, it's now playing catchup to the likes of Meta and Apple. The initial Android XR demo at I/O didn't offer much to get too excited about for now. It showed off features like a mini Google Map that you can access on a built-in display and a way to view 360-degree immersive videos. We're still waiting for actual hardware that can run this stuff. As it happens, Google revealed the second Android XR device. Xreal is working on Project Aura, a pair of tethered smart glasses. We'll have to wait a bit longer for more details on Google's own Android XR headset, which it's collaborating with Samsung on. That's slated to arrive later this year. A second demo of Android XR was much more interesting. Google showed off a live translation feature for Android XR with a smart glasses prototype that the company built with Samsung. That seems genuinely useful, as do many of the accessibility-minded applications of AI. Gentle Monster and Warby Parker are making smart glasses with Android XR too. Just don't call it Google Glass (or do, I'm not your dad). Google is giving the Chrome password manager a very useful weapon against hackers. It will be able to automatically change passwords on accounts that have been compromised in data breaches. So if a website, app or company is infiltrated, user data is leaked and Google detects the breach, the password manager will let you generate a new password and update a compatible account with a single click. The main sticking point here is that it only works with websites that are participating in the program. Google's working with developers to add support for this feature. Still, making it easier for people to lock down their accounts is a definite plus. (And you should absolutely be using a password manager if you aren't already.) Your Yahoo privacy setting is blocking social media and third-party content You can Allow your personal information to be shared and sold. Something went wrong. Try again. You can update your choice anytime by going to your privacy controls, which are linked to throughout our sites and apps. This page will now refresh. On the subject of Chrome, Google is stuffing Gemini into the browser as well. The AI assistant will be able to answer questions about the tabs you have open. You'll be able to access it from the taskbar and a new menu at the top of the browser window. It's been a few years since we first heard about Project Starline, a 3D video conferencing project. We tried this tech out at I/O 2023 and found it to be an enjoyable experience. Now, Google is starting to sell this tech, but only to enterprise customers (i.e. big companies) for now. It's got a new name for all of this too: Google Beam. And it's probably not going to be cheap. HP will reveal more details in a few weeks.

Everything announced at the Google I/O 2025 keynote
Everything announced at the Google I/O 2025 keynote

Engadget

time20-05-2025

  • Engadget

Everything announced at the Google I/O 2025 keynote

Today is one of the most important days on the tech calendar as Google kicked off its I/O developer event with its annual keynote. As ever, the company had many updates for a wide range of products to talk about. The bulk of the Android news was revealed last week, during a special edition of The Android Show. However, Tuesday's keynote still included a ton of stuff including, of course, a pile of AI-related news. We covered the event in real-time in our live blog, which includes expert commentary (and even some jokes!) from our team. If you're on the hunt for a breakdown of everything Google announced at the I/O keynote, though, look no further. Here are all the juicy details worth knowing about: It's been a few years since we first heard about Project Starline, a 3D video conferencing project. We tried this tech out at I/O 2023 and found it to be an enjoyable experience. Now, Google is starting to sell this tech, but only to enterprise customers (i.e. big companies) for now. It's got a new name for all of this too: Google Beam. And it's probably not going to be cheap. HP will reveal more details in a few weeks. Google Meet is getting a real-time translation option, which should come in very useful for some folks. A demo showed Meet being able to match the speaker's tone and cadence while translating from Spanish. Subscribers will be able to try out Spanish-language translations in beta starting this week. This feature will soon be available for other languages. This story is developing...

Google I/O 2025 live coverage: Gemini, Android 16 updates, and more
Google I/O 2025 live coverage: Gemini, Android 16 updates, and more

TechCrunch

time20-05-2025

  • TechCrunch

Google I/O 2025 live coverage: Gemini, Android 16 updates, and more

Google's biggest conference of the year is here, and we're expecting quite a bit of news from the event. Google hosted a separate event dedicated to Android updates last week, called The Android Show. But we're still looking forward to more, like improved notifications in Android 16, as well as support for Auracast, which should make it easier to switch between Bluetooth devices. We're also expecting lock screen widgets and a range of new accessibility features. We're also prepping for new additions to Google's flagship Gemini family of AI models, as well as news on Astra, Google's wide-ranging effort to build AI apps and 'agents' for real-time, multimodal understanding. An updated top-of-the-line Gemini Ultra model is on the way, and with it perhaps a pricier Gemini subscription. Be sure to keep up with everything announced right here.

Google I/O 2025: Live updates on Gemini, Android XR, Android 16 updates and more
Google I/O 2025: Live updates on Gemini, Android XR, Android 16 updates and more

Engadget

time20-05-2025

  • Engadget

Google I/O 2025: Live updates on Gemini, Android XR, Android 16 updates and more

Ready to see Google's next big slate of AI announcements? That's precisely what we expect to be unveiled today at Google I/O 2025, the search giant's developer conference that kicks off today at 1PM ET / 10AM PT. Engadget will be covering it in real-time right here, via a liveblog and on-the-ground reporting from our very own Karissa Bell. Ahead of I/O, Google already gave us some substantive details on the updated look and feel of its mobile operating system at The Android Show last week. Google included some Gemini news there as well: Its AI platform is coming to Wear OS, Android Auto and Google TV, too. But with that Android news out of the way, Google can use today's keynote to stay laser-focused on sharing its advances on the artificial intelligence front. Expect news about how Google is using AI in search to be featured prominently, along with some other surprises, like the possible debut of an AI-powered Pinterest alternative. To view this content, you'll need to update your privacy settings. Please click here and view the "Content and social-media partners" setting to do so. The company made it clear during its Android showcase that Android XR, its mixed reality platform, will also be featured during I/O. That could include the mixed reality headset Google and Samsung are collaborating on, or, as teased at the end of The Android Show, smart glasses with Google's Project Astra built-in. As usual, there will be a developer-centric keynote following the main presentation (4:30PM ET / 1:30PM PT), and while we'll be paying attention to make sure we don't miss out any news there, our liveblog will predominantly focus on the headliner. You can watch Google's keynote in the embedded livestream above or on the company's YouTube channel, and follow our liveblog embedded below starting at 1PM ET today. Note that the company plans to hold breakout sessions through May 21 on a variety of different topics relevant to developers. Hello everyone! Welcome to our liveblog of Google's annual I/O developer conference. I feel as if our liveblog tool has gotten more than its fair share of use these last two weeks. If it all feels very familiar to you too, that's likely because we had two liveblogged events just last week, one of which was of the company's Android showcase Update, May 20 2025, 9:45AM ET: This story has been updated to include a liveblog of the event. Update, May 19 2025, 1:01PM ET: This story has been updated to include details on the developer keynote taking place later in the day, as well as tweak wording throughout for accuracy with the new timestamp.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store