logo
I'm not handing control of my wallet to an AI — and not even Google's AI shopping features can change that

I'm not handing control of my wallet to an AI — and not even Google's AI shopping features can change that

Tom's Guide25-05-2025

The internet has absolutely revolutionized the way we shop. Whether you're after groceries, clothes, the best phones or something else entirely, our first instinct is to head online and see where we can pick it up the cheapest.
Now, though, it seems like big tech is having another go at this — employing AI to do all that initial searching for you. Google just showed off a bunch of features related to this during the Google I/O 2025 keynote. And, if I'm being totally honest, I have very mixed feelings about the whole thing.
Especially if it's not entirely clear if I will always get total control over how my money is being spent.
The idea of buying with AI, or some approximation of it, is nothing new. One of the reasons Amazon created Alexa was to give customers the option to purchase items with their voice, rather than using an app or website — with particular emphasis on buying from Amazon itself.
Shopping with Alexa has evolved a lot over the years, and the core feature is still around. You can ask Alexa to order something for you, after placing it in your basket, without ever looking at what it is. That's something that has never sat right with me.
When I'm shopping I tend to do a lot of looking around. If I want something specific, that means checking different retailers, or at the very least using Google to see who has what. Then, if it's a generic product, I'll be browsing the different options to check up on price, features, materials and all the other things on offer.
Odds are I'm not going to pick the first listing I see, or even the second and third. I need to find the right thing for me, and it may not be the most obvious or even the cheapest option. The whole shopping process is weighing up what I need, how much I have to pay, and when it needs to arrive — among other things.
These are the things that are being calculated in my mind as I'm browsing, rather than something I can fully articulate off the top of my head. It's why I could never fully trust Alexa to pick those items for me, even if I'm the one that verifies my basket before any money is handed over. And that's not likely to change anytime soon, even if it's a different AI involved in the process.
Google I/O keynote was filled to the brim with AI news and previews. To the point where it was actually quite difficult to keep up with everything Google had on the table. But two sections immediately stood out to me — and both of them were shopping related.
The Project Astra demo, where Google's AI attempts to help fix a bike involves Gemini calling a local bike shop and then placing an order for a new tension screw. While Gemini never acts independently, the one thing I noticed was that it was able to place a pick-up order without actually telling the user how much it's all going to cost.
A tension screw for a bike is not going to be expensive, even from a small independent store that can't cut costs like Amazon does. But the fact Google did all that without divulging key information is slightly concerning. Sure, it's on the user to actually ask those questions, but Gemini shouldn't need to be prompted to tell you all the important details.
But that's at odds with later demos at I/O, where Google showed how Search's new AI Mode can make the act of shopping online easier. The short version is that using Gemini and LLMs, Google can now do research for you, and help you find the kind of stuff you may want to buy, and without you needing to use very specific keywords.
But it was also made clear that this was all controlled by you, and regardless of how much research Google AI does for you the actual act of making the purchase is entirely down to you, rather than a blind purchase.
Obviously these are different systems that account for shopping in very different ways. But you'd also think there would be some level of consistency between them both — especially since these demos were not happening live.
I'm not a big fan of AI, and I've made my feelings very clear on that in the past. Features either feel too gimmicky to be of any use, or feel like they don't actually save me much in the way of effort.
On top of that I'll always remain skeptical of Google's AI Search features, given how poor the AI Overviews have been since they first started rolling out.
The new AI Mode shopping features seem rather interesting, and the ability to ask AI for recommendations on what to look for could be useful. That is, assuming it's able to do everything Google says it can do.
But no matter what happens, I absolutely will not be handing over all the decision-making process to a machine — and I sure as heck won't be letting it buy stuff for me.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Google Phone app is getting a visual makeover with Android 16's Material 3 Expressive
Google Phone app is getting a visual makeover with Android 16's Material 3 Expressive

Yahoo

timean hour ago

  • Yahoo

Google Phone app is getting a visual makeover with Android 16's Material 3 Expressive

Material 3 Expressive design, for Android 16, has been spotted on the Phone by Google app. Google's Phone app gets larger elements, new buttons, and more. In-call "More" controls now appear as a pop-up menu. Android 16 is a big release, not just in terms of new features but also because of the overhaul of the operating system's Material Design language. Google is calling it Material 3 Expressive, and the company is already working on introducing the design language to some of its popular apps, including Calendar, Photos, Files, and Meet. It's safe to assume that the Mountain View tech giant will introduce Material 3 Expressive to all its Android apps to ensure design consistency in the operating system. While we're all excited to see how Material 3 Expressive transforms each of the Google apps on Android, we just got a solid look at what the Phone by Google app will look like with Android 16's design, courtesy of Android Authority's APK teardown of the app's version 177.0.763181107-publicbeta-pixel2024. The design makeover was spotted on the incoming call screen and in-call menu. The incoming call screen shows the rounded call button, which still supports the vertical swipe gesture for answering or declining calls. This could be seen as a major hint that the company has no plans to replace the vertical swipe with a horizontal swipe and simple tap-to-answer/decline buttons. Image source: Android Authority The in-call screen also shows a new animation for the profile picture of the caller. However, the animation disappears when you receive the call, with the screen showing the name, phone number, profile picture, buttons, and menu, all of which appear bigger than the current ones. The in-call screen is much more than changes in size. The shape of the in-call buttons also changed from round to oval. These buttons change shape to a rounded square upon pressing. We don't see any new buttons, but there is a noteworthy change in how the "More" menu appears. Currently, the "More" button reveals additional control options, including "Add call," Video call," and "Hold," all of which appear in the same container as the other buttons. But with Material 3 Expressive, the additional controls now appear in a pop-up style menu, appearing just above those buttons. Another major change we can spot is the redesigned reject call button, which is now pill-shaped and not rounded. Again, all these changes are currently going through the internal testing phase and are not available to general users. As much as we'd love to see them on the Phone app, there is no clarity about when they will be available. We expect the redesign to be available before Material 3 Expressive is rolled out to Pixel phones via a Feature drop later in the year. Phone by Google Google LLC TOOLS Price: Free 4.5 Download

Halton students take top honours in national Samsung STEM contest
Halton students take top honours in national Samsung STEM contest

Hamilton Spectator

time2 hours ago

  • Hamilton Spectator

Halton students take top honours in national Samsung STEM contest

Students from the Halton District School Board's I-STEM program have earned national recognition in the Samsung Solve for Tomorrow competition, winning top prizes for their innovation and problem-solving using science, technology, engineering and mathematics (STEM). Out of four national finalist teams, two were from the HDSB I-STEM program. Aldershot School won first place, while Elsie MacGill Secondary School placed third in the Canada-wide challenge, which encourages students to use STEM to address real-world issues. The Samsung Solve for Tomorrow contest aims to foster student innovation through STEM, while encouraging collaboration, critical thinking and a sense of social responsibility. It tasks students with developing solutions to global and local problems using emerging technologies and creative thinking. Earlier this spring, teams from Aldershot School, Elsie MacGill Secondary School and Thomas A. Blakelock High School were selected as regional finalists, each receiving $2,500 in Samsung technology. The students created five-minute videos showcasing their ideas for building a more sustainable, inclusive and healthier future through tech-driven solutions. National finalists presented their projects at Samsung Canada's headquarters in Mississauga on May 28, following months of research and collaboration. Aldershot School earned the top spot with a pitch by Grade 12 student Keerthana Srinivasan, who developed a project using quantum Monte Carlo methods to detect faults in photovoltaic farms. The school received $50,000 in Samsung technology to support STEM learning. Aldershot was also named School for Tomorrow, an award recognizing innovation in education. Elsie MacGill Secondary School placed third with a proposal by Grade 11 students Hassan Rasheed and Karam Noori. Their idea featured kinetic floor plates that convert motion into electrical energy. The team was awarded $10,000 in tech resources. These projects highlight the effectiveness of hands-on learning and the ability of students to apply STEM knowledge to pressing global challenges. The I-STEM program is a regional, four-year secondary school initiative that's open to students across and beyond Halton. It emphasizes design thinking, entrepreneurship, global competencies and adaptability, preparing students for rapid technological change, globalization and shifting workforce needs. 'This is a remarkable achievement that highlights the excellence of the I-STEM program in equipping students with the skills for future-ready innovation and problem solving,' said Curtis Ennis, Director of Education for the HDSB. 'Through this experience, students build critical thinking, creativity and collaboration skills, while developing innovative solutions that make a meaningful impact. Competitions like this support the goals outlined in the HDSB's 2024–2028 Multi-Year Strategic Plan, particularly in the areas of Learning, Engagement and Achievement, by connecting classroom learning to authentic, hands-on experiences. I extend my heartfelt congratulations to our HDSB students for their outstanding accomplishments on the regional and national stages.' In a statement, the board said the 2024–2028 Multi-Year Strategic Plan sets direction for its work and aligns actions across its schools and programs. 'This plan guides our efforts to support more than 67,000 students and 11,000 staff, as well as the broader HDSB community. The four-year strategy is built around six commitments that intersect to ensure cohesive progress on key objectives,' the HDSB said. Error! Sorry, there was an error processing your request. There was a problem with the recaptcha. Please try again. You may unsubscribe at any time. By signing up, you agree to our terms of use and privacy policy . This site is protected by reCAPTCHA and the Google privacy policy and terms of service apply. Want more of the latest from us? Sign up for more at our newsletter page .

Ads Ruined Social Media. Now They're Coming to AI.
Ads Ruined Social Media. Now They're Coming to AI.

Bloomberg

time2 hours ago

  • Bloomberg

Ads Ruined Social Media. Now They're Coming to AI.

Chatbots might hallucinate and sprinkle too much flattery on their users — 'That's a fascinating question!' one recently told me — but at least the subscription model that underpins them is healthy for our wellbeing. Many Americans pay about $20 a month to use the premium versions of OpenAI's ChatGPT, Google's Gemini Pro or Anthropic's Claude, and the result is that the products are designed to provide maximum utility. Don't expect this status quo to last. Subscription revenue has a limit, and Anthropic's new $200-a-month 'Max' tier suggests even the most popular models are under pressure to find new revenue streams.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store