
The Motorola Razr Ultra's coolest AI feature has one annoying hiccup
I like almost everything about Motorola's new Razr Ultra. It looks gorgeous, feels powerful, and its cover screen experience remains miles ahead of Samsung's Flex Window. I even think Motorola struck the right balance with its Moto AI rollout, pairing classic features like an image generator with useful note-taking tools and a quick way to summarize notifications. I've actually found ways to work Moto AI into my daily life, which isn't something I've always been able to say with Pixels or Samsung Galaxy devices.
And yet, there's a problem. One of the Razr Ultra's coolest features — and its one exclusive AI tool — makes me just a little bit nervous every time I try to use it. Here's why Look and Talk is Moto AI at its best… and worst.
If you had a Razr Ultra, would you use Look and Talk?
0 votes
Yes, it sounds neat!
NaN %
No, I'm not leaving my phone propped up like that.
NaN %
No wake word, no problem
Ryan Haines / Android Authority
First, the good news — Motorola's Look and Talk is by far the easiest way to trigger an AI-powered assistant. It functions exactly like it sounds, you just look at your Razr Ultra, then you talk to it. There's no need for a wake word or a wave of the hand — make sure you're in range, and the phone will spring into action. You will, of course, have to wait for Moto AI to open. Otherwise, you'll find yourself talking to a phone that's not ready to listen yet. Thankfully, the Moto AI interface is pretty distinct, so when you see the rainbow of pink, blue, and orange, you'll know it's ready to go.
From there, it's just a matter of knowing which assistant you might need and being mindful that you're staying within the Razr Ultra's range. According to Motorola, Look and Talk is designed to work up to an arm's length away, plus a little wiggle room in case you're working around a desk or in the kitchen. For me, that's primarily held true — I'm not tall, so I don't have very long arms, but the Razr Ultra has seemed pretty responsive within about three and a half feet or so. If nothing else, it's reassuring that my phone isn't always listening to me.
When it does listen, I've been impressed with Motorola's answers.
When it does listen, though, I've been impressed with Motorola's answers. The Razr Ultra has been able to help me plan a busy spring weekend in Baltimore while cooking dinner (and following a recipe on a different device). It's also pulled up marathon dates and locations while I try to work out my chances of running a qualifying time for the Boston Marathon this fall. Then, when I'm finished, I can simply walk back out of range and know that my queries are saved somewhere in my Moto AI history for the next time I need to refer back to them.
I just don't trust leaving a $1,300 flip phone in this position
Ryan Haines / Android Authority
So, if Look and Talk lets me chat with my AI-powered assistant hands-free and doesn't require a wake word or input other than my face, what could be wrong with it? Well, even the smallest of gestures still counts as a gesture. Unfortunately, Look and Talk doesn't work with the Razr Ultra closed, so I have to prop my phone into laptop mode or tent mode before it will even look for my face. That means leaving a very expensive flip phone in the relatively vulnerable position of half-open, half-closed while trusting the somewhat limited IP48 rating to keep it safe from harm.
And yes, I generally trust both the hinge and the water resistance rating to do their jobs — so far, both have worked just fine. However, I'm nowhere near learning the muscle memory to automatically put my phone into laptop or tent mode when I set it on my desk or kitchen counter. Because the Razr Ultra comes out of my pocket closed, it gets placed on my desk closed, and then I have to pick it up and open it halfway for Look and Talk to kick in. At that point, it almost becomes easier to use my hands to navigate the Moto AI menu, so I can use Remember This, the Playlist Studio, or wonder what my Next Move might be.
Look and Talk would be a much smarter feature if I didn't have to angle my phone just-so.
I have, of course, only been using the Razr Ultra for a few days at this point, so I could prove myself wrong over time. I might eventually find that I'm pulling the phone out of my pocket and putting it into laptop mode automatically — I don't love the idea of facing the hinge to the sky — but I'm not there yet. Besides, I'm still running the risk of debris settling right next to the display and accidentally scratching the ultra-thin glass when I shut my phone to move along with my day. I don't know about you, but I haven't budgeted for expensive Razr repairs like that.
In the meantime, Motorola should allow Look and Talk to work with the Razr Ultra closed. It can still rely on Face Unlock and hang onto the limited range of three or so feet, but I think the feature has to be ready to work in a position that benefits the most people. For me, that means being ready to go from the second I pull it out of my pocket.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNET
30 minutes ago
- CNET
Get Your Hands on These Anker Soundcore Open-Ear Earbuds While They're Down to Just $53
There are a lot of options for good running earbuds out there, so finding the one that fits you can be tricky. One really good option is to go with an open-ear style though, as it can help you feel a bit cooler and more comfortable. Well, right now you get your hands on the Soundcore C30i by Anker for $53 as long as you use the on-page coupon to save yourself 25%. This is a great chance to grab these buds for less, and they've got loads of features that make them really good for all kinds of workouts. These open-ear earbuds are available in two colors, black and white. Their firm-shell design is lightweight and comfortable. It also has a secure clip-on design so they won't fall out. You can take them anywhere you go, as they're water-resistant. We haven't tested these specific buds, but they promise clear and high-quality audio, giving you the best listening experience. Hey, did you know? CNET Deals texts are free, easy and save you money. If you're looking for more listening options, check out our list of the best deals on earbuds and headphones going on right now. Why this deal matters Anker's Soundcore open earbuds are designed to fit comfortably in your ears with a tight grip so it doesn't fall out while exercising. The $53 earbuds are now significantly cheaper than Bose's Ultra Open Earbuds which run around $299.


Fast Company
31 minutes ago
- Fast Company
The real data revolution hasn't happened yet
The Gartner Hype Cycle is a valuable framework for understanding where an emerging technology stands on its journey into the mainstream. It helps chart public perception, from the 'Peak of Inflated Expectations' through the 'Trough of Disillusionment,' and eventually up the 'Slope of Enlightenment' toward the 'Plateau of Productivity.' In 2015, Gartner removed big data from the Hype Cycle. Analyst Betsy Burton explained that it was no longer considered an 'emerging technology' and 'has become prevalent in our lives.' She's right. In hindsight, it's remarkable how quickly enterprises recognized the value of their data and learned to use it for their business advantage. Big data moved from novelty to necessity at an impressive pace. Yet in some ways, I disagree with Gartner. Adoption has been widespread, but effectiveness is another matter. Do most enterprises truly have the tools and infrastructure to make the most of the data they hold? I don't believe they do. Which is why I also don't believe the true big data revolution has happened yet. But it's coming. Dissecting the Stack A key reason big data is seen as mature, even mundane, is that people often confuse software progress with overall readiness. The reality is more nuanced. Yes, the software is strong. We have robust platforms for managing, querying, and analyzing massive datasets. Many enterprises have assembled entire software stacks that work well. But that software still needs hardware to run on. And here lies the bottleneck. Most data-intensive workloads still rely on traditional central processing units (CPUs)—the same processors used for general IT tasks. This creates challenges. CPUs are expensive, energy hungry, and not particularly well suited to parallel processing. When a query needs to run across terabytes or even petabytes of data, engineers often divide the work into smaller tasks and process them sequentially. This method is inefficient and time-consuming. It also ends up requiring more total computation than a single large job would. Even though CPUs can run at high clock speeds, they simply don't have enough cores to efficiently handle complex queries at scale. As a result, hardware has limited the potential of big data. But now, that's starting to change with the rise of accelerated computing. Breaking the Bottleneck Accelerated computing refers to running workloads on specialized hardware designed to outperform CPUs. This could mean field-programmable gate arrays (FPGAs) or application-specific integrated circuits (ASICs) built for a specific task. More relevant to big data, though, are graphics processing units (GPUs). GPUs contain thousands of cores and are ideal for tasks that benefit from parallel processing. They can dramatically speed up large-scale data operations. Interestingly, GPU computing and big data emerged around the same time. Nvidia launched CUDA (compute unified device architecture) in 2006, enabling general-purpose computing on graphics hardware. Just two years earlier, Google's MapReduce paper laid the foundation for modern big data processing. Despite this parallel emergence, GPUs haven't become a standard part of enterprise data infrastructure. That's due to a mix of factors. For one, cloud-based access to GPUs was limited until relatively recently. When I started building GPU-accelerated software, SoftLayer—now absorbed into IBM Cloud—was the only real option. There was also a perception problem. Many believed GPU development was too complex and costly to justify, especially for general business needs. And for a long time, few ready-made tools existed to make it easier. Those barriers have largely fallen. Today, a rich ecosystem of software exists to support GPU-accelerated computing. CUDA tools have matured, benefiting from nearly two decades of continuous development. And renting a top-tier GPU, like Nvidia's A100, now costs as little as $1 per hour. With affordable access and a better software stack, we're finally seeing the pieces fall into place. The Real Big Data Revolution What's coming next will be transformative. Until now, most enterprises have been constrained by hardware limits. With GPU acceleration more accessible and a mature ecosystem of supporting tools, those constraints are finally lifting. The impact will vary by organization. But broadly, companies will gain the ability to run complex data operations across massive datasets, without needing to worry about processing time or cost. With faster, cheaper insights, businesses can make better decisions and act more quickly. The value of data will shift from how much is collected to how quickly it can be used. Accelerated computing will also enable experimentation. Freed from concerns about query latency or resource drain, enterprises can explore how their data might power generative AI, smarter applications, or entirely new user experiences. Gartner took big data off the Hype Cycle because it no longer seemed revolutionary. Accelerated computing is about to make it revolutionary again.


Forbes
32 minutes ago
- Forbes
FDA's New AI Tool Cuts Review Time From 3 Days To 6 Minutes
AI at the FDA getty The U.S. Food and Drug Administration announced this week that it deployed a generative AI tool called ELSA (Evidence-based Learning System Assistant), across its organization. After a low-profile pilot that delivered measurable gains, the system is now in use by staff across the agency, several weeks ahead of its original schedule. Dr. Marty Makary, the FDA's commissioner, shared a major outcome. A review task that once took two or three days now takes six minutes. 'Today, we met our goal ahead of schedule and under budget,' said Makary. 'What took one scientific reviewer two to three days [before] The FDA has thousands of reviewers, analysts, and inspectors who deal with massive volumes of unstructured data such as clinical trial documents, safety reports, inspection records. Automating any meaningful portion of that stack creates outsized returns. ELSA helps FDA teams speed up several essential tasks. Staff are already using it to summarize adverse event data for safety assessments, compare drug labels, generate basic code for nonclinical database setup, and identify priority sites for inspections, among other tasks. This last item, using data to rank where inspectors should go, could have a real-world impact on how the FDA oversees the drug and food supply chain and impacts on how the FDA delivers its services. Importantly, however, the tool isn't making autonomous decisions without a human in the loop. The system prepares information so that experts can decide faster. It cuts through the routine, not the judgment. One of the biggest questions about AI systems in the public sector revolves around the use of data and third party AI systems. Makary addressed this directly by saying that 'All information stays within the agency. The AI models are not being trained on data submitted by the industry.' That's a sharp contrast to the AI approaches being taken in the private sector, where many large language models have faced criticism over training on proprietary or user-submitted content. In the enterprise world, this has created mounting demand for "air-gapped" AI solutions that keep data locked inside the company. That makes the FDA's model different from many corporate tools, which often rely on open or external data sources. The agency isn't building a public-facing product. It's building a controlled internal system, one that helps it do its job better. Federal departments have been slow to move past AI experimentation. The Department of Veterans Affairs has started testing predictive tools to manage appointments. The SEC has explored market surveillance AI for years. But few have pushed into full and widespread production. The federal government has thousands of employees processing huge volumes of information, most of it unstructured sitting in documents, files, and even paper. That means AI is being focused most on operational and process-oriented activities. It's shaping up to be a key piece of how agencies process data, make recommendations, and act. Makary put it simply that ELSA is just the beginning for AI adoption within the FDA. 'Today's rollout of ELSA will be the first of many initiatives to come,' he said. 'This is how we'll better serve the American people.'