
I Run CNET's Testing Labs. Here's How Our Top-Rated Robot Vacuum Cleaned Up in Real Life
CNET's expert staff reviews and rates dozens of new products and services each month, building on more than a quarter century of expertise.
My good boy Torben was more than excited to see if the Dreame X50 was up to the task of cleaning up after him.
Jared Hannah/CNET
I've tested a lot of robot vacuums over the years, and while they've gotten smarter and more powerful, there's always been one annoying limitation -- they can't handle multiple floors without human help. No matter how advanced the tech gets, I still end up carrying the vacuum up and down the stairs.
That's why I was curious about the Dreame X50 Ultra. It's a vacuum and mop combo that made its debut at CES 2025. It promises to tackle multi-level cleaning smarter. After trying it out in my messy, pet-filled home, I was genuinely surprised by what it could do.
We tested it at CNET Labs along with dozens of other robot vacuums. After lab testing, it was named CNET's best for obstacles and pets. I was impressed by how the DreamX50 performed during lab testing. I wanted to see how the Dream X50 performed in real-world conditions -- how well could it tackle pet hair, climb over my stuff, mop and navigate random obstacles?
I spent years as a product development design engineer and now oversee operations at CNET's Testing Lab. Now layer my expertise with my home life: I live with my wife, three teenage boys and two large shedding dogs. What better place to put the Dreame X50 to the ultimate real-world test? Here's what I found: it did well, what surprised me and what fell short.
My first impressions of the Dreame X50 Ultra robot vacuum
Easy set up: Setting it up was pretty easy-peasy. I followed the guide, filled the water, added the cleaner and plopped in the dust bag. The app was a breeze on my Android phone, and I got it mapping the house in no time.
Mapping and navigation: It zipped around and mapped my main floor (about 1,200 square feet) in just 11 minutes. It figured out the different rooms and even knew where the carpets were and the flooring types of each room.
This is the live view mode in the app so that you can see what the robot sees in real time from its front facing camera. It's a neat way to check on your house when you're not home.
Jared Hannah
Cleaning: It started with a full deep clean, then vacuumed and mopped on its second run. During the app set up, I told it I had pets, so it emptied its dustbin a bunch during the first run. I found it to be pretty smart about lifting its mops while vacuuming the carpet.
Timing: It took a little over 2 hours to do everything, which is slower than my older Roborock Q5 robot vac, but hey, this one mops! For comparison, my Roborock Q5 will vacuum the whole house in 88 minutes, whereas the Dreame took 123 minutes to vacuum the same space.
5 things I liked and what surprised me
1. Tangle-free roller: After a few runs, there was zero hair wrapped around the roller. To me, this alone is a huge win and solves a big pain point for my home. I call it the "Great Pyrenees challenge." I have two long-haired, large-breed shedding dogs (Great Pyrenees), so untangling hair from the roller is a constant annoyance. Not having to untangle it all the time is a win in my book.
A closer look at the Dreame X50 brush roller. It can be removed easily if you ever need to clear anything that might get caught between the dual rollers.
Jared Hannah
A comparison of the older Roborock Q5 (top) vs. the Dreame X50 Ultra (bottom) after cleaning one large area rug covered in dog hair. You can see that the Roborock already has lots of hair stuck on the roller and the Dreame has none.
Jared Hannah
2. Climbing: It conquered what I call my "Ikea chair challenge." My other robo-vacs always get stuck at the bottom of this chair. The Dreame X50 figured out how to use its auxiliary climbing arms to get over it. It's actually kind of entertaining to watch the robot struggle a little at first, then regroup and try a different strategy using one of the tools it has in its arsenal.
The Dreame X50 Ultra was able to use its lift arms to climb over the base of this chair that almost every other robot gets stuck on.
Jared Hannah
The robot successfully identified furniture that it might get stuck on. It knew to either avoid it, or use its lift arms to climb over it.
Jared Hannah
The Dreame X50 also did a great job at identifying cords and it knew to avoid them.
Jared Hannah
3. Navigating furniture: I was surprised by how well it could handle furniture. The X50 has a feature that lowers its turret, allowing it to fit underneath low-clearance furniture that other robots could not reach.
These are some of the settings in the extensive menu where you can set it to lower the turret to fit under low clearance furniture.
Jared Hannah
4. Mopping: I've never had a mopping robot before, but this one does a solid job. It's pretty much as good as when I mop myself. I was impressed with how well the base station cleaned off the mop pads between cycles. I just have to dump the dirty water tank and refill the clean one.
The clean water tank and dirty water tank are easy to remove and install. These tanks are also much larger capacity than other models we have tested allowing you to get through more mopping cycles before it needs a refill.
Jared Hannah
The Dreame X50 is mopping up muddy paw prints from vinyl plank flooring. It knows to avoid the area rug when it is performing the mopping function.
Jared Hannah
There's also a spot for cleaning solution, which is neat. When it parks itself on the base station, it sprays off the mop pads while spinning them to get all of the dirt out. Then it dries them out. I inspected the mop pads after their pad cleaning cycle, and they look very clean. By default, it'll do this after every mopping session so that you're never using dirty mop pads or dirty water on your floors.
I did notice a little dirt buildup in the base station where it cleans the mop pads, so that will eventually need cleaning.
5. Object avoidance: This was seriously impressive. It dodged shoes, socks, cords, toys, everything. I didn't have to tidy up before running it, which is amazing. It even recognized my dogs and was super gentle around them.
The Dreame X50 will take photos of your pets if you enable that setting. You can click on the pet icon on the cleaning map to see the photos after a cleaning cycle.
Jared Hannah
6 things I didn't like or I'm still unsure about
1. Vacuum performance: I'm not 100% convinced it's the best vacuum for getting all the dog hair off my rugs. Based on my home experience, it appeared to do an OK job. During lab testing, though, we found the Ecovacs Deebot T30S and the iRobot Roomba Combo J7 Plus to perform better with our pet hair and carpet test. The Dreame X50 performed admirably during the hardwood-sand test, but compared with its competitors, it struggled on carpet, averaging under 50% in our testing on midpile and low-pile carpet.
2. Lots of moving parts: The DreameX50 has a ton of gadgets and moving parts. I'm curious how well it will hold up long term, but so far, so good.
3. Software quirks: The AI is supposed to be super smart, but it had some weird moments. It took its time figuring out where it was at the start of a cleaning cycle, and it got a little confused by my dining chairs. It also took way longer to clean the house than my old robot vac.
4. Voice command wonkiness: The voice commands were a bit hit or miss, too. I didn't always know what the robot would do when given a particular voice command. For example, when I said "mop the kitchen," the robot vacuumed the kitchen before mopping, even though the default setting was supposed to do both simultaneously. Similarly, when I said "clean the house," the robot cleaned only the hallway. I found that "start cleaning" would initiate a full house clean, but other commands didn't always produce the expected results. While most voice commands worked, there is still room for improvement.
5. Settings menu: The settings menu is super packed, which I found overwhelming. Feature bloat tends to be a common problem on robot vacuum apps. We've also seen this on the Roborock app while testing the Saros Z70.
6. Price: The Dreame X50 is one of the most expensive vacuums out there right now, at around $1,699 at full price. It might be worth it if you have numerous levels and thresholds in your home and really want a robotic vacuum, but I can't see it emptying shelves until Dreame can figure out how to bring the costs down.
My overall experience with the Dreame X50 robot vacuum
The Dreame X50 Ultra mapped out the main level of my house quickly and got right to work cleaning.
Jared Hannah
This robot vacuum reminds me of one of my favorite movies from the 1980s -- Batteries Not Included. If you haven't seen the movie (I highly recommend it), the Dreame X50 Ultra has a lot of similarities to those robots with all of its tools and moving parts like the lifting arms, the extendable side brush, the headlight and the lidar system.
Overall, I'm pretty happy with this robot. It vacuums well, the mop feature is great, and the object avoidance is a lifesaver. It's a bit quirky and could use some software tweaks, but I honestly love all its little robot arms and how it tackles obstacles. It kind of feels like another personality running around the house.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
32 minutes ago
- Forbes
Musk Follows Harvard In Biting The Hand That Feeds
Elon Musk and Harvard Both Bite the Governmental Hand that Feeds Them From an early age, children are taught essential lessons: do not play with fire, do not pet strange dogs, and if one cannot swim, stay out of the deep end. Another timeless rule—often forgotten by those in positions of immense wealth and influence—is this: do not bite the hand that feeds you. This lesson, while simple, has profound implications in the real world. It applies just as readily to billionaires and institutions as it does to children on a playground. Yet recent actions by both Elon Musk and prominent academic institutions—most notably Harvard, but also Columbia, MIT, and others—suggest that even the most successful individuals and organizations are capable of ignoring foundational wisdom. Harvard set the tone. Amid growing political scrutiny and a shifting cultural landscape, the university has drawn intense criticism over its handling of campus protests, particularly those involving slogans such as 'from the river to the sea.' The administration's decision to defend even the most controversial speech—widely viewed by many as antisemitic—has triggered investigations and jeopardized billions in tax-exempt status and government research funding. This raises a critical question: is this truly the hill worth dying on? Is preserving the right to controversial protest slogans worth risking Harvard's institutional future? It is doubtful that most students and faculty would knowingly trade funding, grants, and prestige for this fight. Elon Musk, the world's richest man, has now followed suit—this time turning his attention toward President Donald Trump, with whom he has launched a high-profile and personal feud. What makes this move especially striking is that President Trump is not a distant figure or a fading influence. He is once again sitting in the White House, wielding executive authority over regulatory agencies, defense contracting, and infrastructure initiatives—all areas that directly affect Musk's companies. Tesla, SpaceX, and xAI have flourished in part because of government partnership. SpaceX alone holds multibillion-dollar contracts with NASA and the Department of Defense. Tesla has benefitted from years of energy subsidies and EV tax incentives. Picking a fight with the sitting president—regardless of personal conviction—puts this entire ecosystem at risk. And again the question must be asked: is this battle worth the damage? Whatever principle Musk may be defending, the consequences extend far beyond himself. Shareholders, employees, and retail investors—many of whom placed their trust and savings in his leadership—are the ones left exposed. The parallel between Harvard and Musk is striking: both have been immensely successful, aided in large part by government funding, favorable regulation, and public goodwill. And both have, for different reasons, chosen to confront the very institutions and leaders that have helped sustain their growth. There is precedent for how this ends. Jack Ma, once the most powerful entrepreneur in China, famously criticized the Chinese government. The backlash was immediate and absolute. His companies were dismantled. His IPO was cancelled. His wealth and influence evaporated almost overnight. Even in less authoritarian systems, the lesson holds: those who antagonize the systems that support them may not survive the consequences. While Musk's personal net worth has dropped from nearly $450 billion to approximately $300 billion, the impact is more symbolic than practical for him. But for millions of investors, employees, and stakeholders, these battles matter. Market volatility, regulatory backlash, and reputational risk all come with tangible financial costs—costs borne not just by Musk himself, but by those who have trusted and invested in his vision. The same applies to Harvard and peer institutions. Their leadership may believe they are standing on principle, but the price of alienating government agencies and key financial backers could reshape the long-term trajectory of these universities. The erosion of public trust, the loss of bipartisan support, and the potential withdrawal of federal funding pose existential threats. Leadership—whether in business or academia—requires more than conviction. It requires judgment, timing, and the discipline to separate personal ideology from institutional responsibility. Founder-led companies often outperform when leaders are focused, visionary, and measured. But when ego replaces strategy, the consequences can be swift and severe. No one is demanding absolute political alignment or silence in the face of controversy. No one is asking Elon Musk to wear a MAGA hat. But his recent actions have been so volatile, so self-destructive, that investors may soon be tempted to hand him something else entirely—a MEGA hat: Make Elon Great Again. In today's polarized environment, the margin for error has narrowed. And for those who owe much of their success to public support—whether in Silicon Valley or the Ivy League—biting the hand that feeds is not just unwise. It is unsustainable. ---------------------------------- Disclosure: Past performance is no guarantee of future results. Please refer to the following link for additional disclosures: Additional Disclosure Note: The author has an affiliation with ERShares and the XOVR ETF. The intent of this article is to provide objective information; however, readers should be aware that the author may have a financial interest in the subject matter discussed. As with all equity investments, investors should carefully evaluate all options with a qualified investment professional before making any investment decision. Private equity investments, such as those held in XOVR, may carry additional risks—including limited liquidity—compared to traditional publicly traded securities. It is important to consider these factors and consult a trained professional when assessing suitability and risk tolerance.


Forbes
32 minutes ago
- Forbes
An AI Film Festival And The Multiverse Engine
In the glassy confines of Alice Tully Hall on Thursday, the third annual Runway AI Film Festival celebrated an entirely new art form. The winning film, Total Pixel Space, was not made in the traditional sense. It was conjured by Jacob Adler, a composer and educator from Arizona State University, stitched together from image generators, synthetic voices, and video animation tools — most notably Runway's Gen-3, the company's text-to-video model (Runway Gen-4 was released in March). Video generation technology emerged in public in 2022 with Meta's crude video of a flying Corgi wearing a red cape and sunglasses. Since then, it has fundamentally transformed filmmaking, dramatically lowering barriers to entry and enabling new forms of creative expression. Independent creators and established filmmakers alike now have access to powerful AI tools such as Runway that can generate realistic video scenes, animate storyboards, and even produce entire short films from simple text prompts or reference images. As a result, production costs and timelines are shrinking, making it possible for filmmakers with limited resources to achieve professional-quality results and bring ambitious visions to life. The democratization of content creation is expanding far beyond traditional studio constraints, empowering anyone with patience and a rich imagination. Adler's inspiration came from Jorge Luis Borges' celebrated short story The Library of Babel, which imagines a universe where every conceivable book exists in an endless repository. Adler found a parallel in the capabilities of modern generative machine learning models, which can produce an unfathomable variety of images from noise (random variations in pixel values much like the 'snow' on an old television set) and text prompts. 'How many images can possibly exist,' the dreamy narrator begins as fantastical AI-generated video plays on the screen: a floating, exploding building; a human-sized housecat curled on a woman's lap. 'What lies in the space between order and chaos?' Adler's brilliant script is a fascinating thought experiment that attempts to calculate the total number of possible images, unfurling the endless possibilities of the AI-aided human imagination. 'Pixels are the building blocks of digital images, tiny tiles forming a mosaic,' continues the voice, which was generated using ElevenLabs. 'Each pixel is defined by numbers representing color and position. Therefore, any digital image can be represented as a sequence of numbers,' the narration continues, the voice itself a sequence of numbers that describe air pressure changes over time. 'Therefore, every photograph that could ever be taken exists as coordinates. Every frame of every possible film exists as coordinates.' Winners at the 3rd Annual International AIFF 2025 Runway was founded in 2018 by Cristóbal Valenzuela, Alejandro Matamala, and Anastasis Germanidis, after they met at New York University Tisch School of the Arts. Valenzuela, who serves as CEO, says he fell in love with neural networks in 2015, and couldn't stop thinking about how they might be used by people who create. Today, it's a multi-million-user platform, used by filmmakers, musicians, advertisers, and artists, and has been joined by other platforms, including OpenAI's Sora, and Google's Veo 3. What separates Runway from many of its competitors is that it builds from scratch. Its research team — which comprises most of the company — develops its own models, which can now generate up to about 20 seconds of video. The result, as seen in the works submitted to the AI Film Festival, is what Valenzuela calls 'a new kind of media.' The word film may soon no longer apply. Nor, perhaps, will filmmaker. 'The Tisches of tomorrow will teach something that doesn't yet have a name,' he said during opening remarks at the festival. Indeed, Adler is not a filmmaker by training, but a classically trained composer, a pipe organist, and a theorist of microtonality. 'The process of composing music and editing film,' he told me, 'are both about orchestrating change through time.' He used the image generation platform Midjourney to generate thousands of images, then used Runway to animate them. He used ElevenLabs to synthesize the narrator's voice. The script he wrote himself, drawing from the ideas of Borges, combinatorics, and the sheer mind-bending number of possible images that can exist at a given resolution. He edited it all together in DaVinci Resolve. The result? A ten-minute film that feels as philosophical as it is visual. It's tempting to frame all this as the next step in a long evolution; from the Lumière brothers to CGI, from Technicolor to TikTok. But what we're witnessing isn't a continuation. It's a rupture. 'Artists used to be gatekept by cameras, studios, budgets,' Valenzuela said. 'Now, a kid with a thought can press a button and generate a dream.' At the Runway Film Festival, the lights dimmed, and the films came in waves of animated hallucinations, synthetic voices, and impossible perspectives. Some were rough. Some were polished. All were unlike anything seen before. This isn't about replacing filmmakers. It's about unleashing them. 'When photography first came around — actually, when daguerreotypes were first invented — people just didn't have the word to describe it,' Valenzuela said during his opening remarks at the festival. 'They used this idea of a mirror with a memory because they'd never seen anything like that. … I think that's pretty close to where we are right now.' Valenzuela was invoking Oliver Wendell Holmes Sr.'s phrase to convey how photography could capture and preserve images of reality, allowing those images to be revisited and remembered long after the moment had passed. Just as photography once astonished and unsettled, generative media now invites a similar rethinking of what creativity means. When you see it — when you watch Jacob Adler's film unfold — it's hard not to feel that the mirror is starting to show us something deeper. AI video generation is a kind of multiverse engine, enabling creators to explore and visualize an endless spectrum of alternate realities, all within the digital realm. 'Evolution itself becomes not a process of creation, but of discovery,' his film concludes. 'Each possible path of life's development … is but one thread in a colossal tapestry of possibility.'


Forbes
an hour ago
- Forbes
Tested: Tesla Model Y Juniper As Robotaxi
Here's some breaking news: the 2026 Tesla Model Y 'Juniper' with Full Self Driving is a robotaxi. Maybe Tesla can't call it that but that's what it is. And Waymo may have met its match. I had the 2026 Model Y for the 48-hour test drive (which Tesla just began offering) this past week in Los Angeles. The new Model Y, which hit Tesla stores in February, comes with Full Self-Driving (Supervised) version 13.2.9. But the fact that it's supervised didn't stop me from using it, in practice, unsupervised as a robotaxi, i.e., going door to door without intervention. As background, I've tested the Juniper Model Y FSD now three times: two test drives when it arrived at Tesla stores in March-April and now a 48-hour test drive. On most excursions it has gotten me door to door without intervention (see video below). That is, I just punch in the destination address and let the Model Y drive. I'm a passenger – not unlike Waymo, which I've also used many times in the Beverly Hills-West Hollywood area (more on Waymo comparison in video). Here's the short version. The new Model Y Juniper with version 13 of FSD is pretty damn close to a Tesla robotaxi and Waymo. Yes, I had to occasionally intervene but many trips in the vehicle are intervention-free = robotaxi. And, yes, it makes mistakes but so does Waymo. No FSD errors on the Model Y Juniper with v13.2.9 I've experienced have been dangerous or egregious. Mostly things like driving too slowly or taking a convoluted route to my destination (the latter is a mistake Waymo also makes). The Model Y with FSD version 13 is a vast improvement over the Model 3 I tested about a year ago. As just two examples, the Model Y took me from my home to a Supercharger location about 10 miles away intervention-free. I did nothing but sit there and witness the drive. At the end of the return trip, it took a route that I would not have chosen to take. But human taxi drivers do that too. It also took me to a Starbucks about 8 miles away intervention-free. That trip too was very similar, if not exactly the same as, what I've experienced in a Waymo Jaguar I-PACE in downtown Los Angeles. The only thing that I've found annoying is occasional speed limitations. On some short stretches of road near my home it slows to 25 mph and won't go faster unless I intervene. Tesla FSD is often compared unfavorably to Google's Waymo. That may have been true in the past. But not anymore. I use Waymo a lot in Los Angeles, as I said above. Though Waymo is amazing, it also makes mistakes. But its biggest shortcoming is its range limitations, i.e., geofencing (see this map). Los Angeles is a very big place and most of LA county is off limits to Waymo. Tesla's FSD doesn't have that problem. That is both a boon and a bane for Tesla – the latter because it's a huge challenge. But I see Tesla meeting the challenge in most cases. I will give Waymo this. In the geofenced area I use (Century City / Beverly Hills / West Hollywood) it is more refined and more confident than Tesla FSD. In some cases, more adept at avoiding and getting around obstacles. But Tesla is almost there. And, again, Tesla FSD has a huge advantage in that it is not limited to small restricted areas. I've spent a lot of time testing General Motors Super Cruise. As well as Ford's Bluecruise and Rivian's Highway Assist. Super Cruise does what it says it does. It very competently takes over the driving duties on the highway. But it ain't Tesla FSD. It won't do local roads. It's not a robotaxi. And that's the bottom line. FSD is not foolproof or flawless. And a Bloomberg story this week makes that clear. In that case, an older version of FSD was blinded by the sun, resulting in fatalities. And I've been in a Tesla when FSD missed seeing a community gate, which, without intervention, would have resulted in an accident. That was in a previous version of FSD. But it doesn't mean it can't happen again. That said, GM's SuperCruise, based on my experience, also makes the rare risky mistake. As do other ADAS (Advanced Driver Assist System) from other EV manufacturers that I've tested. Over the past year, I've tested ADAS on EVs from General Motors (Super Cruise), Rivian (Highway Assist), Ford (Bluecruise), and Tesla. My take is that the benefits of an ADAS outweigh the risks. In 2024, there were 39,345 US traffic fatalities. Needless to say, practically all involved human drivers. And that increasingly means distracted drivers using their smart device. Unlike humans, an ADAS does not get distracted. The larger picture is that, on balance, a Tesla with FSD – and any reputable ADAS for that matter – makes the roads safer. As long as the driver is paying attention and can take over when the ADAS fails. The latter unfortunately is a big if because some drivers see it as an invitation to text or nap. So, what about a robotaxi where there is no driver to intervene? As stated above, of course there's risk. But there is a much bigger risk with the average car driven by the average distracted human. With the explosion of personal devices, more and more people are distracted while they drive as they engage in things like texting – and even web browsing – while driving. I see people staring down at their devices while driving every day in Los Angeles. Those people are much more dangerous than any ADAS-controlled car. And those people would benefit greatly from an ADAS. The upshot is, an ADAS, such as Tesla FSD and robotaxi, does not get distracted and is laser-focused on the road. Humans often are not.