Latest news with #BostonDynamics'


Qatar Tribune
2 days ago
- Science
- Qatar Tribune
Robots run out of energy long before they run out of work to do
Agencies Earlier this year, a robot completed a half-marathon in Beijing in just under 2 hours and 40 minutes. That's slower than the human winner, who clocked in at just over an hour – but it's still a remarkable feat. Many recreational runners would be proud of that time. The robot kept its pace for more than 13 miles (21 kilometers). But it didn't do so on a single charge. Along the way, the robot had to stop and have its batteries swapped three times. That detail, while easy to overlook, speaks volumes about a deeper challenge in robotics: energy. Modern robots can move with incredible agility, mimicking animal locomotion and executing complex tasks with mechanical precision. In many ways, they rival biology in coordination and efficiency. But when it comes to endurance, robots still fall short. They don't tire from exertion – they simply run out of power. As a robotics researcher focused on energy systems, I study this challenge closely. How can researchers give robots the staying power of living creatures – and why are we still so far from that goal? Though most robotics research into the energy problem has focused on better batteries, there is another possibility: Build robots that eat. Modern robots are remarkably good at moving. Thanks to decades of research in biomechanics, motor control and actuation, machines such as Boston Dynamics' Spot and Atlas can walk, run and climb with an agility that once seemed out of reach. In some cases, their motors are even more efficient than animal muscles. But endurance is another matter. Spot, for example, can operate for just 90 minutes on a full charge. After that, it needs nearly an hour to recharge. These runtimes are a far cry from the eight- to 12-hour shifts expected of human workers – or the multiday endurance of sled dogs. The issue isn't how robots move – it's how they store energy. Most mobile robots today use lithium-ion batteries, the same type found in smartphones and electric cars. These batteries are reliable and widely available, but their performance improves at a slow pace: Each year new lithium-ion batteries are about 7% better than the previous generation. At that rate, it would take a full decade to merely double a robot's runtime. Animals store energy in fat, which is extraordinarily energy dense: nearly 9 kilowatt-hours per kilogram. That's about 68 kWh total in a sled dog, similar to the energy in a fully charged Tesla Model 3. Lithium-ion batteries, by contrast, store just a fraction of that, about 0.25 kilowatt-hours per kilogram. Even with highly efficient motors, a robot like Spot would need a battery dozens of times more powerful than today's to match the endurance of a sled dog. And recharging isn't always an option. In disaster zones, remote fields or on long-duration missions, a wall outlet or a spare battery might be nowhere in sight. In some cases, robot designers can add more batteries. But more batteries mean more weight, which increases the energy required to move. In highly mobile robots, there's a careful balance between payload, performance and endurance. For Spot, for example, the battery already makes up 16% of its weight. Some robots have used solar panels, and in theory these could extend runtime, especially for low-power tasks or in bright, sunny environments. But in practice, solar power delivers very little power relative to what mobile robots need to walk, run or fly at practical speeds. That's why energy harvesting like solar panels remains a niche solution today, better suited for stationary or ultra-low-power robots. These aren't just technical limitations. They define what robots can do. A rescue robot with a 45-minute battery might not last long enough to complete a search. A farm robot that pauses to recharge every hour can't harvest crops in time. Even in warehouses or hospitals, short runtimes add complexity and cost. If robots are to play meaningful roles in society assisting the elderly, exploring hazardous environments and working alongside humans, they need the endurance to stay active for hours, not minutes. New battery chemistries such as lithium-sulfur and metal-air offer a more promising path forward. These systems have much higher theoretical energy densities than today's lithium-ion cells. Some approach levels seen in animal fat. When paired with actuators that efficiently convert electrical energy from the battery to mechanical work, they could enable robots to match or even exceed the endurance of animals with low body fat. But even these next-generation batteries have limitations. Many are difficult to recharge, degrade over time or face engineering hurdles in real-world systems.


Business Mayor
11-05-2025
- Science
- Business Mayor
robots and AI help humans exist in future cities at the venice architecture biennale 2025
At the Arsenale of the Venice Architecture Biennale 2025 , robots and AI exist for and with humans, a glimpse at everyday life in future cities. These humanoids and robotics at the international exhibition , which runs until November 23rd, display their growing role in reshaping how structures and wearables are designed, built, and used, both on Earth and in space. They support human exploration and survival out of Earth, form part of construction tasks, and bear systems that allow them to adapt to the environment and collaborate with humans to perform different tasks. Take the BioSuit by Dava Newman and Guillermo Trotti. It's a 3D textile framework built with computational design and fiber integration, tailored to everyone's body dimensions. It has wearable sensors and actuators, thermal protection, radiation shielding, and active materials for compression. The suit is designed to support astronaut activity on the Moon and Mars. It even comes with real-time mission planning and metabolic monitoring to combine astronaut data with environmental inputs and guide the astronauts with their exploration. Positioned next to this suit at the Arsenale, visitors see the Lunar Ark by IVAAIU City. Another application of robotics in space development, it depicts a data center on the Moon using robotic systems. The goal is to mitigate risks related to climate change on Earth by storing critical data off-planet. The robots come in by assembling the archive infrastructure and carry out the system updates using optical laser communication. For the exhibition, the design team places a robot arm on top of Boston Dynamics' robot dog, Spot. BioSuit by Dava Newman and Guillermo Trotti | image © designboom Machines 'help' humans, not replace them Robots and AI only take up a part of the Arsenale at the Venice Architecture Biennale 2025, but it's enough to announce and remind people of their growing presence in people's lives and the architecture industry. Bjarke Ingels Group, Laurian Ghinitolu, and Arata Mori, for example, present an installation where traditional Bhutanese woodworking is helped by a robotic arm. This six-meter, diamond-shaped wooden beam is partially carved by a human and partially by a robot using AI. The case isn't to show that robots will replace humans. Instead, the installation demonstrates how we can fire up the robots for help, shouldering some of our workload. There's another pair of robots and AI at the Venice Architecture Biennale 2025 that exhibits how machines and humans can work together. That's CO-POIESIS by Philip F. Yuan and Bin He. Here, the duo built a temporary pavilion for the two robots, made from salvaged timber and with robotic fabrication. The large structure hosts two wired robots with sensors: the one at the front plays the steelpan drum, while the one behind dances. Outside the installation, there's another steelpan drum that visitors play. Once they do, the robot hits the same drum that the visitors strike, and soon enough, the second robot begins to dance. Lunar Ark by IVAAIU City | image courtesy of IVAAIU City Humanoids can gain self-awareness over time Can robots and AI gain self-awareness? During the Venice Architecture Biennale 2025, the installation Am I A Strange Loop? by Takashi Ikegami and Luc Steels attempts to answer the question. It features a humanoid robot called Alter3. It doesn't have skin around its body, but the machine has a face and two hands, sculpted from clay-like material. The design team installs systems for perception, motion control, memory, and language processing. Read More Paint by Blēo among six new products on Dezeen Showroom This means that Alter3 can converse with visitors and move its hands and head as it talks using language models. There's also Machine Mosaic by Daniela Rus, demonstrating the use of a humanoid robot in bricklaying and mosaic assembly. It has a computer vision system that enables the robot to sense and interpret its surroundings. Because of this, it can translate what it sees into action, mimicking it even. During the exhibition, the robot repeatedly assembles and dismantles components, showing how robotics can perform structured building tasks. the installation significantly depicts a data center on the Moon using robotic systems | image © designboom The experiment looks into robotic self-awareness. Researchers believe can develop when feedback loops connect a robot's outputs to its inputs, creating a recursive cycle. These robots and AI at the Venice Architecture Biennale 2025 still mirror the already growing sphere of the machinery in space, architecture, Earth, and human lives. Whether helping astronaut performance, constructing lunar facilities, assisting with craftsmanship, or testing theories of consciousness, robotics, and the people behind them, try to expand the boundaries of design, construction, and space exploration. These machines take on more functions in both land and extraterrestrial environments, and the international exhibition, which runs until November 23rd, 2025, spotlights the relationship between human activity and robotic support that's becoming interdependent. Ancient Future: Bridging Bhutan's Tradition and Innovation by Bjarke Ingels Group, Laurian Ghinitolu, and Arata Mori | image courtesy of BIG traditional Bhutanese woodworking evidently helped by a robotic arm | image © designboom CO-POIESIS by Philip F. Yuan and Bin He | image © designboom
Yahoo
11-04-2025
- Science
- Yahoo
From brain Bluetooth to ‘full RoboCop': where chip implants will be heading soon
In the 1987 classic film RoboCop, the deceased Detroit cop Alex Murphy is reborn as a cyborg. He has a robotic body and a full brain-computer interface that allows him to control his movements with his mind. He can access online information such as suspects' faces, uses artificial intelligence (AI) to help detect threats, and his human memories have been integrated with those from a machine. It is remarkable to think that the movie's key mechanical robotic technologies have almost now been accomplished by the likes of Boston Dynamics' running, jumping Atlas and Kawasaki's new four-legged Corleo. Similarly we are seeing robotic exoskeletons that enable paralysed patients to do things like walking and climbing stairs by responding to their gestures. Developers have lagged behind when it comes to building an interface in which the brain's electrical pulses can communicate with an external device. This too is changing, however. In the latest breakthrough, a research team based at the University of California has unveiled a brain implant that enabled a woman with paralysis to livestream her thoughts via AI into a synthetic voice with just a three-second delay. The concept of an interface between neurons and machines goes back much further than RoboCop. In the 18th century, an Italian physician named Luigi Galvani discovered that when electricity is passed through certain nerves in a frog's leg, it would twitch. This paved the way for the whole study of electrophysiology, which looks at how electrical signals affect organisms. The initial modern research on brain-computer interfaces started in the late 1960s, with the American neuroscientist Eberhard Fetz hooking up monkeys' brains to electrodes and showing that they could move a meter needle. Yet if this demonstrated some exciting potential, the human brain proved too complex for this field to advance quickly. The brain is continually thinking, learning, memorising, recognising patterns and decoding sensory signals – not to mention coordinating and moving our bodies. It runs on about 86 billion neurons with trillions of connections which process, adapt and evolve continuously in what is called neuroplasticity. In other words, there's a great deal to figure out. Much of the recent progress has been based on advances in our ability to map the brain, identifying the various regions and their activities. A range of technologies can produce insightful images of the brain (including functional magnetic resonance imaging (fMRI) and positron emission tomography (PET)), while others monitor certain kinds of activity (including electroencephalography (EEG) and the more invasive electrocortigraphy (ECoG)). These techniques have helped researchers to build some incredible devices, including wheelchairs and prosthetics that can be controlled by the mind. But whereas these are typically controlled with an external interface like an EEG headset, chip implants are very much the new frontier. They have been enabled by advances in AI chips and micro electrodes, as well as the deep learning neural networks that power today's AI technology. This allows for faster data analysis and pattern recognition, which together with the more precise brain signals that can be acquired using implants, have made it possible to create applications that run virtually in real time. For instance, the new University of California implant relies on ECoG, a technique developed in the early 2000s that captures patterns directly from a thin sheet of electrodes placed directly on the cortical surface of someone's brain. In their case, the complex patterns picked up by the implant of 253 high-density electrodes are processed using deep learning to produce a matrix of data from which it's possible to decode whatever words the user is thinking. This improves on previous models that could only create synthetic speech after the user had finished a sentence. Elon Musk's Neuralink has been able to get patients to control a computer cursor using similar techniques. However, it's also worth emphasising that deep learning neural networks are enabling more sophisticated devices that rely on other forms of brain monitoring. Our research team at Nottingham Trent University has developed an affordable brainwave reader using off-the-shelf parts that enables patients who are suffering from conditions like completely locked-in syndrome (CLIS) or motor neurone disease (MND) to be able to answer 'yes' or 'no' to questions. There's also the potential to control a computer mouse using the same technology. The progress in AI, chip fabrication and biomedical tech that enabled these developments is expected to continue in the coming years, which should mean that brain-computer interfaces keep improving. In the next ten years, we can expect more technologies that provide disabled people with independence by helping them to move and communicate more easily. This entails improved versions of the technologies that are already emerging, including exoskeletons, mind-controlled prosthetics and implants that move from controlling cursors to fully controlling computers or other machines. In all cases, it will be a question of balancing our increasing ability to interpret high-quality brain data with invasiveness, safety and costs. It is still more in the medium to long term that I would expect to see many of the capabilities of a RoboCop, including planted memories and built-in trained skills supported with internet connectivity. We can also expect to see high-speed communication between people via 'brain Bluetooth'. It should be similarly possible to create a Six Million Dollar Man, with enhanced vision, hearing and strength, by implanting the right sensors and linking the right components to convert neuron signals into action (actuators). No doubt applications will also emerge as our understanding of brain functionality increases that haven't been thought of yet. Clearly, it will soon become impossible to keep deferring ethical considerations. Could our brains be hacked, and memories be planted or deleted? Could our emotions be controlled? Will the day come where we need to update our brain software and press restart? With every step forward, questions like these become ever more pressing. The major technological obstacles have essentially been cleared out of the way. It's time to start thinking about to what extent we want to integrate these technologies into society, the sooner the better. This article is republished from The Conversation under a Creative Commons license. Read the original article. Amin Al-Habaibeh receives funding Innovate UK, The British Council, The Royal academy of Engineering, EPSRC, AHRC, and the European Commission.
Yahoo
19-03-2025
- Yahoo
Watch the Atlas robot bust a move in Boston Dynamics' latest video
Boston Dynamics has treated us to a lot of impressive videos over the years and the company is back today with the latest example of its robotics mastery. In the clip above, its Atlas robot demonstrates several types of full-body movement, starting with a walk and advancing to a cartwheel and even a spot of break dancing. The different actions were developed using reinforcement learning that used motion capture and animation as source materials. At this rate, our future robot overlords will be able to out-dance and out-tumble us humans as well as out-think us one day. The video is part of Boston Dynamics' research with the Robotics and AI Institute, but it has multiple partners aiding its work. For instance, NVIDIA CEO Jensen Huang touched on the company's GR00T model for robotics during the GTC 2025 keynote earlier this week. Yesterday, Boston Dynamics announced that it is deepening its collaboration with the company focused on AI in robotics. It is using NVIDIA's Jetson Thor computing platform to run "complex, multimodal AI models that work seamlessly with Boston Dynamics' whole-body and manipulation controllers."


Globe and Mail
19-03-2025
- Business
- Globe and Mail
Fix4Bot.com Launches as World's First Humanoid Robot Repair and Parts Provider
Vancouver, Canada - proudly unveils its groundbreaking launch as the world's first company dedicated to commercializing humanoid robot repair services and parts. Catering to the booming robotics industry, this innovative venture offers specialized robot maintenance, diagnostics, and genuine parts to keep humanoid robots operational across various sectors. stands out by providing expert repair services and authentic parts for an impressive lineup of humanoid robots from top manufacturers. Whether it's Boston Dynamics' agile Atlas and Spot, Tesla's advanced Optimus Gen 2, or Softbank Robotics' pioneering designs, has it covered. The company also supports cutting-edge models like Figure AI's Figure 01 and 02, Agility Robotics' versatile Digit, and Unitree Robotics' sleek G1 and H1. From healthcare innovators like Diligent Robotics' Moxi to UBTECH's multifunctional Walker series, ensures that these remarkable machines—from industrial powerhouses to service-oriented assistants—receive the maintenance they need to perform at their best. 'With the rise of humanoid robots in industries like healthcare, manufacturing, and beyond, the need for reliable robot repair and parts has never been greater,' said Theos, CEO of 'We're thrilled to lead the charge, offering unmatched expertise and support to keep these advanced robots running smoothly.' combines a skilled team of technicians, state-of-the-art diagnostic tools, and partnerships with leading robot manufacturers to deliver top-tier service. Customers can visit to easily diagnose issues, order parts, or schedule repairs for their specific robot model—all with just a few clicks. For more details, explore About Established in 2025, is the global pioneer in humanoid robot repair and parts distribution. With a commitment to innovation, the company supports an extensive range of models, empowering the robotics ecosystem with exceptional service. Media Contact Company Name: Instantly Press Contact Person: William Tsui Email: Send Email Country: Canada Website: