logo
AI Agents Playing Video Games Will Transform Future Robots

AI Agents Playing Video Games Will Transform Future Robots

Forbes01-05-2025

AI agents trained in video game environments are demonstrating a remarkable ability to transfer ... More skills to new challenges, potentially revolutionizing how we build real-world robots.
Video games have played an important role in the development of AI. Many early demonstrations of machine learning involved teaching computers to play games. Eventually, Google Deepmind's mastery of the game Starcraft 2 was taken as proof that machines could now compete with us across many fields in which we were previously undisputed champions.
Now, games are being used as a testbed for exploring some of the most exciting new areas in AI, including autonomous agents, real-world robots and perhaps even the quest for AGI.
At this year's Game Developer's Conference, Google's DeepMind AI division demonstrated its research into what it calls Scalable Instructable Multiworld Agents (SIMA).
The idea is to show that machines can navigate and learn inside the 3D worlds of video game environments. They can then use what they've learned to navigate entirely different worlds and tasks, all with their own rules, using whatever tools are available to them to solve problems.
It might sound like child's play, but this research could dramatically impact the development of the agentic AI we'll use in our work and personal lives. So let's take a look at what it could mean, and whether it could even solve the ultimate AI challenge of creating machines capable of adapting to any situation, much like humans can.
Video games provide a great environment for training AI because the variety of tasks and challenges is almost infinite. Importantly, the player usually solves these challenges using a standard set of tools, all accessed via the game controller.
This corresponds well with the way AI agents tackle problems by choosing which tools to use from a pre-defined selection.
Game worlds also provide safe, observable and scalable environments where the effects of subtle changes to variables or behavior can be explored at little real-world cost.
DeepMind's SIMAs were trained across nine different video game environments, taken from popular games including No Man's Sky, Valheim and Goat Simulator. The agents were given the ability to interact and control the games using natural language commands like 'pick up the key' or 'move to the blue building.'
Among the standout findings, the research showed that the agents are highly effective at transferable learning—taking what they learn in one game and using it to get better at another.
This was backed up by observations that agents trained to play eight of the nine games performed better at the one game they were untrained on than specialized agents solely trained on the one game.
This dynamic learning ability will be critical in a world where agents are working alongside us, helping us explore, interpret and understand messy real-world problems and situations.
But what about looking a little further ahead, to a time when it's commonplace for robots to help us out with physical tasks as well as digital ones?
The development of real-world robots that carry out physical tasks has accelerated in the last decade, hand-in-hand with the evolution of AI. However, they are still generally only used by large businesses due to the high cost of training them for specialist roles.
Using virtual and video game environments could dramatically lower this cost. The theory is that transferable learning will enable physical robots to use their hands, arms or whatever tools they have to tackle many physical challenges, even if they haven't come across them before.
For example, a robot that effectively learns how to use its hands to work in a warehouse might also learn how to use them to build a house.
Before it released ChatGPT, OpenAI demonstrated research in this field. Dactyl is a robotic hand, trained in virtual simulated environments, that learned how to solve a Rubik's Cube. This was one of the first demonstrations of the potential of transferring skills learned in virtual environments to complex physical-world tasks.
More recently, Nvidia has developed its Isaac platform expressly for the purpose of training robots to 'learn to learn' how to carry out real-world tasks inside virtual environments.
Today, physical AI-assisted robots are put to work in warehouse roles, agriculture, healthcare, deliveries, and many other jobs. In most cases, however, these robots are still doing tasks they were specifically trained for—at enormous expense by companies with very deep pockets.
But new models of 'affordable' robots are on the horizon. Tesla plans to manufacture thousands of its Optimus robots this year and assign many of them to work in its factories. And Chinese robotics developer Unitree recently unveiled a $16,000 humanoid robot that can turn its hand to many tasks.
With the price of robots falling and their AI brains becoming more powerful by the day, walking, talking humanoid robots could be stepping out of science fiction into everyday reality sooner than we think.
Almost 30 years ago, machines scored their first big win over humans by defeating Gary Kasparov at Chess. Few would have predicted then that a computer would exist that could beat world champions not just at one game, but at any game.
This ability to 'generalize' information by taking knowledge from one task and using it to solve an entirely different one is traditionally exclusive to humans, but that could be changing.
All of this will be hugely interesting to those chasing the holy grail of AI development, artificial general intelligence (AGI).
Evidence that agents like DeepMind's SIMAs are able to transfer learning from one virtual game environment to another suggests they may be developing some of the qualities needed for AGI. It demonstrates that they are progressively building competencies that can be applied to solving future problems.
Google, along with OpenAI, Anthropic and Microsoft, have all stated that developing AGI is their eventual goal, and it's clearly the logical endpoint of the current focus on agentic intelligence. With video games, could another part of the puzzle be in place?

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Samsung Confirms AI Upgrade Choice—This Changes Your Phone
Samsung Confirms AI Upgrade Choice—This Changes Your Phone

Forbes

timean hour ago

  • Forbes

Samsung Confirms AI Upgrade Choice—This Changes Your Phone

This decision defines the future of your phone. A timely warning from Samsung this week, which neatly sets out the biggest upgrade decision now facing Android users. As whispers start to spread suggesting a disconnect between Samsung and Google at the heart of Android, this is critical. We're talking AI and the new features and offerings now hitting phones and PCs at breakneck speed. This is where Galaxy has an advantage, Samsung says, 'in privacy-first, AI-powered experiences' which can 'protect you in the era of AI.' The question the Galaxy-maker asks in its latest post is the right one: 'This level of personalization' brought by AI 'can be incredibly helpful, but the more your phone knows, the more there is to protect. So, what's keeping all that personal data secure?' Samsung's answer is Knox. 'Every Galaxy device is protected from the chip up by a multi-layered approach, which includes on-device personalization, user-controlled cloud processing, and ecosystem-wide protection through Samsung Knox Matrix.' This is Samsung's secure ecosystem that is the closest replica to Apple's securely walled garden currently available on Android. 'At the core of this system is Samsung Knox Vault, Samsung's hardware-based solution for your most sensitive information.' Knox is not new and neither is the concept of hardware-enabled Galaxy data security. What is new is segmenting sensitive the latest AI-related data from the rest, and securing that alongside the more traditional PINs, passwords and credit card numbers. 'Location service metadata from your most personal photos,' Samsung says, 'could easily give away the exact location where the image was taken.' And there's not much data more sensitive than who did what, where and when. 'In the era of AI, personal information like your home address, face clustering ID, person ID, pet type, scene type and more need to be encrypted and stored in a safe location. These things aren't just files — they are deeply connected to your daily life.' It's unclear exactly what is being or will be segmented and how this plays into the various opt-ins that Samsung has added to distinguish between on-device and cloud AI, between what is only within your secure enclave and what is outside. But it's difficult not to read this push as a play against the latest announcements from Google and the cloud-based AI that will now run riot across sensitive data, including emails and even cloud data storage. Yes, there are always opt-outs, but it's all or nothing for users who want AI but are not yet worrying about privacy. 'As Galaxy AI becomes more useful,' Samsung says, 'it also becomes more personal — learning how you use your device and adapting to your needs… Knox Vault is more than a security feature, it's Galaxy's promise that no matter how advanced your devices become, or how much AI evolves, your privacy is secured.'

Here's an early look at Pixel's upcoming AI Mode home screen shortcut (APK teardown)
Here's an early look at Pixel's upcoming AI Mode home screen shortcut (APK teardown)

Android Authority

time2 hours ago

  • Android Authority

Here's an early look at Pixel's upcoming AI Mode home screen shortcut (APK teardown)

Mishaal Rahman / Android Authority TL;DR Pixel devices could soon receive a shortcut to Search's AI Mode on the home screen. The shortcut will appear within the Pixel Launcher's search widget and launch the Google app's AI Mode interface. We've enabled it in Android 16 QPR1 beta to give you an early look. how Google introduced Search's AI Mode earlier this year to help users find answers to complex, multi-part questions. Currently, you can access it on the web or by heading to the Google app on your phone and tapping the AI Mode button. However, Google is working on a new shortcut to give Pixel users quicker access to the feature. Authority Insights story on Android Authority. Discover You're reading anstory on Android Authority. Discover Authority Insights for more exclusive reports, app teardowns, leaks, and in-depth tech coverage you won't find anywhere else. An APK teardown helps predict features that may arrive on a service in the future based on work-in-progress code. However, it is possible that such predicted features may not make it to a public release. Google plans to add an AI Mode shortcut to the Pixel Launcher's search widget, giving Pixel users access to the feature right on the home screen. We first spotted references to this shortcut in the second Android 16 beta and got a look at its icon. Although it's still not live in the Android 16 QPR1 beta, we've now enabled it manually to give you an early look. As we showcased in a mockup earlier this year, the AI Mode shortcut appears to the left of the voice search and Lens shortcuts in the Pixel Launcher's search widget. The AI Mode icon features Google's signature colors, but adapts to the system colors if you enable the 'Themed icons' option. Tapping on it opens the Google app's AI Mode interface, making it easier for Pixel users to start an AI Mode search from the home screen. The AI Mode home screen shortcut could reach Pixel devices with Android 16. However, since it's not live in the first Android 16 QPR1 beta, Google might delay its rollout to a subsequent release. We'll update this post as soon as the shortcut is widely available. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.

This change to audio previews in Google Messages could help save your mobile data (APK teardown)
This change to audio previews in Google Messages could help save your mobile data (APK teardown)

Android Authority

time2 hours ago

  • Android Authority

This change to audio previews in Google Messages could help save your mobile data (APK teardown)

Edgar Cervantes / Android Authority TL;DR Google Messages is testing a larger Floating Action Button for Gemini. Further, voice messages and audio files may soon display their file size before downloading, helping you decide whether to download them on mobile data or Wi-Fi. A new Gallery-opening animation is also being worked on to improve visual fluidity when attaching media. Google Messages continues to evolve into a polished messaging app for Android flagships. Google has been working hard behind the scenes, and you can see some of the results, like the Material 3 Expressive design overhaul rolling out to more users. There's more coming, as we've spotted code for several minor changes within Google Messages v20250602 beta, and we've managed to activate them before their release to give you an early look. Authority Insights story on Android Authority. Discover You're reading anstory on Android Authority. Discover Authority Insights for more exclusive reports, app teardowns, leaks, and in-depth tech coverage you won't find anywhere else. An APK teardown helps predict features that may arrive on a service in the future based on work-in-progress code. However, it is possible that such predicted features may not make it to a public release. Firstly, Google is experimenting with a bigger Floating Action Button (FAB) for Gemini. This button is similar to the Start Chat button now in its collapsed state. In the images below, the left screen showcases the current UI, while the right screen shows the upcoming UI with the larger Gemini button. You'll notice that the bigger button is seemingly misaligned when the Start Chat button is collapsed. However, we believe this alignment issue will likely be fixed if and when Google decides to roll this out to more users. Another small change coming to RCS users concerns voice messages and audio files. In chat conversations, voice messages show a grayed-out audio player to indicate that the file is being downloaded. Google Messages could soon showcase the download size and download progress of voice messages and audio files when you receive them, before downloading. You can see the current UI on the left and the upcoming change on the right screen in the image below: AssembleDebug / Android Authority It's not a big deal for voice notes, but audio files can sometimes be large. Knowing their size before downloading can help you decide whether to download them on your limited data connection or switch to your more generous Wi-Fi connection. Another change coming to Google Messages is a new animation when you open the Gallery to attach images or videos. Currently, this action doesn't have a proper animation, as you can see in this video: In the future, Google could add a proper animation for this action, as you can see in this video: Most users will likely never notice the animation, but it's a small but nice change that shows off attention to detail in the user experience. As mentioned, these changes are not currently live. We don't know if and when Google will roll them out within Google Messages. We'll keep you updated when we learn more. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store