IOTE EXPO CHINA

lOTE 2026 The 25th International Internet of Things Exhibition-Shenzhen

2026.08.26-28 | Shenzhen World Exhibition & Corntion Center(Bao’an District)

The Atlas robot learns new skills by observing human behavior using AI models

Boston Dynamics’ humanoid robot, Atlas, has demonstrated a new approach to learning complex new tasks using AI models that observe human behavior.

Programming machines to handle a wide range of challenges in human environments, from handling fragile objects to navigating cluttered spaces, has always been a daunting task. Now, Large Behavior Models (LBMs) promise to help robots learn tasks much faster.

Atlas Becomes a Robot Apprentice

In a demonstration by Boston Dynamics and the Toyota Research Institute (TRI), an AI model guided the Atlas robot through a time-consuming “on-site workshop” task. The robot coordinated full-body movements, picking up parts from a cart, folding them, and placing them on a shelf. Then, it pulled out a low box to store other components and finally loaded the remaining items into a large truck.

But the real miracle happened when things didn’t go as planned. The initial version of the AI couldn’t handle unexpected situations. The solution wasn’t writing complex new code. Instead of human intervention, the research team had an operator demonstrate how to recover from errors, such as parts falling to the ground or a trash can lid accidentally closing. After retraining the neural network with these new examples, the robot learned to react autonomously and solve problems.

This capability stems from the robot’s ability to estimate changes in its surroundings using its sensors and react based on experience gained during training. Programming new behaviors no longer requires years of expert engineering experience, creating opportunities to rapidly expand the Atlas robot’s capabilities.

Human-Robot Collaboration

So, how is the robot trained? First, the operator enters a virtual reality device. Wearing a VR headset, the operator is fully immersed in the robot’s workspace, observing what the robot sees through a head-mounted stereo camera. Trackers on the operator’s hands and feet allow for smooth, intuitive control of Atlas; their movements are directly mapped onto the machine.

It is this remote operating system that enables the collection of high-quality data. Whether crouching down to pick up an object or carefully adjusting its position, the robot’s movements are guided by the human, generating the raw data needed for learning.

This data is then fed into the Atlas robot’s “brain”: a diffusion transformer architecture with 450 million parameters. This model receives all the information the robot perceives—images, its own body position (proprioception), and high-level language cues instructing it to perform actions—and generates the movements needed to control Atlas’s entire 50 degrees of freedom.

All of this is done in a continuous iterative cycle: collecting data, training the model, and evaluating performance to determine what data to collect next.

Learning Complex and Unpredictable Tasks

By training the policy on a variety of tasks, researchers found that the robot can generalize better and recover from errors. The system can learn to perform tasks that are difficult to code manually due to their complexity and unpredictability.

Using a single-language conditional AI model, Atlas has learned to tie ropes, lay tablecloths, and even manipulate a 22-pound car tire.

The research team found that for logic-based modeling robots (LBM), the learning process is the same whether the robot is stacking rigid blocks or folding a T-shirt: if a human can demonstrate, a robot like Atlas can learn. Furthermore, they found that the robot’s movement speed can be accelerated in real time, allowing it to complete tasks typically 1.5 to 2 times faster than human demonstrations without sacrificing performance.

The next focus is on expanding this “data flywheel” to increase the diversity and difficulty of tasks, while exploring new artificial intelligence algorithms and methods for integrating other data sources. This is an important step on the long road to the future, where humanoid robots like Atlas will be able to work alongside us and serve us in the real world.