Researchers are training home robots in simulations based on iPhone scans

Image Credits: Screenshot / YouTube

There’s a long list of reasons why you don’t see a lot of non-vacuum robots in the home. At the top of the list is the problem of unstructured and semi-structured environments. No two homes are the same, from layout to lighting to surfaces to humans and pets. Even if a robot can effectively map each home, the spaces are always in flux.

Researchers at MIT CSAIL this week are showcasing a new method for training home robots in simulation. Using an iPhone, someone can scan a part of their home, which can then be uploaded into a simulation.

Simulation has become a bedrock element of robot training in recent decades. It allows robots to try and fail at tasks thousands — or even millions — of times in the same amount of time it would take to do it once in the real world.

The consequences of failing in simulation are also significantly lower than in real life. Imagine for a moment that teaching a robot to put a mug in a dishwasher required it to break 100 real-life mugs in the process.

“Training in the virtual world in simulation is very powerful, because the robot can practice millions and millions of times,” researcher Pulkit Agrawal says in a video tied to the research. “It might have broken a thousand dishes, but it doesn’t matter, because everything was in the virtual world.”

Much like the robots themselves, however, simulation can only go so far when it comes to dynamic environments like the home. Making simulations as accessible as an iPhone scan can dramatically enhance the robot’s adaptability to different environments.

In fact, creating a robust enough database of environments such as these ultimately makes the system more adaptable when something is inevitably out of place, be it moving a piece of furniture or leaving a dish on the kitchen counter.

发表回复

您的电子邮箱地址不会被公开。 必填项已用 * 标注