Skip to main content

Leader in AI Research Introduces Object Manipulation in Robotics Testing Scenario

--News Direct--

The Allen Institute for AI (AI2) announced the 3.0 release of its embodied artificial intelligence framework AI2-THOR, which adds active object manipulation to its already impressive testing framework. ManipulaTHOR is a first-of-its-kind virtual agent with a highly articulated robot arm equipped with swiveling joints to bring a more human-like approach to interacting with a variety of objects.

AI2-THOR is the first testing framework to study the problem of object manipulation in more than 100 visually rich, physics-enabled rooms. By enabling the training and evaluation of generalized capabilities in manipulation models, ManipulaTHOR allows for faster training in more complex environments as compared to current real-world training methods, while also being far safer and more cost-effective.

“Imagine a robot being able to navigate a kitchen, open a refrigerator and pull out a can of soda. This is one of the biggest and yet often overlooked challenges in robotics and AI2-THOR is the first to design a benchmark for the task of moving objects to various locations in virtual rooms, enabling reproducibility and measuring progress,” said Dr. Oren Etzioni, CEO at AI2. “After five years of hard work, we can now begin to train robots to perceive and maneuver through the world more like we do, making real-world usage models more attainable than ever before.”

Despite being an established research area in robotics, the visual reasoning aspect of object manipulation has consistently been one of the biggest hurdles researchers face. In fact, it’s long been understood that robots struggle to correctly perceive, navigate, act, and communicate with others in the world. AI2-THOR solves this problem with complex simulated testing environments that researchers can use to train robots for eventual activities in the real world.

With the pioneering of embodied AI through AI2-THOR, the landscape has changed for the common good. AI2-THOR enables researchers to efficiently devise solutions that address the object manipulation issue as well as other traditional problems associated with robotics testing.

“In comparison to running an experiment on an actual robot, AI2-THOR is incredibly fast and safe,” said Roozbeh Mottaghi, Research Manager at AI2. “Over the years, AI2-THOR has enabled research on many different tasks such as navigation, instruction following, multi-agent collaboration, performing household tasks, reasoning if an object can be opened or not. This evolution of AI2-THOR allows researchers and scientists to scale the current limits of embodied AI.”

In addition to the 3.0 release, the team is hosting the RoboTHOR Challenge 2021 in conjunction with the Embodied AI Workshop at this year’s Conference on Computer Vision and Pattern Recognition (CVPR). AI2’s challenges cover RoboTHOR object navigation, ALFRED (instruction following robots), and Room Rearrangement.

Object manipulation with AI2-THOR's new ManipulaTHOR.

About AI2:

AI2 conducts high-impact research and engineering in the field of artificial intelligence, all for the common good. Founded in 2014, AI2 is the creation of philanthropist and Microsoft co-founder Paul Allen and is led by Dr. Oren Etzioni, a leading researcher in the field. AI2 employs a diverse team of over 150 top-notch researchers and engineers from around the world, taking a results-oriented approach to complex challenges across several important domains of AI.

Stay up to date with new research and news from AI2 by subscribing to the AI2 Newsletter and following AI2 on Twitter @allen_ai

To read AI2-THOR’s ManipulaTHOR paper: prior.allenai.org/manipulathor/manipulathor.pdf

Contact Details

Forrest Carman

+1 206-859-3118

forrestc@owenmedia.com

View source version on newsdirect.com: https://newsdirect.com/news/leader-in-ai-research-introduces-object-manipulation-in-robotics-testing-scenario-837356652

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.