As part of our MURI project, our team is building a virtual world simulator for the MDS robot to assist us in developing complex human-robot teaming behaviors.

The simulation companion to our project is based on the USARSim simulator. USARSim builds on Unreal Tournament 2004 and adds various elements (such as robots and environments) for simulating urban search and rescue scenarios. USARSim includes maps that simulate the real test arenas in NIST's Reference Test Facility for Autonomous Mobile Robots for Urban Search and Rescue.

We have added a MDS model to the USARSim that accurately resembles the degrees of freedom of the physical robot. Several of the robot's perceptual systems are simulated as well, including a stereo camera pair in the head and a Hokuyo laser scanner in the base of the robot. The virtual MDS also comes with sound and odometry sensors and can optionally be equipped with an IMU or a GPS.

Our team is also developing tools that visualize the state of the robot (laser readings, localization information on a map, etc.), to allow a remote operator
to control each degree of freedom of the robot and to obtain a live video feed from the cameras.

Our longer-term research touches on peer-to-peer collaboration between humans and robots, automated robot task allocation, and planning for teams of agents under uncertainty.

Participating universities include MIT, University of Washington, Vanderbilt University, UMASS Amherst, and Stanford.