Urban Madness is the first multi-humanoid-robot single-participant study that the Personal Robots Group has performed. The study was run with two fully autonomous Mobile-Dexterous-Social (MDS) robots on battery power. A study participant entered a large arena in which both MDS robots operated in mixed initiative modes, dynamically tasking themselves when the human didn’t provide commands. The study was a collaboration between MIT Media Lab, Stanford, and MIT CSAIL.
The human-robot team was given a task to find as many items as possible under time constraints. The highest overall score in the study received a large cash prize. Speech recognition was used and corrected by human operators. The speech was parsed into a small number of action policies that the robot could perform and when the goal was achieved, the robot would retask itself dynamically. The robots were addressable through the proper names “Nexi” and Maddox.” An unmanned aerial vehicle was teleoperated by a human operator and provided hints to the participant. In addition, the human had to search their own space for the items.
Cognitive load was measured by how quickly the human could respond to a flashing red light when it was randomly triggered. The participant, having to search their own space, task the mobile ground robots to search specific areas, and tasked with asking the UAV for hints, as well as respond to a lit button was, to say the least, overloaded.
Back channeled social cues were added in certain study groups to determine empirically what effects back channeling has on easing load in high demand situations such as disaster relief.