Recently, I was really fortunate to lend my expertise in experimental design and HRI studies with human users, to collaborate on a paper with researchers in Brown's Computer Science Department, the Humans 2 Robots Lab on a paper for the 2017 IROS conference.
Whitney, D., Rosen, E., Phillips, E., Konidaris, G., & Tellex, S. (Submitted). A virtual reality interface for robots. International conference on intelligent robots and systems. IEEE/RSJ.
Draft Abstract:
Teleoperation allows a human to remotely operate a robot to perform complex and potentially dangerous tasks such as defusing a bomb, repairing a nuclear reaction, or maintaining the exterior of a space station. Existing teleoperation approaches generally rely on computer monitors to display sensor data and joysticks or keyboards to actuate the robot. These approaches use 2D interfaces to view and interact with a 3D world, which can make using them difficult for complex or delicate tasks. To address this problem, we introduce a virtual reality interface that allows users to remotely teleoperate a physical robot in real-time. Our interface allows users to control their point of view in the scene using virtual reality, increasing situational awareness (especially of object contact), and to directly move the robot's end effector by moving a hand controller in 3D space, enabling fine-grained dexterous control. We evaluated our interface on a cup-stacking manipulation task with 18 users, comparing the relative effectiveness of a keyboard and mouse interface, virtual reality camera control, and positional hand tracking. Our system reduces task completion time by 101 seconds (a reduction of 66%), while improving subjective assessments of system usability and workload.