Bidirectional Human-Robot Handovers
Funding Sources
Principal Investigator
Collaborating Academic Organizations
- Collaborative Advanced Robotics and Intelligent Systems Laboratory
- Computer Vision and Systems Laboratory of the Université Laval
- Laboratoire de robotique de l’Universite Laval
- Artificial Perception Laboratory at the McGill Center for Intelligent Machines
- Sensory Perception and Interaction Research Group
Thesis
Project Description
Fundamental human to human interactions – sharing spaces, tools, handing over objects, carrying objects together – are part of the everyday experience; for most people, the task of handing over an object to another person is a natural and seemingly effortless task. However, in the context of human-robot interaction, smooth and seamless interaction is an open problem of fundamental interest for robotics designers, integrators and users alike. This thesis explores how nonverbal cues exhibited during robot giving and receiving behaviours change how users perceive the robot, and affect the handover task. Additionally, the work also investigates how robots can recognize and interpret expressions conveyed by a human giver to infer handover intent.
Over the course of several user studies examining human-human and human-robot handovers, the role of nonverbal cues such as gaze and object orientation and how they may play a part in establishing fluency and efficiency of robot-to-human handovers are investigated. These studies provide insights into how robots can be trained through observation of human-to-human handovers. Furthermore, this thesis examines the role of nonverbal cues in the less-studied human-to-robot handover interaction. In this exploration, kinematic features from motion-captured skeleton models of a giver are used to establish the intent to handover, thereby enabling a robot to appropriately react and receive the object. Additionally, changing user perceptions, geometry and dynamics of human-to-robot handovers are explored through variation of the initial pose, grasp type during, and retraction speed after handover of the robot receiver.
Findings from this thesis demonstrate that nonverbal cues such as gaze and object orientation in the case of robot-to-human handovers, and kinodynamics during human-to-robot handovers, can significantly affect multiple aspects of the interaction including user perception, fluency, legibility, efficiency, geometry and fluidity of the handover. Using a machine learning approach, recognizing handover intent from nonverbal kinematics of the giver’s pose could also be performed effectively. Thus, the work presented in this thesis indicates that nonverbal cues can serve as a powerful medium by which details of a handover can be subtly communicated to the human partner, resulting in a more natural experience in this ubiquitous, collaborative activity.