Naturalistic robot-to-human bimanual handover in complex environments through multi-sensor fusion
Author(s)
Ovur, SE
Demiris, Y
Type
Journal Article
Abstract
Robot-human object handover has been extensively studied in recent years for a wide range of applications. However, it is still far from being as natural as human-human handovers, largely due to the robots’ limited sensing capabilities. Previous approaches in the literature typically simplify the handover scenarios, including one or more of (a) conducting handovers at fixed locations, (b) not adapting to human preferences, or (c) only focusing on single-arm handover with small objects due to the sensor occlusions caused by large objects. To advance the state of the art toward a human-human level of handover fluency, this paper investigates a bimanual handover scenario in a naturalistic, complex setup. Specifically, we target robot-to-human box transfer while the human partner is on a ladder, and ensure that the object is adaptively delivered based on human preferences. To address the occlusion problem that arises in a complex environment, we develop an onboard multi-sensor perception system for the bimanual robot, introduce a measurement confidence estimation technique, and propose an occlusion-resilient multi-sensor fusion technique by positioning visual perception sensors in distinct locations on the robot with different fields of view. In addition, we establish a Cartesian space controller with a quaternion approach and a leader-follower control structure for compliant motion. Four distinct experiments are conducted, covering different human preferences (such as the box delivered above or below the hands) and significant handover location changes once the process has begun. For validation, the proposed multi-sensor fusion technique was compared to a single-sensor approach for both top and bottom sensors separately, and to simple averaging of both sensors. 30 repetitions were performed for each experiment (four experiments, four methods), the equivalent of 480 handover repetitions in total. Multi-sensor fusion approach achieved a handover success rate above 86.7% for all experiments by successfully combining the strengths of both fields of view for human pose tracking under significant occlusions without sacrificing handover duration. In contrast, due to the occlusions, the single-sensor and simple averaging approaches completely failed during challenging experiments, illustrating the importance of multi-sensor fusion in complex handover scenarios. Note to Practitioners—This paper is motivated by enabling naturalistic robot-to-human bimanual object handovers in complex environments, which is a challenging problem due to occlusions. Existing approaches in the literature do not benefit from multi-sensor fusion to handle occlusions, which is essential in such physical human-robot interaction scenarios. To this aim, we have developed a multi-sensor fusion technique to improve the perception capabilities of robots with respect to human co-workers. The developed framework has been tested with Microsoft Azure Kinect sensors and a bimanual mobile Baxter robot, but it can be adapted to any depth perception sensor and bimanual robotic platform. Furthermore, the introduced multi-sensor fusion technique is comprehensive and generic, as it can be applied to any intermittent sensor data, such as human pose tracking via RGBD sensors. The presented approach shows that increasing the field of view of robots‘ perception used with enhanced data fusion could drastically improve the robot‘s sensing capability. For future work, data fusion can be improved by introducing Bayesian filters, and the system can be validated with different sensors and robotic platforms. Moreover, the handover detection method of physical interaction could further benefit from the incorporation of force sensors.
Date Issued
2024-07
Online Publication Date
2024-10-18T11:13:44Z
Date Acceptance
2023-04-27
ISSN
1042-296X
Publisher
Institute of Electrical and Electronics Engineers
Start Page
3730
End Page
3741
Journal / Book Title
IEEE Transactions on Automation Science and Engineering
Volume
21
Issue
3
Copyright Statement
Copyright © 2024 IEEE. This is the author’s accepted manuscript made available under a CC-BY licence in accordance with Imperial’s Research Publications Open Access policy (www.imperial.ac.uk/oa-policy)
License URI
Identifier
https://ieeexplore.ieee.org/document/10167504
https://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:001025530500001&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=a2bf6146997ec60c407a63945d4e92bb
Publication Status
Published
Date Publish Online
2023-06-27