Personalised assistive dressing by humanoid robots using multi-modal information

File Description SizeFormat 
ws_icra2016_stamped.pdfAccepted version1.9 MBAdobe PDFDownload
Title: Personalised assistive dressing by humanoid robots using multi-modal information
Author(s): Gao, Y
Chang, HJ
Demiris, Y
Item Type: Conference Paper
Abstract: In this paper, we present an approach to enable a humanoid robot to provide personalised dressing assistance for human users using multi-modal information. A depth sensor is mounted on top of the robot to provide visual information, and the robot end effectors are equipped with force sensors to provide haptic information. We use visual information to model the movement range of human upper-body parts. The robot plans the dressing motions using the movement range models and real-time human pose. During assistive dressing, the force sensors are used to detect external force resistances. We present how the robot locally adjusts its motions based on the detected forces. In the experiments we show that the robot can assist human to wear a sleeveless jacket while reacting to the force resistances.
Publication Date: 20-May-2016
Date of Acceptance: 21-Apr-2016
URI: http://hdl.handle.net/10044/1/33436
Copyright Statement: © 2016 The Authors
Conference Name: Workshop on Human-Robot Interfaces for Enhanced Physical Interactions at ICRA
Publication Status: Published
Start Date: 2016-05-16
Finish Date: 2016-05-20
Conference Place: Stockholm, Sweden
Appears in Collections:Faculty of Engineering
Electrical and Electronic Engineering



Items in Spiral are protected by copyright, with all rights reserved, unless otherwise indicated.

Creative Commons