13
IRUS Total
Downloads
  Altmetric

A multimodal intention detection sensor suite for shared autonomy of upper-limb robotic prostheses

File Description SizeFormat 
sensors-20-06097.pdfPublished version2.99 MBAdobe PDFView/Open
Title: A multimodal intention detection sensor suite for shared autonomy of upper-limb robotic prostheses
Authors: Gardner, M
Mancero Castillo, C
Wilson, S
Farina, D
Burdet, E
Khoo, BC
Atashzar, SF
Vaidyanathan, R
Item Type: Journal Article
Abstract: Neurorobotic augmentation (e.g., robotic assist) is now in regular use to support individuals suffering from impaired motor functions. A major unresolved challenge, however, is the excessive cognitive load necessary for the human–machine interface (HMI). Grasp control remains one of the most challenging HMI tasks, demanding simultaneous, agile, and precise control of multiple degrees-of-freedom (DoFs) while following a specific timing pattern in the joint and human–robot task spaces. Most commercially available systems use either an indirect mode-switching configuration or a limited sequential control strategy, limiting activation to one DoF at a time. To address this challenge, we introduce a shared autonomy framework centred around a low-cost multi-modal sensor suite fusing: (a) mechanomyography (MMG) to estimate the intended muscle activation, (b) camera-based visual information for integrated autonomous object recognition, and (c) inertial measurement to enhance intention prediction based on the grasping trajectory. The complete system predicts user intent for grasp based on measured dynamical features during natural motions. A total of 84 motion features were extracted from the sensor suite, and tests were conducted on 10 able-bodied and 1 amputee participants for grasping common household objects with a robotic hand. Real-time grasp classification accuracy using visual and motion features obtained 100%, 82.5%, and 88.9% across all participants for detecting and executing grasping actions for a bottle, lid, and box, respectively. The proposed multimodal sensor suite is a novel approach for predicting different grasp strategies and automating task performance using a commercial upper-limb prosthetic device. The system also shows potential to improve the usability of modern neurorobotic systems due to the intuitive control design.
Issue Date: 27-Oct-2020
Date of Acceptance: 23-Oct-2020
URI: http://hdl.handle.net/10044/1/85026
DOI: 10.3390/s20216097
ISSN: 1424-8220
Publisher: MDPI AG
Journal / Book Title: Sensors
Volume: 20
Issue: 21
Copyright Statement: ©2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open accessarticle distributed under the terms and conditions of the Creative Commons Attribution(CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Keywords: Science & Technology
Physical Sciences
Technology
Chemistry, Analytical
Engineering, Electrical & Electronic
Instruments & Instrumentation
Chemistry
Engineering
shared autonomy
prosthetic technology
mechanomyography
HAND
RECOGNITION
STATE
mechanomyography
prosthetic technology
shared autonomy
Analytical Chemistry
0301 Analytical Chemistry
0805 Distributed Computing
0906 Electrical and Electronic Engineering
0502 Environmental Science and Management
0602 Ecology
Publication Status: Published
Article Number: ARTN 6097
Appears in Collections:Mechanical Engineering
Bioengineering
Faculty of Engineering



This item is licensed under a Creative Commons License Creative Commons