Repository logo
  • Log In
    Log in via Symplectic to deposit your publication(s).
Repository logo
  • Communities & Collections
  • Research Outputs
  • Statistics
  • Log In
    Log in via Symplectic to deposit your publication(s).
  1. Home
  2. Faculty of Engineering
  3. Faculty of Engineering
  4. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking
 
  • Details
Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking
File(s)
ICORR2017endpointControl.RESUBMISSION.FINAL.pdf (1.85 MB)
Accepted version
Author(s)
Maimon-Mor, RO
Fernandez-Quesada, J
Zito, GA
Konnaris, C
Dziemian, S
more
Type
Conference Paper
Abstract
Eye-movements are the only directly observable behavioural signals that are highly correlated with actions at the task level, and proactive of body movements and thus reflect action intentions. Moreover, eye movements are preserved in many movement disorders leading to paralysis (or amputees) from stroke, spinal cord injury, Parkinson's disease, multiple sclerosis, and muscular dystrophy among others. Despite this benefit, eye tracking is not widely used as control interface for robotic interfaces in movement impaired patients due to poor human-robot interfaces. We demonstrate here how combining 3D gaze tracking using our GT3D binocular eye tracker with custom designed 3D head tracking system and calibration method enables continuous 3D end-point control of a robotic arm support system. The users can move their own hand to any location of the workspace by simple looking at the target and winking once. This purely eye tracking based system enables the end-user to retain free head movement and yet achieves high spatial end point accuracy in the order of 6 cm RMSE error in each dimension and standard deviation of 4 cm. 3D calibration is achieved by moving the robot along a 3 dimensional space filling Peano curve while the user is tracking it with their eyes. This results in a fully automated calibration procedure that yields several thousand calibration points versus standard approaches using a dozen points, resulting in beyond state-of-the-art 3D accuracy and precision.
Date Issued
2017-08-15
Date Acceptance
2017-05-15
Citation
IEEE Conference on Rehabilitation Robotics, 2017, pp.1049-1054
URI
http://hdl.handle.net/10044/1/48591
DOI
https://www.dx.doi.org/10.1109/ICORR.2017.8009388
Publisher
IEEE
Start Page
1049
End Page
1054
Journal / Book Title
IEEE Conference on Rehabilitation Robotics
Copyright Statement
© 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Source
15th IEEE Conference on Rehabilitation Robotics (ICORR 2017)
Subjects
Science & Technology
Technology
Life Sciences & Biomedicine
Engineering, Electrical & Electronic
Robotics
Rehabilitation
Engineering
INTERFACE
FEASIBILITY
VISION
EEG
EMG
Adult
Eye Movements
Head
Humans
Robotics
Self-Help Devices
Young Adult
Head
Humans
Eye Movements
Self-Help Devices
Robotics
Adult
Young Adult
Publication Status
Published
Start Date
2017-07-17
Finish Date
2017-07-20
Coverage Spatial
London, UK
Date Publish Online
2017-08-15
About
Spiral Depositing with Spiral Publishing with Spiral Symplectic
Contact us
Open access team Report an issue
Other Services
Scholarly Communications Library Services
logo

Imperial College London

South Kensington Campus

London SW7 2AZ, UK

tel: +44 (0)20 7589 5111

Accessibility Modern slavery statement Cookie Policy

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback