249
IRUS TotalDownloads
Altmetric
3D gaze cursor: continuous calibration and end-point grasp control of robotic actuators
File | Description | Size | Format | |
---|---|---|---|---|
MarcosAbbottFaisal.aldo_Final.AldoHisto.FINALFINALFINAL.pdf | Accepted version | 2.11 MB | Adobe PDF | View/Open |
Title: | 3D gaze cursor: continuous calibration and end-point grasp control of robotic actuators |
Authors: | Marcos Tostado, P Abbott, WW Faisal, AA |
Item Type: | Conference Paper |
Abstract: | © 2016 IEEE.Eye movements are closely related to motor actions, and hence can be used to infer motor intentions. Additionally, eye movements are in some cases the only means of communication and interaction with the environment for paralysed and impaired patients with severe motor deficiencies. Despite this, eye-tracking technology still has a very limited use as a human-robot control interface and its applicability is highly restricted to 2D simple tasks that operate on screen based interfaces and do not suffice for natural physical interaction with the environment. We propose that decoding the gaze position in 3D space rather than in 2D results into a much richer spatial cursor signal that allows users to perform everyday tasks such as grasping and moving objects via gaze-based robotic teleoperation. Eye tracking in 3D calibration is usually slow - we demonstrate here that by using a full 3D trajectory for system calibration generated by a robotic arm rather than a simple grid of discrete points, gaze calibration in the 3 dimensions can be successfully achieved in short time and with high accuracy. We perform the non-linear regression from eye-image to 3D-end point using Gaussian Process regressors, which allows us to handle uncertainty in end-point estimates gracefully. Our telerobotic system uses a multi-joint robot arm with a gripper and is integrated with our in-house GT3D binocular eye tracker. This prototype system has been evaluated and assessed in a test environment with 7 users, yielding gaze-estimation errors of less than 1cm in the horizontal, vertical and depth dimensions, and less than 2cm in the overall 3D Euclidean space. Users reported intuitive, low-cognitive load, control of the system right from their first trial and were straightaway able to simply look at an object and command through a wink to grasp this object with the robot gripper. |
Issue Date: | 20-Jun-2016 |
Date of Acceptance: | 16-May-2016 |
URI: | http://hdl.handle.net/10044/1/33957 |
DOI: | https://dx.doi.org/10.1109/ICRA.2016.7487502 |
Publisher: | IEEE |
Start Page: | 3295 |
End Page: | 3300 |
Journal / Book Title: | 2016 IEEE International Conference on Robotics and Automation (ICRA) |
Copyright Statement: | © 2016 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. |
Conference Name: | IEEE International Conference on Robotics and Automation (ICRA) |
Publication Status: | Published |
Start Date: | 2016-05-16 |
Finish Date: | 2016-05-21 |
Conference Place: | Stockholm |
Appears in Collections: | Bioengineering Faculty of Engineering |