HYGRIP: Full-stack characterisation of neurobehavioural signals (fNIRS, EEG, EMG, force and breathing) during a bimanual grip force control task
File(s)fnins-14-00919.pdf (1.74 MB)
Published version
Author(s)
Ortega San Miguel, P
Zhao, T
Faisal, AA
Type
Journal Article
Abstract
Brain-computer interfaces (BCIs) have achieved important milestones in recent years, but the major number of breakthroughs in the continuous control of movement have focused on invasive neural interfaces with motor cortex or peripheral nerves. In contrast, non-invasive BCIs have made primarily progress in continuous decoding using event-related data, while the direct decoding of movement command or muscle force from brain data is an open challenge. Multi-modal signals from human cortex, obtained from mobile brain imaging that combines oxygenation and electrical neuronal signals, do not yet exploit their full potential due to the lack of computational techniques able to fuse and decode these hybrid measurements. To stimulate the research community and machine learning techniques closer to the state-of-the-art in artificial intelligence we release herewith a holistic data set of hybrid non-invasive measures for continuous force decoding: the Hybrid Dynamic Grip (HYGRIP) data set. We aim to provide a complete data set, that comprises the target force for the left/right hand, cortical brain signals in form of electroencephalography (EEG) with high temporal resolution and functional near-infrared spectroscopy (fNIRS) that captures in higher spatial resolution a BOLD-like cortical brain response, as well as the muscle activity (EMG) of the grip muscles, the force generated at the grip sensor (force), as well as confounding noise sources, such as breathing and eye movement activity during the task. In total, 14 right-handed subjects performed a uni-manual dynamic grip force task within $25-50\%$ of each hand's maximum voluntary contraction. HYGRIP is intended as a benchmark with two open challenges and research questions for grip-force decoding. First, the exploitation and fusion of data from brain signals spanning very different time-scales, as EEG changes about three orders of magnitude faster than fNIRS. Second, the decoding of whole-brain signals associated with the use of each hand and the extent to which models share features for each hand, or conversely, are different for each hand. Our companion code makes the exploitation of the data readily available and accessible to researchers in the BCI, neurophysiology and machine learning communities. Thus, HYGRIP can serve as a test-bed for the development of BCI decoding algorithms and responses fusing multimodal brain signals. The resulting methods will help understand limitations and opportunities to benefit people in health and indirectly inform similar methods answering the particular needs of people in disease.
Editor(s)
Daly, I
Date Issued
2020-10
Date Acceptance
2020-08-10
ISSN
1662-453X
Publisher
Frontiers Media
Start Page
1
End Page
10
Journal / Book Title
Frontiers in Neuroscience
Volume
14
Copyright Statement
© 2020 Ortega, Zhao and Faisal. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) http://creativecommons.org/licenses/by/4.0/. The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
License URI
Sponsor
Engineering and Physical Sciences Research Council
Identifier
https://www.frontiersin.org/articles/10.3389/fnins.2020.00919/full
Grant Number
EP/L016796/1
Subjects
Electroencephalography
near-infrared spectroscopy
non-invasice
Brain-Computer Interfaces
sensor-fusion
Dataset
continuous decoding
1109 Neurosciences
1701 Psychology
1702 Cognitive Sciences
Publication Status
Published
Article Number
919
Date Publish Online
2020-10-26