Repository logo
  • Log In
    Log in via Symplectic to deposit your publication(s).
Repository logo
  • Communities & Collections
  • Research Outputs
  • Statistics
  • Log In
    Log in via Symplectic to deposit your publication(s).
  1. Home
  2. Faculty of Engineering
  3. Civil and Environmental Engineering
  4. Civil and Environmental Engineering
  5. Performance of multiple neural networks in predicting lower limb joint moments using wearable sensors
 
  • Details
Performance of multiple neural networks in predicting lower limb joint moments using wearable sensors
File(s)
fbioe-11-1215770.pdf (1.51 MB)
Published version
Author(s)
Altai, Zainab
Boukhennoufa, Issam
Zhai, Xiaojun
Phillips, Andrew
Moran, Jason
more
Type
Journal Article
Abstract
Joint moment measurements represent an objective biomechemical parameter in joint health assessment. Inverse dynamics based on 3D motion capture data is the current 'gold standard’ to estimate joint moments. Recently, machine learning combined with data measured by wearable technologies such electromyography (EMG), inertial measurement units (IMU), and electrogoniometers (GON) has been used to enable fast, easy, and low-cost measurements of joint moments. This study investigates the ability of various deep neural networks to predict lower limb joint moments merely from IMU sensors. The performance of five different deep neural networks (InceptionTimePlus, eXplainable convolutional neural network (XCM), XCMplus, Recurrent neural network (RNNplus), and Time Series Transformer (TSTPlus)) were tested to predict hip, knee, ankle, and subtalar moments using acceleration and gyroscope measurements of four IMU sensors at the trunk, thigh, shank, and foot. Multiple locomotion modes were considered including level-ground walking, treadmill walking, stair ascent, stair descent, ramp ascent, and ramp descent. We show that XCM can accurately predict lower limb joint moments using data of only four IMUs with RMSE of 0.046 ± 0.013 Nm/kg compared to 0.064 ± 0.003 Nm/kg on average for the other architectures. We found that hip, knee, and ankle joint moments predictions had a comparable RMSE with an average of 0.069 Nm/kg, while subtalar joint moments had the lowest RMSE of 0.033 Nm/kg. The real-time feedback that can be derived from the proposed method can be highly valuable for sports scientists and physiotherapists to gain insights into biomechanics, technique, and form to develop personalized training and rehabilitation programs.
Date Issued
2023-07-31
Date Acceptance
2023-07-14
Citation
Frontiers in Bioengineering and Biotechnology, 2023, 11, pp.1-12
URI
http://hdl.handle.net/10044/1/106225
URL
http://dx.doi.org/10.3389/fbioe.2023.1215770
DOI
https://www.dx.doi.org/10.3389/fbioe.2023.1215770
ISSN
2296-4185
Publisher
Frontiers Media S.A.
Start Page
1
End Page
12
Journal / Book Title
Frontiers in Bioengineering and Biotechnology
Volume
11
Copyright Statement
Copyright © 2023 Altai, Boukhennoufa, Zhai, Phillips, Moran and Liew. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
License URL
https://creativecommons.org/licenses/by/4.0/
Identifier
http://dx.doi.org/10.3389/fbioe.2023.1215770
Publication Status
Published
Article Number
1215770
Date Publish Online
2023-07-31
About
Spiral Depositing with Spiral Publishing with Spiral Symplectic
Contact us
Open access team Report an issue
Other Services
Scholarly Communications Library Services
logo

Imperial College London

South Kensington Campus

London SW7 2AZ, UK

tel: +44 (0)20 7589 5111

Accessibility Modern slavery statement Cookie Policy

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback