Multilevel Assessment of Exercise Fatigue Utilizing Multiple Attention and Convolution Network (MACNet) based on Surface Electromyography

Guofu Zhang, Banghua Yang, Peng Zan, Dingguo Zhang

Research output: Contribution to journalArticlepeer-review

1 Citation (SciVal)

Abstract

Background: Assessment of exercise fatigue is crucial for enhancing work capacity and minimizing the risk of injury. Surface electromyography (sEMG) has been used to quantitatively assess exercise fatigue as a new technology in recent years. However, the currently available research primarily distinguishes between fatigue and non-fatigue states, offering limited and less robust findings in multilevel evaluations. Methods: This study proposes a multiple attention and convolution network (MACNet) for a three-level assessment of muscle fatigue based on sEMG. Under the designed 50% maximum voluntary contraction experimental paradigm, sEMG signals and rate of perceived exertion scale are collected from 48 subjects. MACNet is developed to assess sEMG fatigue, incorporating improved temporal attention based on sliding window, multiscale convolution, and channel-spatial attention. Finally, GradCAM visualization is used to verify the developed model's interpretation, exploring the effects of sEMG channels and time-domain characteristics on exercise fatigue. Results: The average classification F1-Score and accuracy of MACNet are 83.95% and 84.11% for subject-wise and 82.83% and 82.43% for cross-subject, respectively. The GradCAM visualization highlights the greater contribution of the flexor digitorum superficialis and flexor digitorum profundus in evaluating high fatigue, along with the varied impact of time-domain features on exercise fatigue assessment. Conclusion: MACNet achieves the highest average classification accuracy and F1-Score, significantly higher than other state-of-the-art methods like SVM, RF, MFFNet, TSCNN, LMDANet, Conformer and MSFEnet, enhancing the extraction of exercise fatigue insights from sEMG channels and time-domain features.

Original languageEnglish
Pages (from-to)243-254
Number of pages12
JournalIEEE Transactions on Neural Systems and Rehabilitation Engineering
Volume33
Early online date26 Dec 2024
DOIs
Publication statusPublished - 31 Jan 2025

Funding

This work was supported in part by the National Key Research and Development Program of China under Grant 2022YFC3602700 and Grant 2022YFC3602703, in part by the National Natural Science Foundation of China under Grant 62376149, in part by the National Defense Basic Scientific Research Program of China (Defense Industrial Technology Development Program) under Grant JCKY2021413B005, in part by Shanghai Major Science and Technology Project under Grant 2021SHZDZX, in part by Shanghai Industrial Collaborative Technology Innovation Project under Grant XTCX-KJ-2022-2-14, and in part by China Scholarship Council under Grant 202206890012. This work involved human subjects or animals in its research. Approval of all ethical and experimental procedures and protocols was granted by the Ethics Committee of Shanghai University under Approval No. ECSHU 2023-097.

FundersFunder number
National Defense Basic Scientific Research Program of ChinaJCKY2021413B005
National Key Research and Development Program of China2022YFC3602703, 2022YFC3602700
Shanghai Major Science and Technology Project2021SHZDZX
National Natural Science Foundation of China62376149
China Scholarship Council202206890012
Shanghai Industrial Collaborative Technology Innovation ProjectXTCX-KJ-2022-2-14
Shanghai UniversityECSHU 2023-097

    Keywords

    • attention mechanism
    • exercise fatigue
    • multilevel assessment
    • multiscale convolution
    • sEMG

    ASJC Scopus subject areas

    • Internal Medicine
    • General Neuroscience
    • Biomedical Engineering
    • Rehabilitation

    Fingerprint

    Dive into the research topics of 'Multilevel Assessment of Exercise Fatigue Utilizing Multiple Attention and Convolution Network (MACNet) based on Surface Electromyography'. Together they form a unique fingerprint.

    Cite this