Abstract
Human Activity Recognition (HAR) has been employed in a wide range of applications, e.g. self-driving cars, where safety and lives are at stake. Recently, the robustness of skeleton-based HAR methods have been questioned due to their vulnerability to adversarial attacks. However, the proposed attacks require the full-knowledge of the attacked classifier, which is overly restrictive. In this paper, we show such threats indeed exist, even when the attacker only has access to the input/output of the model. To this end, we propose the very first black-box adversarial attack approach in skeleton-based HAR called BASAR. BASAR explores the interplay between the classification boundary and the natural motion manifold. To our best knowledge, this is the first time data manifold is introduced in adversarial attacks on time series. Via BASAR, we find on-manifold adversarial samples are extremely deceitful and rather common in skeletal motions, in contrast to the common belief that adversarial samples only exist off-manifold. Through exhaustive evaluation, we show that BASAR can deliver successful attacks across classifiers, datasets, and attack modes. By attack, BASAR helps identify the potential causes of the model vulnerability and provides insights on possible improvements. Finally, to mitigate the newly identified threat, we propose a new adversarial training approach by leveraging the sophisticated distributions of on/off-manifold adversarial samples, called mixed manifold-based adversarial training (MMAT). MMAT can successfully help defend against adversarial attacks without compromising classification accuracy.
Original language | English |
---|---|
Article number | 110564 |
Journal | Pattern Recognition |
Volume | 153 |
Early online date | 5 May 2024 |
DOIs | |
Publication status | Published - 30 Sept 2024 |
Data Availability Statement
No data was used for the research described in the article.Funding
This project has received funding from the EU H2020 (No 899739), EPSRC, UK (EP/R031193/1), NSF China (No. 62302139, No. 61772462, No. U1736217), RCUK grant CAMERA, UK (EP/M023281/1, EP/T022523/1), FRFCU-HFUT, China (JZ2023HGTA0202, JZ2023HGQA0101).
Funders | Funder number |
---|---|
Research Councils UK Digital Economy Programme | EP/T022523/1, EP/M023281/1 |
Research Councils UK Digital Economy Programme | |
FRFCU-HFUT | JZ2023HGTA0202, JZ2023HGQA0101 |
EU H2020‐MSCA‐RISE‐872049 | 899739 |
Engineering and Physical Sciences Research Council | EP/R031193/1 |
Engineering and Physical Sciences Research Council | |
National Natural Science Foundation of China | 61772462, 62302139, U1736217 |
National Natural Science Foundation of China |
Keywords
- Adversarial robustness
- Black-box attack
- On-manifold adversarial samples
- Skeletal action recognition
ASJC Scopus subject areas
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Artificial Intelligence