Human activity recognition (HAR) is essential for the development of robots to assist humans in daily activities. HAR is required to be accurate, fast and suitable for low-cost wearable devices to ensure portable and safe assistance. Current computational methods can achieve accurate recognition results but tend to be computationally expensive, making them unsuitable for the development of wearable robots in terms of speed and processing power. This paper proposes a light-weight architecture for recognition of activities using five inertial measurement units and four goniometers attached to the lower limb. First, a systematic extraction of time-domain features from wearable sensor data is performed. Second, a small high-speed artificial neural network and line search method for cost function optimization are used for activity recognition. The proposed method is systematically validated using a large dataset composed of wearable sensor data from seven activities (sitting, standing, walking, stair ascent/descent, ramp ascent/descent) associated with eight healthy subjects. The accuracy and speed results are compared against methods commonly used for activity recognition including deep neural networks, convolutional neural networks, long short-term memory and convolutional–long short-term memory hybrid networks. The experiments demonstrate that the light-weight architecture can achieve a high recognition accuracy of 98.60%, 93.10% and 84.77% for seen data from seen subjects, unseen data from seen subjects and unseen data from unseen subjects, respectively, and an inference time of 85 (Formula presented.) s. The results show that the proposed approach can perform accurate and fast activity recognition with a reduced computational complexity suitable for the development of portable assistive devices.

Original languageEnglish
Article number5854
Number of pages22
Issue number13
Early online date24 Jun 2023
Publication statusPublished - 31 Jul 2023

Bibliographical note

Funding: This research received no external funding

Data Availability Statement: The study used an open-source dataset called ENABL3S which can be
accessed at https://figshare.com/articles/dataset/Benchmark_datasets_for_bilateral_lower_limb_
individuals/5362627 ( accessed on 28 March 2023).


  • activity recognition
  • deep learning
  • lower-limb motion recognition
  • wearable sensors

ASJC Scopus subject areas

  • Analytical Chemistry
  • Information Systems
  • Instrumentation
  • Atomic and Molecular Physics, and Optics
  • Electrical and Electronic Engineering
  • Biochemistry


Dive into the research topics of 'A light-weight artificial neural network for recognition of activities of daily living'. Together they form a unique fingerprint.

Cite this