Loading...
Thumbnail Image
Item

Hand-Crafted Features With A Simple Deep Learning Architecture For Sensor-Based Human Activity Recognition

Albadawi, Yaman
Shanableh, Tamer
Date
2024-07-10
Advisor
Type
Article
Peer-Reviewed
Postprint
Degree
Description
Abstract
With the growth in the wearable device market, wearable sensor-based human activity recognition systems have been gaining increasing interest in research because of their rising demands in many areas. This research presents a novel sensor-based human activity recognition system that utilizes a unique feature extraction technique associated with a deep learning method for classification. One of the main contributions of this work is dividing the sensor sequences time-wise into non-overlapping 2D segments. Then, statistical features are computed from each 2D segment using two approaches; the first approach computes features from the raw sensor readings, while the second approach applies time-series differencing to sensor readings prior to feature calculations. Applying time-series differencing to 2D segments helps in identifying the underlying structure and dynamics of the sensor reading across time. This work experiments with different numbers of 2D segments of sensor reading sequences. Also, it reports results with and without the use of different components of the proposed system. Additionally, it analyses the best-performing models’ complexity, comparing them with other models trained by integrating the proposed method with an existing transformer network. All of these arrangements are tested with different deep-learning architectures supported by an attention layer to enhance the model. Four benchmark datasets are used to perform several experiments, namely, mHealth, USC-HAD, UCI-HAR, and DSA. The experimental results revealed that the proposed system outperforms human activity recognition rates reported in the most recent studies. Specifically, this work reports recognition rates of 99.17%, 81.07%, 99.44%, and 94.03% for the four datasets, respectively.