Autonomous Human Activity Classification From Wearable Multi-Modal Sensors
- Syracuse Univ., NY (United States)
There has been significant amount of research work on human activity classification relying either on Inertial Measurement Unit (IMU) data or data from static cameras providing a third-person view. There has been relatively less work using wearable cameras, providing first-person or egocentric view, and even fewer approaches combining egocentric video with IMU data. Utilizing only IMU data limits the variety and complexity of the activities that can be detected. For instance, the sitting activity can be detected by IMU data, but it cannot be determined whether the subject has sat on a chair or a sofa, or where the subject is. To perform fine-grained activity classification, and to distinguish between activities that cannot be differentiated by only IMU data, we present an autonomous and robust method using data from both wearable cameras and IMUs. In contrast to convolutional neural network-based approaches, we propose to employ capsule networks to obtain features from egocentric video data. Moreover, Convolutional Long Short Term Memory framework is employed both on egocentric videos and IMU data to capture the temporal aspect of actions. We also propose a genetic algorithm-based approach to autonomously and systematically set various network parameters, rather than using manual settings. Experiments have been conducted to perform 9- and 26-label activity classification, and the proposed method, using autonomously set network parameters, has provided very promising results, achieving overall accuracies of 86.6% and 77.2%, respectively. The proposed approach, combining both modalities, also provides increased accuracy compared to using only egovision data and only IMU data.
- Research Organization:
- Syracuse Univ., NY (United States)
- Sponsoring Organization:
- USDOE Advanced Research Projects Agency - Energy (ARPA-E)
- Grant/Contract Number:
- AR0000940
- OSTI ID:
- 1799120
- Journal Information:
- IEEE Sensors Journal, Vol. 19, Issue 23; ISSN 1530-437X
- Publisher:
- IEEECopyright Statement
- Country of Publication:
- United States
- Language:
- English
Dance Motion Capture Based on Data Fusion Algorithm and Wearable Sensor Network
|
journal | June 2021 |
Activity Recognition for Ambient Assisted Living with Videos, Inertial Units and Ambient Sensors
|
journal | January 2021 |
Mission-Aware Spatio-Temporal Deep Learning Model for UAS Instantaneous Density Prediction
|
conference | July 2020 |
Similar Records
Learning View-Invariant Features for Person Identification in Temporally Synchronized Videos Taken by Wearable Cameras
An Effective Adversarial Attack on Person Re-Identification in Video Surveillance via Dispersion Reduction