skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Autonomous Human Activity Classification From Wearable Multi-Modal Sensors

Journal Article · · IEEE Sensors Journal

There has been significant amount of research work on human activity classification relying either on Inertial Measurement Unit (IMU) data or data from static cameras providing a third-person view. There has been relatively less work using wearable cameras, providing first-person or egocentric view, and even fewer approaches combining egocentric video with IMU data. Utilizing only IMU data limits the variety and complexity of the activities that can be detected. For instance, the sitting activity can be detected by IMU data, but it cannot be determined whether the subject has sat on a chair or a sofa, or where the subject is. To perform fine-grained activity classification, and to distinguish between activities that cannot be differentiated by only IMU data, we present an autonomous and robust method using data from both wearable cameras and IMUs. In contrast to convolutional neural network-based approaches, we propose to employ capsule networks to obtain features from egocentric video data. Moreover, Convolutional Long Short Term Memory framework is employed both on egocentric videos and IMU data to capture the temporal aspect of actions. We also propose a genetic algorithm-based approach to autonomously and systematically set various network parameters, rather than using manual settings. Experiments have been conducted to perform 9- and 26-label activity classification, and the proposed method, using autonomously set network parameters, has provided very promising results, achieving overall accuracies of 86.6% and 77.2%, respectively. The proposed approach, combining both modalities, also provides increased accuracy compared to using only egovision data and only IMU data.

Research Organization:
Syracuse Univ., NY (United States)
Sponsoring Organization:
USDOE Advanced Research Projects Agency - Energy (ARPA-E)
Grant/Contract Number:
AR0000940
OSTI ID:
1799120
Journal Information:
IEEE Sensors Journal, Vol. 19, Issue 23; ISSN 1530-437X
Publisher:
IEEECopyright Statement
Country of Publication:
United States
Language:
English

References (32)

Going deeper with convolutions conference June 2015
Human activity classification from wearable devices with cameras conference October 2017
Temporal segmentation and activity classification from first-person sensing conference June 2009
Situation awareness via sensor-equipped eyeglasses conference November 2013
Multi-scale Conditional Random Fields for first-person activity recognition on elders and disabled patients journal January 2015
Long-term recurrent convolutional networks for visual recognition and description conference June 2015
Rethinking the Inception Architecture for Computer Vision conference June 2016
Learning Spatiotemporal Features with 3D Convolutional Networks conference December 2015
SST: Single-Stream Temporal Action Proposals conference July 2017
ActivityNet: A large-scale video benchmark for human activity understanding conference June 2015
A Survey on Activity Detection and Classification Using Wearable Sensors journal January 2017
Pixel-Level Hand Detection in Ego-centric Videos conference June 2013
Recognition of Activities of Daily Living with Egocentric Vision: A Review journal January 2016
Human activity recognition using thigh angle derived from single thigh mounted IMU data conference October 2014
Story-Driven Summarization for Egocentric Video conference June 2013
Physical Human Activity Recognition Using Wearable Sensors journal December 2015
Motion primitive-based human activity recognition using a bag-of-features approach conference January 2012
Accelerated Visual Context Classification on a Low-Power Smartwatch journal January 2016
A tutorial on human activity recognition using body-worn inertial sensors journal January 2014
Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition journal January 2016
A Study on Human Activity Recognition Using Accelerometer Data from Smartphones journal January 2014
Accelerometry-Based Classification of Human Activities Using Markov Modeling journal January 2011
Large-Scale Video Classification with Convolutional Neural Networks conference June 2014
Machine Learning Methods for Classifying Human Physical Activity from On-Body Accelerometers journal February 2010
First-Person Activity Recognition: What Are They Doing to Me? conference June 2013
Fast unsupervised ego-action learning for first-person sports videos conference June 2011
Learning to recognize objects in egocentric activities conference June 2011
Object-Centric Spatio-Temporal Pyramids for Egocentric Activity Recognition conference January 2013
Deep Residual Learning for Image Recognition conference June 2016
Detecting activities of daily living in first-person camera views conference June 2012
Experiments on an RGB-D Wearable Vision System for Egocentric Activity Recognition conference June 2014
Human Activity Classification Incorporating Egocentric Video and Inertial Measurement unit data conference November 2018

Cited By (3)

Dance Motion Capture Based on Data Fusion Algorithm and Wearable Sensor Network journal June 2021
Activity Recognition for Ambient Assisted Living with Videos, Inertial Units and Ambient Sensors journal January 2021
Mission-Aware Spatio-Temporal Deep Learning Model for UAS Instantaneous Density Prediction conference July 2020