Advertisement for orthosearch.org.uk
Orthopaedic Proceedings Logo

Receive monthly Table of Contents alerts from Orthopaedic Proceedings

Comprehensive article alerts can be set up and managed through your account settings

View my account settings

Visit Orthopaedic Proceedings at:

Loading...

Loading...

Full Access

General Orthopaedics

CLASSIFICATION OF STATIC AND DYNAMIC ACTIVITIES FROM A LEG-INSTRUMENTED WEARABLE SENSOR SYSTEM

International Society for Technology in Arthroplasty (ISTA) meeting, 32nd Annual Congress, Toronto, Canada, October 2019. Part 1 of 2.



Abstract

Objective

Emergence of low-cost wearable systems has permitted extended data collection for unsupervised subject monitoring. Recognizing individual activities performed during these sessions gives context to recorded data and is an important first step towards automated motion analysis. Convolutional neural networks (CNNs) have been used with great success to detect patterns of pixels in images for object detection and recognition in many different applications. This work proposes a novel image encoding scheme to create images from time-series activity data and uses CNNs to accurately classify 13 daily activities performed by instrumented subjects.

Methods

Twenty healthy subjects were instrumented with a previously developed wearable sensor system consisting of four inertial sensors mounted above and below each knee. Each subject performed eight static and five dynamic activities: standing, sitting in a chair/cross-legged, kneeling on left/right/both knees, squatting, laying, walking/running, biking and ascending/descending stairs. Data from each sensor were synchronized, windowed, and encoded as images using a novel encoding scheme. Two CNNs were designed and trained to classify the encoded images of both static and dynamic activities separately. Network performance was evaluated using twenty iterations of a leave-one-out validation process where a single subject was left out for test data to estimate performance on future unseen subjects.

Results

Using 19 subjects for training and a single subject left out for testing per iteration, the average accuracy observed when classifying the eight static activities was 98.0% ±2.9%. Accuracy dropped to 89.3% ±10.6% when classifying all dynamic activities using a separate model with the same evaluation process. Ascending/descending stairs, walking/running, and sitting on a chair/squatting were most commonly misclassified.

Conclusions

Previous related work on activity recognition using accelerometer and/or gyroscope raw signals fails to provide sufficient data to distinguish static activities. The proposed method operating on lower limb orientations has classified eight static activities with exceptional accuracy when tested on unseen subject data. High accuracy was also observed when classifying dynamic activities despite the similarity of the activities performed and the expected variance of individuals’ gait. Accuracy reported in existing literature classifying comparable activities from other wearable sensor systems ranges between 27.84% to 84.52% when tested using a similar leave-one-subject-out validation strategy[1]. It is expected that incorporating these trained models into the previously developed wearable system will permit activity classification on unscripted instrumented activity data for more contextual motion analysis.