Emergence of low-cost wearable systems has permitted extended data collection for unsupervised subject monitoring. Recognizing individual activities performed during these sessions gives context to recorded data and is an important first step towards automated motion analysis. Convolutional neural networks (CNNs) have been used with great success to detect patterns of pixels in images for object detection and recognition in many different applications. This work proposes a novel image encoding scheme to create images from time-series activity data and uses CNNs to accurately classify 13 daily activities performed by instrumented subjects. Twenty healthy subjects were instrumented with a previously developed wearable sensor system consisting of four inertial sensors mounted above and below each knee. Each subject performed eight static and five dynamic activities: standing, sitting in a chair/cross-legged, kneeling on left/right/both knees, squatting, laying, walking/running, biking and ascending/descending stairs. Data from each sensor were synchronized, windowed, and encoded as images using a novel encoding scheme. Two CNNs were designed and trained to classify the encoded images of both static and dynamic activities separately. Network performance was evaluated using twenty iterations of a leave-one-out validation process where a single subject was left out for test data to estimate performance on future unseen subjects.Objective
Methods