Towards Robust Tracking with an Unreliable Motion Sensor Using Machine Learning

Abstract

This paper presents solutions to improve reliability and to work around challenges of using a Leap Motion™ sensor as a gestural control and input device in digital music instrument (DMI) design. We implement supervised learning algorithms (k-nearest neighbors, support vector machine, binary decision tree, and artificial neural network) to estimate hand motion data, which is not typically captured by the sensor. Two problems are addressed: 1) the sensor cannot detect overlapping hands 2) The sensor's limited detection range. Training examples included 7 kinds of overlapping hand gestures as well as hand trajectories where a hand goes out of the sensor's range. The overlapping gestures were treated as a classification problem and the best performing model was k-nearest neighbors with 62% accuracy. The out-of-range problem was treated first as a clustering problem to group the training examples into a small number of trajectory types, then as a classification problem to predict trajectory type based on the hand's motion before going out of range. The best performing model was k-nearest neighbors with an accuracy of 30%. The prediction models were implemented in an ongoing multimedia electroacoustic vocal performance and an educational project named Embodied Sonic Meditation (ESM).


Back to Table of Contents