Revgest: Augmenting Gestural Musical Instruments with Revealed Virtual Objects

Abstract

Gestural interfaces, which make use of physiological signals, hand / body postures or movements, have become widespread for musical expression. While they may increase the transparency and expressiveness of instruments, they may also result in limited agency, for musicians as well as for spectators. This problem becomes especially true when the implemented mappings between gesture and music are subtle or complex. These instruments may also restrict the appropriation possibilities of controls, by comparison to physical interfaces. Most existing solutions to these issues are based on distant and/or limited visual feedback (LEDs, small screens). Our approach is to augment the gestures themselves with revealed virtual objects. Our contributions are, first a novel approach of visual feedback that allow for additional expressiveness, second a software pipeline for pixel-level feedback and control that ensures tight coupling between sound and visuals, and third, a design space for extending gestural control using revealed interfaces. We also demonstrate and evaluate our approach with the augmentation of three existing gestural musical instruments.

Files


Back to Table of Contents