MetaGesture Music

metagesture-music_crop

MetaGestureMusic draws upon three complementary research fields to create gestural, interactive musical instruments for both musicians and non-musicians alike. It draws upon Auditory Culture studies to understand the cultural significance of music. It uses User Centered Design (UCD) methods to involve the end user in scenario building and creation of design mock ups. These ideas will then be implemented in functional interactive musical instruments that are built and programmed using techniques from the field of New Instruments for Musical Expression (NIME). It deploys machine learning techniques to decode gesture captured by sensor technologies such as physiological biosignal consumer accelerometers on iPhones, motion capture systems and GPS to correlate to features of novel digital sound synthesis. This creates forms of embodied musical interaction, engagement with music that is physical, situated, social, and participatory. This vision of music opens up the creative process of music and embraces all acts of engagement with music, from selecting, to listening, to dancing, to performing.