Towards Gestural Sonic Affordance – Paper presented at NIME 2013


On May 2013, part of EAVI research group (including myself) went to NIME 2013 in Daejeon/Seoul to present our last research and performances.

This year, i presented in the poster session a research paper written with Atau Tanaka and Baptiste Caramiaux, on current investigation on sonic affordances.

Following up the current research on sound-related affordances as design principle for interactive musical interfaces and sonic interaction design, which initial approach was presented at at SMC 2012, I have realised a second user studies early this year.

In this second user study, based on experiment and following interviews, we were interested in exploring topics such as gesture-sound relationships, (Godoy, Leman and Caramiaux), physical affordances of the control device, and cultural association between sound, control device and everyday, non-specialist user.

We designed 3 different sound stimuli based on three categories of sounds, based on Schaeffer/Chion categories, and using digital synthesis (PhiSem, STK, AM).
These sounds were then mapped to three different Sound-Gestures mappings as described in the image below.


 The experiment consisted of a task oriented experiment where 7 participants (non-musicians) were asked to play digital sound by moving their limbs. To realise this experiment of active control of digital sound through movement, A 3D wireless micro accelerometer (the Axavity WAX3) was attached on the hand of participants, using a velcro strap. The accelerometer was mapped to a sound-synthesis engine running on an external laptop, invisible to the participants, connected to loudspeakers in dual-mono configuration.

The participants were asked to explore the three different sound-gesture mappings, randomly assigned every 1.30 minutes. No information were given to the participants about the technological system, the mapping, the expectations and goal of the experiment.

The following up interviews revealed us that the description of gestures produced by participants was somehow influenced by the identification of plausible sound sources. These actions were often related to everyday actions/objects, and this was particularly evident during the impulsive and iterative tasks.

For the continuous gesture-sound mapping, the sounds and the actions produced  were described more abstractly. However, the complex perceivable modulating parameters in the mapping contributed to further kinetic explorations, where changes in the sound were perceived as a clear relationship with some qualities of the movement.

In such sense, I can notice that the visibility of the mapping together with more abstract quality of the the sound stimuli, aided kinetic exploration and a process of articulation of the experience that went beyond identification of everyday sounds and related actions. Intuitiveness of the gesture-sound mapping was as quality found by the participants. They often described the experience as “natural” or “easy”.

This research can be seen as an initial step to question the role of sound-related affordances, cultural constraints and physical affordances of the device for Sonic Interaction Design.

More information, further results and observations can be read on the paper.