The CHI Conference is the premier conference on Human-Computer Interaction (HCI) and will be held in Seoul, South Korea, from April 18 to April 23, 2015. We will be attending the conference to present the latest research outcomes from the group in the field. This year, we have one full paper, one note, one TOCHI paper and one student research paper, details below:
Form Follows Sound: Designing Interactions from Sonic Memories (Full Paper)
Baptiste Caramiaux, Alessandro Altavilla, Scott Pobiner, and Atau Tanaka
- Abstract: Sonic interaction is the continuous relationship between user actions and sound, mediated by some technology. Because interaction with sound may be task oriented or experience-based it is important to understand the nature of action-sound relationships in order to design rich sonic interactions. We propose a participatory approach to sonic interaction design that first considers the affordances of sounds in order to imagine embodied interaction, and based on this, generates interaction models for interaction designers wishing to work with sound. We describe a series of workshops, called Form Follows Sound, where participants ideate imagined sonic interactions, and then realize working interactive sound prototypes. We introduce the Sonic Incident technique, as a way to recall memorable sound experiences. We identified three interaction models for sonic interaction design: conducting; manipulating; substituting. These three interaction models offer interaction designers and developers a framework on which they can build richer sonic interactions.
Using Interactive Machine Learning to Support Interface Development through Workshops with Disabled People (Note)
Simon Katan, Mick Grierson, and Rebecca Fiebrink
- Abstract: We have applied interactive machine learning (IML) to the creation and customisation of gesturally controlled musical interfaces in six workshops with people with learning and physical disabilities. Our observations and discussions with participants demonstrate the utility of IML as a tool for participatory design of accessible interfaces. This work has also led to a better understanding of challenges in end-user training of learning models, of how people develop personalised interaction strategies with different types of pre-trained interfaces, and of how properties of control spaces and input devices influence people’s customisation strategies and engagement with instruments. This work has also uncovered similarities between the musical goals and practices of disabled people and those of expert musicians.
Understanding Gesture Expressivity through Muscle Sensing (TOCHI paper presented in the Journal Session)
Baptiste Caramiaux, Marco Donnarumma and Atau Tanaka
- AbstractExpressivity is a visceral capacity of the human body. To understand what makes a gesture expressive, we need to consider not only its spatial placement and orientation, but also its dynamics and the mechanisms enacting them. We start by defining gesture and gesture expressivity, and then present fundamental aspects of muscle activity and ways to capture information through electromyography (EMG) and mechanomyog- raphy (MMG). We present pilot studies that inspect the ability of users to control spatial and temporal variations of 2D shapes and that use muscle sensing to assess expressive information in gesture execution beyond space and time. This leads us to the design of a study that explores the notion of gesture power in terms of control and sensing. Results give insights to interaction designers to go beyond simplistic gestural interaction, towards the design of interactions that draw upon nuances of expressive gesture.
Can Specialised Electronic Musical Instruments Aid Stroke Rehabilitation? (Student research competition)
- AbstractStroke patients often have limited access to rehabilitation after discharge from hospital leaving them to self-regulate their recovery. Previous research has indicated that several musical approaches can be used effectively in stroke rehabilitation. Stroke patients (n = 43), between 6 months and 19 years post-stroke, took part in specially created music workshops playing music both in groups and individually using a number of digital musical interfaces. Feedback forms were completed by all participants, which helped to develop the prototypes and gain insights into the potential benefits of music making for rehabilitation. 93% of participants stated they thought that the music workshops were potentially beneficial for their rehabilitation. The research project contributes to the field of HCI by exploring the role of computer based systems in stroke rehabilitation.