This year, the EAVI group, along with our collaborators in other departments and universities such as Queen Mary, have managed to get several publications, workshops and demos accepted for the ACM CHI 2016 conference in San Jose this May. CHI is the top conference for Human-Computer Interaction, and so its a real honour to be able to present our work there. We’re also proud to announce that one of our papers – Haptic Wave: A Cross-Modal Interface for Visually Impaired Audio Producers – by Atau Tanaka and Adam Parkinson, received a best paper award. Overall, it’s great to have our research recognised at such a high level.
A. Tanaka, A. Parkinson. Haptic Wave: A Cross-Modal Interface for Visually Impaired Audio Producers.
O. Metatla, N. Correia, N. Bryan-Kinns, T. Stockman, F. Martin. Tap the ShapeTones: Exploring the Effects of Crossmodal Congruence in an Audio-Visual Interface.
S. Wiseman. Use Your Words: Designing One-time Pairing Codes to Improve User Experience.
A. Sarasua, B. Caramiaux, A. Tanaka. Machine Learning of Expressive Gestures: Case Study of Articulations in Music Conducting.
P. Kirk, M. Grierson, L. Stewart. Stroke Rehabilitation of the Upper Extremity: A Feasibility Study Using Specialised Digital Musical Instruments (DMIs) in the Home Environment.
M. Gillies, R. Fiebrink, J. Garcia, et al Human-Centred Machine Learning.
A.Parkinson, A.Tanaka. The Haptic Wave: A Device for Feeling Sound