The Design Patterns for Inclusive Collaboration (DePIC) project aims to develop new ways for people to interact with each other using different senses, so reducing barriers caused by visual and other sensory impairments. DePIC is an EPSRC funded project that brings together researchers from Goldsmiths with Queen Mary and the University of Bath.
Our interaction with the world around us relies on perception which exploits combinations of the senses we have available to us, for instance when we both see and hear someone speaking we associate the words spoken with the speaker. Enabling people to use combinations of senses becomes critical in situations where people who have different sensesavailable to them interact with each other. These differences can arise because of temporary or permanent sensory impairment, or due to the technology they are using. However, very little research has examined how people combine and map information from one sense to another, particularly for individuals with sensory impairments, and then used such mappings to inform the design of technology to make collaboration easier. The aim of this multi-disciplinary project is to develop new ways for people to interact with each other using different combinations of senses. This will reduce barriers to collaboration caused by sensory impairment, and improve social and workplace inclusion by optimising the use of available senses.
Specifically, DePIC aims to combine empirical studies of mappings between senses with participatory design techniques to develop new ideas for inclusive design grounded in Cognitive Psychology. We will capture these design ideas and mappings in the form of Design Patterns and demonstrate their usefulness through the development of interactive systems to support assisted work, living, and leisure.
At Goldsmiths, our work has focused on the design and manufacture of a haptic prototype to enable audio engineers with visual impairments to have quicker and more efficient interactions with computer-based recording and editing of audio. This has been driven by participatory design, and manifested in the HapticWave (pictured). The HapticWave uses a modified arduino board and a motorised fader to allow users to feel the amplitude of an audio waveform.