Adam went to this years International Computer Music Conference in Perth, Australia, to present a paper called “Embodied Listening, Affordances and Performing with Computers” (email a.d.parkinson at gmail if you would like a copy, proceedings aren’t online yet).
Here’s the abstract:
I consider here some contemporary approaches to the live performance of computer music. Drawing upon ideas of the embodied mind and the extended mind, I will outline a theory of embodied listening which problematises some of our assumptions about music, gesture and performance. Against the background of this theory, I will argue against approaches which put too great a focus upon gestural legibility in when design new computer-based instruments.
I will consider a diverse array of practices in contemporary instrument design which do not necessarily adopt principles of gestural legibility, and challenge some of our ideas about what instruments and musical performance should be. I draw upon the notion of affordances, and argue in favour of an approach to computer based instrument design which seeks to explore the unique affordances and singular possibilities we find in new technologies.
The paper went down well at ICMC, with people at least pretending to understand Adam’s rambling northern accent.
There were some fantastic papers and people at ICMC. Too many to list (and apologies for everything I’ve not mentioned), but of particular interest to Adam were the following works:
Tim Barrass, who presented two papers, one was a very elegant piece of work with his brother than involved 3d printing of steel bells, doing a spectogram analysis of the bell, then using this spectogram as the template for another bell, which is analysed, and so on, recursively.
Also present by Tim was Mozzi – http://sensorium.github.io/Mozzi/ – a synthesis library for the arduino which was very inspiring. My next instrument is going to be on arduino!
Oliver Bown (University of Sydney, ex-Goldsmiths) did a great performance using Raspberry Pi’s that were distributed through a concert-hall, each with its own small speaker. He live coded onto them over wifi, and produced a wonderful soundscape with really compelling spatialisation.
Tae Hong Park (NYU) presented a great paper on lo-fi haptics with mobile devices: using, for example, an elastic band strapped round the device to create a ‘haptic guitar’ (coupling it with Touch OSC), or using a strip of foam under the device and looking at accelerometer data to make a velocity sensitive keyboard with haptic feedback. Seemed like super creative work.
It was great to hear ex-Newcastle Colleague Will Schrimshaw quoted for his sound art theory, too. I am looking forward to reading his new texts.
Looking forward to staying in contact with all the great people I had some really stimulating discussions with there.