Sensorimotor skill and cultural values in digital musical instrument design

Aug 25, 17:00, Film & Drama Studio (FADS), QMUL

Abstract. Every year, many new musical instruments are created in research and industry, but most drop out of regular use after just a few years, while classic acoustic and electronic designs remain ubiquitous in many styles of music. This talk will examine two specific human facets of new instrument design: sensorimotor skill and sociocultural values. In the first case, the talk will explore ways of designing new instruments which extend and repurpose existing expertise on familiar instruments, drawing on examples from my previous work. In the second case, the talk will examine the complex interaction of technological and aesthetic values amongst the designer, their tools, the performer, and the surrounding musical ecology. Rather than propose any definitive set of guidelines, the talk will conclude with open questions and reflections for how instrument creators can consider skill and culture throughout the technology design process.

Andrew McPherson is a computing researcher, composer, electronic engineer, and musical instrument designer and a Senior Research Fellow of the Royal Academy of Engineering. He is Professor of Musical Interaction in the Centre for Digital Music at Queen Mary University of London, where he leads the Augmented Instruments Laboratory. Andrew holds undergraduate degrees in both engineering and music from MIT, an MEng in electrical engineering from MIT, and a PhD in music composition from the University of Pennsylvania. Andrew’s musical instruments are widely used by performers and composers across many genres, and his research has led to two spinouts: Augmented Instruments Ltd, which develops Bela, an open-source audio maker platform, and TouchKeys, a transformation of the keyboard into a versatile multi-touch control surface. He is deeply committed to teaching: Bela is used in the classroom by over 20 universities, and his online course on audio programming has been followed by learners around the globe.

Co-designing audio haptic interactions

Aug 26, 10:50, Peston Lecture Theatre, QMUL

Abstract. The importance of involving the persons intended to use a design, already in the design process leading up to the final prod-uct or service, is increasingly acknowledged, and has always been a part of our design processes at Certec, Department of Design Sciences, Lund university. This talk is intended to provide both inspiration and practical suggestions for anyone interested in designing  audio haptic interactions for and with persons with varying abilities. The text focuses on co-design, but many of the adaptations and materials presented can also be used in more traditional design activities, such as usability testing. Our work rests on an inclusive mindset. In other words, in the talk I will focus on how to expand and enhance existing methods regarding who is involved, and how to provide means for participation to wider target groups, rather than how to create “special” methods for “special” users with “special” needs. I will discuss activities, materials and methods drawing on concrete practical examples.

Charlotte Magnusson is Associate Professor in Certec – Rehabilitation Engineering and Design at Lund University (Sweden). She has two particular areas of interest. The first is concerned with non-visual interaction design and how use of other senses than the visual can be used in applications and services. The second is design and design methodology for persons with and without disabilities. Charlotte is also an experienced programmer, with particular experience from interactive multimodal applications. She has led and participated in several Swedish and European research projects. She was chair of the 7th HAID in 2012. Charlotte is a member of the Swedish Braille authority, and ISO/TC 159/SC 4/WG 9 (Tactile Haptic Interaction).

Multi-sensory displays using acoustic waves, holography and levitation

Aug 26, 15:20, Peston Lecture Theatre, QMUL

Abstract. There is something magical about being able to interact with a 3D display that shows an hologram in front of us to create a princes Leia effect. It gets even more magical when we not only see 3D but feel, hear, taste, and smell it. My research is driven by this vision to deliver novel multi-sensory experiences to users without instrumenting them. For example, we manipulate sound to levitate tiny objects in mid-air and move them 10,000 times a second so that the object disappears, and a 3D shape emerges in its place. These sensations are created in mid-air – so users don’t have to touch or hold any device to experience it. We use principles from acoustic holography to shape our wavefront and deliver them using many tiny speakers that are precisely timed to shape the wavefront. Such walk-up-and use devices are starting to find their way into theme parks, ticketing stations and many other everyday places. I will talk about my work on how we use acoustic levitation and holography to create immersive experiences.

Sriram Subramanian is a Royal Academy of Engineering Chair in Emerging Technologies at University College London (UCL, UK). Before joining UCL he has held faculty positions at University of Sussex (UK), University of Bristol (UK) and University of Saskatchewan (Canada). His research is driven by a vision to deliver novel multi-sensory experiences to users without instrumenting them with wearable or head-mounted displays. He has done this by successfully blending creation of novel hardware with democratisation of access through many different projects. Notably in 2013, he co-founded Ultrahaptics (now Ultraleap) to commercialise his mid-air haptic technology.

* Vincent Hayward’s (Sorbonne University & Actronica) keynote talk unfortunately had to be cancelled.