To industrial designer Gustavo Ostos Rios, music improvisation is all about the emotion but he and his two supervisors in the Department of Industrial Design at Eindhoven University of Technology in The Netherlands have now found a way to understand the complex interactions that take place between instrumentalists and singers during a jam with the aim of using those insights to add greater emotional expression to a performance involving digital instruments.
“In human-computer-interaction (HCI), we are more and more moving away from designing interactions for single users, towards designing interactions for networked groups of single users; we move from a ‘one user-one technology’ paradigm towards a ‘multiple users-multiple technologies’ paradigm,” Mathias Funk, one of the co-authors explains in the International Journal of Arts and Technology. Examples of this shift include the familiar social media and social networking sites, like Twitter and Facebook, many people use on a daily basis as well as the likes of Wikipedia and other collaborative ventures, such as citizen science projects including GalaxyZoo and SETI@Home.
In the department of Industrial Design, Funk and his colleague Bart Hengeveld research novel musical instruments that translate this idea towards musical performance. Much emphasis is placed on the solo performance or endeavour in some of the creative arts, such as painting and sculpture and the audience is usually detached from the art, viewing it and interacting with the “product” some time after the creative process has ended. Live music is different, there is usually more than one person involved in creating a performance and the audience is present the whole time. As such, there is a shared emotional response that can, in the case of improvisational performance, take the music in new directions. More commonly, the changes in direction are driven by the musicians and how they interact with each other, but audience response can nudge them too.
The team has developed a three-layer model that illuminates the relationship between band members and the audience as a system, where emotions, expressivity and generation of sound give shape to improvisation. The team has used this model to focus specifically on how individual emotional arousal can be used as input to control as a group their digital musical instrument, EMjam. The system, the team says, “builds on the construct that when paying attention at a concert it is possible to see performers’ expressions; a guitarist playing a solo and reaching a peak at a certain point of it; a bass guitar player following with his face the lines played; a drummer making accents with the whole body; and in addition to this, the audience responding to the performance.”
The instrumentalists receive a wrist band with skin conductance sensors that can in a sense measure the musician’s emotional state. The percussionist has a wristband controlling the rhythm generated by the EMjam, the bass guitarist has harmony controller and the guitarist or keyboard player a controller for melody. EMjam then uses the music software Ableton Live to add a parallel second layer of sound as a result of every individual input. The team adds that the same approach might be used to add expression to a light show or other visuals to accompany the music.
Ostos Rios, G., Funk, M. and Hengeveld, B. (2016) ‘Designing for group music improvisation: a case for jamming with your emotions’, Int. J. Arts and Technology, Vol. 9, No. 4, pp.320-345.
No comments:
Post a Comment