Combining Audio And Gestures For A Real-Time Improviser

Publication Type  Conference Proceedings
Year of Publication  2005
Authors  Morales-Mazanares, Roberto; Morales, Eduardo; Wessel, David
Conference Name  International Computer Music Conference
Pagination  813-816
Publisher  International Computer Music Association
Conference Location  Barcelona, Spain
Abstract  Skilled improvisers are able to shape in real time a music discourse by continuously modulating pitch, rhythm, tempo and loudness to communicate high level information such as musical structures and emotion. Interaction between musicians, correspond to their cultural background, subjective reaction around the generated material and their capabilities to resolve in their own terms the aesthetics of the resultant pieces. In this paper we introduce GRI an environment, which incorporates music and movement gestures from an improviser to adquire precise data and react in a similar way as an improviser. GRI takes music samples from a particular improviser and learns a classifiers to identify different improvision styles. It then learns for each style a probabilistic transition automaton that considers gestures to predict the most probable next state of the musician. The current musical note, the predicted next state, and gesture information are used to produce adequate responses in real-time. The system is demonstrated with a flutist, with accelerometers and gyros to detect gestures with very promising results.
URL  http://cnmat.berkeley.edu/publications/combining_audio_and_gestures_real_time_improviser
  
Export  EndNote Tagged | XML | BibTex
AttachmentSize
icmc05fin.pdf65.16 KB