A paper of mine detailing the design of TC-11 has been accepted into the New Interfaces for Musical Expression conference (NIME 2012). The conference will be held in Ann Arbor, MI, and I’ll be going to present and demo the instrument. Once the proceedings are online, I will link to the paper here (along with other neat research from the conference).
From the conference site:
The NIME conference brings together researchers and practitioners from a range of academic fields including computer science, electrical engineering, human-computer interaction, musicology, electro-acoustic music, dance and composition, and has routinely attracted interest from electronic music industry as well.
This year NIME will be held at the University of Michigan in Ann Arbor. Ann Arbor has a long tradition on electro-acoustic music having been the location of pioneering early work by Gordon Mumma, Robert Ashley, David Tudor, Alvin Lucier, and David Behrman. It has just celebrated the 25th anniversary of the Performance Art Technology program, is the location of numerous electronic music groups such as Steve Rush’s Digital Music Ensemble and more recently Georg Essl’s Michigan Mobile Phone Ensemble. University of Michigan hosted the International Computer Music Conference in 1998.
The folks at NIME have uploaded the full NIME++ 2010 proceedings. Here is the permanent link:http://www.educ.dab.uts.edu.au/nime/PROCEEDINGS/
The direct link to my paper is here: ‘Relationship-Based Instrument Mapping of Multi-Point Data Streams Using a Trackpad Interface.’
I’m always happy to hear comments and feedback. You can see (old) videos of the instruments described in the paper on my Vimeo page.
There were some notable presentations and projects I’d like to link to from here. Alvaro Cassinelli presented a tracking system that follows the curvature of any drawn or sensed object with a laser, as described in the paper ’scoreLight: playing with a human-sized laser pick-up.’ Demo video here.
Ajay Kapur presented his work in organizing a cross-departmental approach to building a robotic orchestra (‘A Pedagogical Paradigm for Musical Robotics’). Music and theater students work together to apply their skill sets to the creation of an ensemble where humans and machines perform together. Very inspirational.
Kris Kitani developed an analysis / performance augmenting algorithm for live percussion performance (‘ImprovGenerator: Online Grammatical Induction for On-the-Fly Improvisation Accompaniment’). If this were incorporated into the above mentioned robot orchestra, I can imagine some really exciting results.
And finally, Andrew McPherson has brought the piano to its next evolutionary phase with‘Augmenting the Acoustic Piano with Electromagnetic String Actuation and Continuous Key Position Sensing’. This one is best understood by watching a video of his elegant system in action.
The conference was fantastic – lots of creative energy and some truly inspired projects and research. As for my presentation, it went quite well. I got some very nice feedback from other presenters and conference attendees about the multi-point instruments.
Once the paper publication goes online I will link to it from here. There are plenty of clever works that are worth mentioning. In the meantime, check out the flickr photostream for an idea of the conference atmosphere.
My new paper on multi-point mapping strategies, titled ‘Relationship-Based Instrument Mapping of Multi-Point Data Streams Using a Trackpad Interface,’ has been accepted into the New Interfaces for Musical Expression (NIME) 2010 conference in Sydney. The paper describes my analysis system for dealing with multi-point interfaces. I will be presenting at the conference, and will post the link to the paper and proceedings when it gets published.
I’ve finished two demos of a multitouch instrument I’ve been programming. You can see the videos on Vimeo:
The synth uses the Macbook Pro trackpad, but it can accept any multitouch interface that adheres to the TUIO specification. I programmed the instrument to react to the relationships between the points, instead of simply mapping the multiple points’ coordinates directly to synthesizer parameters. This adds a level of interpretation to the raw multi-point stream, but the result is a much more dynamic instrument. MDrumSynth, for example, allows you to ‘mute’ the trackpad like a drumhead by holding down multiple fingers before tapping. I mapped that parameter to the long decay sound you can hear in the video.
I’m still very much in the development stages. There is a lot of polishing to do before I look into distributing the programs.