Joo Youn Paek
Joshua Knowles

3001:MMMI

3001:MMMI is a new way for an audience to collaboratively perform music using mobile phones as controllers for a single performance.

http://auscillate.com/mmmi/

3001:MMMI is a new way for an audience to collaboratively perform music. Audience members interact with a large projection screen using their mobile phones and playfully control, create, and perform a live piece of music together. 3001:MMMI is a fun, social experience — encouraging both interaction between the audience and the performance and interaction between individual audience members.

It works like this:

Each member of the audience uses their phone (or a loaner phone) to control a small avatar on the screen. How this avatar moves around and interacts with other audience member's avatars determines what sorts of sounds, melodies, and patterns will play out of the main sound-system. An audience member pushes buttons on the phone to control the avatar on the screen. They see which other members they're connecting with and are given the ability to play in the musical space and experiment.

I have a system developed, as well, so that I can "score" the performance in advance and allow the player interaction to algorithmically affect this score. This allows the audience to have fun and feel the interactivity, but reigns in the cacophony by introducing constraints such as amplitude dynamics and coherent chord structures.

The system currently works like such: Each mobile phone runs an J2ME (Java) applet that connects with a master Java server computer through a local wifi router. This Java server connects via a socket connection to a second computer running the visualization using Processing and Max/MSP for audio output.

My research has ranged from music interfaces to game design techniques to Java programming and computer networking to musical composition. There\'s a lot behind this project.

People interested in musical performance (especially of the NIME variety), video games, and bleeding-edge mobile tech should also be interested in this project. Between three and twenty people can use the system at the same time. Even though it\'s a mobile phone app, they need to be present at the site of the installation.

A user either installs our software on their phone or (more likely) borrows one of our loaner phones. They push the \"0\" button to check in their avatar and give themselves a name. When this happens, a circular avatar appears on the screen and immediately becomes a part of the on-screen performance space. They see the available controls on their phone\'s screen and use the phone\'s \"rocker\" to control their avatar\'s position on the screen and the number keypad to control the musical parameters. Audience members use this interface to collaboratively build melody lines and control other sounds.

3001:MMMI requires two Mac computers to run (one to serve the MUPE XML interface to the phones and one to run the audio and visuals. It also requires mobile phones with the MUPE client installed and configured. A projector and single sound system handle the audio-visual output.

I have learned much about networking and writing Java and J2ME applications, but this project has largely been about creating user interfaces on the phones that are at once easy-to-use (and quickly learned) and expressive given the limited interactions possible.

[?]