Musical Turk

Mark Kleback

A crowd-sourced musical composition that invites the audience to participate in the performance

http://musicalturk.com



Musical Turk is a musical performance tool modeled after Amazon's Mechanical Turk crowd-sourcing service. It collects musical samples from various people on a website, then remixes these samples in a live performance space with assistance from the audience.

Background
Golan Levin's cell phone symphony

Dan Deacon's cell phone light app

Moldover's Octomasher

Nick Yulman's Bricolo robotic instruments

Audience
The website targets musical composers, prompting them to submit music based on a set of rules. The performance seeks to involve anyone willing to participate, with no prior knowledge of musical composition.

User Scenario
Prior to a performance, users at home submit 30-50 loops to musicalturk.com. On the day of the performance, a group of musicians begins playing together and recording loops that are then uploaded to the website on the spot. The audience is given interfaces to remix both the website submitted loops, as well as the live loops. This way, the musical performance involves both the musicians and audience alike.

Implementation
This project exists as a continuing online database, as well as a series of physical modules used during a live performance.

The website (musicalturk.com) prompts users to submit musical samples with specific criteria, and collects them in a database. There is an open API for these loops which contain the name, title, and file location of each sample.

During the performance, a python sketch scans the JSON file constantly, downloads new files, and sends an OSC message to an openFrameWorks sketch with each new sample. OF will then create an audioPlayer object and display the new sample on the screen.

An Arduino is managing the controls, which is sending data via Serial to OF with volume and selection data.

Another computer is used as a recording input. This computer uses Max/MSP to record loops at the performance, send OSC messages to a separate python sketch with the new loop information, and these are then uploaded to the database. Since the other station is downloading loops in realtime, these live samples are almost instantaneously added to the mixer.

Ideally, there are 8 physical consoles, each with a separate function for the live performance. The hope is that the audience can be an involved participant of a musical composition created solely with website submissions.

Conclusion
Getting people to submit music is easy, but getting them to follow rules is difficult. For future iterations, I'd love to have a automated sorting algorithm that can analyze the music and organize it appropriately.

During the performance, I started out just giving the interface to the audience and seeing how they interpret it. I discovered that it was far better to have a sort of "performance" using musicians first to show how everything works and set a tone for the piece. People tended to understand the intention more effectively when we did this.

Using OSC, Python, OpenFrameWorks, and Serial together was very challenging, so I learned how to build something that is resilient and can withstand dozens of users pressing buttons and turning knobs for hours. Getting the interface to download and upload to the website in real time was also challenging, and I'm happy that I was able to accomplish this.