Not A Dance

Peter Holzkorn

A game/installation that engages the user to navigate real and
virtual space via auditory cues.

http://blog.holzkorn.com/thesis



The player/participant navigates through virtual space by sound alone, using
free-movement gestural controls for input and wireless headphones for
output. The subconscious activity of spatial hearing is turned into the
challenge of collecting invisible objects that emit auditory cues. Through
the gestural interface, the task of spatial mapping is related to the
player's body.

The project is also an exploration of the tension between gameplay and performance:
For the spectator, watching the physical movement is to watch a performance that,
paired with the generated soundscape and visual information, becomes a separate experience.


User Scenario
The user puts on the wireless headphones and enters the play area. From there on, they receive spoken instructions via the headphones on how to play. They will complete the three different stages (or game modes) one after the other, with more instructions at the beginning of each stage. Playing consists of moving in the play area to move in virtual space, and collecting sounds that are positioned in this space through various gestures.

The audience hears most of the sounds the player hears, without the instructions, and has an additional visual representation on a screen, to understand what's going on in the game.

Implementation
- Kinect camera, tracking the player
- Wireless headphones for audio output
- Some kind of clear delineation of the play area
- Code running on my laptop (Cinder, Max/MSP)
- 2 speakers for the sound the audience hears
- MOTU device for splitting the signals
- A screen for the visual game overview for the audience