Alex Dodge
Alvin Chang


In this non-visual world, feel the wind, hear the sounds.

The Nature of Code

Most of us navigate space using primarily our legs and visual sense. But what if moving were more about effortless exploration of a touch- and audio-based space? In this project, people can move through space by slightly moving their shoulders in whichever direction they choose -- forward, backward, left or right. But here's the catch: The space isn't visual; it's represented by audio, and movements can be detected by the wind that rushes by your face as you move through this world.

I have long thought that alternative feedback methods have been underutilized in interactive systems. Aside from “vibrate” on mobile phones and haptic feedback on handheld gaming controllers, which seem to be on their way out, there hasn’t been extensive implementation. Much of this can be attributed to path dependent tendencies with interfaces -- even when the benefit gained is far greater than the cost of changing; evidenced by the QWERTY keyboard above. This is perhaps due in part to our inherent proclivity for adapting to less than optimal tools… keyboards will most likely stay the same for some time, but other systems are worth experimentation.

Anyone and everyone.

User Scenario
A user sits down in a chair, legs dangling. The user puts on headphones and blindfolds. The program runs, and the user begins to feel a light breeze on his or her face. Then, the user begins to hear sounds and moves his or her shoulders toward that direction, and that's when the user begins to feel more find as he or she travels faster and faster.

We are using four 200 mm PC fans and PC power supplies, plugged into an Arduino with Processing running the sounds and we're using a camera to track the user's movements.

We've learned the difficulty of creating a non-visual world.