(Implementing Kinect body tracking with non-visual spatial feedback)
A Collaboration with Alvin Chang
Pictured above are three ubiquitous non-visual interfaces… Having focused primarily on visual media for the longest time, I myself find it odd that I have frequently fixated on concepts surrounding non-visual interfaces. Perhaps looking less often at a screen would allow one to direct their attention to other things in their environment? In any case I have long thought that alternative feedback methods have been underutilized in interactive systems. Aside from “vibrate” on mobile phones and haptic feedback on handheld gaming controllers, which seem to be on their way out, there hasn’t been extensive implementation. Much of this can be attributed to path dependent tendencies with interfaces -even when the benefit gained is far greater than the cost of changing; evidenced by the QWERTY keyboard above. This is perhaps due in part to our inherent proclivity for adapting to less than optimal tools… keyboards will most likely stay the same for some time, but other systems are worth experimentation.
Adapting a project devloped by Alvin Chang that utlized a Kinect for body tracking to maneuver a simple aquatic environment, we have designed an experimental system incorporating spatial audio and air flow feedback that will hopefully allow a user to navigate a mapped space. Our design requires the user to be blindfolded and to wear noise-canceling headphones while interacting with the system; this will hopefully help to make the non-visual feedback become prominent and effective.
A Kinect will track the user’s body position within a limited gestural space. To reduce user disorientation and ensure proper body tracking the user will remain seated using a limited set of movements to navigate.
The user’s movement will be indicated using an array of variable flow multidirectional fans. The user’s acceleration and speed in a given direction will be translated to flow intensity from a corresponding fan aimed at the user’s face. Serial communication with an Arduino microcontroller will regulate four 200mm 1300rpm fans. Three fans are positioned in front of the user and one behind.
Spatial targets in the form of mapped sound objects will hopefully be perceived as directional by the user. A user should be able to seek a sound locale effectively. Using distance with logarithmic intensity in a right-left binaurally mapped audio space, possibly implementing Doppler frequency shift as well, we hope to position multiple looped sound effects in a 2D coordinate space.
Related: Shay Martin a blind “video” game developer.
An active braille interface. Last year I saw a man on the subway in Tokyo using a much smaller portable device… possibly a braille mobile texting device or e-reader, though I couldn’t find picture of it.