(Thoughts inspired by an interview with interaction designer Bill Verplank, conducted by Gideon D’Arcangelo, June 2006 )
There is a large discussion revolving around interactivity and computers. What works, works best, and feels natural...in other words, how do we make systems that are intuitive? More interesting to me lately is not so much the question of how a system can be intuitive, but how a system becomes intuitive. When does an "unnatural" design teach us how to naturally interact with its "counter intuitive" logic? For example, typing on a keyboard is not a natural activity. I remember my first computer class in 7th grade and being given a test for typing. I hated it. My brain struggled with the odd ordering of the letters on the keyboard and I fought to use more than just my index finger and thumb. Of course, over time and through my dependence on computer technology, I learned. Now, typing is intuitive. But beyond it simply being a skill I have acquired, typing has also changed the way I think. It has effected my "natural" way of being. Typing is how I speak through email, IM-ing, and even text messaging by phone. Beyond the simple content of words, I play with characters to express emotion, make funny faces, and reveal extraneous, contextual information. It is natural for me to quickly type a colon with a parenthesis the moment I feel a smile :) or a frown :(. The typing action and my mood become intertwined, naturally.
This "naturality" does not exactly address the topic of intuitiveness emerging from design. We think of intuitiveness as a state of being connected to our actions in a way that we do not have to think. Intuition is usually a good thing. It infers we are in tune with the world around us and making complex decisions with the deepest core of our experience and knowledge. It may be natural for us to smile when we are happy, but it is intuitive for us to know what will make our friend smile. The best moments of creativity come from an intuitive place. For musicians, this is a sonic intuitiveness. Knowing how and when certain sounds unite to create rhythms and melodies that make people dance, cry, and laugh. The intuitive process of music making consists of playing, listening, finding, and repeating. Traditionally, this process emerges from a group of musicians playing their instruments live, starting with improvisation and narrowing down the options as certain parts (melodies and rhythms) become realized. But what happens when the intuitive process of making music is mediated by a computer? For one thing, the group can be eliminated. A computer allows for one person to play all instruments, hence, being only one decision maker in the process. Secondly, the existence of virtual instruments (instruments that only exist inside the computer and are manipulated either through an external controller or display settings on the computer screen) invites instruments you can see, but cannot touch. And finally, software for audio recoding is visual. One does not have to rely so much on their ear since sound waves are drawn on the screen and compositions are visually mapped out in linear blocks of color. We "see" our music as it sounds through the graphic displays of our computer software. The notion that only one person is needed for a band, instruments can be played while not being touched, and that a sonic composition can be built by a process of "cutting and pasting" blocks of colors in a line across a screen is completely counter intuitive to our common sensibility of what it means to make music. However, I would argue that it is not counter-intuitive. In fact, what I see is simply a new kind of intuition being built around the new tools a computer offers. If making music is touching an instrument (playing), hearing its sound (listening), deciding whether on the sounds you want to hear (finding), and playing what you want to hear based on what was decided (repeating), then the core process remains the same. Only now with a computer, our intuition is evolving to accommodate new intersections of our sensibilities. Rhythms are drawn that could never be played, but these rhythms are no less intuitive. They are just new.
Computer music may be considered unnatural, but it is not made without intuition. Perhaps music is not all about sound. In the same way a deaf musician like Beethoven can feel piano keys and hear the melody in his mind, so too can a computer musician see a drum sequence in an editing window and hear the cadence of its rhythm in his mind. Such adjustments in our perception of how we relate to sound change our intuition, but does not eliminate it. And these changes in our intuition invite new forms of expressions that grow to become no less natural that the beating of a wooden drum.