At first I began by watching people use things like laptops, mobile phones, iPods, etc… but began to consider something perhaps more at the core of all these devices; being the limitations and unique abilities of a what we could consider an interface of the first order… among its many other facets… our hands and their specific modes of interaction. I should mention that by first order I am asserting the inability to discern anything before it without entering the explicit realm of neuroscience or psychology, though not to suggest that hands are in any way an isolate within such an order… but an apparent and primary avenue of input in a text-centric media culture.
We are most often busy designing things that fit the human body with enough specificity and generality to be both efficiently optimized by each unique user as well as applicable to the largest base of users possible (one size fits all)… (read profit margin). However I thought why not consider it in other terms; how our hands’ physiology has dictated certain design decisions arbitrarily with such subtlety that they often go unnoticed until much later on… some not even restricted to the physical… Have you ever wondered why we use a Base 10 number system? Perhaps in the worlds of many popular animated series they employ a Base 8 system instead??
Similarly one standard that makes relatively good sense in one place often seems to insert itself to another based entirely on precedent. Consider the Chinese typewriter seen here. The standard typewriter manual interface makes great sense when you have a 26 character phonetic language system; easily manageable with 10 fingers… but when you have a gestural glyph system with a base set with over 5000 the same approach makes less sense… yet exceedingly tedious machines like this were in use for decades while a stroke or radical based system might have made a lot more sense it wouldn’t arrive until the dawn of the PC. For lack of a better alternative you go with what is there, until a better system arrives… and even when one does we tend to accept it as is… you might mediate your seemingly incompatible system with third party software like a western keyboard based phonetic translator dubbed “Romaji” by the Japanese and “Pinyin” by the Chinese… though not without some consequences; as auto spellcheck for westerners has essentially relieved us of the need to know how to spell our own languages, similarly Japanese dependent on Romaji or kana phonetics for digital input have no trouble reading but eventually lose the ability to write kanji fluidly by hand. Perhaps as we offload former manual tasks to systems of automation we will find newer and most likely more abstract tasks to replace them.
We seem to generally accept the manual interfaces given to us: A keyboard based on a typewriter, whose key layout (QWERTY) was developed to avoid mechanical jamming of the typewriter rather than provide easier access to frequently used letters… you could go and buy a Dvorak keyboard and spend a few months learning to use it, but you probably won’t… The fact that we’ve adopted the keyboard as the base standard of interaction with computer systems is not because it is the best method but because it is both good enough and just slightly better than that which came before it…
Which is better than doing this:
Which is maybe still better than programming BASIC using only switches on this:
Even though the Altair came decades after the interfaces pictured in black and white… it was affordable … showing that convenience can on rare occasion be trumped if and when there is no other alternative (The Altair was the only affordable computer that you could buy at the time). To prematurely conclude on the subject of the keyboard… They are probably here to stay in some form (until you can figure out how to code without one) with an already proven long and rich history dating as far back as the 1st century if you consider keyboard based musical instruments within the same lineage.
While we are steadily coming around with revisions to our accepted manual interfaces that will most likely enable new and previously unimagined forms of expression, modes of teaching / learning, and no doubt newer pitfalls to match. The relatively recent emphasis on touch-based interfaces offers a great deal in gesture based interaction though most of the interfaces seen thus far remain buoyant by means of novelty rather than truly inspired functionality. I could type faster and more accurately on my Blackberry, but somehow I am happier with my iPhone. I sacrificed a great deal of functionality for sleekness and bulky re-mappings of keyboard and mouse derived menus. I imagine that it will take time for us to really understand the possibilities and true limitations of touch based systems… but in the meantime we might consider other avenues to explore.
Looking to those with specialized means of communication we might be able to derive greater functionality and efficiency where before we saw only limitation:
The fingertips hold the greatest concentration of nerves other than your eyes and your inner ear. It makes sense that we might consider the sense of touch localized to the finger tip alone or in conjunction with touch based systems as a focus for new interfaces.