It's always a both exhilarating and stupid moment to realize something that's been sitting in front of you all the time. One of those moments was when I understood why it really is so important for Apple the create best-of-the-breed Voice recognition and speech synthesis software into every major version of Mac OS X.
In a sense, the multi-sensory interfaces (not just visual and kinesthetic, but also haptic, auditory and tasty/smelly) is one most obvious ways to innovate in computer hardware and software. However, at the same time, it has the glass ceiling of the too popular WIMP -interface1) (i.e. Windows-Icons-Mouse-Pointer). The so called "direct manipulation" interface, that actually isn't one. A bit more direct one would be the one in the picture above, in which a pen directly moves the pointer on screen, not indirectly from a mouse pad, like it's typical nowadays.
So. My theory is that Apple recognizes both the need to get rid of the WIMP, and that there aren't good enough competitors for that interface yet (by good enough, I mean nearly perfect user experience, not just technically implementable). So Apple does the next best thing, continues to advance the alternative technologies of interaction and finds places where they can be used as secondary or alternative solutions. Or, as in this case, as a service to the disabled people, who can not use the WIMP in the first place.
From this perspective, iPod represents a refreshing break from the WIMP-infested world. It's main way of interaction is a combination of gestures and auditory feedback. The control "wheel" is not haptic, finger just slides (no bumps), but the audio complements that sliding with ticking sounds that make it feel like finger would turn a discrete dial. The simple text menus do not employ fancy graphics, or icons. Hence less strain on eyes. Like Borat would say: Nice!
Inspired by: 1) Milekic, S. (2002) Towards Tangible Virtualities - Tangialities. Museums and the Web 2002 -conference.
No comments:
Post a Comment