Tuesday 19 January 2010

Human-Computer Interaction

I think it is time that Human-Computer Interaction (HCI) leaves the realms of the computer science educational syllabus and the job titles of psychology-trained IT-literate professionals, and hits the streets in high tech devices that people want to use. This means expanding the current expectation of how people interact with machines.

As I mentioned in my previous tablet related post, gesture-based interaction is about much more than touch screens on phones. Gestures may mean those made by fingers and hands, but also those made by moving the head or the face. I have met robots that can assess and act on human facial expressions. I have used gestures in free space, rather than on a capacitive glass surface, to direct what a machine should do. That was years ago, and so products could soon launch which use such techniques. Some gaming consoles such as Wii have already shown how human movement can be interpreted. Eye tracking is already used by systems such as flight simulators and also in laboratory test situations when studying human cognition and behaviours. The movement of the eye can convey a great deal of intent.

Voice recognition and synthesis have both advanced to a very usable stage and are also ready to augment humans' interactions with computer devices. I already have a pretty natural dialogue with my car and with my phone/music player. Finally, some wireless systems being developed can not only convey information but also detect movement and presence in a similar fashion to radar.

Some lab demonstrations have also shown how people can control devices simply by thinking about what they want to achieve.
Interacting with devices of the future may never be the same again!

No comments: