Wednesday, 3 September 2008
Machines that understand users?
We often hear stories about how users find themselves baffled by machines ... they don't understand why the computer has done something, or can't work out how something should be configured to achieve the desired outcome. So users understanding machines is sometimes rather a challenge! So what about the other way around?
In the future, we will have machines that understand people. ZDnet reports how Intel has already announced research it is carrying out into sentient machines, which use a whole raft of complex sensors to understand the world of the user and hence awareness of the user's situation. One Intel project called "everyday sensing and perception" or ESP began during 2007. Its aim is to achieve 90% accuracy of understanding 90% of an average individual's daily routine. The sensors involved include very basic measurements as well as higher level interpretations of movement, emotions and words, as well as real time object recognition. Currently the latter can manage at least 75% accuracy on automatically recognising seven objects, using video capture from a shoulder-worn camera. The hope is to scale this to hundreds of objects. This is example of how discretely worn devices in the future will interact with other devices to provide real time inputs - part of a wearable sensor network.
At present, the processing required for this real time event recognition is about 4 TeraFlops and about 10kW of power. The power consumption aim eventually is less than 1 watt so that portable devices can perform the task.
And this is only one set of research initiatives in this area. So in a matter of a decade or two, it may be machines understanding users which is more common a practice than the other way about!