09 February 2006

Robots Crying? Is this Artificial Intelligence?


I found two fascinating accounts of research into machine intelligence, fromDeveloping Intelligence Blog, and from Science Blog.

From Chris Chatham at Develintel, comes this excellent write-up about MIT researchers attempting to simulate emotion in a robot called "Kismet." Why bother putting emotions into robots, when emotions give people so much trouble? Allow Chris to explain:

Emotions are not typically considered mechanistic or functional - we tend to think of them as cognitive "byproducts," an evolutionary inheritance from ancestors capable only of feeling, and not of thinking. On the contrary, emotions may actually be an integral part of cognition. Below is a list of Kismet's emotions, and the functions each is thought to subserve; does this list accord with your subjective experience of these emotions?

* Anger: to mobilize and sustain activity at high levels; low levels of anger (frustration) are useful when progress toward a goal is slow
* Disgust: to create distance between one and an aversive stimulus
* Fear: to motivate avoidance or escape responses
* Joy: to broaden experience by encouraging social behavior and reward completion of a goal
* Sorrow: to slow responses in cases of poor performance, so as to encourage reflection and behavior change
* Surprise: to signal the occurence of an unpredicted event, so as to improve future attempts at event prediction
* Interest: to motivate exploration and learning, and reinforce selective attention
* Boredom: similar to interest, except its purpose is to force an encounter with a new stimulus, which might then elicit interest

Certainly one can't ascribe intrinsic "functions" to emotions, but it is clear that emotional deficits can cause changes in behavior - for this reason they must have some behavioral consequence, which we may assume is evolutionarily advantageous. While it may not be possible to describe exactly what these behavioral consequences are, it may actually be possible to test hypotheses about possible "functions" of emotions in the creation of autonomous robots, in order answer precisely these questions that are either impossible or unethical to test in humans.


Go to Develintel and read the whole thing. Chris is an excellent guide to all matters cognitive.

Next, from Science Blog comes this fascinating account of attempts to fuse sensory inputs in a machine neural model. We know that in humans, sensory inputs of several types can be fused into one conscious impression of "now." Sometimes referred to as the "binding problem", many scientists believe the concept is key to understanding the subjective experience of consciousness. If machines are to "experience" a consciousness closely analagous to our own, a similar type of "binding" is necessary. The "Sensemaker Project" at the University of Ulster is an early attempt to accomplish this.

To exploit these properties, the Heidelberg group developed a spiking neuron Application Specific Integrated Circuit device, so as to be able to emulate larger constituent components of biological sensory systems. A prototype device had been submitted for fabrication when the project completed, but when fabricated will be exploited in a follow-up European project.

These circuits process data in a similar manner to the biological brain, focusing resources on the most data-rich sensory stream. A user interface on a PC lets researchers engage with the system.

The team concentrated on two particular senses, namely sight and touch. The experimental touch-sensor system, developed in Heidelberg and used by the SENSEMAKER partner Trinity College, Dublin, is itself quite novel. It features an array of small, moveable spring-loaded pins. This enabled psychophysical experiments on touch and vision to be conducted on humans and was a very valuable tool in exploring human responses to sensory integration. The results from these experiments helped to inform the sensory fusion model.

The project has created a sophisticated, biologically-inspired model of sensory fusion, for tactile and visual senses. Perhaps the greatest achievement of the project is the creation of a framework which allows extensive experimentation in terms of sensory integration. The project’s work can easily be extended into other sensory modalities; for example the project partners are currently planning to extend the work to auditory senses. The hardware implementation(s) of the model, which allow for extremely rapid learning as compared to biological timescales, will be exploited in follow-up projects.


In both of these projects, you can see how far machine intelligence research has come from the old "expert systems" of the 1980s. Researchers are becoming more sophisticated, and are taking a more multidisciplinary approach to the problem of machine "cognition." The huge gulf between human cognition and what machines can do will not be bridged anytime soon. All the same, this is genuine progress.

Labels: ,

Bookmark and Share

0 Comments:

Post a Comment

“During times of universal deceit, telling the truth becomes a revolutionary act” _George Orwell

<< Home

Newer Posts Older Posts
``