USA: Technology - Duplicating the brain.

USA: Technology - Duplicating the brain. - Scientists seeking a thinking computer have been studying the brain and seeking to simulate its processes. Pallab Ghosh reports on their progress.

by
Last Updated: 31 Aug 2010

Scientists seeking a thinking computer have been studying the brain and seeking to simulate its processes. Pallab Ghosh reports on their progress.

Computers are extremely fast at solving calculations but they are still fundamentally unsuited to solving the sort of ill-defined, unstructured problems that people find very easy: recognising objects and making quick intuitive guesses at the solution to a difficult problem. Computer scientists have for some years been trying to develop a new type of computer, better suited to solving these types of problems. Such a computer already exists: the brain. Some computer scientists, "connectionists", believe they can duplicate processes in the brain by copying the original model.

They are following Alan Mathison Turing, who built the first electrically-operated computer in 1945. Turing, as a 10-year-old, had been struck by the observation in a children's science book that the body, though complex, was but a machine. He believed that building a "miraculous machine" which was intelligent and able to think should be possible - in the '40s a seemingly far-fetched idea. Less so now.

The American telecommunications giant, AT and T, recently developed what it describes as a brain-like chip. Like the human brain the device consists of many switches each of which is connected to a large number of others. In the brain these switches are called neurons which can be either in a "firing" state or "non-firing" state. Each neuron decides which state to be in by looking to see which state its neighbours are in - if enough of the neuron's neighbours are firing then it fires too. The more the neuron fires, the more it becomes susceptible to the influence of its neighbours and more prone to firing. Psychologists believe that this process is involved in learning.

Researchers at AT and T Bell Laboratories in New Jersey have duplicated this process with electronic components. All electronic memories store information in the components themselves. Here the researchers have created a circuit which, as in the brain, stores information outside the components. Data is defined in terms of the number of components firing and their susceptibility to firing. The susceptibility here is the electrical resistance of the wire joining the components, which decreases as the chip learns. This seems an eccentric way of storing information but the chip could enable computers to solve problems as humans do. It is unlike conventional electronic circuits. It can learn and solve problems by approximating - skills once unique to higher animals.

More recently, there's been the exciting development of what is claimed to be the world's first neuron made from silicon. It is said to have some of the characteristics of real nerve cells. Less than a tenth of a millimetre square, it uses incredibly little power.

The main purpose of the neuron is to study how the human brain works but according to the inventors, Misha Mahowald of the California Institute of Technology and Rodney Douglas of Oxford University, the silicon neuron could be used in intelligent machines. The researchers hope to link together a few hundred of their silicon neurons to form what will in effect be an artificial brain.

The brain will look very much like any other silicon chip bit and, once fully developed, will have brain-like capabilities. Such a chip could well link into conventional electronic circuits to carry out tasks, such as accurate pattern recognition, that are difficult to achieve using conventional chips, a development so astonishing that scientists can only speculate on further applications.

But will all this lead to the development of real intelligence? Turing thought the question "Can a machine think?" meaningless and believed that, by the end of the century, developments would produce a climate in which it was irrelevant. Computers are now part of our mental furniture. With eight years to go before the end of the century and much promising research in progress, sceptics might think it advisable to wait and see.

Pallab Ghosh is a science and technology writer.

Find this article useful?

Get more great articles like this in your inbox every lunchtime

Subscribe

Get your essential reading delivered. Subscribe to Management Today