Being analogue - when digital is not enough
© Copyright 1994-2002, Rishab Aiyer Ghosh. All rights reserved.
Electric Dreams #45
23/January/1995

People aren't digital. Despite enormous advances in computing technology, analogue intelligence will remain superior for a long time. In almost any human action, from the obviously intellectual one of writing weekly columns on information society to the more mechanical one of associating a pattern of light and dark with meaning, analogue bio-computing does a better job than digital silicon. No wonder many people are trying to make computers more analogue, for when digital is not enough.

The human eye is a marvelous piece of technology. Made of cheap, easily available components, this naturally evolved bag of water and hydrocarbon jelly provides greater sensitivity (to the level of a single photon) and a greater range (from a dark room to direct sunlight) than any single artificial optical device. Much of the eye's success is due to its neural image processing circuitry, which supports its adaptability to changing inputs, condensing a flood of raw data into an ordered stream of visual information over the optic nerve, in a process very unlike binary digital computing.

The fact of computing being digital owes a lot to Boolean algebra, the logic of the true and false developed in the 19th century. Boolean algebra shows how you can calculate anything (within some epistemological limits) with a system of two discrete values, which suited early computers that couldn't reliably distinguish too many voltage levels. Though modern electronics is far more sophisticated, we stick to binary digital computing for its sheer conceptual simplicity. In doing so, we miss out on a lot.

Nature created the eye without having read of George Boole's algebra, and relied instead on basic laws of physics. Having millions of years to develop, the analogue computer is capable of far more than anything digital - the human eye, for example, computes more than all the supercomputers in the world. One reason is that digital computers depend, at the lowest level, on constraining the laws of physics to the far more limited laws of Boolean logic. So, while one of an optical neuron's most basic operations involves exponentiation and averaging multiple inputs in hardware, digital computers are built upon a comparison of two true-or-false values - exponentiation is way up on the ladder of complexity.

While fuzzy logic and neural networks try to digitally simulate some of the effects of analogue computing, there are several efforts to create artificial analogue computers in hardware, notably that of Carver Mead and colleagues at the California Institute of Technology, who design analogue VLSI chips such as the silicon retina.

As conventional chip manufacturing technology runs into physical limits in the density of circuitry and signal speed, alternatives to semiconductor based binary digital computers are emerging. Apart from analogue VLSI, these include bio-chips, which are based on materials found in living creatures; optical computers that live on pure light; and quantum computers that depend on the laws of quantum mechanics to perform, in theory, tasks that ordinary computers cannot.

These technologies differ in many aspects. But they are similar in one - they aim, unlike current digital computers, to utilize at the lowest level some of the computational capabilities inherent in the basic, analogue, laws of physics, creating machines that, like people, benefit from the potential intelligence that fills our universe.




  • Electric Dreams Index
  • dxm.org Homepage