Today's computers use pulses of electricity and flipping magnets to manipulate and store data. But information can be processed in many other, weirder, ways…
1. Optical computing
There's nothing weird about encoding data in light – global communications depend on optical fibre. But using light signals to actually process data and carry out computations is still not practical.
Optical computers are a worthwhile goal because using light could increase a computer's speed and the quantity of data it can handle. But trapping, storing and manipulating light is difficult.
Research by people like Paul Braun, at the University of Illinois, Urbana Champaign, US, is bringing us closer to this goal. He has created 3D optical waveguides out of photonic crystals that should make possible to trap light, slow it down and bend it around sharp corners, without fear of it escaping.
Meanwhile Mikhail Lukin at Harvard University has developed what is essentially an optical version of the transistor that underlies all today's computing power. Lukin and colleagues have created a way to make a single photon from one light signal switch another light signal on and off.
2. Quantum computing
If you want to tear up all the rules of classical computing, look no further than quantum computers. Instead of using electronic bits of information that exist in either 1 or 0 states, they use quantum mechanical effects to create qubits that can be in both states at once.
Calculations show that this ability allows many parallel computations to be carried out. As the number of qubits a quantum computer increases, the data it can process increases exponentially.
That would make possible things that are unfeasible with today's computers – such as rapidly factoring extremely large numbers to crack cryptographic keys.
However, quantum computers so far have only had very small numbers of qubits using quantum dots, nuclear magnetic resonance, metal ions, and, more recently entangled pairs of photons.
3. DNA computing
DNA may be the perfect material for carrying out computations. In a sense that is precisely what it evolved to do: DNA processes data and runs programs stored in sequences of genomic base pairs, as well as coordinating proteins that process information themselves to keep organisms alive.
The first person to co-opt these processes for computational problems was Leonard Adleman at the University of Southern California. In 1994, he used DNA to solve a well-known mathematical problem called the 7-point Hamiltonian Path problem.
Since then, DNA has also been used to create logic gates and an unbeatable tic-tac-toe opponent.
The basic principle is to use sequences of DNA to recognise shorter "input" strands, and to produce different "output" sequences. The results can then be read, for example, through the activation of fluorescent proteins.
Recently DNA-computing enthusiasts have become interested in having their creations go to work inside biological systems like the human body. It makes sense, because that's where they fit in best – and where conventional computers fit in least.
4. Reversible computing
Some people think we should be recycling our bits as well as our trash.
Hardware companies have long tried to reduce the power consumption of computers. One unusual way to do this is by engineering chips that are "reversible".
Normally every computational operation that involves losing a bit of information also discards the energy used to represent it. Reversible computing aims to recover and reuse this energy.
One way to do this, which is being developed by Michael Frank at the University of Florida, US, involves making versions of logic gates than can run in reverse.
Every computing operation involves feeding inputs into logic gates, which produce output signals. Instead of discarding the energy of those signals, Frank's gates run in reverse after every operation. That returns the energy of the output signal to the start of the circuit where it is used to carry a new input signal.
It may sound odd, but according to Frank, as computing power improves it won't be long before chips' wastefulness will be a major limit to their performance.
5. Billiard Ball computing
Computing today involves chain reactions of electrons passing from molecule to molecule inside a circuit. So it makes sense to try and harness other kinds of chain reaction for computing – even dominoes or marbles.
Logic gates have been made by carefully arranging dominoes or chutes for marbles to roll down (video).
Basic computing circuits like half-adders can also be made.
But making something as powerful as a microprocessor this way would require acres of space – unless your balls or dominoes are very small.
Researchers at IBM have experimented with logic circuits that use cascades of atoms bouncing off each other like billiard balls to pass information along their length.
Such gates can only be used once, but could be significantly smaller than even the tiniest existing transistors.
6. Neuronal computing
Why start from scratch when you can borrow already successful ideas? Some researchers hope to get ahead by copying nature's very own computers.
Ferdinando Mussa-Ivaldi of Northwestern University in Chicago has shown how a few brain cells from a lamprey, a primitive eel-like vertebrate can be used to control a robot.
Output from light sensors on the robot was passed to the neurons, and their responses used to control the robot's movement. The brain cells normally used by the lamprey to orientate itself proved capable of making the robot follow a light source.
It's not the first time a critter's brain has been co-opted in this way.
Claire Rind, a neurobiologist at the University of Newcastle, UK, used recordings of the neuronal activity of locusts watching manoeuvring "TIE-fighter" spacecraft from the movie Star Wars to develop extremely accurate obstacle avoidance systems.
US defence research agency DARPA has recently created remote controlled cyborg moths using electrodes in their brains (see a video of the cyborg moths).
7. Magnetic (NMR) computing
Every glass of water contains a computer, if you just know how to operate it.
Susan Stepney and colleagues at the University of York, UK, use strong magnetic fields (nuclear magnetic resonance) to control and observe the way in which molecules interact. This method can represent information in 3D and can also exploit the natural dynamics of how molecules interact.
If successful it may prove possible to model something as complex as our atmosphere using just a thimble of water.
So far, however, the group have only carried out a proof of principle by, somewhat ironically, simulating the water-based computer on a classical computer.
8. Glooper Computer
One of the weirdest computers ever built forsakes traditional hardware in favour of "gloopware". Andrew Adamatzky at the University of the West of England, UK, can make interfering waves of propagating ions in a chemical goo behave like logic gates, the building blocks of computers.
The waves are produced by a pulsing cyclic chemical reaction called the Belousov-Zhabotinsky reaction.
Adamatzky has shown that his chemical logic gates can be used to make a robotic hand stir the mixture in which they exist. As the robot's fingers stimulate the chemicals further reactions are triggered that control the hand.
The result is a sort of robotic existential paradox – did the chemical brain make the robot's hand move, or the hand tell the brain what to think? Eventually Adamatzky aims to couple these chemical computers to an electroactive gel-based "skin" to create a complete "blob-bot".
9. Mouldy computers
Even a primitive organism like slime mould can be used to solve problems that are tricky for classical computers.
Toshiyuki Nakagaki at the Institute of Physical and Chemical Research in Nagoya, Japan, has shown that slime mould can work out the shortest route through a maze.
In his experiments, the masses of independent amoeba-like cells that act as a single organism would initially spread out to explore all the possible paths of a maze.
But when one train of cells found the shortest path to some food hidden at the maze's exit the rest of the mass stopped exploring. The slime mould then withdrew from the dead end routes and followed the direct path to the food.
This is interesting for computer scientists because maze solving is similar to the travelling salesman problem, which asks for the shortest route between a number of points in space. The problem quickly scales in complexity as more points are added, making it a tough problem for classical computers.
10. Water wave computing
Perhaps the most unlikely place to see computing power is in the ripples in a tank of water.
Using a ripple tank and an overhead camera, Chrisantha Fernando and Sampsa Sojakka at the University of Sussex, used wave patterns to make a type of logic gate called an "exclusive OR gate", or XOR gate.
Perceptrons, a type of artificial neural network, can mimic some types of logic gates, but not a XOR. Only encoding the behaviour of a XOR gate into ripples made it possible for the perceptron to learn how that gate works.
Saturday, April 19, 2008
Ten weirdest computers
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment