Another oldie from my withdrawal from MySpace…
Seen any of the Terminator movies lately? If so, you may want to wait a while before reading this, but this is cool. Trust me. Even if you spell it ‘evilution’, you are going to think this is cool. Really cool. Alan Bellows did a writeup on hardware evolution, a brand new area of study, which has direct parallels to biological evolution. The main figure in this article is a Dr. Adrian Thompson at the University of Sussex. The experiment that has so wowed me involved the use of a field-programmed gate array (FPGA) to distinguish between two tones. The FPGA used was small, only 10 x 10 cells in size, and removed access to the system clock (so that the program could not time the waveforms coming in and accidentally result in a program that just measured the waveform frequencies). Dr. Thompson programmed in a random set of binary data, the initial DNA if you will, and judged the ability of each set of digital DNA. The programs which produced the best ability to differentiate between tones were kept for the next generation, with a bit of random mutation thrown in for good measure.
“For the first hundred generations or so, there were few indications that the circuit-spawn were any improvement over their random-blob ancestors. But soon the chip began to show some encouraging twitches. By generation #220 the FPGA was essentially mimicking the input it received, a reaction which was a far cry from the desired result but evidence of progress nonetheless. The chip’s performance improved in minuscule increments as the non-stop electronic orgy produced a parade of increasingly competent offspring. Around generation #650, the chip had developed some sensitivity to the 1kHz waveform, and by generation #1,400 its success rate in identifying either tone had increased to more than 50%.
Finally, after just over 4,000 generations, [the] test system settled upon the best program. When Dr. Thompson played the 1kHz tone, the microchip unfailingly reacted by decreasing its power output to zero volts. When he played the 10kHz tone, the output jumped up to five volts. He pushed the chip even farther by requiring it to react to vocal “stop” and “go” commands, a task it met with a few hundred more generations of evolution. As predicted, the principle of natural selection could successfully produce specialized circuits using a fraction of the resources a human would have required. And no one had the foggiest notion how it worked.”
And that is what is so cool about this. Until the program was back-engineered, how it did what it did was a complete unknown and totally up to the selection process. A mere 37 of its logic gates were used, compared to hundreds of thousands in a sound processor designed specifically for the task. Even though only a very few gates were used, they were organized in a complex and completely unexpected way. “The plucky chip was utilizing only thirty-seven of its one hundred logic gates, and most of them were arranged in a curious collection of feedback loops. Five individual logic cells were functionally disconnected from the rest– with no pathways that would allow them to influence the output– yet when the researcher disabled any one of them the chip lost its ability to discriminate the tones. Furthermore, the final program did not work reliably when it was loaded onto other FPGAs of the same type.” The purpose of the seemingly unconnected logic cells seems to be in supplying magnetic flux, and the program makes use of this in lieu of not having access to the system clock.
“These evolutionary computer systems may almost appear to demonstrate a kind of sentience as they dispense graceful solutions to complex problems. But this apparent intelligence is an illusion caused by the fact that the overwhelming majority of design variations tested by the system– most of them appallingly unfit for the task– are never revealed.”
This concept is key in understanding why proponents of intelligent design (IDiots) see conscious design everywhere. Blind selection, which also accurately describes natural selection, eliminated the ‘designs’ that didn’t work, so we only see the ones that do! So everything around us are the resulting successful designs. It’s no wonder why engineers see god everywhere. But it’s all an illusion, complexity arising from a simple set of rules. What works moves on, what doesn’t is discarded. Random mutations occur ensuring that falsely optimized configurations don’t occur. Evolution is such an elegant process!
From Pharyngula, PZ Myers writes:
“That looks a lot like what we see in developmental networks in living organisms — unpredictable results when pieces are “disconnected”, or mutated, lots and lots of odd feedback loops everywhere, and sensitivity to specific conditions (although we also see selection for fidelity from generation to generation, more so than occurred in this exercise, I think). This is exactly what evolution does, producing a functional complexity from random input.”
I think there are limits on the analogy to biological evolution, but the parallels are immediately obvious. There will always be those out there that say that Thompson was the designer because he set up the initial conditions. But if the starting conditions are state functions (that is, the path to get there makes no difference), this is irrelevant. Nor is determining the selection criteria so long as selection is blind, exactly the way Nature operates. One thing this experiment makes abundantly clear is that by following the rules set out by natural selection apparent complexity can become manifest in a relatively short amount of time.
Science is so cool.