A recent post at telegraph.co.uk purported to offer updated "proof" for the theory of evolution. You know, the "utterly proven" theory about the origin of the species that millions of scientists have dedicated their life to re-proving, much the way they do other "Laws" of science such as gravity, entropy, and displacement...er...oh, wait.
Be sure to read his article before continuing.
As proof, the author offered several contrived examples: from modern technology to insect and bacterial behavior to show how the evidence is mounting in such a way that, soon, even the "most hardened skeptics" will become convinced of the truthfulness of evolution. Waiting...waiting...hm...still no.
The first absurd claim comes when the author claims that, "Computers have long been used to model biological evolution." Since I am a computer programmer, let me offer my opinion on this. The computer that could truly model even the splitting of a single cell into two cells has not yet been invented. No computer existing today has the memory or processing power that it would take to even begin the insanely complex task of modeling biological reproduction, the foundation for "evolution". Though you may read about today's super computers being used to simulate "protein folding", keep in mind that protein folding is to single-cell division what memorizing your ten decimal digits is to quantum physics and doctorate level calculus. The two can hardly be compared, even though the first is a building block to the second. Secondly, all digital system that simulate physical systems must do so a pre-determined "resolutions". In the world of 3D graphics, for instance, as a car speeds towards a wall, the picture of the car and wall may be drawn 10 times, 100 times, or 1000 times depending on how fast the "real time" engine is capable of satisfying the "geometry engine's" will to keep the car in the right place given the amount of time elapsed. For instance, if the computer can re-draw twice every second the complex vertices and shaders that make up one frame, then a very choppy presentation will show the car hitting the wall in 3 seconds. If the computer can draw 10 frames per second, the display will show a much smoother rendition of the car hitting the wall, but still in just 3 seconds. If the display can render 32 or more "fields" (half-frames, every other "scan line"), then the display will actually fool the human eye into believing that it is watching a "full-motion" rendering of the car hitting the wall. However, this isn't actually true. In fact, in 3 seconds time, only about 100 frames could be rendered at 32 fields per second. But it you've ever seen high-speed photography of a crash dummy hitting an air-bag or a bullet flying from the barrel of a gun, then you know that even at a million frames per second, small changes are reflected in EVERY FRAME! So how many states you choose to measure in a second determines how many states you capture, and take into account. In biological and physical systems, these states are the moments in time that you are watching. But between these moments, the vast majority of time passes without observation. If you're watching a tortoise crawl through the sand, then 32 frames per seconds will certainly suffice. You can pretty much guarantee that the tortoise didn't run off and get a burger and fry between frame 17 and 18. If, however, you're watching the blindingly fast reaction between molecules that make up proteins that make up a strand of DNA interacting with an mRNA strand, then a million frames per second doesn't really do it justice at all. The only way to model such complexities in our day is to elongate time.
A former manager of mine used work for Amdol designing I/O bus architectures. He said that they ran a weeks-long software simulation of one of their chips, but only simulated a total of 6 seconds worth of actual I/O once the chip was finally constructed. Software is much, much slower than hardware. Even much more so, software is infinitely slower than the real world. No matter what resolution you choose to measure a physical system, virtually all time passes between your measurements. Any attempts to circumvent this method of modeling and optimize the output is merely a model of one's assumptions, not of the real world. The more optimized the output, the less of a true model it is and the more of a model of one's assumptions it is.
All that to say this: if someone today claims to use computers to simulate evolution, you must parse their words. They purport to have modeled unspeakable numbers of creatures over eons of time and arrived at a result in short order. In order to do that, your "model" would have to be nothing BUT assumptions. Thus, if you write a program to tell you that evolution is true, then don't be surprised if it does!
The second absurd claim is that network and telephony networks exercise "evolution" when passing data through them. Now, every network programmer knows what a "trace route" is. In windows, the tracert command echoes back the "route" used for data to travel from your computer to some remote computer and back and the elapsed time between each hop. Most also know that if you tracert the same host on different days, it will often result in a different route than before. If I'm getting data from a computer in India, the data might make 20-30 "hops" from one router to another along the internet backbone. The request for the data goes from my computer to my local router, usually in the same building. From there it "hops" to my internet service provider: Comast, Earthlink, AOL, or some other provider. After a couple 2 or 3 "hops" at Comcast, it goes out to the real internet and finds it's way to the host computer via a series of hops that tend to move it geographically closer and closer to it's destination. Finally it's received by their internet service provider, forwarded to their company's router and finally to the specific computer that has the data I want. The data then takes the reverse route, usually, but not always identical to the request route. But all those "hops" hit routers in between Comcast and say, Bangalore Electronics's routers. Those public routers are "learning" systems in that they are constantly measuring the response times from various other routers and re-routing packets of data around slower or over-burdened routers to keep the total "latency" (response times) rather low. When routers fail to auto-optimize routes, human administrators can go in and define "static routes" to override the router's artificial intelligence and assert a better plan.
Breathe! Ok, now what Dr. Chimp is trying to hoist on the public is that this auto-optimization, similar in all digital networks, is proof of evolution!!! These systems were designed from the ground up to do exactly what they do, and this is proof of evolution! Yikes!
The third absurd claim is that the behavior of colonies of ants proves evolution! Now admittedly, when he claims it proves evolution, he's talking about the tendency of randomness to produce an improved system. The problem is that the things he states as randomness could easily be rephrased as "applied design". The scent trail, foraging specialists, carriers, and sudden, mass, hunger-based movements of colonies that he observes are all programmed responses using intricate instruments that pre-exist the duration of his experiment. In this clever, misleading way, the bar for "proving evolution" gets set so low that a child cutting across a field to get home proves evolution!!!
The fourth example is not any better. That bacteria in clusters suddenly change their behavior in a way that kills their host and thus, ultimately themselves is, by no means, producing a better system. I think he kind of lost focus towards the end of his chat.
If this is what passes for proof of evolution amidst the echoes of learned professors in the higher halls of learning these days, then I'm not all that worried about a sudden shift towards the hopeless world view of evolution. And by the way, Steve, your article is misnamed.
P.S. - Keep an eye out for Ben Stein's new movie, Expelled. It is documented proof of the slight of hand and scholastic intimidation that is commonplace in the scientific community today.