Unnatural Evolution
One of my favorite parts of being in London is sitting in a pub or café reading The Spectator with British accents in the background, as I’m doing right now.
Matt Ridley, who is deeply knowledgeable about the topic, has a really interesting article in the current edition that sees Darwinian evolution everywhere. This is a much more frequently observed theme in British publications than anywhere else. My guess is that this is partially cheering for the home team, but that it is also not random that Darwin was British. The idea of spontaneous order without authority-from-above is a classically British notion.
It is widely accepted that analysis of human society inspired Darwin more than Darwinian biology inspired Social Darwinism, but Ridley provides fascinating evidence of correspondence to demonstrate this. But when we come to his analogy of biological evolution with technology development, we can see an important difference that Ridley doesn’t explicate in the essay:
Technology is a case in point. Although engineers are under the fond illusion that they design things, nearly all of what they do consists of nudging forward descent with modification. Every technology has traceable ancestry; ‘to create is to recombine’ said the geneticist François Jacob. The first motor car was once described by the historian L.T.C. Rolt as ‘sired by the bicycle out of the horse carriage’. Just like living systems, technologies experience mutation (such as the invention of the spinning jenny), reproduction (the rapid mechanisation of the cotton industry as manufacturers copied each others’ machines), sex (Samuel Crompton’s combination of water frame and jenny to make a ‘mule’), competition (different designs competing in the early cotton mills), extinction (the spinning jenny was obsolete by 1800), and increasing complexity (modern cotton mills are electrified and computerised).
I think that this is accurate but incomplete. Technology does advance through a process of trial-and-error, broadly conceived; but the capture of insight is often conscious, rather than being captured merely through competitive survival. Engineers observe differences in performance, and through a combination of inductive and deductive processes consciously retain advantageous differences for future generations. They also consciously manipulate future generations using insights (or, really, theories about insights) as to the relationship between the structure of the object and its performance. This is Lamarck, not Darwin.
To take one of Ridley’s examples, had early technologists literally tried random recombination after recombination of a set of parts (even making the huge assumption of the information embedded in our choice of the universe of parts and their methods of physical connection), it is highly unlikely that they would have gotten to the spinning jenny as fast as engineers and inventors actually did. You can get Shakespeare with enough monkeys on typewriters, but I don’t suggest that you put them into a writing contest against Shakespeare. Now, obviously, at some very high level of abstraction, one could argue that this process itself is the outcome of evolution, but it is in operation very different form the natural selection we see in nature.
This difference becomes striking when we come to Ridley’s description of the evolution of software:
Software inventors have learnt to recognise the power of trial and error rather than deliberate design. Beginning with ‘genetic algorithms’ in the 1980s, they designed programmes that would experiment with changes in their sequence till they solved the problem set for them. Then gradually the open-source software movement emerged by which users themselves altered programmes and shared their improvements with each other. Linux and Apache are operating systems designed by such democratic methods, but the practice has long spread beyond programmers.
I’ve done and led a lot of software development of various kinds that use both genetic algorithms (GAs) and open-source software. GAs are a highly specialized technique that are used to address a very narrow class of optimization problems – those in which we have very little information about the structure of the optimization space, and therefore, in which it is more efficient to have a search process that assumes no knowledge of the space. If we do have such knowledge, we use Linear Programs, Quadratic Programs or any other of a huge number of alternative optimization methods. This is because these other methods assume a structure to the optimization space that (if our assumptions are correct) allows us to home in a solution much faster than does a GA. In a situation to which they are applicable, using a GA to compete with them is like using monkeys and typewriters to compete with Shakespeare.
Open Source, on the other hand, is a great case of Lamarckian, rather than Darwinian, evolution. The various engineers and release committees that create and approve incremental improvements comprise a Lamarckian process. There is a direct analogy to GAs, termed Genetic Programming, in which explicitly Darwinian methods are used to develop software code improvements. While it has been found to be useful for some very specific tasks, such as circuit design, the Open Source movement doesn’t have a lot to worry about from this competition.
Thanks for this write-up. It elegantly lays out the differences between a few approaches that I’ve seen incorrectly conflated before. If you’ve encountered discussions on complexity in terms of software/hardware and have any thoughts I’d love to hear them. Such discussions I find tend to be a mix of interesting ideas and BS.
— Greg Sanders · Jan 10, 05:52 PM · #
Greg:
Thanks for the compliment.
I don’t know of any book specifically focused on Complexity in terms of hardware / software. But on the general topic, the essay on sociobology and evolutionary psychology in Gould’s book “The Richness of Life” (all from memory) goes into this same msplaced comparison and describes what he calls the “Lamarckian juggernaut”.
— Jim Manzi · Jan 10, 07:22 PM · #
Thanks – very interesting.
I think there’s a fundamental difference between the way evolution designs things and the way engineers do.
Evolution is totally opportunistic, and produces designs of baroque splendor. Think of the way the genetic material actually produces proteins – the basic process is the DNA is transcribed to messenger RNA which is transcribed to a protein – but then you start finding cases where the product of one gene is an enzyme that edits the messenger RNA of another gene before it gets transcribed, and so on. Or in the brain, some event happens, calcium floods into a neuron, and a lot of different signaling pathways all get going at once.
Engineers strive for clarity instead – one cause and one effect. A good engineering design has a totally different feel than a biological system.
— peter · Jan 11, 05:35 AM · #
Greg, if you have a little background with this stuff, Complexity, Entropy and the Physics of Information has a great collection of essays.
— JA · Jan 12, 02:21 AM · #
One big difference is that engineers can putter around off line in the lab with new versions that will be, for some time, worse than the current version being mass produced in the factory, but have more long term potential. For example, engineers spent years working on jet engines before they were good enough to power an airplane, much less replace existing propeller-driven planes.
But, with living things, it’s very unlikely that a radical change can be good enough to survive. Thus, legs don’t get replaced by wheels, or whatever.
— Steve Sailer · Jan 12, 10:57 AM · #
Jim and JA: Thanks for the recommendations, I’ll check them out.
— Greg Sanders · Jan 14, 03:42 AM · #