The Plausibility Of Life

Jan 13, 2006 11:18 GMT  ·  By

A new book by two biologists solves a 150 years old mystery surrounding the mechanism of Darwinian Evolution. Darwin stated that life evolves via a two-step process: the first step is variation, the second is natural selection. The second step has long been well understood, but controversies have always surrounded the concept of variation.

How does variation occur? One idea, supported by most scientists, is that it occurs by random mutations; thus evolution does not have a predetermined direction. Another idea, supported by many religious people, is that there is an external Intelligent Designer (God) who decides exactly what variations should occur - i.e. the mutations are not random, they are designed to act as means of a certain divine cosmic purpose.

For obvious reasons, scientists are not too eager to embrace the Intelligent Design scenario - which would introduce an external and totally arbitrary element into the theory. However, the dilemma of random mutations has been perceived as increasingly acute in recent years. Alexey V. Melkikh writes in 'Entropy' magazine: "The problem is that the Darwin mechanism of the evolution (a random process) cannot explain the known rate of the species evolution. In accordance with the very first estimates, the total number of possible combinations of nucleotides in the DNA is about 4^(2 * 10^9) (because four types of nucleotides are available, while the number of nucleotides in the DNA of higher organisms is about 2*10^9). This figure is much larger than the number of all organisms, which have ever lived on Earth. Therefore, the evolution was not random. [...] From the viewpoint of the theory of random evolution, all genes are equal (no more or less important genes may exist), because all of them appeared by random mutations. In this case, an organism cannot know beforehand which genes it will need in the distant future. Thus, either a molecular or some other machine exists deciding on further evolution. However, such a machine will require certain reference samples for its operation establishing what is and what is not important for an organism. Or this machine [or Intelligent Designer] does not exist and, then, it is impossible to make a decision. There is no criterion to confirm that a set of nucleotides is the best one in a given situation."

A solution was given in the late 1970s by Victor Turchin, a mathematician working in the field of cybernetics, in the form of the theory of meta-system transitions. Since then, this theory has become one of the most important parts of the Principia Cybernetica Project. However, Turchin's solution was strictly theoretical and abstract.

Two biologists, Marc Kirschner & John Gerhart, have now arrived to the same solution, apparently totally unaware of Turchin's insight. The difference is they have been able to offer more than just an abstract idea. Their argument is presented in a book called The Plausibility of Life: Resolving Darwin's Dilemma.

The bottom line of the argument is that life is organized in such a way that potentially useful variations are thrown up in every generation, and that this facility for variation has itself evolved. This is why, for example, life on Earth existed only in single-celled forms for billions of years before exploding into the variety around us today.

The authors explain the core of the theory in an American Scientist interview: "About half a century ago, we learned that heritable variation does not occur without mutation. Any place in the genome can suffer mutation, which is a change of the local DNA sequence. It appears to strike at random, and rarely. Our theory of 'facilitated variation' is meant to explain how rare and random mutation can lead to exquisite changes of form and function. [...]

The components and genes are largely the same in all animals. Almost every exquisite innovation that one examines in animals, such as an eye, hand or beak, is developed and operated by various of these conserved core processes and components. This is a profound realization for the question of variation, because it says that the different and seemingly novel features of animals are made and run by various of the same core processes, just used in different combinations, at different times and places in the animal, and used to different extents of their output. Variation is not as hard to get as one might initially think. A Lego analogy is applicable: The same Lego parts can be stuck together to give a model of the Eiffel Tower or of a soccer ball.

If the core processes remain the same, what changes in evolution? We suggest that it is the regulation of these processes. Regulatory components determine the combinations and amounts of core processes to be used in all the special traits of the animal. Whereas components of the core processes do not change in evolution, regulatory components do, and they are the targets of random mutational change. Genes for regulatory components comprise a minority of the genome (under a quarter, as a rough estimate), fewer than genes for core processes, but still a lot of genes and a lot of regulatory DNA. The thrust of our argument is that rather few mutational changes, affecting regulatory components, are needed to generate complex innovation."

This idea, which the authors dub "facilitated evolution", carries a bonus. Instead of making it hard to understand how complex structures like the eye or wing could have evolved, facilitated evolution works precisely because the organisms on which evolution operates have great potential for variation. As the authors point out, this pulls the rug from under the argument for "intelligent design". The secret lies "in understanding the organism on its own terms", which are "nothing like a brass watch or a divine creation".

The authors also managed to explain the major transitions that occurred in evolution, i.e. to answer the question "How were the Lego blocks invented?"

We envision four episodes, each separated from the next by a long interval during which life-forms diversified based on the varied use of those recently acquired processes, driven by regulatory change. First, as early bacteria-like cells evolved, the processes arose for synthetic and degradative (energy-producing) metabolism, for DNA synthesis and for gene expression, including protein synthesis. [...]

The second episode occurred roughly two billion years ago, as the first eukaryotic cells evolved, perhaps coincident with the initial accumulation of oxygen in the atmosphere. Eukaryotic cells are much more complicated in their organization and coordination of activities. Processes evolved for arranging components at different places and for moving them from place to place. Genomes got larger, a complex cell cycle arose culminating in mitosis and cell division, and the first sexual reproduction took place with meiosis and the fusion of two cells.[...]

A third episode occurred at the time of the earliest multicellular animals, perhaps one billion years ago. These processes of multicellularity involve the means of cell-to-cell communication, of cells adhering to each other and to a blanket of materials that cells deposit around themselves, and specialized junctions connecting cells. The means evolved for developing a multicellular animal from a single-celled egg. Specialized cell types evolved, such as nerve and muscle.[...]

A final episode, discussed in the book, occurred just before the Cambrian period and involved the innovation of the body plans of animals. This innovation compartmentalized the embryo and allowed a large increase in the complexity of development-namely, the independent use of different combinations and amounts of core processes in each of the domains of the developing animal's body plan, to give the many anatomical and physiological innovations of the Cambrian period to the present", the authors explain.

Photo Gallery (2 Images)

Open gallery