In a century that worshipped at the shrine of science, who would have thought that science itself would finally return to beliefs once discarded as ancient superstition? Experiments with the DNA code--the language of life--have led, near the end of the 21st century, full circle--nearly back to the original Word that created life in the first place.
Our own century, the 21st, has indeed been the Biotech Century, just as Jeremy Rifkin predicted in a book with that title in 2000. In those early years, futurists had a field day predicting the surrealistic scenes to be ushered in by biotechnology. Most were predicting great benefits in medicine and agriculture, to be sure, but also grave threats to human dignity such as cloning, chimeras (human-animal hybrids), cyborgs (human-machine hybrids), genetically engineered subhuman slaves, and superhuman masters.
In his 1997 book Remaking Eden, Lee Silver of Princeton predicted that a century of genetic engineering might even create a new species of human, no longer able to mate with its "gene poor" relations. Humanity seemed poised to take control of its own evolution, a vision passionately promoted by eugenicists from the time of Charles Darwin.
Indeed, the very definition of humanity came into question. What happens to human dignity when test-tube babies are conceived in order to be tissue donors for other family members--a practice already underway at the turn of the millennium? What happens to our definition of human nature when researchers create human-animal hybrids--also underway in 2000? In one case, the nuclei of human cells were extracted and inserted into a pig's egg cells; the hybrids were allowed to grow to 32-cell embryos before being destroyed. Researchers looked forward to using such subhuman creatures for research--even for use as living meat-lockers for growing transplantable organs and tissues.
Back then, some argued that genetic engineering posed no moral questions--that it was merely an extension of the natural methods used by farmers and breeders for centuries with crops and domesticated animals.
Yet, nature always reaches a limit: Once a breeder isolates a particular trait in the gene pool, change stops. Continued shuffling of genes does not create new genes, any more than shuffling a deck of cards creates new cards. The famous 20th-century breeder Luther Burbank said a tendency to stay true to type "keeps all living things within some more or less fixed limitations."
But in the 21st century, science provided the tools to overcome the genetic barrier artificially--and thus to create a Darwinian world. Contrary to the ancient, universal view that life is divided into distinct, discernible natural kinds--that cows are cows, and crows are crows--Darwin had proposed that living things form a continuous chain with ever-shifting boundaries, each species melding into the next on the evolutionary ladder.
Darwin's theory was never actually confirmed in nature. Even in laboratory experiments, scientists were unable to cross Burbank's "more or less fixed limitations." The fossil record shows the same pattern of discontinuous groupings back to the beginning of life. But genetic technology gave scientists the artificial means to create a Darwinian world of fuzzy edges and blurred boundaries.
And when the boundaries of humanity had become plastic and malleable, who could say humans had any special moral status? Even non-Christians saw the force of the argument: Embryologist Brian Goodwin wrote that Darwinism eliminates the idea of species as natural kinds, with disastrous ethical consequences: "Human nature disappears as a concept from neo-Darwinism, and so life becomes a set of parts, commodities that can be shifted around."
This assault on human dignity went hand in hand with the spread of scientific reductionism, which downgraded humans merely to complex mechanisms. It began with Rene Descartes, the 17th-century French philosopher who proposed a dualistic view of the human being as a robotic body somehow connected to a mental soul--a "ghost in the machine." This tenuous ghost was easy to exorcise, and before long scientists were declaring that humans were nothing but complicated mechanisms. "Let us conclude boldly," said Julien Offrey de la Mettrie in 1749, "that man is a machine." Pavlov, Watson, and Skinner, who treated human behavior as reflex actions based on mechanistic principles of stimulus and response, echoed him.
Again Darwin's theory played a role, offering crucial theoretical support for the reductionistic view. Summed up by 20th-century historian Jacques Barzun, Darwinism implied that "the sum total of accidents of life acting upon the sum total of the accidents of variation provided a completely mechanistic and material system by which to account for the changes in living forms."
By the end of the 20th century, Darwinian reductionism was being extended into fields such as ethics, where philosophers like Michael Ruse were busy debunking traditional morality as "an illusion," preserved only for its survival value. "What is in our genes' interests is what seems 'right'--morally right," wrote Robert Wright in The Moral Animal in 1997.
In a reductio ad absurdum, consciousness itself was declared an illusion. A computational model of the mind became popular, with philosophers like Paul Churchland pronouncing that mental states do not exist. Human beings were reduced to mechanisms with no higher value than gadgets and gizmos--which cleared the decks for unbridled experimentation with human embryos and DNA.
Oddly, alongside this intellectual trend ran another that was completely opposite: the exaltation of the individual as an autonomous Self, free to make itself and create its own values. In the Romantic movement, the "ghost" in Descartes's machine was revitalized and elevated to center stage, leading to what philosopher Robert Solomon called "the modern philosophical obsession with self as the locus and arbiter of knowledge."
Thus Descartes set off a dizzying seesaw motion, with descendants of the Enlightenment (like the logical positivists) espousing mechanistic materialism, while descendants of Romanticism (like the existentialists) exalted the autonomous self making ungrounded choices. This became a "dualism," a "bifurcation," within the Western mind, according to intellectual historian Richard Tarnas, and antithesis between the poles of "nature" and "freedom," wrote Christian philosopher Herman Dooyeweerd. That later paved the way to the Biotech Century: As materialism reduced humans to useful gadgets to be tinkered with, the autonomous self rejected all ethical limits except its own desires.
In his 1970 book Fabricated Man, Paul Ramsey diagnosed the way this dualism affected the life issues. Most arguments for abortion, he noted, defined the person as a choosing consciousness, while reducing the body to a possession or instrument under our control, like a car that takes us where we want to go. It was this dualism, some observed, that allowed many Americans to sanction both abortion and euthanasia. Unborn babies are not yet persons, for they haven't started making choices; neither are the elderly and the demented, for they have ceased making theirs.
This dualism also accounted for the willingness to create human embryos purely for research, and to cut and splice human DNA as though it were so much raw material available for bio-industry. An extreme example was the late 20th century's Joseph Fletcher (of situational ethics fame), who insisted that test-tube reproduction is actually "more human" than the ordinary method because it is then a matter of choice and conscious control: "To be men we must be in control." Thus Fletcher voted in favor of cyborgs, chimeras, and the entire Brave New World scenario.
A quarter century after Roe, America was heading toward a new era of eugenics--not the coercive, top-down, racist eugenics of the Nazi regime but a eugenics of consumer choice. Consumers ready to pay for any technology promising better medicine or genetic advantages for their children were creating a market.
But then something unexpected happened. From within genetics itself came new insights. We found that the structure of DNA suggests a new basis for human dignity--or rather, a return to an old one.
As every high-school student now knows, DNA functions like a language. The DNA molecule is made up of nucleotides that function as chemical "letters" in a code. The pattern or sequence of "letters" spells out a "message" that builds proteins within the cell. Where did this pattern come from?
It is not a random pattern, as we would expect from sheer chance. Nor is it a single pattern repeated over and over, as we would expect from a natural force (like using a macro on your computer). "DNA is like the magnetic letters your kid sticks on the refrigerator," a 20th-century college professor named Steve Meyer once said. "The magnetic force explains how the letters stick to the fridge, but it doesn't explain how the letters are sequenced to spell 'I love Daddy.'"
The sequence in DNA is completely arbitrary, in terms of physical forces--and that's exactly why DNA is capable of functioning as a code. In any code or language, linguistic convention endows an arbitrary pattern with meaning by a linguistic convention. For example, in English, the letters "s-e-e" form a word related to sight; in German, the word means ocean. Where did the linguistic convention come from that creates meaning for the arbitrary sequence of "letters" in DNA? How did rules of grammar and interpretation arise that give the molecule its symbolic attributes?
That was the great question raised by the genetic revolution. And the answer was unavoidable. Obviously, linguistic conventions do not emerge from chemical reactions; they are mental realities, products of intelligence. "DNA is like a computer program," said that historical figure Bill Gates, "but far, far more advanced than any software we've ever created."
It became clear that the key to interpreting the organic world was not natural selection but John 1:1, "In the beginning was the Word." "This passage uses the Greek word logos to say that in the beginning there was intelligence, wisdom, communication," explained Phillip Johnson, Berkeley law professor and the 20th century's great proponent of Intelligent Design theory.
We found that genetics itself implies that life is a grand narrative told by the divine Word--that there is an Author for the text of life. This means that (contrary to Darwinism) living things cannot be reduced to mere matter and energy.
Moreover, living things are divided into natural kinds (again contrary to Darwinism), each akin to a story or poem. Evolution from one into another by gradual steps is impossible, just as it would be impossible to create Hamlet from The Tempest by randomly changing one letter at a time. (Virtually all the transitional stages would be cluttered with nonsense and, on analogy to natural selection, would die out.)
Consequently, humans have an intrinsic nature after all, and an intrinsic dignity. There is no stage at the beginning or end of life when the human being is not a full person.
Astonishingly, it was a non-Christian philosopher, the well-known 20th-century postmodernist Richard Rorty, who drew together these themes most strikingly. "The very idea that the world or the self has an intrinsic nature," he wrote, "is a remnant of the idea that the world is a divine creation, the work of someone who had something in mind, who Himself spoke some language in which He described His own project." In other words, human beings have an intrinsic nature and dignity only if the world is an embodiment of the Word, the Logos, the language of a personal Creator. Amazingly, it was the genetic revolution that brought this truth home, transforming the entire American culture.
In the modern world, science is accorded authority to tell us what "really is." Philosophers and theologians had long sought to provide ethical guidance for genetic research, but the turning point came when science itself revealed that the language of life--the DNA code--supports the conclusion that the biotic world reflects the Logos of its Creator.