Tragic Optimism for a Millennial Dawning

Tragic Optimism for a Millennial Dawning
▪ 1999

Introduction
by Stephen Jay Gould Stephen Jay Gould, Professor of Geology and Curator of Invertebrate Paleontology at Harvard University since 1973, has gained worldwide renown as a paleontologist, evolutionary biologist, and, not least, as a gifted writer on scientific subjects. With Niles Eldredge he developed in 1972 the theory of punctuated equilibria, a revision of Darwinian theory that proposes that evolution occurs not at slow constant rates over millions of years but rather in rapid bursts over periods as short as thousands of years followed by long periods of stability. His books, many of which deal with controversies in evolutionary biology, intelligence testing, and paleontology, include Ontogeny and Phylogeny (1977), The Mismeasure of Man (1981), Time's Arrow, Time's Cycle (1987), Dinosaur in a Haystack (1995), and Questioning the Millennium (1997). Beginning in 1974, he has regularly contributed essays to Natural History magazine. He has won many awards for his teaching and research.

Wallace's Paradox
      As 1998 unfolded in the homestretch of our millennial countdown, I remembered that, exactly 100 years ago, the leader in my profession of evolutionary biology, then a new science dedicated to explaining the causes and pathways of life's ancient history, wrote a book to mark the end of the last century. Charles Darwin died in 1882, so leadership had fallen to Alfred Russel Wallace, who also had recognized the principle of natural selection in an independent discovery made before Darwin's publication.

      In The Wonderful Century: Its Successes and Failures, published in 1898, Wallace presented a simple thesis combining both joy and despair: The 19th century had witnessed such a spectacular acceleration of technological progress that innovations made during this mere hundred years had surpassed the summation of change in all previous human history. This dizzying pace, however, may do more harm than good because human morality, at the same time, had stagnated or even retrogressed—thereby putting unprecedented power (for good or evil) into the hands of leaders inclined to the latter alternative. Wallace summarized his argument:

      "A comparative estimate of the number and importance of these [technological] achievements leads to the conclusion that not only is our century superior to any that have gone before it, but that it may be best compared with the whole preceding historical period. It must therefore be held to constitute the beginning of a new era in human progress. But this is only one side of the shield. Along with these marvelous Successes—perhaps in consequence of them—there have been equally striking Failures, some intellectual, but for the most part moral and social. No impartial appreciation of the century can omit a reference to them; and it is not improbable that, to the historian of the future, they will be considered to be its most striking characteristic."

      As the 20th century (and an entire millennium) draws to its close, we can only reaffirm Wallace's hopes and fears with increased intensity—for our century has witnessed even greater changes, with special acceleration provided in recent years by two great revolutions—in genetic understanding and the electronic technology of information processing. Our century has also, however, experienced the depths of two world wars, with their signatures of senseless death in the trenches of Belgium and France, in the Holocaust, and at Hiroshima. How dizzyingly fast we move, yet how stuck we remain.

      History will not remember the following items as particularly memorable or defining features of 1998, but two pairs of remarkably similar films, released by two rival companies, epitomize Wallace's paradox as applied to our time. The summer of 1998 featured two disaster movies, one about a comet, the other about an asteroid, on track to strike and destroy the Earth and how courageous heroes divert the menace with nuclear weapons, thus saving our planet: Deep Impact by DreamWorks and Armageddon by Disney. A few months later the same companies fought another round by releasing, nearly simultaneously, moral fables about insects (standing in for human values, of course) done entirely by computer animation: Antz and A Bug's Life, respectively.

      Consider the dizzying spiral of upward scientific and technological advance illustrated in these pairings. The intellectual basis for these disaster films—the theory that an extraterrestrial impact triggered the catastrophic mass extinctions that wiped out dinosaurs (along with half the species of marine organisms) 65 million years ago and gave mammals their lucky and golden opportunity—was first proposed (and dismissed as fanciful nonsense by most of my paleontological colleagues) in 1980. Late in 1998 a published report that a tiny fragment of the impacting asteroid had been recovered from strata deposited at the time of the hypothesized blast pretty much sealed the continually improving case for this revolutionary scenario.

      Few hypotheses that begin in such controversy can progress to accepted fact in a mere 20 years. Even fewer ideas ever pass from the professional world of science into hot themes for mass markets of our commercial culture. (The popular resonances are not hard to identify in this case: if extraterrestrial impact caused mass extinctions millions of years ago, why not again? And why not use our nuclear weapons, heretofore imbued with no conceivable positive utility in saving life, to fend off such a cosmic threat?) Consider also the equally accelerating spiral of technological advance illustrated by the manufacture of these films—60 years from Disney's first animated full-length feature, Snow White and the Seven Dwarfs, where each frame had to be drawn and painted by hand, to orders of magnitude more complexity based on orders of magnitude less handwork, as computers interpolate smooth action between end points of human design in A Bug's Life.

      And yet, to invoke the other side of Wallace's paradox, these films, for all their technical wizardry, remain mired in the same conventions, prejudices, and expectations that keep our social relations (and moral perceptions) so far behind our material accomplishments. Both Antz and A Bug's Life feature young male heroes who are reviled and misunderstood by a conformist multitude but who eventually save their colonies by their individualistic ingenuity—and, of course, then win the (anthropomorphic) hand of the young queen. But true ant societies are matriarchies. Males are rare and effectively useless, and all the so-called workers and soldiers (including the prototypes for the two male heroes of the recent movies) are sterile females. In A Bug's Life the worthy ants have four limbs and look human; only the villainous grasshoppers have—as all insects truly do—six legs (and a resulting sinister appearance in their two pairs of arms, at least to human observers; good guys must look like us).

      The transitions between centuries and millennia fall at precise, but entirely arbitrary, boundaries of human construction. No astronomical or biological cycle works at a repeat frequency of exact tens, hundreds, or thousands. Yet we imbue these purely conventional boundaries with our own decreed meaning, and parse time into decades (the Gay Nineties, the Roaring Twenties, the Complacent Fifties). We have even coined a phrase to mark our anxiety and stocktaking at major boundaries—the fin de siècle (or "end of century") phenomenon.

      We are now about to face, for the first time in the history of most nations and traceable family lines, the largest of all human calendric boundaries in a millennial transition. And who can possibly predict what the first years of the new millennium will bring? Wallace's paradox—the exponential growth of technology matched by the stagnation of morality—implies only more potential for instability and less capacity for reasonable prognostication. But at least we might find some solace in the sharply decreasing majesty of our fear. At the last millennial transition of year 1000, many European Christians awaited (either with fear or ecstasy) the full apocalyptic force of Christ's Second Coming to initiate his thousand-year reign of Earthly bliss. At the turning in 2000, we focus most dread upon the consequences of a technological glitch that may make our computers read a two-digit year code as 1900 rather than 2000.

Tragic Optimism
      Human rationality, that oddest of all unique evolutionary inventions, does confer some advantages upon us. This most distinctively human trait does grant us the capacity to analyze the sources of current difficulty and to devise (when possible) workable solutions for their benign resolution. Unfortunately, as another expression of Wallace's paradox, other all-too-human traits of selfishness, sloth, lack of imagination, fear of innovation, moral venality, and old-fashioned prejudice often conspire to overwhelm rationality and to preclude a genuine resolution that good sense, combined with good will, could readily implement under more favourable circumstances.

      The lessons of history offer no guarantees but only illustrate the full range of potential outcomes. Occasionally, we have actually managed to band together and reach genuine solutions. Smallpox, once the greatest medical scourge of human civilization, has been completely eradicated throughout the world, thanks to coordinated efforts of advanced research in industrialized countries combined with laborious and effective public health practices in the developing world. On a smaller but still quite joyous note, during 1998 the bald eagle reached a sufficient level of recovery—thanks to substantial work by natural historians, amateur wildlife enthusiasts, and effective governmental programs—to become the first item ever deleted for positive reasons from the American Endangered Species List.

      Just as often, unfortunately, we have failed because human frailty or social circumstances precluded the application of workable solutions. (Cities become buried by volcanoes viewed as extinct only because they haven't erupted in fallible memory. Houses built on floodplains get swept away because people do not understand the nature of probability and suppose that, if the last "hundred-year flood" occurred in 1990, the next deluge of such intensity cannot happen until 2090—thus tragically failing to recognize the difference between a long-term average and a singular event. In 1998 did India or Pakistan do anything but increase their expenditures, decrease their world respect, and endanger their countrymen by matching atomic tests, with both nations remaining at exactly the same balance after their joint escalation?)

      I do, however, think that one pattern—the phenomenon that engenders what I have called "tragic optimism" in setting a title for this essay—does emerge as our most common response, and therefore as the potential outcome that should usually attract our betting money in the lottery of human affairs. We do usually manage to muddle through, thanks to rationality spiced with an adequate dose of basic human decency. This capacity marks the "optimism" of my signature phrase. But we do not make our move toward a solution until a good measure of preventable tragedy has already occurred to spur us into action—the "tragic" component of my designation.

      To cite an example from the hit movie of 1998—James Cameron's gloriously faithful (and expensive) re-creation of the greatest maritime disaster in our civil history—we do not equip ships with enough lifeboats until the unsinkable Titanic founders and drowns a thousand people who could have been saved. We do not develop the transportation networks to distribute available food, and we do not overcome the social barriers of xenophobia until thousands have died needlessly by starvation. (As pointed out by Amartya Sen, winner of the 1998 Nobel Memorial Prize in Economic Science, no modern famine has ever been caused by a genuine absence of food; people die because adequate nourishment, available elsewhere, cannot reach them in time, if at all.) We do not learn the ultimate wisdom behind Benjamin Franklin's dictum that we must either hang together or hang separately, and we do not choose to see, or to vent our outrage in distant lands beyond our immediately personal concerns, until the sheer horror of millions of dead Jews in Europe or Tutsi in Africa finally presses upon our consciousness and belatedly awakens our dormant sense of human brotherhood.

      To cite a remarkable example from 1998 on the successes of tragic optimism, many people seem unaware of the enormously heartening, worldwide good news about human population growth—a remarkable change forged by effective research, extensive provision of information, debate, and political lobbying throughout the planet, and enormous effort at local levels of village clinics and individual persuasion in almost all nations—mostly aimed at the previously neglected constituency of poor women who may wish to control the sizes of their families but had heretofore lacked access to information or medical assistance.

      In the developing countries of Africa, Latin America, and Asia—the primary sources of our previous fears about uncontrollable population explosions that would plunge the world into permanent famine and divert all remaining natural environments to human agricultural or urban usages—the mean number of births per woman has already been halved from a previous average of about six to a figure close to three for the millennial transition. In most industrialized nations birthrates have already dropped below replacement levels to fewer than two per woman.

      But, as the dictates of tragic optimism suggest, we started too late once again. Today's human population stands at about 5.9 billion, arguably too high already for maximal human and planetary health. Moreover, before stabilization finally gains the upper hand, the momentum of current expansion should bring global levels to about 10.4 billion by 2100. Most of this increase will occur in maximally stressed nations of the developing world. We have probably turned the tide and gained the potential for extended (and even prosperous) existence on a stable planet, but we dithered and procrastinated far too long and must bear the burden of considerable, and once preventable, suffering (and danger) as a result.

Differing Scales of Time
      In most of my writings on evolutionary biology, I emphasize the unity of humans with other organisms by debunking the usual, and ultimately harmful, assumptions about our intrinsic self-importance and domination as the most advanced creatures ever evolved by a process predictably leading in our direction. All basic evidence from the history of life leads to an opposite interpretation of Homo sapiens as a tiny, effectively accidental, late-arising twig on an enormously arborescent bush of life.

      Fossil evidence for life on Earth dates back to bacterial cells more than 3.5 billion years old. For more than half this history, no other creatures existed except these simplest single-celled organisms of bacterial grade. These indestructible bacteria have always dominated, and still rule, life on Earth by criteria of numbers, diversity of biochemistry, range of inhabited environments, and prospects for continued prosperity. The number of E. coli cells (just one of many bacterial species that inhabit the human gut) carried by each person alive today exceeds the total number of humans who have ever existed.

      In 1998 fossil embryos of the most ancient animals of modern design were discovered in China in rocks more than 570 million years old. By contrast, the duration of human life on Earth represents only an eyeblink of cosmic time, a millisecond in the Earth's geological history. The entire human lineage began with the evolutionary split from our closest relatives (chimpanzees and gorillas) in Africa only six to eight million years ago. Homo sapiens, the modern species to which all humans belong, represents a truly new kid on the evolutionary block, having originated, presumably in Africa, only about 250,000 years ago.

      In the context of this essay, however, I need to emphasize the flip side of this chronology by pointing out the extraordinary impact of human existence during such an utterly insignificant amount of geological time. During the 3.5-billion-year tenure of life on Earth, no other species has left so strong an imprint upon our planet's surface in such a geological instant. We cannot attribute this influence to any novelty of merely physical form. (We are a large mammalian species, rather frail of body, and endowed with no special gift of brawn.) Our extraordinary achievements, for better or worse as only the future can tell, arise from an unparalleled increase and remodeling of neuronal tissue in our brains and from the attendant power of emerging consciousness, to unleash an entirely novel force upon the history of this planet: the power of cultural transmission, a much stronger and more rapid process of change than Darwinian physical evolution.

      Only about 30,000 years have passed from the first European Paleolithic cave paintings at Chauvet (showing a mastery of style fully comparable with the skill of a Picasso) to the blockbuster art shows of America in December 1998 (Jackson Pollock in New York City, late Monet in Boston). Fewer than 10,000 years have elapsed since several human societies independently developed agriculture and unleashed the phenomena of accumulating wealth and dwellings in fixed places that serve as a prerequisite to the ever-growing social and material complexities called "civilization" (as opposed to the nomadic style of our previous lives as hunter-gatherers). The people who painted at Chauvet, and who first planted and reaped, belonged to our species and did not differ from us in any feature of bodily form, including size and structure of the brain. In other words, all the technological change that marks the full impact of human presence upon this planet has been forged by the power of cultural transmission among humans of unaltered evolutionary form and capacity.

      Cultural change gains both its extraordinary power and its quirky unpredictability by operating under different principles than those regulating the slower Darwinian history of physical evolution. To cite the two most important differences, human cultural change works by the Lamarckian mechanism of inheritance of acquired characteristics (while evolutionary change must follow the vastly slower Mendelian and Darwinian route of natural selection upon genetic variation). Whatever we learn or invent in one generation we pass directly to the next by writing and teaching. Change, therefore, can accumulate and accelerate with unparalleled rapidity, leading us either to dizzying and disruptive success or into the abyss of gargantuan failure. As a second difference, biological evolution yields permanent separation on the tree of life. Once a species branches off from an ancestral lineage, it must follow its own distinctive pathway forever. Nature cannot make a new all-purpose mammal by mixing 80% of a bat with 20% of a dolphin. (Genetic engineering may be on the verge of breaking these age-old rules, but such fracturing would only represent a feedback from human invention upon biological history.) By contrast, cultural change proceeds largely by amalgamation and imitation. One distant traveler, gaining one look at a wheel invented by other peoples, can return to transform his own society forever.

      Essential unpredictability, as a matter of principle (based on the unique complexity of most parts and the partial randomness of many processes, not on the limitations of our own ability to understand a genuinely deterministic universe), ruled the natural world long before humans made their boisterous and accidental entrance in the history of life on Earth. But the special principles of human cultural change only enhance the volatility and quirkiness of our own impact. At its own time scale, where a million years represents but a cosmic day, the Earth may wink at our hubris. Species come and species go, but the Earth endureth forever (or at least for many billion years more until the Sun explodes).

      Yes, we may wipe out a large percentage of species (including ourselves), but Earth will recover, at its own time scale, several million years from now, as hardy survivors repopulate a temporarily battered planet. (After all, five major mass extinctions have occurred during the 600 million years of animal life on Earth. The biggest, 225 million years ago, wiped out about 95% of all marine species. Yet evolution always restores full diversity, though the process requires several million years.) Yes, we may unleash a powerful greenhouse effect, melt the polar ice caps, and raise sea levels sufficiently to drown most of our major cities (built at or near sea level for primary function as ports and harbours). But the Earth will prosper, though we may die. (At many past times during the history of continental drift, both poles lay over open oceans, no ice caps existed, sea level stood much higher, and life prospered.)

      These claims are surely correct, but we make a terrible and tragic mistake—the classic error of mixing time scales—if we argue that the Earth's ability, at its own time scale, to heal the effects of potential human malfeasance should give us any solace or lead us to a position of "why worry" about environmental deterioration or anthropogenic extinction. The Earth's time scale, however majestic, cannot be the appropriate ruler for our own legitimately parochial interest in our lives (measured in decades or, at most, a century), our nations and bloodlines (measured, at best, in millennia), our cultures with all their magnificent achievements (and their gruesome failures), and the immediate environments and fellow creatures that now share the planet with us at the only time scale we can know, at least in crucial moral and psychological senses.

      The Earth will survive if we unleash the dark side of Wallace's paradox, but our own glorious and tentative experiment in consciousness will fail, and we will (albeit temporarily) take much of the Earth's present splendour with us. We must care intensely, and at the appropriate scale of human existence—the scale now so palpably before us as we prepare for the first and only millennial transition (our longest measuring rod) in any living organism's memory (except for a few unconscious trees).

      With tragic optimism we may place our bets on survival. Consciousness does give us the capacity to prevail along with the ability to destroy. John Playfair, the great Scottish scientist who explicated deep time by writing a famous book in 1802 on the immensity of geological cycles, ended his Outlines of Natural Philosophy (1814) with a wonderfully succinct description of tragic optimism, and its moral implication that we must never abandon the struggle. He wrote, using the old subjunctive mood (where his "were" equals our "would be"): "About such ultimate attainments, it were unwise to be sanguine, and unphilosophical to despair."

* * *


Universalium. 2010.

Игры ⚽ Поможем написать реферат

Look at other dictionaries:

  • literature — /lit euhr euh cheuhr, choor , li treuh /, n. 1. writings in which expression and form, in connection with ideas of permanent and universal interest, are characteristic or essential features, as poetry, novels, history, biography, and essays. 2.… …   Universalium

  • Art, Antiques, and Collections — ▪ 2003 Introduction       In 2002 major exhibitions such as Documenta 11 reflected the diverse nature of contemporary art: artists from a variety of cultures received widespread recognition for work ranging from installation to video to painting …   Universalium

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”