Mathematics and Physical Sciences

Mathematics and Physical Sciences
▪ 2003

Introduction

Mathematics
      Mathematics in 2002 was marked by two discoveries in number theory. The first may have practical implications; the second satisfied a 150-year-old curiosity.

      Computer scientist Manindra Agrawal of the Indian Institute of Technology in Kanpur, together with two of his students, Neeraj Kayal and Nitin Saxena, found a surprisingly efficient algorithm that will always determine whether a positive integer is a prime number.

      Since a prime is divisible only by 1 and itself, primality can, of course, be determined simply by dividing a candidate n in turn by successive primes 2, 3, 5, … up to √n (larger divisors would require a corresponding smaller divisor, which would already have been tested). As the size of a candidate increases, however—for example, contemporary cryptography utilizes numbers with hundreds of digits—such a brute-force method becomes impractical; the number of possible trial divisions increases exponentially with the number of digits in a candidate.

      For centuries mathematicians sought a primality test that executes in polynomial time—that is, such that the maximum number of necessary operations is a power of the number of digits of the candidate. Several primality tests start from the “little theorem” discovered in 1640 by the French mathematician Pierre de Fermat: “For every prime p and any smaller integer a, the quantity ap − 1 − 1 is divisible by p.” Hence, for a given number n, choose a and check whether the relation is satisfied. If not, then n is not prime (i.e., is composite). While passing this test is a necessary condition for primality, it is not sufficient; some composites (called pseudoprimes) pass the test for at least one a, and some (called Carmichael numbers, the smallest of which is 561) even pass the test for every a.

      Two alternative approaches are conditional tests and probabilistic (or randomized) tests. Conditional tests require additional assumptions. In 1976 the American computer scientist Gary L. Miller obtained the first deterministic, polynomial-time algorithm by assuming the extended Riemann hypothesis about the distribution of primes. Later that year the Israeli computer scientist Michael O. Rabin modified this algorithm to obtain an unconditional, but randomized (rather than deterministic), polynomial-time test. Randomization refers to his method of randomly choosing a number a between 1 and n − 1 inclusive to test the primality of n. If n is composite, the probability that it passes is at most one-fourth. Tests with different values of a are independent, so the multiplication rule for probabilities applies (the product of the individual probabilities equals the overall probability). Hence, the test can be repeated until n fails a test or its probability of being composite is as small as desired.

      Although such randomized tests suffice for practical purposes, Agrawal's algorithm excited theoreticians by showing that a deterministic, unconditional primality test can run in polynomial time. In particular, it runs in time proportional to slightly more than the 12th power of the number of digits, or to the 6th power if a certain conjecture about the distribution of primes is true. While the new algorithm is slower than the best randomized tests, its existence may spur the discovery of faster deterministic algorithms.

      While these primality tests can tell if an integer is composite, they often do not yield any factors. Still unknown—and a crucial question for cryptography—is whether a polynomial-time algorithm is possible for the companion problem of factoring integers.

      Another famous problem in number theory, without far-reaching consequences, was apparently solved in 2002. The Belgian mathematician Eugène Charles Catalan conjectured in 1844 that the only solution to xm − yn = 1 in which x, y, m, and n are integers all greater than or equal to 2 is 32 − 23 = 1. In 1976 the Dutch mathematician Robert Tijdeman showed that there could not be an infinite number of solutions. Then in 1999 the French mathematician Maurice Mignotte showed that m < 7.15 × 1011 and n < 7.78 × 1016. This still left too many numbers to check, but in 2002 the Romanian mathematician Preda Mihailescu announced a proof that narrowed the possible candidates to certain numbers, known as double Wieferich primes, that are extremely rare.

Paul J. Campbell

Chemistry

Inorganic Chemistry.
      In 2002 two groups of U.S. researchers working together reported the serendipitous synthesis of compounds of uranium and the noble gases argon, krypton, and xenon. Despite more than 40 years of effort, chemists had been able to make only a handful of compounds from the noble gases. These gases are the six elements helium, neon, argon, krypton, xenon, and radon. All have an oxidation number of 0 and the maximum possible number of electrons in their outer shell (2 for helium, 8 for the others). Those traits are hallmarks of chemical stability, which means that the noble gases resist combining with other elements to form compounds. Indeed, until the 1960s chemists had regarded these elements as completely inert, incapable of forming the bonds that link atoms together to make compounds.

      Lester Andrews and co-workers of the University of Virginia were studying reactions involving CUO, a molecule of carbon, uranium, and oxygen atoms bonded together in a linear fashion. In order to preserve the CUO, they protected it in frozen neon chilled to −270 °C (−450 °F). When they repeated the reactions by using argon as the protectant, however, the results were totally different, which suggested that new compounds had formed. Xenon and krypton also gave unanticipated results. Bruce Bursten and associates at Ohio State University then performed theoretical calculations on supercomputers to confirm the findings. Andrews and Bursten speculated that other metals also might bond to noble gases under the same ultracold conditions.

      For nearly 200 years chemists had tried to decipher the structure of the complex molecules in the solutions called molybdenum blues. Scientists knew that the elements molybdenum and oxygen form large molecules that impart a blue colour to the solutions. The first of these so-called polyoxomolybdate (POM) molecules were identified in 1826. No one, however, had been able to explain the compounds' molecular structure in solution. During the year Tianbo Liu, a physicist at Brookhaven National Laboratory, Upton, N.Y., reported the presence of giant clusterlike structures in molybdenum blue solutions that resemble the surface of a blackberry. Unlike other water-soluble inorganic compounds, POM molecules apparently do not exist as single ions in solution; rather, they cluster together by the hundreds into bunches. Liu said the “blackberry” structures in molybdenum blue may represent a heretofore unobserved stable state for solute molecules.

Carbon Chemistry.
      Scientists continued their search for commercial and industrial applications of the tiny elongated molecular structures known as carbon nanotubes. Discovered in 1991, nanotubes consist of carbon atoms bonded together into graphitelike sheets that are rolled into tubes 10,000 times thinner than a human hair. Their potential applications range from tiny wires in a new generation of ultrasmall computer chips to biological probes small enough to be implanted into individual cells. Many of those uses, however, require attaching other molecules to nanotubes to make nanotube derivatives. In general, methods for making small amounts of derivatives for laboratory experimentation have required high temperatures and other extreme conditions that would be too expensive for industrial-scale production.

      During the year chemists from Rice University, Houston, Texas, and associates from the Russian Academy of Sciences, Moscow, described groundbreaking work that could simplify the production of nanotube derivatives. Rice's John Margrave, who led the team, reported that the key procedure involved fluorination of the nanotubes—i.e., attaching atoms of fluorine, the most chemically reactive element—an approach developed at Rice over the previous several years. Fluorination made it easier for nanotubes to undergo subsequent chemical reactions essential for developing commercial and industrial products. Among the derivatives reported by the researchers were hexyl, methoxy, and amido nanotubes; nanotube polymers similar to nylon; and hydrogen-bonded nylon analogs.

Organic Chemistry.
      Antiaromatic molecules are organic chemistry's will-o'-the-wisps. Like aromatic molecules, they have atoms arranged in flat rings and joined by two different kinds of covalent bonds. Unlike aromatic molecules, however, they are highly unstable and reactive and do not remain long in existence. Chemistry textbooks have used the cyclopentadienyl cation—the pentagonal-ring hydrocarbon molecule C5H5 deficient one electron and thus having a positive charge—as the classic example of the antiaromatics' disappearing act.

      Joseph B. Lambert and graduate student Lijun Lin of Northwestern University, Evanston, Ill., reported a discovery that may rewrite the textbooks. While trying to synthesize other organic cations (molecules with one or more positive charges), they produced a cyclopentadienyl analog in which methyl (CH3) groups replace the hydrogen atoms and found that it did not behave like the elusive entity of textbook fame. Rather, it remained stable for weeks in the solid state at room temperature. Lambert proposed that cyclopentadienyl be reclassified as a nonaromatic material.

Physical Chemistry.
      Gold has been treasured throughout history partly because of its great chemical stability. Resistant to attack by oxygen, which rusts or tarnishes other metals, gold remains bright and beautiful under ordinary environmental conditions for centuries. Gold, however, does oxidize, forming Au2O3, when exposed to environments containing a highly reactive form of oxygen—e.g., atomic oxygen or ozone. Hans-Gerd Boyen of the University of Ulm, Ger., led a German-Swiss team that announced the discovery of a more oxidation-resistant form of gold. The material, called Au55, consists of gold nanoparticles; each nanoparticle is a tiny cluster comprising exactly 55 gold atoms and measuring about 1.4 nm (nanometres). Boyen's group reported that Au55 resisted corrosion under conditions that corroded bulk gold and gold nanoparticles consisting of either larger or smaller numbers of atoms. The researchers speculated that the chemical stability is conferred by special properties of the cluster's 55-atom structure and that Au55 may be useful as a catalyst for reactions that convert carbon monoxide to carbon dioxide.

      Incandescent tungsten-filament lightbulbs, the world's main source of artificial light, are noted for inefficiency. About 95% of the electricity flowing through an incandescent bulb is transformed into unwanted heat rather than the desired entity, light. In some homes and large offices illuminated by many lights, the energy waste multiplies when additional electricity must be used for air conditioning to remove the unwanted heat from electric lighting.

      Shawn Lin and Jim Fleming of Sandia National Laboratories, Albuquerque, N.M., developed a microscopic tungsten structure that, if it could be incorporated into a filament, might improve a lightbulb's efficiency. The new material consists of tungsten fabricated to have an artificial micrometre-scale crystalline pattern, called a photonic lattice, that traps infrared energy—radiant heat—emitted by the electrically excited tungsten atoms and converts it into frequencies of visible light, to which the lattice is transparent. The artificial lattice, in effect, repartitions the excitation energy between heat and visible light, favouring the latter. Lin and Fleming believed that the tungsten material could eventually raise the efficiency of incandescent bulbs to more than 60%.

Applied Chemistry.
      Zeolites are crystalline solid materials having a basic framework made typically from the elements silicon, aluminum, and oxygen. Their internal structure is riddled with microscopic interconnecting cavities that provide active sites for catalyzing desirable chemical reactions. Zeolites thus have become key industrial catalysts, selectively fostering reactions that otherwise would go slowly, especially in petroleum refining. About 40 zeolites occur naturally as minerals such as analcime, chabazite, and clinoptilolite. To date, chemists had synthesized more than 150 others, and they were on a constant quest to make better zeolites.

      Avelino Corma and colleagues of the Polytechnic University of Valencia, Spain, and the Institute of Materials Science, Barcelona, reported synthesis of a new zeolite that allows molecules enhanced access to large internal cavities suitable for petroleum refining. Dubbed ITQ-21, it incorporates germanium atoms rather than aluminum atoms in its framework, and it possesses six “windows” that allow large molecules in crude oil to diffuse into the cavities to be broken down, or cracked, into smaller molecules. In contrast, the zeolite most widely used in petroleum refining has just four such windows, which limits its efficiency.

      Chemists at Oregon State University reported an advance that could reduce the costs of making crystalline oxide films. The films are widely used in flat-panel displays, semiconductor chips, and many other electronic products. They can conduct electricity or act as insulators, and they have desirable optical properties.

      To achieve the necessary crystallinity with current manufacturing processes, the films must be deposited under high-vacuum conditions and temperatures of about 1,000 °C (1,800 °F). Creating those conditions requires sophisticated and expensive processing equipment. Douglas Keszler, who headed the research group, reported that the new process can deposit and crystallize oxide films of such elements as zinc, silicon, and manganese with simple water-based chemistry at atmospheric pressure and at temperatures of about 120 °C (250 °F). The method involved a slow dehydration of the materials that compose crystalline oxide films. In addition to reducing manufacturing costs, the process could allow the deposition of electronic thin films on new materials. Among them were plastics, which would melt at the high temperatures needed in conventional deposition and crystallization processes.

Michael Woods

Physics

Particle Physics.
      In 2002 scientists took a step closer to explaining a major mystery—why the observed universe is made almost exclusively of matter rather than antimatter. The everyday world consists of atoms built up from a small number of stable elementary particles—protons, neutrons, and electrons. It has long been known that antiparticles also exist, with properties that are apparently identical mirror images of their “normal” matter counterparts—for example, the antiproton, which possesses a negative electric charge (rather than the positive charge of the proton). When matter and antimatter meet, as when a proton and an antiproton collide, both particles are annihilated. Antiparticles are very rare in nature. On Earth they can be produced only with great difficulty under high vacuum conditions, and, unless maintained in special magnetic traps, they survive for a very short time before colliding with normal matter.

      If matter and antimatter are mirror images, why does the vast majority of the universe appear to be made up of normal matter? In other words, what asymmetry manifested itself during the big bang to produce a universe of matter rather than of antimatter? The simplest suggestion is that matter and antimatter particles are not completely symmetrical. During the year physicists working at the Stanford Linear Accelerator Center (SLAC) in California confirmed the existence of such an asymmetry, although their experiments raised other questions. The huge research team, comprising scientists from more than 70 institutions around the world, studied very short-lived particles known as B mesons and their antiparticles, which were produced in collisions between electrons and positrons (the antimatter counterpart of electrons). A new detector dubbed BaBar enabled them to measure tiny differences in the decay rates of B mesons and anti-B mesons, a manifestation of a phenomenon known as charge-parity (CP) violation. From these measurements they calculated a parameter called sin2β (sine two beta) to a precision of better than 10%, which confirmed the asymmetry. Although the BaBar results were consistent with the generally accepted standard model of fundamental particles and interactions, the size of the calculated asymmetry was not large enough to fit present cosmological models and account for the observed matter-antimatter imbalance in the universe. SLAC physicists planned to examine rare processes and more subtle effects, which they expected might give them further clues.

      Researchers from Brookhaven National Laboratory, Upton, N.Y., confirmed previous work showing a nagging discrepancy between the measured value and the theoretical prediction of the magnetic moment of particles known as muons, which are similar to electrons but heavier and unstable. The magnetic moment of a particle is a measure of its propensity to twist itself into alignment with an external magnetic field. The new value, measured to a precision of seven parts per million, remained inconsistent with values calculated by using the standard model and the results of experiments on other particles. It was unclear, however, whether the discrepancy was an experimental one or pointed to a flaw in the standard model.

Lasers and Light.
      One region of the electromagnetic spectrum that had been unavailable for exploitation until 2002 was the so-called terahertz (THz) region, between frequencies of 0.3 and 30 THz. (A terahertz is one trillion, or 1012, hertz.) This gap lay between the high end of the microwave region, where radiation could be produced by high-frequency transistors, and the far-infrared region, where radiation could be supplied by lasers. In 2002 Rüdeger Köhler, working with an Italian-British team at the nanoelectronics-nanotechnology research centre NEST-INFM, Pisa, Italy, succeeded in producing a semiconductor laser that bridged the gap, emitting intense coherent pulses at 4.4 THz. The device used a so-called superlattice, a stack of periodic layers of different semiconductor materials, and produced the radiation by a process of quantum cascade.

      Claire Gmachl and co-workers of Lucent Technologies' Bell Laboratories, Murray Hill, N.J., fabricated a similar multilayered configuration of materials to produce a semiconductor laser that emitted light continuously at wavelengths of six to eight micrometres, in the infrared region of the spectrum. Unlike typical semiconductor lasers, which give off coherent radiation of a single wavelength, the new device represented a true broadband laser system having many possible applications, including atmospheric pollution detectors and medical diagnostic tools. In principle, the same approach could be used to fabricate devices with different wavelength ranges or much narrower or wider ranges.

Condensed-Matter Physics.
      Since 1995, when it was first made in the laboratory, the state of matter known as a Bose-Einstein condensate (BEC) has provided one of the most active fields of physical research. At first the mere production of such a state represented a triumph, garnering for the scientists who first achieved a BEC the 2001 Nobel Prize for Physics. By 2002 detailed investigations of the properties of such states and specific uses for them were coming to the fore. Bose-Einstein condensation involves the cooling of gaseous atoms whose nuclei have zero or integral-number spin states (and therefore are classified as bosons) so near to a temperature of absolute zero that they “condense”—rather than existing as independent particles, they become one “superatom” described by a single set of quantum state functions. In such a state the atoms can flow without friction, making the condensate a superfluid.

      During the year Markus Greiner and co-workers of the Max Planck Institute for Quantum Optics, Garching, Ger., and Ludwig Maximilian University, Munich, Ger., demonstrated the dynamics of a BEC experimentally. To manipulate the condensate, they formed an “optical lattice,” using a number of crisscrossed laser beams; the result was a standing-wave light field having a regular three-dimensional pattern of energy maxima and minima. When the researchers caught and held the BEC in this lattice, its constituent atoms were described not by a single quantum state function but by a superposition of states. Over time, this superposition carried the atoms between coherent and incoherent states in the lattice, an oscillating pattern that could be observed and that provided a clear demonstration of basic quantum theory. The researchers also showed that, by increasing the intensity of the laser beams, the gas could be forced out of its superfluid phase into an insulating phase, a behaviour that suggested a possible switching device for future quantum computers.

      BECs were also being used to produce atom lasers. In an optical laser the emitted light beam is coherent—the light is of a single frequency or colour, and all the components of the waves are in step with each other. In an atom laser the output is a beam of atoms that are in an analogous state of coherence, the condition that obtains in a BEC. The first atom beams could be achieved only by allowing bursts of atoms to escape from the trap of magnetic and optical fields that confined the BEC—the analogue of a pulsed laser. During the year Wolfgang Ketterle (one of the 2001 Nobel physics laureates) and co-workers at the Massachusetts Institute of Technology succeeded in producing a continuous source of coherent atoms for an atom laser. They employed a conceptually simple, though technically difficult, process of building up a BEC in a “production” trap and then moving it with the electric field of a focused laser beam into a second, “reservoir” trap while replenishing the first trap. The researchers likened the method to collecting drops of water in a bucket, from which the water could then be drawn in a steady stream. Making a hole in the bucket—i.e., allowing the BEC to flow as a beam from the reservoir—would produce a continuous atom laser. The work offered a foretaste of how the production, transfer, and manipulation of BECs could become an everyday technique in the laboratory.

Solid-State Physics.
      The study of systems containing only a few atoms not only gives new insights into the nature of matter but also points the way toward faster communications and computing devices. One approach has been the development and investigation of so-called quantum dots, tiny isolated clumps of semiconductor atoms with dimensions in the nanometre (billionth of a metre) range, sandwiched between nonconducting barrier layers. The small dimensions mean that charge carriers—electrons and holes (traveling electron vacancies)—in the dots are restricted to just a few energy states. Because of this, the dots can be thought of as artificial atoms, and they exhibit useful atomlike electronic and optical properties.

      Toshimasa Fujisawa and co-workers of the NTT Basic Research Laboratories, Atsugi, Japan, studied electron transitions in such dots involving just one or two electrons (which acted as artificial atoms analogous to hydrogen and helium, respectively). Their encouraging results gave support to the idea of using spin-based electron states in quantum dots for storage of information. Other researchers continued to investigate the potential of employing coupled electron-hole pairs (known as excitons) in quantum dots for information storage. Artur Zrenner and co-workers at the Technical University of Munich, Ger., demonstrated the possibility of making such a device. Although technological problems remained to be solved, it appeared that quantum dots were among the most promising devices to serve as the basis of storage in future quantum computers.

David G.C. Jones

Astronomy

Solar System.
      For information on Eclipses, Equinoxes and Solstices, and Earth Perihelion and Aphelion in 2003, see Table (Earth Perihelion and Aphelion, 2003).

      The question of whether Pluto should be regarded as a full-fledged planet was highlighted in late 2002 with the announcement of a discovery by astronomers from the California Institute of Technology. In October Michael Brown and Chad Trujillo reported an object beyond the orbits of Neptune and Pluto some 6.3 billion km (4 billion mi) from the Sun. Designated 2002 LM60 and tentatively named Quaoar by its discoverers, the object falls into the class of bodies called trans-Neptunian objects, whose count has grown into the hundreds since the first one was identified in 1992. Quaoar was first spotted in June with a telescope on Mt. Palomar and subsequently observed with the Earth-orbiting Hubble Space Telescope, which resolved its image. It appeared to be about 1,300 km (800 mi) in diameter, about half the size of Pluto.

      Quaoar was the largest object found in the solar system since the discovery of Pluto in 1930. Although it is about 100 million times more massive than a typical comet, the object—like Pluto and the other bodies orbiting beyond Neptune—was thought to be part of the Kuiper belt, a region in the outer solar system believed to contain myriad icy bodies and to be the source of most short-period comets. The latest discovery was certain to provoke further debate about the planetary nature of the larger trans-Neptunian objects and the inclusion of Pluto among them.

      After NASA's 2001 Mars Odyssey spacecraft reached the planet Mars in October 2001, it spent the next few months lowering and reshaping its orbit for its science mapping mission. Throughout 2002 the probe imaged the Martian surface and took a variety of measurements. Its instruments included a neutron detector designed to map the location of intermediate-energy neutrons knocked off the Martian surface by incoming cosmic rays. The maps revealed low neutron levels in the high latitudes, which was interpreted to indicate the presence of high levels of hydrogen. The hydrogen enrichment, in turn, suggested that the polar regions above latitude 60° contain huge subsurface reservoirs of frozen water ice. The total amount of water detected was estimated to be 10,000 cu km (2,400 cu mi), nearly the amount of water in Lake Superior. Odyssey's instruments, however, could not detect water lying at depths much greater than a metre (3.3 ft), so the total amount could be vastly larger. Such information would be vitally important if human exploration of Mars was ever to be undertaken in the future.

      In line with the accelerating rate of discoveries of new moons for the giant planets, astronomers reported finding still more moons for Jupiter. After combining the results of telescopic observations in December 2001 and May 2002 from Mauna Kea, Hawaii, a team led by Scott S. Sheppard and David C. Jewitt of the University of Hawaii announced the detection of 11 new Jovian moons, bringing the total number known to 39. In view of the latest discoveries, the team proposed that there might be as many as 100 Jovian moons. The new objects are tiny—no more than 2–4 km (1.25–2.5 mi) in diameter—and have large elliptical orbits inclined with respect to the orbits of the four large Galilean moons. They also revolve around Jupiter in a direction opposite to its rotation. Together these properties suggested that the small moons are objects captured by Jupiter's gravity early in its history.

Stars.
      The rate of discovery of planets orbiting other stars, like that of moons in the solar system, continued to accelerate. Extrasolar planets were first reported in 1995; by the end of 2002, more than 100 had been reported, roughly a third of them in that year alone. Among the latest discoveries was a planetary system somewhat similar to the Sun's own. In 1996, 55 Cancri—a star lying in the constellation Cancer—had been found to have a planet with about the mass of Jupiter orbiting it about every 14.6 days. That period placed the planet at about one-tenth the Earth-Sun distance from its central star. In 2002 Geoffrey Marcy and Debra A. Fisher of the University of California, Berkeley, R. Paul Butler of the Carnegie Institution of Washington, D.C., and co-workers announced their finding of a second planet with a mass of three to five times that of Jupiter revolving around 55 Cancri in an orbit comparable to Jupiter's orbit around the Sun. The Marcy team also described the likely presence of yet a third planet in the system having an orbital period of about 44 days. Although the known companions of 55 Cancri did not make the system an exact analogue of the Sun's, their discovery offered hope that more closely similar systems would be found.

      Pulsars—rapidly rotating, radio-emitting, highly magnetized neutron stars—were first detected in 1967. By 2002 more than 1,000 were known. Pulsars arise as the by-product of supernova explosions, which are the final event in the life cycle of massive stars. During the past millennium, only a half dozen supernova explosions in the Milky Way Galaxy have been preserved in historical records—in the years 1006, 1054, 1181, 1572, 1604, and 1680. The explosion leading to the famous Crab Nebula, for example, occurred on July 4, 1054. This supernova remnant has long been known to contain a pulsar.

      In 2002 discovery of the youngest radio pulsar found to date was reported. It lies within an extended radio source known as 3C 58, the remnant of the supernova explosion of 1181. To detect it radio astronomers began with the 2001 observation of a point X-ray source, dubbed RXJ 1856-3754, made with NASA's Earth-orbiting Chandra X-Ray Observatory. Fernando Camilo of Columbia University, New York City, and collaborators then used the 100 × 110-m (328 × 361-ft) Robert C. Byrd Green Bank Telescope to detect the X-ray source by its radio pulses. The radio pulsar was found to be rotating at about 15 times per second, in agreement with the previously reported X-ray source. X-ray data from the Chandra Observatory, combined with the young age of the pulsar, implied that the pulsar might be cooler or smaller (or both) than it should be if it was made up mainly of neutrons. Some theoretical interpretations suggested that the pulsar may consist of quarks, pions, or other exotic form of matter.

Galaxies and Cosmology.
      Although astronomers can study distant galaxies in great detail, it is very difficult to peer into the centre of Earth's own Galaxy by using optical telescopes. The plane of the Milky Way contains a great deal of dust, which strongly obscures what lies within it. Infrared radiation emitted by objects at the Galaxy's core, however, can penetrate the dust. Using near-infrared telescopes, an international team of astronomers led by Rainer Schödel of the Max Planck Institute for Extraterrestrial Physics, Garching, Ger., managed to penetrate to the heart of the Milky Way to track the motion of stars in the vicinity of the compact radio source—and black hole candidate—called Sagittarius (Sgr) A*. Over a period of 10 years, they watched the motion of a star (designated S2) that lies close to Sgr A*. They found that S2 orbits the galactic centre in about 15.2 years with a nearest approach to Sgr A* of only about 17 light-hours. This corresponds to such a small orbit that only a black hole having a mass equal to three million to five million Suns can fit within it. These observations provided the best evidence to date that black holes exist.

      The hot big-bang model proposes that the universe began with an explosive expansion of matter and energy that subsequently cooled, leading to its present state. As optical observations have revealed, the universe contains visible galaxies that are receding from one another. It also contains a nearly uniform background of microwave radiation, which currently has a temperature of about 3 K (three degrees above absolute zero). New studies in 2002 of distant galaxies and of the microwave background radiation continued to clarify and solidify the validity of the big-bang evolutionary picture.

      By year's end as many as 26 separate experiments had measured fluctuations in the intensity of the background radiation. Details of the measurements provided valuable information about the expansion of the universe some 400,000 years after its inception. The most startling conclusion from these studies was that the universe consists of about 5% ordinary matter (the luminous matter seen in galaxies) and about 25% dark (nonluminous) matter, which is probably cold but whose composition is unknown. The other 70% comprises a kind of repulsive force that was proposed originally by Albert Einstein, who called it the cosmological constant, and that more recently was being termed dark energy or quintessence, although it does not have the character of what is usually called energy. Together these constituents add up to just what is needed to make the spatial geometry of the universe “flat” on cosmic scales. One implication of this flatness is that the universe will continue to expand forever rather than eventually collapsing in a “big crunch.”

Kenneth Brecher

Space Exploration

Manned Spaceflight.
      Assembly of the International Space Station (ISS) continued to dominate manned space operations in 2002. (See Table (Launches in support of human spaceflight, 2002).) Construction was delayed several months, however, when in June a sharp-eyed ground inspector spotted tiny cracks in the metal liner of a main-engine propellant line of the space shuttle orbiter Atlantis. Similar cracks, which had the potential to destroy both vehicle and crew, turned up in the fuel or oxygen lines of the orbiter Discovery and subsequently Columbia and Endeavour. NASA halted shuttle missions until October while a welding fix was developed, tested, and implemented.

      On Feb. 1, 2003, a shocked world learned the news that the shuttle orbiter Columbia had broken up catastrophically over north-central Texas at an altitude of about 60 km (40 mi) as it was returning to Cape Canaveral, Florida, from a non-ISS mission. All seven crew members—five men and two women—died; among them was Ilan Ramon, the first Israeli astronaut to fly in space. One focus of the investigation into the cause of the disaster was on Columbia's left wing, which had been struck by a piece of insulation from the external tank during launch and which had been the first part of the orbiter to cease supplying sensor data during its descent.

      The ISS grew during 2002 with the attachment of the first three segments of the primary truss, the station's structural backbone. The central S0 segment, carried up by shuttle in April, was placed atop the Destiny laboratory module delivered the previous year. The rest of the truss would extend to port and starboard from the station. S1 (starboard) and P1 (port) segments, added in October and November, respectively, would hold radiators for eliminating waste heat generated by the crew and the station's systems. They would also support electrical cables supplying power to the ISS modules from the solar-panel arrays that would eventually be attached to the ends of the completed main truss. In addition, the truss segments had rails to allow the Canadian-built robot arm Canadarm2, delivered to the ISS in 2001, to travel the length of the truss and help attach new elements.

      On a separate shuttle mission in June, the reusable Leonardo Multi-Purpose Logistics Module carried supplies and gear to outfit the station. A significant piece of that cargo was the Microgravity Science Glovebox, which would allow astronauts to conduct a wide range of experiments in materials science, combustion, fluids, and other space-research fields. In September, NASA named biochemist-astronaut Peggy Whitson, then aboard the ISS, as the station's first science officer, a new position intended to emphasize the position of science on the ISS.

      Space tourism received a boost with the flight of South African businessman Mark Shuttleworth to the ISS aboard a Russian Soyuz TM in April. In contrast to the controversy surrounding Dennis Tito's similar flight in 2001, Shuttleworth's sortie received some support from NASA, and Shuttleworth carried experiments developed by South African students. Another Soyuz mission, launched to the station in October, served as a test flight for an improved version of the TM design, designated Soyuz TMA.

      A non-ISS shuttle mission in March was devoted to servicing the Hubble Space Telescope (HST) for the fourth time. The crew replaced the Faint Object Camera, the last of the HST's original science instruments, with a new Advanced Camera for Surveys, which soon provided stunning images of the universe. The crew also installed improved solar arrays and other equipment.

      China carried on in its methodical quest to place a human in space with the third and fourth unmanned test flights (launched March 25 and December 30, respectively) of its Shenzhou spacecraft, which was based on the Soviet-Russian Soyuz design. The latest flights incorporated tests of escape and life-support systems. The first human flight could come as early as 2003. China also began expressing interest in participating in the ISS program even as Russia was voicing doubts that it had the resources to continue meeting its commitments.

Space Probes.
      An important deep-space mission, NASA's Comet Nucleus Tour (CONTOUR), was lost as it was being boosted from Earth orbit on August 15. CONTOUR had been placed in a parking orbit on July 3 to await the proper moment to begin the planned trajectory that would take it within 100 km (60 mi) of comet nuclei in 2003 and 2006. After its upper stage fired, ground controllers were unable to regain contact, and tracking stations soon found debris near the planned trajectory. A preliminary investigation indicated that the stage failed and destroyed the craft.

      After reaching Mars in late 2001, NASA's 2001 Mars Odyssey spacecraft spent three months using atmospheric braking techniques to settle into the orbit selected for its science mapping mission, which began February 18. In addition to returning high-quality images of the Martian surface, Odyssey's instruments mapped the distribution of surface and near-surface elements. Some of these data suggested the presence of subsurface frozen water in large areas surrounding the poles. (See Astronomy.)

      The Galileo spacecraft's highly successful exploration of Jupiter and its moons, which began in 1995, completed its final full year in Jovian orbit. Low on propellant, Galileo made its last and closest (100-km) flyby of Jupiter's moon Io on January 17, followed by a flyby of another moon, Amalthea, on November 5. In early 2003 mission controllers were to place it on a trajectory for a fiery entry into Jupiter's atmosphere later in the year. This would eliminate the possibility of the spacecraft's crashing on, and contaminating, Europa or another moon that might harbour rudimentary life.

      Launched in February 1999, NASA's Stardust spacecraft opened its ultrapure collector arrays between August and December 2002 to capture interstellar dust particles. On November 2 it flew within 3,000 km (1,900 mi) of asteroid Annefrank, returning images and other data. This was a dress rehearsal of its planned Jan. 2, 2004, flight through the tail of Comet Wild 2, when, using separate collectors, it would gather comet dust particles. The spacecraft was to return to Earth with its collection of extraterrestrial materials in January 2006.

Unmanned Satellites.
      A unique Earth-mapping mission began on March 17 with the orbiting of the U.S.-German twin Gravity Recovery and Climate Experiment spacecraft (GRACE 1 and 2, nicknamed Tom and Jerry after the cartoon characters). By tracking the precise distance between the two spacecraft and their exact altitude and path over Earth, scientists could measure subtle variations in Earth's gravitational field and detect mass movements due to such natural activity as sea-level changes, glacial motions, and ice melting.

      Other advanced environmental research satellites sent into space during the year included the U.S. Aqua, launched May 4 as a complement to Terra (launched 1999), and the European Space Agency's Envisat 1, launched March 1. Aqua was designed to study the global water cycle in the oceans, ice caps, land masses, and atmosphere. Its six instruments were provided by the U.S., Japan, and Brazil. (See Earth Sciences: Meteorology and Climate (Earth Sciences ).) Europe's Envisat carried an array of 10 instruments to investigate global warming, the ozone hole, and desertification. China orbited its Fengyun (“Wind and Cloud”) 1D and Haiyang (“Marine”) 1 satellites on May 15. Fengyun employed a digital imager to observe clouds and monitor for floods and sandstorms. Haiyang had an ocean imager to observe chlorophyll concentration, temperatures, and other aspects of the seas. On May 4 France launched its SPOT 5 Earth-observation satellite, which carried cameras for producing high-resolution colour and black-and-white images in conventional and stereo versions. Applications of SPOT imagery ranged from specialized map products and agricultural management to defense and natural-hazard assessment.

      NASA's High Energy Solar Spectroscopic Imager (HESSI) was launched on February 5 in a successful bid to replace an earlier version lost during launch in 1999. HESSI monitored X-ray and gamma-ray energy released by solar flares. Its instruments measured the energy levels and intensity of flares across a map of the Sun's disk.

      In September NASA awarded a contract to TRW to design and build the Next Generation Space Telescope. The instrument would orbit the Sun at a gravitationally stable point about 1.5 million km (930,000 mi) from Earth on the planet's night side, and it would be named after James Webb, NASA's second administrator, who led the Apollo program and pursued a strong U.S. program of space science. Since its launch was not expected before 2010, Congress asked NASA to ensure that the HST operated as long as possible.

Launch Vehicles.
      The quest to develop safer, more cost-effective replacements for the space shuttle continued as the U.S. refocused efforts in its Space Launch Initiative. While a clear winner had yet to emerge, NASA turned its attention to multistage systems rather than the single-stage-to-orbit approach exemplified by the VentureStar project, which was canceled in 2001. Engine-design work was refined to concentrate on kerosene as a fuel rather than liquid hydrogen. Although liquid hydrogen is a more efficient source of energy than kerosene, it is also less dense and so requires larger vehicles. NASA also initiated programs to upgrade the space shuttle system and keep it flying through the year 2020 (almost 40 years after its first flight) and to develop a small Atlas- or Delta-launched spaceplane to ferry crews to and from the ISS and serve as a lifeboat for the station.

      Two new U.S. commercial launch systems made their debut. The Atlas 5, combining technologies evolved from U.S. and former Soviet ballistic missiles, made its first flight on August 21, with the Hot Bird 6 satellite as payload. The Delta IV, using the new RS-68 hydrogen-oxygen liquid-fueled engine derived from the space shuttle main engine, was delayed by a series of small problems but finally made a successful first flight November 20 carrying the Eutelsat W5 spacecraft. On September 10 Japan's H-2A rocket made its third flight, in which it placed a twin payload into orbit. The vehicle's first flight, in August 2001, went smoothly, but during the second launch on February 4, one of its two payloads, a $4.5 million reentry technology demonstrator, failed to separate and was lost. Continued success of the H-2A was deemed crucial to Japan's hopes of competing in the commercial launch market.

Dave Dooling

▪ 2002

Introduction

Mathematics
      The closeness of the 2000 U.S. presidential election highlighted the unusual characteristics of the American electoral system, such as the electoral college, in which all but a few states assign electoral votes on a winner-take-all basis, and simple plurality elections, in which the leading candidate wins without having a runoff election to establish a majority winner. Mathematicians and others had investigated voting systems in the past, and this contentious election inspired further research and discoveries in 2001. (See also Sidebar (Election Reform Debate in the U.S. ).)

      When there are only two candidates, the situation is very simple. In 1952 the American mathematician Kenneth May proved that there is only one voting system that treats all voters equally, that treats both candidates equally, and where the winning candidate would still win if he or she received more votes. That system is majority rule.

      When there are more than two candidates, as was the case in the 2000 presidential election, the situation is most unsatisfactory. Two notable voting systems have been proposed as better for multicandidate races. The first is commonly attributed to the 18th-century French mathematician Jean-Charles, chevalier de Borda. Borda's method requires each voter to rank the candidates, with the lowest candidate getting 1 point, the next lowest candidate 2 points, and so forth, up to the highest candidate, who gets as many points as there are candidates. The points from all voters are added, and the candidate with the most points wins. This system was actually first described in 1433 by Nicholas of Cusa, a German cardinal who was concerned with how to elect German kings. Today it is used in the United States to rank collegiate football and basketball teams.

      Borda believed that his system was better than the one devised by his French contemporary Marie-Jean-Antoine-Nicolas de Caritat, marquis de Condorcet. Condorcet felt that the winner should be able to defeat every other candidate in a one-on-one contest. Unfortunately, not every election has a Condorcet winner. In the 2000 presidential election, however, polls indicated that Al Gore would have been a Condorcet winner, since—with the help of supporters of Ralph Nader—he would have beaten George W. Bush in a one-on-one contest (or in a runoff election).

      Like the Borda system, the Condorcet system had already been proposed for ecclesiastical elections; it was first described in the 13th century by the Catalan philosopher and missionary Ramon Llull, who was interested in how to elect the abbess of a convent. Nicholas of Cusa made a copy of one of Llull's manuscripts before deciding he could do better, by devising the Borda system. Another of Llull's manuscripts, with a more complete description of his voting system, was discovered and published in 2001, by Friedrich Pukelsheim and others at the University of Augsberg, Germany.

      Part of the reason for the great controversy between Borda and Condorcet was that neither of their systems was ideal. In fact, the American Nobel Prize-winning economist Kenneth Arrow showed in 1951 that no voting system for multicandidate elections can be both decisive (produce a Condorcet winner) and completely fair (candidates change position only with a change in their rankings). Nevertheless, after the 2000 presidential election, Americans Donald Saari and Steven Brams argued persuasively for modifying the U.S. system.

      Saari used geometry in order to reveal hidden assumptions in voting methods. He favoured the Borda system, which he believed more accurately reflects the true sentiment of voters, as well as having a tendency to produce more centrist winners than the plurality method. In practice, ranking all the candidates can be onerous, and the “broadly supported” winner may just be everybody's third or fourth choice.

      Another criticism of the Borda system is that the electorate may vote strategically, rather than sincerely, in order to manipulate the election. Such strategic voting takes place under the current system; in the 2000 presidential election, many voters who preferred Nader voted for Gore instead of out of fear of giving the election to Bush.

      Brams favoured approval voting, which is used by some professional societies; Venetians first used it in the 13th century to help elect their magistrates. Under approval voting, voters cast one vote for every candidate they regard as acceptable; the winner is the candidate with the most votes. Approval voting has several attractive features, such as the winner always having the broadest approval and voters never having to choose between two favoured candidates.

      Saari and Brams both agreed that the plurality method, together with the winner-take-all feature of the electoral college, has fundamentally flawed the American electoral process, preventing the election of candidates with broad support and frustrating the will of the electorate.

Paul J. Campbell

Chemistry

Carbon Chemistry.
      In 2001 Hendrik Schön and associates of Lucent Technologies' Bell Laboratories, Murray Hill, N.J., announced the production of buckminsterfullerene crystals that become superconducting at substantially warmer temperatures than previously possible. Superconductors conduct electric current without losses due to resistance when they are cooled below a certain critical temperature. In 1991 a Bell Labs team first showed that buckminsterfullerene molecules (C60), which are spherical hollow-cage structures made of 60 carbon atoms each, can act as superconductors at very low temperatures when doped with potassium atoms.

      Schön's group mixed C60 with chloroform (CHCl3) or its bromine analogue, bromoform, to create “stretched” C60 crystals. In the modified crystal structure, chloroform or bromoform molecules were wedged between C60 spheres, moving them farther apart. The altered spacing between neighbouring C60 molecules, coupled with the experimenters' use of a setup that took advantage of transistor-like effects, raised the critical temperature of the material. Tests showed that C60 mixed with bromoform became superconducting below 117 K (−249 °F), which is more than double the previous temperature record of 52 K (−366 °F) for a C60-based material set the previous year.

      Although still very cold, the record-breaking temperature was warm enough for the C60 superconductor to function while cooled by liquid nitrogen (boiling point 77 K [−321 °F]), instead of the lower-boiling and much more expensive liquid helium. The only other superconductors that operate at higher temperatures are copper oxide ceramic superconductors. These materials were used in powerful magnets, superconductive wires for power-transmission systems, and other applications, but they were expensive and had other drawbacks. Schön speculated that C60 superconductors could turn out to be cheaper. He also believed that increasing the spacing between C60 spheres in the crystal by just a small percentage could boost the critical temperature even more.

Physical Chemistry.
      Water can flow uphill, as chemical engineer Manoj K. Chaudhury demonstrated in a notable 1992 experiment that delighted and perplexed the public. Chaudhury, then at Dow Corning Corp., and George M. Whitesides of Harvard University coaxed microlitre-sized droplets of water to run uphill on the surface of a polished silicon wafer at a rate of about one millimetre per second. The secret involved the creation of a surface tension gradient—a swath of continually decreasing hydrophobicity, or tendency to repel water—across the silicon wafer. The wafer was then tilted from the horizontal so that the most hydrophobic end was lower than the least hydrophobic end. Water droplets deposited at the low end were propelled across the surface against gravity by the imbalance of surface tension forces between the uphill and downhill ends of the drop.

      In a report published during the year, Chaudhury and co-workers at Lehigh University, Bethlehem, Pa., described a technique for making water droplets move across a silicon surface hundreds of times faster than in the previous experiment, at rates of centimetres to a metre or more per second. The speeds were achieved by passing saturated steam over a relatively cool silicon surface possessing a surface tension gradient. In this case the gradient was applied radially, with the wafer's surface being most hydrophobic at the centre and least so at the circumference. As water droplets condensed on the surface from the steam, they first moved slowly outward but then rapidly accelerated as they merged with neighbouring drops. The energy that was released during drop coalescence and directionally channeled by the surface tension gradient accounted for the increased speed of the drops. Chaudhury suggested that the phenomenon could be put to practical use in heat exchangers and other heat-transfer applications and in microfabricated devices where tiny amounts of fluid need to be pumped from one component to another.

Analytic Chemistry.
      Nuclear magnetic resonance (NMR) spectroscopy was among the chemist's most important tools for studying the physical and chemical properties of plastics, glasses and ceramics, catalysts, DNA and proteins, and myriad other materials. Spectroscopy is the study of interactions between electromagnetic radiation and matter. NMR spectroscopy is based on a phenomenon that occurs when atoms of certain elements are immersed in a strong static magnetic field and exposed to radio-frequency waves. In response, the atomic nuclei emit their own radio signals that can be detected and used to understand a material's properties.

      Researchers from the U.S., France, and Denmark reported a technique for obtaining more precise NMR information about a material's atomic structure. The group, headed by Philip Grandinetti of Ohio State University at Columbus, found that spinning samples at speeds as high as 30,000 cycles per second can often boost the NMR signal strength by 10-fold or more. They termed the new technique FASTER (for “fast spinning gives transfer enhancement at rotary resonance”). Spinning materials during NMR was not new. A technique known as magic-angle spinning rotated materials at a certain angle in relation to the NMR's static magnetic field. Unfortunately, magic-angle spinning did not work well for about 70% of the chemical elements, including the common elements oxygen, aluminum, and sodium. Analysis required the averaging of weeks of test results and the use of expensive high-power amplifiers. FASTER could produce results in hours with a much less costly low-power amplifier, according to Grandinetti.

Organic Chemistry.
      French chemist Louis Pasteur, who established the basics of stereochemistry in the 1840s, tried unsuccessfully to influence biological and chemical processes toward a preference for molecules with a right-handed or a left-handed structure. For example, Pasteur rotated growing plants in an effort to change the handedness of their naturally produced chemical compounds, and he performed chemical reactions while spinning the reactants in centrifuges. Over the next century and a half, chemists tried other ways of producing an excess of either left- or right-handed chiral molecules from achiral precursors, a process termed absolute asymmetric synthesis. (Molecules that exist in right- and left-handed versions, like a pair of gloves, are said to be chiral. Molecules lacking such handedness are said to be achiral.) To date, the only acknowledged successes had come with sophisticated approaches such as the induction of reactions with circularly polarized light and chiral selection based on the electroweak force, a fundamental interaction of nature that has asymmetric characteristics. Scientists had uniformly dismissed reports of asymmetric synthesis by simple stirring—clockwise or counterclockwise rotation during the chemical conversion of an achiral compound.

      During the year Josep M. Ribó and associates of the University of Barcelona, Spain, reported convincing evidence that chiral assemblies of molecules can be produced by stirring. They used achiral porphyrins, large disk-shaped molecules made of connected organic rings. The porphyrins had a zwitterionic structure—each molecule contained both positively and negatively charged regions—which allowed them to aggregate through electrostatic interactions and hydrogen bonding. Individual porphyrin disks can assemble linearly into left-handed or right-handed helices, and when left undisturbed they formed equal amounts of each kind. Ribó showed that stirring caused the formation of chiral assemblies, with the chirality controlled by the direction of the stirring.

      The findings could shed light on the mystery of homochirality in biological systems on Earth—why the essential molecules in living things are single-handed. Natural sugars, for example, are almost exclusively right-handed; natural amino acids, left-handed. Ribó's work suggested that vortex action during early stages of chemical evolution could be the explanation.

Nuclear Chemistry.
      During the year scientists at Lawrence Berkeley National Laboratory (LBNL), Berkeley, Calif., retracted their two-year-old claim for the synthesis of the superheavy element 118. The original announcement in 1999 had gained worldwide attention because element 118 was considered to be the heaviest chemical element ever produced and was regarded as evidence for existence of the so-called island of stability, a region of the periodic table consisting of superheavy elements with half-lives significantly longer than their slightly lighter superheavy neighbours on the table.

      The retraction came after confirmation experiments at LBNL and in Japan, Germany, and France had failed to reproduce the earlier results. In addition, after reviewing the original data using different analytic software, an LBNL committee of experts found no evidence for the decay chains that pointed to the existence of element 118. The LBNL researchers in 1999 had not directly observed the element. Rather, after bombarding a target of lead-208 with high-energy krypton-86 ions at LBNL's 224-cm (88-in) cyclotron, they inferred the production of three atoms of element 118 from data that they interpreted as characteristic of the way that the atoms decayed into a series of lighter elements. As part of a brief statement in Physical Review Letters, where the original results had been announced, the research team wrote: “Prompted by the absence of similar decay chains in subsequent experiments, we (along with independent experts) re-analyzed the primary data files from our 1999 experiments. Based on these re-analyses, we conclude that the three reported chains are not in the 1999 data.”

Michael Woods

Physics

Particle Physics.
      In the field of neutrino physics, years of work by large teams of researchers worldwide finally bore fruit in 2001. Of the fundamental particles that make up the standard model of the universe, neutrinos are the most enigmatic. Their existence was postulated in 1930 to explain a mysterious loss of energy seen in the nuclear beta-decay process. Because neutrinos interact so weakly with matter, however, they are extraordinarily difficult to observe, and experimental confirmation of their existence came only a quarter century later. Three types of neutrinos were known—electron, muon, and tau neutrinos. They were generally assumed to be massless, but the question remained open until 1998 when a team at Super-Kamiokande, a mammoth neutrino detector located in a Japanese zinc mine, found the strongest evidence to that time that neutrinos indeed possess a tiny mass.

      During the year, this work was extended to solve a major puzzle concerning solar physics. The accepted physical model for the nuclear reactions taking place in the Sun required the emission of a large number of electron neutrinos, but decades of experimental measurement had shown only a third of the expected number arriving at Earth. Physicists working at the Sudbury Neutrino Observatory, a neutrino detector built in a Canadian nickel mine, combined their data with complementary data from Super-Kamiokande to produce direct evidence for the remaining two-thirds. Their results confirmed the theory that electron neutrinos oscillate, or transform, among the three types as they travel through space from the Sun, which explained why earlier work had found a shortfall of two-thirds from that predicted. For neutrinos to oscillate, they must have a finite mass, which was consistent with the 1998 finding from Super-Kamiokande.

      The new results enabled the theoretical model of the Sun's nuclear reactions to be confirmed with great accuracy. The number of emitted neutrinos depends very sensitively on the Sun's central temperature, giving this as 15.7 million K, precise to 1%. At the same time, the oscillation between neutrino types would enable a better estimate for the neutrino mass, which had implications for cosmology. (See Astronomy (Mathematics and Physical Sciences ).)

      Another result from particle physics that affected an understanding of the universe as a whole came from work on a phenomenon known as CP violation. In the standard model every matter particle has an antiparticle with the same mass but with properties such as electric charge and spin reversed—for example, electrons and their positron counterparts. When a particle meets its antiparticle, mutual annihilation takes place with the release of energy. Conversely, a particle and its antiparticle can be created from energy. When the formation of particles and antiparticles in the hot early universe is modeled, a difficulty arises. If particles and antiparticles are identical, an equal number of both sorts should now exist. Because particles vastly outnumber antiparticles in the observable universe, however, there must be some kind of asymmetry in properties between the two types of matter. In present theories a very small asymmetry would do the job, and CP violation appeared to be a possible explanation.

      Until the 1950s it was assumed that nature is symmetrical in a number of ways. One example is parity—any reaction involving particles must be identical to the equivalent antiparticle reaction viewed in a mirror. In 1957 it was discovered that nuclear beta decay violated this symmetry. It was assumed, however, that symmetry in particle reactions involving both a change of parity (P) and a change of charge sign (C)—for example, the exchange of a negatively charged electron for a positively charged positron—was not violated. This conservation of charge and parity considered together is called CP symmetry. In 1964 decays of K mesons were found to violate CP symmetry. During 2001 independent teams of physicists at the Stanford (University) Linear Accelerator Center and the High Energy Accelerator Research Organization, Tsukuba, Japan, reported evidence for CP violation in the decay of another particle, the B meson. The experimental results also yielded a numerical value representing the amount of CP violation, which turned out to be about half of the required value predicted by the standard model to produce the known universe. The work was preliminary, however, and further refinement was needed to determine whether the standard model as currently formulated was an accurate picture of nature.

      Another tantalizing suggestion of fundamental physics beyond the standard model came from a collaborative experiment at Brookhaven National Laboratory, Upton, N.Y., which made the most precise measurement yet—to one part in a billion—of the magnetic moment of a muon. (The magnetic moment of a particle is a measure of its ability to turn itself into alignment with a magnetic field.) The results could give support to theories of supersymmetry, in which each fundamental particle possesses not only an antiparticle but also a heavier and as yet unobserved supersymmetric partner. Such particles might provide an explanation for the observation that most of the mass of the universe appears to be in the form of nonluminous, or dark, matter. Another hint of their existence comes from results of the balloonborne High Energy Antimatter Telescope (HEAT) experiment, which found an excess of high-energy positrons in cosmic rays. The excess positrons could be explained by collisions between superparticles.

Lasers and Light.
      Two achievements reported during the year could be said to span the speed range of research in optical physics. Harm Geert Muller of the FOM Institute for Atomic and Molecular Physics, Amsterdam, and collaborators produced the shortest light pulses ever measured—just 220 attoseconds (billionths of a billionth of a second, or 10−18 second) in duration. The investigators focused an intense pulse of infrared laser light on a jet of dilute argon gas, which converted some of the light into a collection of higher harmonics (multiples of the original frequency) in the ultraviolet range. The relative phases of the harmonics were such that the frequencies interfered in a special way, canceling each other except for very brief time intervals when they all added constructively. The result was a train of extremely short light spikes. Pulses this short could enable the study of a range of very fast phenomena and perhaps even follow electron motion around atomic nuclei.

      In 1999, working at the other end of the speed range, a group led by Lene Vestergaard Hau (see Biographies (Hau, Lene Vestergaard )) of Harvard University and the Rowland Institute for Science had demonstrated techniques for slowing a light pulse in a cloud of extremely cold gas from its normal speed of about 300,000 km (186,000 mi) per second to roughly the speed of urban automobile traffic. In 2001 Hau and her colleagues reported on a technique to halt a light pulse in a cold gas and release it at a later time. They first prepared a gas of ultracold sodium atoms and treated it with light from a so-called coupling laser, which altered the optical characteristics of the gas. They then fired a probe pulse from a second laser into the gas. Switching off the coupling beam while the probe pulse was traversing the gas brought the light to a stop and allowed all the information about it to be imprinted on the sodium atoms as a “quantum coherence pattern.” Switching on the coupling laser again regenerated a perfect copy of the original pulse. This technique could have applications for controlling and storing information in optical computers.

Condensed-Matter Physics.
      In 1995 researchers first produced a new state of matter in the laboratory—an achievement that was recognized with the 2001 Nobel Prize for Physics. (See Nobel Prizes .) Called a Bose-Einstein condensate, it comprises a collection of gaseous atoms at a temperature just above absolute zero (−273.15 °C, or −459.67 °F) locked together in a single quantum state—as uniform and coherent as a single atom. Until 2001 condensates of elements such as rubidium, lithium, and sodium had been prepared by cooling a dilute gas of atoms in their ground states. During the year separate research groups at the University of Paris XI, Orsay, and the École Normale Supérieure, Paris, succeeded in making a condensate from a gas of excited helium atoms. Because no existing lasers operated in the far-ultraviolet wavelength needed to excite helium from the ground state, the researchers used an electrical discharge to supply the excitation energy.

      Although each helium atom possessed an excitation energy of 20 eV (which was more than 100 billion times its thermal energy in the condensate), the atoms within the condensate were stabilized against release of this energy by polarization (alignment) of their spins, which greatly reduced the probability that excited atoms would collide. When the condensate came into contact with some other atom, however, all the excitation energy in its atoms was released together. This suggested the possibility of a new kind of laser that emits in the far ultraviolet.

      Practical devices based on such advanced techniques of atomic and optical physics were coming closer to realization. During the year a team led by Scott Diddams of the U.S. National Institute of Standards and Technology, Boulder, Colo., used the interaction between a single cooled mercury atom and a laser beam to produce the world's most stable clock, with a precision of about one second in 100 million years. Such precision could well be needed in future high-speed data transmission.

David G.C. Jones

Astronomy
       Earth Perihelion and Aphelion, 2002For information on Eclipses, Equinoxes and Solstices, and Earth Perihelion and Aphelion in 2002, see Table (Earth Perihelion and Aphelion, 2002).

Solar System.
      On Feb. 12, 2001, the unmanned spacecraft NEAR (Near Earth Asteroid Rendezvous) Shoemaker gently touched down on asteroid 433 Eros. NEAR had spent the previous 12 months in orbit about the potato-shaped object, photographing its surface features. After it landed, its onboard gamma-ray spectrometer showed that Eros has a low abundance of iron and aluminum relative to magnesium. Such proportions are found in the Sun and in meteorites called chondrites, thought to be among the oldest objects in the solar system. The observations suggested that Eros was formed some 4.5 billion years ago and did not undergo significant chemical changes after that time. In another postlanding study, a magnetometer aboard NEAR confirmed the lack of a detectable magnetic field on Eros. This finding suggested that magnetized meteorites (which constitute the majority of meteorites found on Earth) may be fragments knocked from other types of asteroids or that they acquired their magnetization on their journey to Earth.

      On September 22 another spacecraft, Deep Space 1, successfully navigated its way past Comet Borrelly, providing the best view ever of the ice particles, dust, and gas leaving comets. The spacecraft came within 2,200 km (1,360 mi) of the roughly 8 × 4-km (5 × 2.5-mi) cometary nucleus. It sent back images that showed a rough surface terrain, with rolling plains and deep fractures—a hint that the comet may have formed as a collection of icy and stony rubble rather than as a coherent solid object. From the amount of reflected light—only about 4%—the surface appeared to be composed of very dark matter. Cosmochemists proposed that the surface was most likely covered with carbon and substances rich in organic compounds.

      In mid-2001 an international group of astronomers using 11 different telescopes around the world reported the discovery of 12 new moons of Saturn. This brought the total to 30, the largest number so far detected for any planet in the solar system. The moons range in diameter from 6 to 32 km (4 to 20 mi). Saturn previously had been known to have six large moons, Titan being the largest, and 12 small ones, all but one of which were classified as regular moons because they move in circular orbits in the planet's orbital plane. All of the new moons move in highly eccentric orbits, which suggested that they are remnants of larger objects that were captured into orbit around Saturn early in its history and subsequently broken up by collisions.

Stars.
      One of the most perplexing problems in modern astrophysics, an observed shortage in the predicted number of neutrinos emanating from the Sun, appeared to be finally resolved during the year. Detailed theoretical studies of nuclear reactions in the Sun's core had predicted that energy is released in the form of gamma rays, thermal energy, and neutrinos. The gamma rays and thermal energy slowly diffuse to the solar surface and are eventually observed as visible light and other electromagnetic radiation. Neutrinos are electrically neutral particles that travel almost unaffected through the Sun and interplanetary space on their way to Earth. Beginning in the late 1960s, scientists sought to detect these elusive particles directly. Because neutrinos interact so weakly with matter, detectors containing enormous quantities of mass were built to detect them. These were placed deep underground to allow neutrinos originating in the Sun to be distinguished from background galactic cosmic rays. Despite many experiments employing a variety of detectors, scientists consistently had observed only about a third of the predicted neutrino flux.

      Neutrinos come in three varieties, or flavours—electron, muon, and tau. Because nuclear fusion in the Sun's core should produce only electron neutrinos, most of the earlier experiments had been designed to detect only that flavour. The Sudbury Neutrino Observatory (SNO), sited deep inside a Canadian nickel mine, was built to have enhanced sensitivity to muon and tau neutrinos. It used as its detector a 1,000-ton sphere of extremely pure heavy water (water molecules in which the two hydrogen atoms are replaced with deuterium, one of hydrogen's heavier isotopes). A second facility, called Super-Kamiokande and located in a zinc mine in Japan, employed a tank of 50,000 tons of ultrapure ordinary water to detect electron and muon neutrinos. In 2001 the international collaboration running SNO, headed by Art McDonald of Queen's University at Kingston, Ont., reported evidence derived from SNO and Super-Kamiokande data for the detection of the missing two-thirds of the neutrino flux. The results confirmed the theory that electron neutrinos transform, or oscillate, among the three possible flavours on their journey to Earth. Oscillation also implied that neutrinos have a tiny but finite mass and thus make a contribution to the nonluminous, unobserved “dark matter” in the universe. (See Physics (Mathematics and Physical Sciences ).)

      The detection of planets orbiting other stars was first announced in 1995. By the beginning of 2001 about 50 extrasolar planets had been reported, and by year's end the number had risen to more than 70. Most of the planets found to date are quite different from those in Earth's solar system. Many are large (as much as 20 times the mass of Jupiter) and often move in elliptical orbits quite close to their parent stars.

      During the year, for the first time, a planetary system remarkably similar to Earth's solar system was detected. Geoffrey Marcy of the University of California, Berkeley, Paul Butler of the Carnegie Institution of Washington, D.C., and their collaborators reported that a star visible to the naked eye, 47 Ursae Majoris, is orbited by at least two planets. The presence of one planet had been known since 1996, but the new discovery changed astronomers' picture of the system in important ways. One planet has a mass at least three-fourths that of Jupiter, and the other has at least two and a half times Jupiter's mass. Interestingly, the ratio of their masses is close to the ratio of the masses of Saturn and Jupiter. Both extrasolar planets move in nearly circular orbits, a property that was thought to increase the odds that the system contains Earth-like planets as well.

Galaxies and Cosmology.
      Over the past 75 years, observations and theory have combined to produce a consistent model of the origin and evolution of the universe, beginning with a big-bang explosion some 10 billion to 20 billion years ago. Left behind and detectable today as a relic of this hot event is a highly uniform flux of cosmic microwave background radiation. Because the matter that is observed filling the universe attracts other matter gravitationally, the expansion rate of the universe should be slowing down. Nevertheless, observations in 1998 of the brightness of fairly distant exploding stars called Type Ia supernovas suggested that the expansion is currently accelerating. The findings were interpreted as evidence for the existence throughout space of a kind of cosmic repulsion force first hypothesized by Albert Einstein in 1917 and represented by a term, the cosmological constant, in his equations of general relativity. The supernovas observed in the studies were found to be dimmer than expected, which implied that they were farther away than a decelerating universe could account for.

      During the year Adam G. Riess of the Space Telescope Science Institute, Baltimore, Md., and collaborators reported new studies of the most distant supernova yet found, designated SN 1997ff. Their analysis of observations of the supernova, which were made with the Hubble Space Telescope, indicated that the expansion rate of the universe was slower at the time of the supernova explosion billions of years ago than it is now. Their results also refuted the possibility that intervening dust or other astrophysical effects could be an explanation for the unexpectedly dim supernovas seen in the earlier studies. SN 1997ff provided the best evidence to date that the expansion of the universe is indeed accelerating.

      The existence of galaxies and their current distribution in space to form clusters, filaments, and voids indicated that large-scale fluctuations in the density of matter were present in the very early universe, and theoretical studies indicated that the cosmic background radiation should also carry an imprint of those fluctuations in the form of slight variations in brightness across the sky. In 2001 the combined findings of three recent experiments designed to study the cosmic background radiation provided dramatic evidence for this prediction. First reported on in 2000, two of the experiments—Maxima (Millimeter Anisotropy Experiment Imaging Array) and Boomerang (Balloon Observations of Millimetric Extragalactic Radiation and Geophysics)—used balloons to carry detectors high above most of Earth's atmosphere. The third experiment—DASI (Degree Angular Scale Interferometer)—was a ground-based interferometer located at the South Pole. All three measured fluctuations in the intensity of the cosmic background radiation on various angular scales across the sky and with an accuracy of one part in 100,000. Taken together, their results implied that more than 95% of the material content of the universe is made up of at least two kinds of dark exotic matter that has gravitational effects on the observed matter. Furthermore, the studies reinforced the idea that about two-thirds of the energy content of the universe exists in the form of the repulsive gravitational force represented by the cosmological constant or some equivalent.

Kenneth Brecher

Space Exploration

Manned Spaceflight.
      Human activity in space faced an uncertain future as the International Space Station (ISS) encountered massive cost overruns and as cuts in general space spending were anticipated in response to the Sept. 11, 2001, terrorist attacks.

       Launches in Support of Human Spaceflight, 2001Following the start of full-time manned operations in late 2000, the ISS underwent rapid expansion with the addition of several key elements. (See Table (Launches in Support of Human Spaceflight, 2001).) First to arrive was the U.S.-built Destiny laboratory module, taken into orbit February 7 by the space shuttle Atlantis. Destiny, about the size of a bus, was designed to hold 24 standard payload racks, about half of them housing equipment for research into human adaptation to space travel, materials fabrication, and the behaviour of fluids and fires in microgravity. Because of weight limitations on shuttle cargos, the module was only partially outfitted inside and out at launch. The next mission, conducted in March by the shuttle Discovery, took up the Leonardo Multi-Purpose Logistics Module. Contributed by the Italian Space Agency as a reusable cargo carrier, Leonardo carried supplies and equipment for the station and transported trash back to Earth. Astronauts also conducted space walks to prepare the ISS for attachment of the Canadian-built robot arm. Three of Discovery's crew stayed aboard the station as the Expedition Two crew, while the original Expedition One crew, which had occupied the ISS since Nov. 2, 2000, returned to Earth on the shuttle.

      A month later the shuttle Endeavour took up the Canadarm2 robot arm and Raffaello, another Italian-built logistics module. Addition of the arm (derived from the earlier Canadarm carried on the shuttle since 1981) would let the ISS crew position new modules as they arrived. Because Canadarm2 could relocate itself along rails on the ISS exterior, it could reach virtually any location where work had to be done. More capability was added in July when Atlantis took up the Joint Airlock (called Quest), which allowed the ISS crew to conduct space walks independent of the shuttle. Further outfitting was conducted in August by the crew of Discovery, which delivered Leonardo to the ISS a second time. The mission also took the Expedition Three crew to relieve the Expedition Two crew. In September, using an expendable launcher, Russia sent up a Docking Compartment; the module carried an additional docking port for Soyuz and Progress spacecraft and an airlock for space walks. Previously the ISS had only two Soyuz/Progress-style ports, which had necessitated some juggling when new craft arrived. On December 5, after a six-day delay caused by an ISS docking problem with a Progress cargo ferry, Endeavour lifted off for the space station to carry out another crew exchange and deliver cargo in Raffaello once again.

      The future of the ISS became clouded with the revelation in early 2001 that budget estimates were running $4 billion over plan. In response, NASA moved to cancel the U.S. habitat module and Crew Return Vehicle, or lifeboat, that would allow the station to house a crew of seven. With the crew restricted to three, virtually no crew time would be left for research, and the station would effectively be crippled as a science tool. At year's end NASA was negotiating with its European partners to have them pick up the responsibilities for finishing the habitat and lifeboat.

      Russia's aging space station, Mir, was deliberately destroyed when mission controllers remotely commanded a docked Progress tanker to fire rockets and lower the station into Earth's atmosphere, where it burned up on March 23. Mir, whose core module was launched in 1986 and served as the nucleus of an eventual six-module complex, had operated long beyond its planned five-year lifetime.

      China continued development of a human spaceflight capability with the second unmanned flight test of its Shenzhou (“Divine Ship” or “Magic Vessel”) spacecraft in early January. The Shenzhou design was derived from Russia's Soyuz craft. The descent module returned to Earth after a week in orbit, but the little news that was released afterward raised doubts about its success. Analysts disagreed on when China would conduct its first manned space mission but expected it to happen within a few years.

Space Probes.
      The high point of the year occurred on February 12 when the Near Earth Asteroid Rendezvous spacecraft (NEAR; officially, NEAR Shoemaker) touched down on asteroid 433 Eros, becoming the first spacecraft to land on a small body. NEAR had been orbiting Eros since Feb. 14, 2000, while taking thousands of video images and laser rangefinder readings to map the asteroid in detail. As the spacecraft ran low on fuel, controllers moved it into a lower orbit that let it collide gently with the surface of the rotating rock—a “soft” hard landing, a task for which it was not designed—and gather data on the surface. (See Astronomy (Mathematics and Physical Sciences ).)

      NASA launched the 2001 Mars Odyssey spacecraft on April 7 on a mission to study Mars from orbit and serve as a communications relay for U.S. and international landers scheduled to arrive in 2003 and 2004. On October 23 Mars Odyssey entered into a Mars orbit, where it spent the next several weeks using the Martian atmosphere as a brake to reshape its orbit for a 917-day mapping mission. Visible-light, infrared, and other instruments would collect data on the mineral content of the surface, including possible water locations, and the radiation hazards in the orbital environment.

      The Cassini mission to Saturn, which carried the European-built Huygens probe designed to explore Saturn's moon Titan, continued toward its goal following a trajectory-assist flyby of Jupiter in late 2000 and early 2001 and returned images in conjunction with the Galileo spacecraft orbiting Jupiter. Cassini was to arrive at Saturn in 2004. Although finished with its official primary and extended missions, Galileo continued to operate during the year with additional flybys of Jupiter's moons Callisto and Io.

      NASA's Deep Space 1, launched in October 1998, made a final plunge past a comet before ending its extended mission in December. The probe was designed to demonstrate several new technologies in the space environment, including an ion engine. After completing its primary mission in 1999, it was kept operational to allow it to fly within 2,200 km (1,400 mi) of the nucleus of Comet Borrelly, which it imaged in impressive detail.

      NASA's Microwave Anisotropy Probe (MAP) was launched on June 30 into a temporary Earth orbit and later moved to its permanent station in space about 1.5 million km (930,000 mi) from Earth, where it would use a pair of thermally isolated microwave telescopes to map small variations in the background radiation of the universe. These irregularities, discovered by the Cosmic Background Explorer (launched 1989), were believed to correspond to density differences in the early universe that gave rise to today's galaxies. NASA launched the Genesis probe on August 8 to gather 10–20 micrograms of particles of the solar wind. The material would be captured on ultrapure collector arrays exposed for more than two years in space and then returned to Earth for analysis in 2004. The collected particles could provide clues to the composition of the original nebula that formed the solar system.

Unmanned Satellites.
      On February 20 Russia launched Sweden's Odin satellite, which carried a 1.1-m (43-in) radio telescope as its main instrument. Using two separate operating modes, the dual-mission craft was designed to observe radiation from a variety of molecular species to elucidate ozone-depletion mechanisms in Earth's atmosphere and star-formation processes in deep space. The Ukrainian-built Coronas-F satellite, launched by Russia on July 31, carried X-ray, radio, and particle instruments to study solar activity.

      Other launches included the Geosynchronous Lightweight Technology Experiment (GeoLITE; May 18), an advanced technology demonstration satellite carrying experimental and operational communications equipment for the U.S. military, and a twin payload (December 7) comprising Jason-1, a French-U.S. ocean-surface topography satellite designed as a follow-on to the highly successful TOPEX/Poseidon satellite launched in 1992, and the Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite, which would study the effects of the Sun and human activity on Earth's middle and upper atmosphere.

Launch Vehicles.
      NASA's plans to reduce the cost of getting payloads to orbit were set back by the cancellation of two high-profile reusable launch vehicle (RLV) projects. The X-33 subscale test craft was to have been a technology demonstrator for a larger single-stage-to-orbit VentureStar RLV. The aircraft-launched X-34 RLV test rocket would have demonstrated technologies for low-cost orbiting of smaller payloads. Both projects ran into technical problems that led NASA to decide that further investment would not save either project. In their place NASA set up the Space Launch Initiative to focus on advancing individual technologies rather than complete systems while continuing to pursue a next-generation RLV.

      Boeing's new Delta IV launcher moved toward its first planned flight in 2002 with the delivery in 2001 of the first common booster core to Cape Canaveral, Florida, and successful ground firing tests of its new RS-68 hydrogen-oxygen liquid-fueled engine. The Delta IV family would be able to boost payloads of 8,000–23,000 kg (17,600–50,600 lb) into low Earth orbit. India carried out the first successful launch of its Geosynchronous Satellite Launch Vehicle on April 18 and thereby took an important step closer to entering the commercial space market. On August 29 Japan's National Space Development Agency launched its first H-2A rocket, a revamped version of the troubled H-2 that was intended to compete with Europe's Ariane launcher and support Japan's partnership in the ISS. The H-2 family used a liquid-hydrogen–fueled first stage and twin solid rocket boosters. On September 29 NASA and the state of Alaska inaugurated a new launch complex on Kodiak Island with the successful launch of the Kodiak Star payload (comprising four small satellites) by an Athena I launcher. The Kodiak location, which faced south across the open Pacific Ocean, was ideal for launching satellites into a variety of polar (north-south) orbits.

Dave Dooling

▪ 2001

Introduction

Mathematics
      In August 2000 the American Mathematical Society convoked a weeklong meeting in Los Angeles devoted to “Mathematical Challenges of the 21st Century.” The gathering featured 30 plenary speakers, including eight winners of the quadrennial Fields Medal, a distinction comparable to a Nobel Prize. In assembling at the start of the new century, the participants jointly undertook a task analogous to one accomplished by a single person 100 years earlier. At the Second International Congress of Mathematicians in Paris in August 1900, the leading mathematician of the day, David Hilbert of the University of Göttingen, Ger., had set out a list of 23 “future problems of mathematics.” The list included not only specific problems but also whole programs of research. Some of Hilbert's problems were completely solved in the 20th century, but others led to prolonged, intense effort and to the development of entire fields of mathematics.

      The talks in Los Angeles included topics of applied mathematics that could not have been imagined in Hilbert's day—for example, the physics of computation, the complexity of biology, computational molecular biology, models of perception and inference, quantum computing and quantum information theory, and the mathematical aspects of quantum fields and strings. Other topics, such as geometry and its relation to physics, partial differential equations, and fluid mechanics, were ones that Hilbert would have found familiar. Just as Hilbert could not have anticipated all the themes of mathematical progress for 100 years into the future, mathematicians at the 2000 conference expected that the emphases within their subject would be reshaped by society and the ways that it applied mathematics.

      The reputation and cachet of Hilbert, together with the compactness of his list, were enough to spur mathematical effort for most of the 20th century. On the other hand, major monetary rewards for the solution of specific problems in mathematics were few. The Wolfskehl Prize, offered in 1908 for the resolution of Fermat's last theorem, amounted to $50,000 when it was awarded in 1995 to Andrew Wiles of Princeton University. The Beal Prize of $50,000 was offered in 1998 for the proof of the Beal conjecture—that is, apart from the case of squares, no two powers of integers sum to another power, unless at least two of the integers have a common factor. Unlike Nobel Prizes, which include a monetary award of about $1 million each, the Fields Medal in mathematics carried only a small award—Can$15,000, or about U.S. $9,900.

      A major development in 2000 was the offer of $1 million each for the solution of some famous problems. In March, as a promotion for a fictional work about a mathematician, publishers Faber and Faber Ltd. and Bloomsbury Publishing offered $1 million for a proof of Goldbach's conjecture—that every even integer greater than 2 is the sum of two prime numbers. The limited time (the offer was to expire in March 2002) would likely be too short to stimulate the needed effort.

      More perduring prizes were offered in May by the Clay Mathematics Institute (CMI), Cambridge, Mass., which designated a $7 million prize fund for the solution of seven mathematical “Millennium Prize Problems” ($1 million each), with no time limit. The aim was to “increase the visibility of mathematics among the general public.” Three of the problems were widely known among mathematicians: P versus NP (are there more efficient algorithms for time-consuming computations?), the Poincaré conjecture (if every loop on a compact three-dimensional manifold can be shrunk to a point, is the manifold topologically equivalent to a sphere?), and the Riemann hypothesis (all zeros of the Riemann zeta function lie on a specific line). The other four were in narrower fields and involved specialized knowledge and terminology: the existence of solutions for the Navier-Stokes equations (descriptions of the motions of fluids), the Hodge conjecture (algebraic geometry), the existence of Yang-Mills fields (quantum field theory and particle physics), and the Birch and Swinnerton-Dyer conjecture (elliptic curves).

      Hilbert tried to steer mathematics in directions that he regarded as important. The new prizes concentrated on specific isolated problems in already-developed areas of mathematics. Nevertheless, as was noted at the May prize announcement by Wiles, a member of CMI's Scientific Advisory Board, “The mathematical future is by no means limited to these problems. There is a whole new world of mathematics out there, waiting to be discovered.”

Paul J. Campbell

Chemistry

Organic Chemistry.
      After more than a decade of effort, University of Chicago organic chemists in 2000 reported the synthesis of a compound that could prove to be the world's most powerful nonnuclear explosive. Octanitrocubane (C8[NO2]8) has a molecular structure once regarded as impossible to synthesize—eight carbon atoms tightly arranged in the shape of a cube, with a nitro group (NO2) projecting outward from each carbon.

      Philip Eaton and colleagues created octanitrocubane's nitro-less parent, cubane (C8H8), in 1964. Later, he and others began the daunting task of replacing each hydrogen atom with a nitro group. Octanitrocubane's highly strained 90° bonds, which store large amounts of energy, and its eight oxygen-rich nitro groups accounted for the expectations of its explosive power. Eaton's team had yet to synthesize enough octanitrocubane for an actual test, but its density (a measure of explosive power)—about 2 g/cc—suggested that it could be extraordinarily potent. Trinitrotoluene (TNT), in contrast, has a density of 1.53 g/cc; HMX, a powerful military explosive, has a density of 1.89 g/cc. Eaton pointed out that the research yielded many new insights into the processes underlying chemical bonding. His group also had indications that cubane derivatives interact with enzymes involved in Parkinson disease and so could have therapeutic applications.

      Oligosaccharides are carbohydrates made of a relatively small number of units of simple sugars, or monosaccharides. These large molecules play important roles in many health-related biological processes, including viral and bacterial infections, cancer, autoimmune diseases, and rejection of transplanted organs. Researchers wanted to use oligosaccharides in the diagnosis, treatment, and prevention of diseases, but, because of the great difficulty involved in synthesizing specific oligosaccharides in the laboratory, the potential for these compounds in medicine remained unfulfilled. Conventional synthesis techniques were labour-intensive, requiring specialized knowledge and great chemical skill.

      Peter H. Seeberger and associates at the Massachusetts Institute of Technology reported the development of an automated oligosaccharide synthesizer that could ease those difficulties. Their device was a modified version of the automated synthesizer that revolutionized the synthesis of peptides. Peptides are chains of amino acids—the building blocks of antibiotics, many hormones, and other medically important substances.

      The oligosaccharide synthesizer linked together monosaccharides. It fed monosaccharide units into a reaction chamber, added programmed amounts of solvents and reagents, and maintained the necessary chemical conditions for the synthesis. Seeberger described one experiment in which it took just 19 hours to synthesize a certain heptasaccharide (a seven-unit oligosaccharide), with an overall yield of 42%. Manual synthesis of the same heptasaccharide took 14 days and had an overall yield of just 9%. Seeberger emphasized, however, that additional developmental work would be needed to transform the machine into a commercial instrument widely available to chemists.

Nuclear Chemistry.
      The periodic table of elements lays out the building blocks of matter into families based on the arrangement of electrons in each element's reactive outer electron shell. Although the table has been highly accurate in predicting the properties of new or as-yet-undiscovered elements from the properties of known family members, theorists believed that it might not work as well for extremely heavy elements that lie beyond uranium on the table. The heavier an element, the faster the movement of its electrons around the nucleus. According to Einstein's theory of relativity, the electrons in a very massive element may move fast enough to show effects that would give the element weird properties. Elements 105 and 106—dubnium and seaborgium, respectively—showed hints of such unusual behaviour, and many nuclear chemists suspected that element 107, bohrium, would exhibit a more pronounced strangeness.

      Andreas Türler of the Paul Scherrer Institute, Villigen, Switz., and co-workers reported that relativistic effects do not alter bohrium's predicted properties. Türler and associates synthesized a bohrium isotope, bohrium-267, that has a half-life of 17 seconds. It was long enough for ultrafast chemical analysis to show that bohrium's reactivity and other properties are identical to those predicted by the periodic table. How heavy, then, must an element be for relativistic effects to appear? Türler cited the major difficulty in searching for answers—the short half-lives of many superheavy elements, which often are in the range of fractions of a second, do not allow enough time for chemical analysis.

Applied Chemistry.
      Polyolefins account for more than half of the 170 million metric tons of polymers or plastics produced around the world each year. Polyolefins, which include polyethylene and polypropylene, find use in food packaging, textiles, patio furniture, and a wide assortment of other everyday products. Demand for polyolefins was growing as new applications were found and as plastics replaced metal, glass, concrete, and other traditional materials.

      Robert H. Grubbs and associates of the California Institute of Technology (Caltech) reported the development of a new family of nickel-based catalysts that could simplify production of polyolefins. The catalysts also could permit synthesis of whole new kinds of “designer” plastics with desirable properties. Existing catalysts for making plastics were far from ideal. They demanded extremely clean starting materials as well as cocatalysts in order to grow polymers properly. In addition, they did not tolerate the presence of heteroatoms—that is, atoms such as oxygen, nitrogen, and sulfur within the ring structures of the starting materials. The Caltech team's catalysts, however, did not need a cocatalyst and tolerated less-pure starting materials and heteroatoms. They could polymerize ethylene in the presence of functional additives such as ethers, ketones, esters, alcohols, amines, and water. By altering the functional groups, chemists would be able to design polymers with a wide variety of desired mechanical, electrical, and optical properties.

      Radioactive nuclear waste from weapons, commercial power reactors, and other sources was accumulating in industrial countries around the world. The waste caused concern because of uncertainty over the best way of isolating it from the environment. Nuclear waste may have to be stored for centuries just for the most dangerous radioactive components to decay. The waste-storage containers used in the U.S. had a design life of about 100 y ears, rather than the thousands of years that were required of long-term storage media. Current research into long-term storage focused on first encapsulating the waste in a radiation-resistant solid material before putting it into a container for underground entombment in a geologically stable formation.

      A research team headed by Kurt E. Sickafus of Los Alamos (N.M.) National Laboratory reported a new family of ceramic materials that appeared virtually impervious to the damaging effects of radiation. The compounds, a class of complex oxides having the crystal structure of the mineral fluorite (CaF2), could be the ideal materials in which to encapsulate and store plutonium and other radioactive wastes for long periods. Radiation gradually knocks atoms out of their normal positions in the crystalline structure of materials, which causes them to deteriorate. Sickafus's group developed a fluorite-structured oxide of erbium, zirconium, and oxygen (Er2Zr2O7) that showed strong resistance to radiation-induced deterioration. They believed that related compounds that would be even more radiation-resistant could be developed by the use of Er2Zr2O7 as a model.

      Shortly after the first synthesis of plutonium in 1940, chemists realized that the new element, which eventually would be used in nuclear weapons, could exist in several oxidation states. Evidence suggested that plutonium dioxide (PuO2) was the most chemically stable oxide. It seemed to remain stable under a wide range of conditions, including temperatures approaching 2,000 °C (about 3,600 °F). Belief in the stability of PuO2 went unchallenged for more than 50 years and led to its use in commercial nuclear reactor fuels in Russia and Western Europe and to steps toward similar use in Japan and the U.S. In addition, PuO2 was the form in which plutonium from dismantled nuclear weapons would be stored.

      John M. Haschke and associates at Los Alamos National Laboratory reported during the year that PuO2 is less stable than previously believed. Their results showed that water can slowly oxidize solid crystalline PuO2 to a phase that can contain greater than 25% of the plutonium atoms in a higher oxidation state, with gradual release of explosive hydrogen gas. This new phase, represented as PuO2+x, is stable only to 350 °C (about 660 °F). In addition, it is relatively water-soluble, which raised the possibility that plutonium that comes into contact with water in underground storage facilities could migrate into groundwater supplies.

“Green” Chemistry.
      Supercritical carbon dioxide (CO2) continued to receive attention as a possible “green solvent.” Green solvents are nontoxic compounds, environmentally friendly alternatives to the organic solvents used in many important industrial processes, including the manufacture of medicines, textiles, and plastics. Supercriticality occurs in gases such as CO2 when they are taken above specific conditions of temperature and pressure (the critical point). Supercritical CO2 has fluidlike properties somewhere between gases and liquids and a combination of desirable characteristics from both states. Although supercriticality was known to enhance the solvent capacity of CO2, supercritical CO2 remained a feeble solvent for many substances of interest. Special solubility-enhancing additives called CO2-philes and very high pressures were employed to make supercritical CO2 an industrially useful solvent, but the high cost of these measures was limiting its potential.

      Eric J. Beckman's group at the University of Pittsburgh (Pa.) reported synthesis of a series of CO2-phile compounds called poly(ether-carbonate)s that dissolve in CO2 at lower pressures and could make the use of supercritical CO2 a more economically feasible process. The compounds are co-polymers—chainlike molecules made from repeating units of two or more simpler compounds—and they can be prepared from inexpensive starting materials such as propylene oxide. Beckman found that the co-polymers performed substantially better than traditional CO2-philes, which contained expensive fluorocarbon compounds.

Michael Woods

Physics

Particle Physics.
      The standard model, the mathematical theory that describes all of the known elementary particles and their interactions, predicts the existence of 12 kinds of matter particles, or fermions. Until 2000 all but one had been observed, the exception being the tau neutrino. Neutrinos are the most enigmatic of the fermions, interacting so weakly with other matter that they are incredibly difficult to observe. Three kinds of neutrinos were believed to exist—the electron neutrino, the muon neutrino, and the tau neutrino—each named after the particle with which it interacts.

      Although indirect evidence for the existence of the tau neutrino had been found, only during the year did an international team of physicists working at the DONUT (Direct Observation of the Nu Tau) experiment at the Fermi National Accelerator Laboratory (Fermilab) near Chicago report the first direct evidence. The physicists' strategy was based on observations of the way the other two neutrinos interact with matter. Electron neutrinos striking a matter target were known to produce electrons, whereas muon neutrinos under the same conditions produced muons. In the DONUT experiment, a beam of highly accelerated protons bombarded a tungsten target, creating the anticipated tau neutrinos among the spray of particle debris from the collisions. The neutrinos were sent through thick iron plates, where on very rare occasions a tau neutrino interacted with an iron nucleus, producing a tau particle. The tau was detected, along with its decay products, in layers of photographic emulsion sandwiched between the plates. In all, four taus were found, enough for the DONUT team to be confident of the results.

      Six of the fermions in the standard model are particles known as quarks. Two of them, the up quark and the down quark, make up the protons and neutrons, or nucleons, that constitute the nuclei of familiar matter. Under the low-energy conditions prevalent in the universe today, quarks are confined within the nucleons, bound together by the exchange of particles called gluons. It was postulated that, in the first few microseconds after the big bang, however, quarks and gluons existed free as a hot jumble of particles called a quark-gluon plasma. As the plasma cooled, it condensed into the ordinary nucleons and other quark-containing particles presently observed.

      In February physicists at the European Laboratory for Particle Physics (CERN) near Geneva reported what they claimed was compelling evidence for the creation of a new state of matter having many of the expected features of a quark-gluon plasma. The observations were made in collisions between lead ions that had been accelerated to extremely high energies and lead atoms in a stationary target. It was expected that a pair of interacting lead nuclei, each containing more than 200 protons and neutrons, would become so hot and dense that the nucleons would melt fleetingly into a soup of their building blocks. The CERN results were the most recent in a long quest by laboratories in both Europe and the U.S. to achieve the conditions needed to create a true quark-gluon plasma. Some physicists contended that unambiguous confirmation of its production would have to await results from the Relativistic Heavy Ion Collider (RHIC), which went into operation in midyear at Brookhaven National Laboratory, Upton, N.Y. RHIC would collide two counterrotating beams of gold ions to achieve a total collision energy several times higher—and thus significantly higher temperatures and densities—than achieved at CERN.

Solid-State Physics.
      New frontiers in solid-state physics were being opened by the development of semiconductor quantum dots. These are isolated groups of atoms, numbering approximately 1,000 to 1,000,000, in the crystalline lattice of a semiconductor, with the dimensions of a single dot measured in nanometres (billionths of a metre). The atoms are coupled quantum mechanically so that electrons in the dot can exist only in a limited number of energy states, much as they do in association with single atoms. The dot can be thought of as a giant artificial atom having light-absorption and emission properties that can be tailored to various uses. Consequently, quantum dots were being investigated in applications ranging from the conversion of sunlight into electricity to new kinds of lasers. Researchers at Toshiba Research Europe Ltd., Cambridge, Eng., and the University of Cambridge, for example, announced the development of photodetectors based on quantum-dot construction that were capable of detecting single photons. Unlike present single-photon detectors, these did not rely on high voltages or electron avalanche effects and could be made small and robust. Applications could include astronomical spectrosopy, optical communication, and quantum computing.

Lasers and Light.
      Lasers had become increasingly powerful since the first one was demonstrated in 1960. During the year independent groups of physicists at the Lawrence Livermore National Laboratory, Livermore, Calif., and the Rutherford Appleton Laboratory, Chilton, Eng., reported using two of the world's most powerful lasers to induce fission in uranium nuclei. Each laser, the Petawatt laser in the U.S. and the Vulcan laser in England, could deliver a light pulse with an intensity exceeding a quintillion (1018) watts per square centimetre. In both experiments the powerful electric field associated with the laser pulse accelerated electrons nearly to the speed of light over a microscopic distance, whereupon they collided with the nuclei of heavy atoms. In decelerating from the collisions, the electrons shed their excess energy in the form of energetic gamma rays, which then struck samples of uranium-238. In a process called photonuclear fission, the gamma rays destabilized some of the uranium nuclei, causing them to split. Although laser-induced fission would not seem to be a practical source of nuclear energy (more energy is needed to power the laser than is released in the fission process), the achievements improved the prospects of using lasers to induce and study a variety of nuclear processes.

      A development of definite practical significance was reported by scientists at Lucent Technologies's Bell Laboratories, Murray Hill, N.J., who devised the first electrically powered semiconductor laser based on an organic material. Their feat could open the way to the development of cheaper lasers that emit light over a wide range of frequencies, including visible colours. Conventional semiconductor lasers, which were used in a vast array of applications from compact-disc players to fibre-optic communications, were made of metallic elements that required handling in expensive facilities similar to those needed for silicon-chip manufacture and were somewhat limited in their range of colours.

      The Bell Labs organic laser employed a high-purity crystal of tetracene placed between two different kinds of field-effect transistors (FETs). When a voltage was applied to the FETs, one device sent negative charges (electrons) into the crystal, and the other created positive charges (holes, or electron vacancies). As electrons and holes combined, they emitted photons that triggered the lasing process, which resulted in a yellow-green light pulse. Despite the apparent requirement for high-purity organic crystals, refinements in manufacturing processes could eventually make organic lasers quite economical. Substitution of other organic materials for tetracene should allow a range of lasers of different colours.

      The propagation of light continued to be a topic of interest long after A.A. Michelson and E.W. Morley discovered in the 1880s that the speed of light is independent of Earth's motion through space. Their result ultimately led Albert Einstein to postulate in 1905 in his special theory of relativity that the speed of light in a vacuum is a fundamental constant. Astronomer Kenneth Brecher of Boston University carried out a rigorous test of that postulate during the year, confirming that any variation in the speed of light due to the velocity of the source, if it exists at all, must be smaller than one part in 1020. Brecher studied cosmically distant violent explosions known as gamma-ray bursts, hundreds of which were detected every year by Earth-orbiting astronomical satellites as brief pulses of high-energy radiation. He reasoned that, if the matter that emits the gamma rays in such an explosion is flying at high speed in many different directions, then any effect imposed on the speed of the radiation by the different velocities of the source would create a speed dispersion in the observed radiation coming from a burst. This dispersion would be manifested in the burst's light curve, the way that the burst brightened and dimmed over time. Analyzing the light curves from a number of these phenomena, however, Brecher found no such effect.

      Reports of two experiments had physicists debating and carefully restating the meaning of the speed of light as a fundamental speed limit, a necessary part of the theory of relativity. Anedio Ranfagni and co-workers at the Electromagnetic Wave Research Institute of the Italian National Research Council, Florence, succeeded in sending microwave-frequency radiation through air at a speed somewhat faster than that of light by modulating a microwave pulse. At the NEC Research Institute, Princeton, N.J., Lijun Wang pushed the speed of a pulse of visible light much higher than the speed of light in a vacuum by propagating it through a chamber filled with optically excited cesium gas. Such results were not necessarily in contradiction with relativity theory, but they demanded a more careful consideration of what defines the transfer of information by a light beam. If information could travel faster than the speed of light in a way that allowed it to be interpreted and used, it would, in essence, be a preview of the future that could be used to alter the present. It would violate the principle of causality, in which an effect must follow the cause.

David G.C. Jones

Astronomy
      For information on Eclipses, Equinoxes and Solstices, and Earth Perihelion and Aphelion in 2001, see Table (Earth Perihelion and Aphelion, 2001 ).

Solar System.
      In 2000 the search for places in the solar system other than Earth with conditions hospitable enough for life gained support from recent studies of images taken by NASA's Mars Global Surveyor spacecraft, which went into orbit around the planet in 1997. High-resolution photographs of some of Mars's coldest regions revealed surface features suggesting that liquid water may have flowed just beneath the Martian surface, occasionally bursting through the walls of craters and valleys to run down and form gullies like those caused by water erosion on Earth. Michael Malin and Kenneth Edgett of Malin Space Science Systems, San Diego, Calif., who reported the results, found that, of more than 50,000 photographs taken by Surveyor, some 150 revealed the presence of as many as 120 such features. Remarkably, the features were found at high Martian latitudes, where the temperature is much colder than at the planet's equator. Furthermore, from the lack of visible subsequent erosion or small craters in the vicinity, the gullies appeared to be no more than a million years old. Because of the low atmospheric pressure on Mars, any liquid water appearing on the surface should have quickly evaporated. In addition, if subsurface water was present, the cold Martian crust should have kept it in the form of solid ice. Therefore, questions were raised concerning Malin and Edgett's interpretation of the Surveyor images. Nonetheless, they sparked renewed interest in looking for life on Mars even at high latitudes.

      After a four-year trip, the Near Earth Asteroid Rendezvous (NEAR) spacecraft reached its final destination. Its target was 433 Eros, the largest of the near-Earth asteroids—i.e., asteroids that can pass inside the orbit of Mars. Arriving at Eros on February 14 (appropriately, Valentine's Day), NEAR became the first spacecraft to be placed in a gravitationally bound orbit around an asteroid. It immediately began a yearlong survey that included taking photographic images, making X-ray and gamma-ray spectroscopic measurements, conducting magnetic-field studies, and collecting other data from the object. The earliest images showed Eros to be elongated, some 33 × 15 km (about 20 × 9 mi), and riddled with craters. With a density about that of Earth's crust, Eros appeared to be a solid object, not just a gravel pile. By year's end NEAR Shoemaker (the spacecraft had been renamed to honour the late planetary scientist Eugene Shoemaker) was maneuvered to within five kilometres (three miles) of Eros, where it revealed a wealth of surface detail, including boulders as small as 1.4 m (4.6 ft) across. Taken together, the pictures and other data showed Eros to be a primitive object, seemingly unchanged since the birth of the solar system except for its surface, which was cratered and crushed into rubble by billions of years of meteoritic impacts.

      The year included a host of discoveries of new solar system objects. Astronomers using the Spacewatch telescope on Kitt Peak, Arizona, concluded that a previously reported asteroid, which they had discovered, was actually a moon of Jupiter, the 17th known. The tiny object, which revolves in orbit some 24 million km (15 million mi) from Jupiter in about two Earth years, does so in a direction opposite that of the other Jovian moons. Astronomers thus concluded that it probably was an asteroid that had been captured by Jupiter's enormous gravitational pull, rather than an original moon formed along with the planet itself. Brett Gladman of the Centre National de la Recherche Scientifique in France and an international team of astronomers, using telescopes in Chile and Hawaii, discovered four new moons for Saturn. This brought the total number of known Saturnian moons to 22, surpassing the 21 moons discovered to date for the planet Uranus. Like the recently discovered moon of Jupiter, the new moons of Saturn are small—only some 10–50 km (6–30 mi) across—and appear to have been captured. Taken together, these new discoveries should help clarify the way in which planets capture asteroids. At year's end Charles Baltay of Yale University and collaborators announced the discovery of a minor planet that orbits the Sun between Neptune and Pluto in a period of 243 years. The object, designated 2000 EB173, is about 650 km (400 mi) across, roughly a fourth the size of Pluto. Although there were at least 300 objects known to orbit in the trans-Neptunian region called the Kuiper belt, this was by far the largest other than Pluto itself.

Stars.
      The search for planets around stars other than the Sun had accelerated since they were first detected in 1995. Found by looking at the small changes that they induce in the motion of their parent stars, nine new extrasolar planets were reported in the latter part of 2000 by three independent groups of astronomers. This brought the total number discovered to date to about 50. One of the new objects, discovered by William Cochran of the University of Texas McDonald Observatory and collaborators, was the nearest extrasolar planet found to date. It revolves around the star Epsilon Eridani, which lies at a distance from Earth of only about 10.5 light-years, in an orbit that furnishes a wide angular separation distance and so may provide the best opportunity for direct observation of an extrasolar planet in the future. Another exciting extrasolar planetary discovery was one announced by a team led by Michel Mayor of Geneva Observatory. The astronomers detected a planet having a mass that may be only about 0.15 that of Jupiter, or about 50 times the mass of Earth. Furthermore, they showed that the planet is one of at least two planets orbiting the star HD 83443—only the second star other than the Sun known to have two or more planets.

      Life on Earth depends on the existence of a wide variety of chemical elements. Hydrogen is thought to have originated in the big bang, and light elements such as carbon and oxygen can be synthesized in the normal course of stellar evolution. Heavy elements up to iron have been theorized to originate only in the centres of massive stars near the end of their evolution and then be spewed into space in supernova explosions at their death. (Elements heavier than iron can be formed only during a supernova explosion itself.) Following its launch into Earth orbit in July 1999, the Chandra X-ray Observatory (named in honour of the astrophysicist Subrahmanyan Chandrasekhar) was trained on a number of supernova remnants, including Cassiopeia A (Cas A), the remnant of a star that exploded in 1680. During the year the Chandra team, after studying the Cas A observations, reported the first unequivocal detection of newly formed iron in a supernova remnant. Much to the team's surprise, however, the iron was detected in gaseous knots rapidly expanding away in the outer regions of the remnant, far beyond the regions where lighter elements uch as silicon were found. How the explosion managed to eject the iron (formed at the centre of the dying star) beyond the silicon (formed at shallower depths than the iron) remained a mystery.

Galaxies and Cosmology.
      During the year the Chandra observatory also made major contributions to studies of distant galaxies. For nearly 40 years, ever since the first X-ray detectors were flown above Earth's X-ray–absorbing atmosphere, astronomers had been puzzled by a uniform glow of X-rays coming from all directions. The radiation, with energies ranging from 1,000 to 100,000 times that of optical light, did not appear to arise from identifiable objects, and it was initially thought to be radiated by energetic particles filling space. Chandra's high-angular-resolution capability, however, allowed the radiation to be resolved into its sources. The team making the observations, headed by Richard Mushotzky of NASA Goddard Space Flight Center, Greenbelt, Md., reported that about 80% of this so-called X-ray background radiation was produced by roughly 70 million discrete sources uniformly spread over the sky. About one-third of the detected sources appeared to be galaxies lying at great distances from Earth and so were being observed as they existed in the very early universe. At the centre of each galaxy was thought to be a massive black hole accreting gas from its surroundings. As the gas fell in, it heated up and radiated X-rays. Many of these X-ray–emitting galaxies had not yet been detected at optical wavelengths, possibly because they were formed early enough in the history of the universe that their relative optical and X-ray emissions were quite different from those typically found in nearby (and, hence, older-appearing) galaxies.

      The universe is thought to have originated with a hot, explosive event—the big bang. As the universe expanded and cooled, a faint background radiation was left over, which can be detected today as microwave radiation filling the sky. Unlike the X-ray background discussed above, the microwave background radiation comes from the gas that occupied the universe before galaxies were formed. Nevertheless, at some later time that very gas coalesced to form the galaxies seen today. Therefore, the lumps or fluctuations in the density of the universe that gave rise to galaxies also should have caused fluctuations in the brightness of the cosmic microwave background. Two balloonborne experiments recently were flown high above most of Earth's obscuring atmosphere to look for these “ripples” from space. One, called Boomerang (Balloon Observations of Millimetric Extragalactic Radiation and Geophysics), was launched from the South Pole; the other, called Maxima (Millimeter Anistropy Experiment Imaging Array), was launched from Texas. Both detected intensity fluctuations in the microwave background radiation that can be attributed to primordial sound waves, or density fluctuations throughout space. These variations appeared to fit well with a model of the universe that is topologically “flat” and will expand forever, although at year's end the correct cosmological model still remained very much an open question.

Kenneth Brecher

Space Exploration
      For information on launches in support of human space flight in 2000, see the Table (Launches in Support of Human Space Flight, 2000 ).

Manned Spaceflight.
      The ongoing assembly in orbit of the International Space Station (ISS) and the beginning of its permanent human occupancy constituted the dominant story of 2000 in space exploration. In July the Russian Space Agency, using a Proton rocket, finally launched the ISS's long-awaited Zvezda service module, which had been held up for two years by political and financial problems in Russia. Its docking with the first linked pair of modules already in orbit—Zarya and Unity—allowed the U.S. to start a series of space shuttle launches to add American-built elements, which would be followed by laboratory modules from Europe and Japan. Zvezda, based on the core module for Russia's Mir space station, would act as the control centre and living quarters for initial space station crews.

      NASA conducted four space shuttle missions in support of ISS operations during the year. Most carried cargoes and crews to outfit the station. Following the addition of Zvezda, the next crucial element for the ISS was NASA's Z1 truss, which was delivered by shuttle in mid-October. Mounted on Unity, Z1 was an exterior framework designed to allow the first set of giant solar arrays and batteries to be attached to the ISS for early power. At the end of October, the first three-man crew, an American and two Russians, was launched from Russia aboard a Soyuz-TM spacecraft. They would stay for four months and be relieved by a three-person crew carried up by shuttle. From that time forward, the ISS was to be continuously occupied throughout its service life. In early December, in a series of spacewalks, shuttle astronauts successfully mounted the solar arrays to the Z1 truss and connected them electrically to the growing station. They also performed a minor repair to one blanket of solar cells that had not properly deployed. Also during the year, NASA continued its flight tests of the X-38, a demonstrator for the Crew Return Vehicle, which would be the ISS lifeboat.

      One space shuttle flight was unrelated to the ISS. Launched in February, STS-99 carried out the Shuttle Radar Topography Mission cosponsored by NASA and the National Imagery and Mapping Agency. The payload comprised a large radar antenna in the payload bay and a smaller element deployed on a 60-m (197-ft) boom; together the two devices operated in the synthetic-aperture mode to produce the effect of a much larger antenna. The mission mapped the elevation of about 80% of the world's landmass—120 million sq km (46 million sq mi)—at resolutions of 10–20 m (33–66 ft).

      Reversing its actions of the previous year to shut down the aging Mir space station, Russia entered into a leasing agreement with the Dutch-based MirCorp to reopen the station for commercial operations, plans for which included a Mir version of the Survivor TV show. Between February and October, a Soyuz-TM crew and three Progress tanker loads of supplies were sent to refurbish the station and stabilize its orbit. By year's end, however, financial support for the private venture appeared to be drying up, and Mir was scheduled for reentry in early 2001 after its 15th anniversary (the first module had been launched in February 1986).

      China continued with plans to become the third country capable of launching humans into space. At year's end it made final preparations for a second unmanned flight test of Shenzhou, a spacecraft that appeared to be based on Russia's Soyuz, although the launcher used was China's Long March 2F rocket. The first test flight had been carried out in 1999. China also announced that it was considering human missions to the Moon.

Space Probes.
      The loss in late 1999 of the Mars Polar Lander and its two onboard miniprobes badly stung NASA and forced the agency to reassess its Mars exploration strategy. The Mars Polar Lander was to land December 3 near the Martian south pole, but contact was lost during atmospheric entry and never reestablished. In March 2000 investigators reported that, because of a software fault, the onboard computer probably interpreted the jolt from the extension of the landing legs as the landing signal itself and shut off the engines prematurely, when the craft was still more than 40 m (132 ft) above the surface. Following this debacle, NASA restructured its unmanned Mars exploration program and decided to fly simpler missions based on the air-bag lander and rover technology from the highly successful Mars Pathfinder and Sojourner mission of 1997.

      Other probes in deep space fared better. The Near Earth Asteroid Rendezvous (NEAR) spacecraft settled into orbit around asteroid 433 Eros on February 14, following an opportunity missed the year before because of a software problem. This time all went well—NEAR returned a series of stunning close-up images, and ground controllers started tightening its orbit for an eventual impact with the tumbling, potato-shaped asteroid. (See Astronomy (Mathematics and Physical Sciences ), above.)

      The Galileo spacecraft, in orbit around Jupiter since late 1995, completed its official extended mission to study Jupiter's large ice-covered moon Europa, but it continued operating. Galileo data hinted at the possibility that liquid water lies under the ice plates that cover Europa, making it a potential harbour for life. NASA planned to direct Galileo to burn up in Jupiter's atmosphere rather than risk the chance of its crashing on and contaminating Europa when the spacecraft's fuel ran out. Jupiter was visited on December 30 by the Cassini mission to Saturn when the spacecraft, which had been launched in October 1997, flew by for a gravity assist.

      During the year the Stardust spacecraft, launched in early 1999, completed the first part of its mission, exposing its ultrapure dust-collection panels to capture grains of interstellar dust. Another set of panels was to collect dust grains from Comet Wild-2 in 2004. The spacecraft was scheduled to return to Earth in 2006, when it would drop its samples for a soft landing. The Ulysses international solar polar mission probe, launched in 1990, began its second passage of the Sun's south polar region late in the year, at a time in the Sun's 11-year sunspot cycle when activity was at its highest. Between 1994 and 1996 Ulysses had observed the Sun during the relatively quiescent part of its cycle. NASA's Pluto-Kuiper Express, planned as the first flyby of the only planet in the solar system not yet explored by a spacecraft, was canceled owing to rising costs and emphasis on a new mission to explore Europa.

Unmanned Satellites.
      Scientists studying the plasmas (ionized gases) that fill space inside Earth's magnetic field received two significant new tools with the launches of four of the European Space Agency's Cluster spacecraft and of NASA's Imager for Magnetopause-to-Aurora Global Exploration (IMAGE) spacecraft. The original set of Cluster spacecraft was lost in the disastrous June 1996 first launch of the Ariane 5 rocket, which veered off course and had to be destroyed. European scientists developed a new set, partly from spare components, which was launched from Kazakhstan in pairs atop Soyuz launchers on July 16 and August 9. Each of the four satellites carried an identical set of instruments to measure changes in plasma across small distances as the spacecraft flew in formation. A different view of the magnetosphere was provided by IMAGE, launched March 25, which used radio probes and special ultraviolet imager instruments to map the otherwise invisible magnetosphere as it changed during solar activity.

      The astrophysics community lost one of its Great Observatories for Space Astrophysics on June 4 when the Compton Gamma Ray Observatory was deliberately guided by NASA into a controlled reentry. Although the science payload was working perfectly, the spacecraft's attitude control system was starting to fail. Rather than risk an uncontrolled reentry and despite protests that an alternative control method was available, NASA ordered the spacecraft destroyed. The year also saw the launch of an increased number of miniature satellites. Microsats, nanosats, and picosats—ranging in mass down to less than a kilogram (about two pounds)—employed advanced technologies in electronics and other disciplines. Quite often, they were built by university students to get them involved in space activities at a relatively low cost. Space engineers expected that large numbers of small, inexpensive satellites would play a larger role in space exploration and utilization.

Launch Vehicles.
      The future of the commercial single-stage-to-orbit VentureStar Reusable Launch Vehicle (RLV) grew uncertain as its X-33 subscale demonstrator craft was almost canceled during the year. Although most of the X-33's systems—including its revolutionary aerospike engine, which achieved a record 290-second firing—had done well in development and tests, the program as a whole continued to fall behind schedule. A serious failure in late 1999 was the rupture of a lightweight composite-structure liquid-hydrogen tank. After deciding that the technology was beyond its grasp, NASA's X-33 team elected to proceed with an aluminum tank. The first of 13 test flights of the X-33 was set for 2003, about three years late. NASA's other RLV test rocket, the smaller, aircraft-launched X-34, was rolled out in 1999 and prepared for its first flight tests. It would demonstrate a number of new technologies, including a Fastrac rocket engine partly based on commercial components.

      In August Boeing Co. finally achieved success with its Delta III launcher, which had failed to orbit commercial payloads in August 1998 and May 1999. The Delta III was based on the reliable Delta II but had a wider first stage and new solid boosters. Boeing conducted the third launch, which carried a dummy satellite, to restore user confidence. The company also prepared for the first launch, scheduled for 2001, of its Delta IV, which employed a low-cost engine derived from the space shuttle's main engine. In May Lockheed Martin Corp. launched its first Atlas III, which used Russian-built rocket engines. Both the Delta IV and Atlas III were developed under the U.S. Air Force's Evolved Expendable Launch Vehicle program, which aimed to reduce space launch costs by at least 25% over current systems.

Dave Dooling

▪ 2000

Introduction

Mathematics
      The major mathematical news in 1999 was the proof of the Taniyama-Shimura conjecture. In 1993 Andrew Wiles of Princeton University proved a special case of the conjecture that was broad enough to imply Fermat's Last Theorem. (About 1630 Pierre de Fermat had asserted that there are no solutions in positive integers to an + bn = cn for n > 2.) The full conjecture had now been proved by associates and former students of Wiles: Brian Conrad and Richard Taylor of Harvard University, Christophe Breuil of the Université de Paris–Sud, and Fred Diamond of Rutgers University, New Brunswick, N.J.

      In 1955 Yutaka Taniyama of the University of Tokyo first observed a remarkable relationship between certain mathematical entities from two previously unrelated branches of mathematics. Although Taniyama could not prove that this relationship existed for all cases, his conjecture, that every elliptic curve is modular, had profound implications for reformulating certain problems, such as Fermat's Last Theorem, from one branch of mathematics to another in which different tools and mathematical structures might provide new insights. Initially, most mathematicians were skeptical of the general case, but following Taniyama's suicide in 1958, his friend and colleague Goro Shimura (now at Princeton) continued to advance the case, and Shimura's name was added: the Taniyama-Shimura conjecture.

      Elliptic curves have equations of the form y2 = ax3 + bx2 + cx + d (the name elliptic curves derives from the study of the length, or perimeter, of ellipses). One major goal of algebraic geometry is to identify their rational solutions for elliptic curves—points (x, y) on the curve with both x and y as rational numbers. For elliptic curves with rational coefficients—that is, where a, b, c, and d are rational numbers—any tangent to the curve at a rational point, or any pair of rational points on the curve, can be used to generate another rational point.

      A key question is how many generators are required for each curve in order to determine all rational solutions. One approach is to broaden the domain for x and y to include complex numbers a + bi, where a and b are real numbers and i = √(-1) , so that the curves for the equations become compact surfaces (loosely speaking, the surface contains only a finite number of pieces). Such surfaces can be classified by their topological genus, the number of holes through the surface. The equations for lines and conic sections (circles, ellipses, hyperbolas, and parabolas) have surfaces with genus 0, and such curves have either no rational points or an easy-to-describe infinite class of them. For elliptic curves, which have genus 1 (a torus, or doughnut shape), there is no easy way to tell whether there are infinitely many rational points, finitely many, or none at all.

      While direct classification of the generators of elliptic curves proved difficult, another branch of mathematics offered a promising new approach to the problem. While difficult to visualize, the numerous symmetries of modular functions produce a rich structure that facilitates analysis. Shimura had observed that the series of numbers that fully characterize a particular modular function (a special complex-valued function) corresponded exactly to the series of numbers that fully characterize a certain elliptic curve. This is where the idea began of reformulating problems involving elliptic curves into problems involving modular functions, or curves.

      A solution to the Fermat equation an + bn = cn for n > 2 would correspond to a rational point on a certain kind of elliptic curve. Gerhard Frey of the University of Saarland, Ger., had conjectured in 1985, and Kenneth Ribet of the University of California, Berkeley, proved in 1986, that such a companion curve cannot be a modular curve. Wiles, however, showed that all semistable elliptic curves (involving certain technical restrictions) are modular curves, leading to a contradiction and hence the conclusion that Fermat's last theorem is true.

      Conrad and the others extended Wiles's result to prove the full Taniyama-Shimura conjecture. In particular, they showed that any elliptic curve y2 = ax3 + bx2 + cx + d can be parametrized by modular functions; this means that there are modular functions f and g with y = f(z) and x = g(z) so that the curve has the form [f(z)]2 = a[g(z)]3 + b[g(z)]2 + c[g(z)] + d. The elliptic curve is thus a projection of a modular curve; hence, rational points on the elliptic curve correspond to rational points on the modular curve. Results proved previously for modular elliptic curves—such as how to tell if all rational points come from a single generator—now are known to apply to all elliptic curves.

Paul J. Campbell

Chemistry

Nuclear Chemistry.
      Two research groups in 1999 reported strong new evidence that the so-called island of stability, one of the long-sought vistas of chemistry and physics, does exist. The island consists of a group of superheavy chemical elements whose internal nuclear structure gives them half-lives much longer than those of their lighter short-lived neighbours on the periodic table of elements.

      Chemists and nuclear physicists had dreamed of reaching the island of stability since the 1960s. Some theorists speculated that one or more superheavy elements may be stable enough to have commercial or industrial applications. Despite making successively heavier elements beyond the 94 known in nature—up to element 112 (reported in 1996)—researchers had found no indication of the kind of significantly longer half-life needed to verify the island's existence.

      The first important evidence for comparatively stable superheavy elements came in January when scientists from the Joint Institute for Nuclear Research, Dubna, Russia, and the Lawrence Livermore (Calif.) National Laboratory (LLNL) announced the synthesis of element 114. The work was done at a particle accelerator operated by Yury Oganesyan and his associates at Dubna. Oganesyan's group bombarded a film of plutonium-244, supplied by LLNL, with a beam of calcium-48 atoms for 40 days. Fusion of the two atoms resulted in a new element that packed an unprecedented 114 protons into its nucleus. Of importance was the fact that the element remained in existence for about 30 seconds before decaying into a series of lighter elements. Its half-life was a virtual eternity compared with those of other known superheavy elements, which have half-lives measured in milliseconds and microseconds. The new element lasted about 100,000 times longer than element 112.

      Adding to Oganesyan's confidence about reaching the island of stability was the behaviour of certain isotopes that appeared as element 114 underwent decay. Some isotopes in the decay chain had half-lives that were unprecedentedly long. One, for instance, remained in existence for 15 minutes, and another lasted 17 minutes.

      In June, Kenneth E. Gregorich and a group of associates at the Lawrence Berkeley (Calif.) National Laboratory (LBNL) added to evidence for the island of stability with the synthesis of two more new elements. If their existence was confirmed, they would occupy the places for element 116 and element 118 on the periodic table. In the experiment, which used LBNL's 224-cm (88-in) cyclotron, Gregorich's group bombarded a target of lead-208 with an intense beam of high-energy krypton-86 ions. Nuclei of the two elements fused, emitted a neutron, and produced a nucleus with 118 protons. After 120 microseconds the new nucleus emitted an alpha particle and decayed into a second new element, 116. This element underwent another alpha decay after 600 microseconds to form an isotope of element 114.

      Although the lifetimes of elements 118 and 116 were brief, their decay chains confirmed decades-old predictions that other unusually stable superheavy elements can exist. If there were no island of stability, the lifetimes of elements 118 and 116 would have been significantly shorter. According to Gregorich, the experiments also suggested an experimental pathway that scientists could pursue in the future to synthesize additional superheavy elements.

Carbon Chemistry.
      Ever since 1985, when the first representative of the all-carbon molecules, called fullerenes was synthesized, researchers had speculated that these hollow, cage-shaped molecules may exist in nature. The first fullerene, C60, comprising 60 carbon atoms, was made accidentally in the laboratory as scientists tried to simulate conditions in which stars form.

      In 1994 Luann Becker, then of the Scripps Institution of Oceanography, La Jolla, Calif., and associates provided evidence for natural fullerenes when they announced detection of C60 in the Allende meteorite, which formed 4.6 billion years ago—around the time of the formation of the solar system—and which fell in Mexico in 1969. In 1999 Becker, currently of the University of Hawaii, and colleagues strengthened their case when they reported finding a range of fullerenes in a crushed sample of the meteorite, extracted with an organic solvent. Included were C60, C70, higher fullerenes in the C76–C96 range, and significant amounts of carbon-cluster molecules—possibly fullerenes—in the C100–C400 range. Becker's group speculated that fullerenes may have played a role in the origin of life on Earth. Fullerenes contained in meteorites and asteroids that bombarded the early Earth may have carried at least some of the carbon essential for life. In addition, atoms of gases contributing to the evolution of an atmosphere conducive to life may have been trapped inside the fullerenes' cagelike structure.

      Interest in fullerenes led to the 1991 discovery of elongated carbon molecules, termed carbon nanotubes, which form from the same kind of carbon vapour used to produce fullerenes. Nanotubes were named for their dimensions, which are on the nanometre scale. In the 1990s interest intensified in using nanotubes as electronic devices in ultrasmall computers, microscopic machines, and other applications.

      During the year Ray H. Baughman of AlliedSignal, Morristown, N.J., and associates reported development of nanotube assemblies that flex as their individual nanotube components expand or contract in response to electric voltages. The scientists regard the assemblies as prototype electromechanical actuators, devices that can convert electric energy into mechanical energy. The nanotube actuators have several attractive characteristics. For instance, they work well at low voltages and have high thermal stability and diamond-like stiffness. Baughman speculated that nanotubes may eventually prove superior to other known materials in their ability to accomplish mechanical work or generate mechanical stress in a single step.

Analytical Chemistry.
      The traditional optical microscope has a resolution of about one micrometre (a millionth of a metre). Electron microscopes and atomic force microscopes can achieve resolutions on the scale of nanometres (billionths of a metre). Nevertheless, researchers in cutting-edge fields such as surface science, biomaterials, thin films, and semiconductors need more than high resolutions. They have long desired a chemical microscope that not only provides good spatial resolution of samples but also allows identification of specific chemical substances present on the sample surface.

      Fritz Keilmann and Bernhard Knoll of the Max Planck Institute for Biochemistry, Martinsried, Ger., announced their successful analysis of local surface chemistry with a device that they were developing as a chemical microscope. The instrument incorporates a conventional atomic force microscope, which passes a minute probelike tip just over the surface of a sample to generate an image of its surface topography. Keilmann and Knoll, however, added a tunable carbon dioxide laser that focuses an infrared (IR) beam on the tip. As the tip moves over the sample, radiation scattered back from the sample is sent to an IR detector. By measuring changes in IR absorption, the detector can show chemical composition at specific points on the sample surface. In experiments the researchers used the device to identify chemical composition of local regions of films made from various materials, including gold on silicon and one kind of polymer imbedded in another.

Physical Chemistry.
      One of the more intriguing mysteries in materials science involves the nature of the chemical bonds in so-called high-temperature superconductors. These ceramic compounds, which conduct electricity without resistance at relatively high temperatures (below about –140° C [–220° F] for the compound with the highest known superconducting transition temperature), contain copper and oxygen bonded into planes and sometimes chains of atoms. If researchers could develop superconductors that operated at even higher temperatures, particularly near room temperature, the materials would have wide commercial and industrial applications in electrical and electronic devices. A key to their development may be an improved understanding of the details of chemical bonding in simpler copper- and oxygen-containing compounds such as copper oxides.

      An important step toward that goal was announced by John C.H. Spence and Jian Min Zuo of Arizona State University. They used a new imaging technique to obtain the clearest direct pictures ever taken of electronic bonds, or orbitals. Electronic bonds are the linkages that hold together atoms in most of the 20 million known chemical compounds. The researchers' technique used X-ray diffraction patterns from a copper oxide compound (Cu2O) to produce a composite image of the atoms and the bonds holding them together. The images confirmed theoretical predictions of the picture of orbitals in this particular compound. They also revealed new details of bonding in copper oxides that could be used to develop better superconductors.

Applied Chemistry.
      Molecular-based computers, an as-yet-unrealized dream, would use molecules of chemical compounds, rather than silicon-based transistors, as switches. They would be smaller and more powerful and have other advantages over silicon-based computers. A group of chemists and other researchers at the University of California, Los Angeles (UCLA), and Hewlett-Packard Laboratories, Palo Alto, Calif., reported a major step toward such devices with development of the first molecular-based logic gate. A logic gate is a switchlike device that is a basic component of digital circuits. The researchers used a class of molecules termed rotaxanes as molecular switches. Rotaxanes are synthetic complexes sometimes known as molecular shuttles; they consist of a ring-shaped molecule threaded by a linear molecule. The ring portion can be made to move back and forth along the thread, in a switchlike fashion, in response to light or other stimuli. The research group linked rotaxanes and molecular wires into a configuration of logic gates and showed that the switches operate. Although many challenges remained, James R. Heath of UCLA, who led the team, predicted that a chemical computer would be in operation within 10 years.

      A wide range of important commercial products—including flame retardants, disinfectants, antiviral drugs, and antibacterial drugs—are produced with bromination reactions. These reactions involve the addition of atoms of bromine to a molecule to produce a bromine compound. They typically require use of elemental bromine, a dark reddish-brown liquid that is toxic and difficult to handle.

      Pierre Jacobs and associates of the Catholic University of Louvain, Belg., and the Free University of Brussels reported development of a new catalyst that permits an alternative and more benign bromination. Their tungstate-exchanged layered double hydroxide catalyst is highly efficient and inexpensive and works under mild reaction conditions. Most important, it uses bromides, rather than elemental bromine, and thereby eliminates the health and environmental hazards of traditional brominations. The catalyst also has important advantages over another alternative approach to bromination, which uses a bromoperoxidase enzyme.

Michael Woods

Physics

Atomic and Optical Physics.
      Since 1960, when the first laser was made, applications for these sources of highly intense, highly monochromatic light have grown tremendously. What gives a beam of laser light its intensity and purity of colour is its characteristic coherence—i.e., all its radiation, which has been emitted from a large number of atoms, shares the same phase (all the components of the radiation are in step). In 1997 physicists first created the matter equivalent of a laser, an atom laser, in which in the output is a beam of atoms that exists in an analogous state of coherence, and in 1999 research groups reported significant progress in the development of atom lasers.

      The atom laser operates according to the principles of quantum mechanics. In this description of the behaviour of matter and radiation, the state of an atom is defined by a wave function, a solution of the equation developed by the Austrian quantum physicist Erwin Schrödinger to describe the wave behaviour of matter. The wavelength of this function, known as the de Broglie wavelength, defines the atom's momentum. In an atom laser the beam comprises atoms that are all described by the same wave function and have the same de Broglie wavelength. Consequently, the atoms are coherent in the same way that light is coherent in a conventional laser.

      The first step in making an atom laser is to prepare a gas of atoms in this coherent form. This was first achieved in 1995 by means of a technique for trapping atoms of rubidium and chilling them to temperatures just billionths of a degree above absolute zero (0 K, −273.15 °C, or −459.67 °F) to form a new kind of matter called a Bose-Einstein condensate (BEC). In a BEC the constituent atoms exist in the same quantum state and act as a single macroscopic “quantum blob,” having properties identical to that of a single atom.

      In the next step to an atom laser, a method is needed to allow a portion of the trapped BEC to emerge as a beam. In the case of a conventional laser, light is confined in a resonant cavity comprising two mirrors aligned face-to-face, and it is allowed to escape the cavity by making one of the mirrors partially transparent. In an atom laser, the problem of allowing atoms to leave the trap to form a beam is much more difficult because they are held in a very precisely controlled combination of magnetic and optical fields. In 1997 Wolfgang Ketterle and colleagues of the Massachusetts Institute of Technology (MIT) devised a way, based on the application of pulses of radio-frequency energy, to extract a controlled fraction of atoms from a trapped BEC of sodium atoms. The beam, which traveled downward under the influence of gravity, took the form of bursts of atoms that were all in the same quantum state.

      In 1999 two teams of physicists reported advances in techniques for extracting a beam of atoms from a trapped BEC. A U.S.–Japanese team led by William Phillips of the National Institute of Standards and Technology (NIST), Gaithersburg, Md., applied a technique known as stimulated Raman scattering to trapped sodium atoms. The coherent atoms were made to absorb a pulse of light from an external laser at one frequency and emit it at a slightly lower (less energetic) frequency. In the process the atoms gained a small amount of momentum, which gave them a “kick” out of the trap in the direction of the laser beam. By shifting the direction of the laser, the researchers were able to change the direction of the atom pulses that emerged from the trap. Theodor W. Hänch and colleagues of the Max Planck Institute for Quantum Optics, Garching, Ger., and the University of Munich, Ger., used an augmentation of the MIT technique. They began with a BEC of rubidium atoms in a very stable magnetic trap and then “punched” a small hole in the trap with a constant weak radio-frequency field. Unlike previous atom lasers, which emitted pulsed beams, this one produced a continuous beam lasting 0.1 second, the duration limited only by the number of atoms in the trap.

      Although atom lasers were in their infancy, it was possible to speculate on their applications. Importantly, because the de Broglie wavelengths of the atoms are much shorter than the wavelengths of laser light, atom lasers offered the possibility for timekeeping, microscopy, and lithography techniques that are more precise than light-based methods. Perhaps even more exciting was the prospect of atom holography, by which interfering beams of atoms would be used to build tiny solid objects atom by atom (analogous to the use of interfering light beams in conventional holography to create images). Such structures, which could be as small as nanometres (billionths of a metre) in size, would have myriad uses in electronics, biomedicine, and other fields.

      Although atom lasers were attracting much scientific attention, conventional lasers were by no means at the end of their useful development. NIST physicists in Boulder, Colo., built a laser monochromatic to 0.6 Hz (a stability of one part in 1014). Todd Ditmire and colleagues of Lawrence Livermore (Calif.) National Laboratory employed a powerful laser to demonstrate “tabletop” hot nuclear fusion; using light pulses from a laser with a peak intensity of 2×1016 w per sq cm, they fused atoms of deuterium (a form of heavy hydrogen) to produce helium-3 and a burst of neutrons. In the same laboratory Thomas Cowan and colleagues used a device called the Petawatt laser to induce nuclear fission in uranium and, at the same time, create particles of antimatter called positrons—the first time laser energy was converted into antiparticles. At the other end of the energy range, a collaboration of physicists from the University of Tokyo, the Bavarian Julius Maximilian University of Würzburg, Ger., and the University of Lecce, Italy, fabricated the first room-temperature semiconductor laser to emit light in the blue region of the spectrum.

Particle Physics.
      The hunt continued for the elusive Higgs boson, the hypothetical subatomic particle proposed by theoretical physicists as a mechanism to account for the reason that the elementary particles exhibit the rest masses that they do. The standard model, the current mathematical theory describing all of the known elementary particles and their interactions, does not account for the origin of the widely differing particle masses and requires an “invented” particle to be added into the mathematics. Confirmation of the existence of the Higgs boson would make the standard model a more complete description.

      During the year physicists working at the Large Electron-Positron (LEP) collider at CERN (European Laboratory for Particle Physics) in Geneva produced data containing tantalizing hints of the Higgs boson, but the evidence was too uncertain for a claim of discovery. In addition, theoretical calculations lowered the limits on the predicted mass of the particle such that its observation—if it exists—might be in reach of particle-collision energies achievable by the Tevatron accelerator at the Fermi National Accelerator Laboratory (Fermilab), Batavia, Ill.

      The adequacy of the standard model came under pressure as the result of data collected during the year. A number of experimental groups were searching for and measuring small asymmetries in particle properties associated with the behaviour of quantum mechanical systems under reversal of the direction of time (T) or, equivalently, under the combined operation of the replacement of each particle with its antiparticle (charge conjugation, or C) and reflection in space such that all three spatial directions are reversed (parity, or P). According to the standard model, particle interactions must be invariant—i.e., their symmetries must be conserved—under the combined operation of C, P, and T, taken in any order. This requirement, however, was coming under question as precise measurements were made of violations of the invariance of the combination of C and P (CP) or, equivalently, of T.

      Physicists working at the KTeV experiment at Fermilab measured the amount by which the decay of particles called neutral kaons (K mesons) violates CP invariance. Kaons usually decay by one of two routes—into two neutral pions or into two charged pions—and the difference in the amount of CP invariance between the two decay routes can be precisely determined. Although the magnitude of the difference found by the KTeV researchers could be made to fit the standard model if appropriate parameters were chosen, the values of those parameters fell at the edge of the range allowed by other experiments. In a related development, physicists led by Carl Weiman of NIST in Boulder measured the so-called weak charge QW of the cesium nucleus and found the value to be slightly different from that predicted by the standard model. The Fermilab and NIST results may well be early signs of physical processes lying beyond the scope of the standard model.

David G.C. Jones

Astronomy
       Earth Perihelion and Aphelion, 2000 (For information on Eclipses, Equinoxes and Solstices, and Earth Perihelion and Aphelion in 2000, see Table (Earth Perihelion and Aphelion, 2000 ).

Solar System.
      Since the mid 1990s, the exploration of Mars had been revitalized with the launch of a veritable fleet of small spacecraft designed to collect a variety of atmospheric and geologic data and to search for evidence of life. Among the space missions scheduled to begin investigating Mars in 1999 was the Mars Climate Orbiter, which was slated to broadcast daily weather images and other data for an entire Martian year of 687 days. On September 23, however, the spacecraft burned up or tore apart immediately upon entering Martian orbit. The disaster appeared to have been caused by a conflict between the use of English and metric units by two different scientific teams responsible for setting the spacecraft's trajectory.

      Pictures taken during the year by the highly successful Mars Global Surveyor (MGS) spacecraft, which went into orbit around the planet in 1997, revealed a great deal about the history of Martian geology, weather, and magnetism. Most dramatically, some of its new pictures provided the first strong evidence that water had flowed on the Martian surface, perhaps for millions of years. J.E.P. Connerney of the NASA Goddard Space Flight Center, Greenbelt, Md., and his colleagues reported from magnetometer readings aboard the MGS spacecraft that a region of Mars called Terra Sirenum is cut by a series of magnetic stripes, each about 200 km (125 mi) wide and up to 2,000 km (1,250 mi) long, with the magnetic fields in adjacent stripes pointing in opposite directions. The stripes resemble patterns found on Earth, where they were thought to have resulted from a combination of plate tectonic activity and periodic reversals of Earth's magnetic field. Although the Martian magnetic field probably always was much weaker than Earth's, the new data pointed to the presence of a planetary liquid core and an active magnetic dynamo that lasted perhaps 500 million years during the early history of Mars. If the Martian dynamo also underwent magnetic field reversals, it could account for the reversed magnetic polarity stripes observed by the MGS. (For additional information on the exploration of the solar system, see Space Exploration: Space Probes, (Mathematics and Physical Sciences ) below.)

Stars.
      The rate of discovery of planets around stars other than the Sun increased dramatically after they were first reported in 1995. By the beginning of 1999, some 20 extrasolar planets had been reported; none of them, however, were found to share the same star. During the year two groups, one led by Geoffrey Marcy of San Francisco State University and R. Paul Butler of the Carnegie Institution of Washington, D.C., and the other by Robert Noyes of the Harvard-Smithsonian Center for Astrophysics, Cambridge, Mass., independently reported evidence that the nearby sunlike star Upsilon Andromedae has three planets in orbit about it; it was the only planetary system other than our own known to date. The star, visible to the naked eye, lies some 44 light-years from Earth and was estimated to be about three billion years old, about two-thirds the age of the Sun. It had been known since 1996 to have at least one planet, but further analysis of observed variations in the motion of the star revealed the presence of the two additional planets. With planetary masses of 0.72, 2, and 4 times that of Jupiter and with the lightest planet lying much closer to the star than Mercury does to the Sun, the Upsilon Andromedae system does not closely resemble our solar system. Some scientists theorized that it may have formed by astrophysical processes quite different from those that shaped the Sun's system. Nevertheless, the discovery, which was made during a survey of 107 stars, suggested that planetary systems may be more abundant than had been thought.

      In early November Marcy, Butler, and their colleagues discovered that the motion of the star HD 209458 exhibits a characteristic wobble indicative of the presence of an orbiting planet. They brought this observation to the attention of their collaborator Greg Henry of Tennessee State University. Together, using a telescope at the Fairborn Observatory in Arizona, the astronomers reported the first detection of the transit of an extrasolar planet across the face of the star that it orbits. Independently, David Charbonneau of Harvard University and Timothy M. Brown of the High Altitude Observatory, Boulder, Colo., also detected and measured the transit across HD 209458. A 1.7% dip was seen in the star's brightness precisely at the time predicted on the basis of the observed stellar wobble. The observations indicated that the planet has a radius about 60% greater than that of Jupiter. Furthermore, because its orbital plane was known, the planet's mass could be accurately measured; it was found to be only about 63% that of Jupiter. Taken together, the findings indicated that the planet's density is only about 20% that of water. Such a low-density object likely formed far from the star and then gradually migrated inward—an evolutionary scenario quite unlike that of the planets in our own solar system.

      The $1.5 billion Chandra X-ray Observatory was carried into orbit July 23 by the space shuttle Columbia. Capable of taking X-ray photographs of the sky with unprecedented angular resolution, Chandra proved to be an immediate success, revealing for the first time a stellar object—either neutron star or black hole—at the centre of Cassiopeia A, the remnant of the most recent supernova in the Milky Way Galaxy. (See Space Exploration: Unmanned Satellites, (Mathematics and Physical Sciences ) below.)

Galaxies and Cosmology.
      Since the first announcements of their detection in the early 1970s, brief, energetic bursts of gamma rays had been reported coming from all over the sky. By the end of 1999, more than 2,500 of these mysterious bursts, usually lasting some tens of seconds, had been detected. Early in the year astronomers for the first time managed to get an optical image of a burst event shortly after it began. Because the events occur randomly in space and are so brief, it previously had been impossible to point an optical telescope at their locations quickly enough. On January 23 an event (GRB 990123) was detected by the Burst and Transient Source Experiment (BATSE), an instrument on board the Earth-orbiting Compton Gamma Ray Observatory. Within four seconds of the flash, a rough position for the event was relayed to the Robotic Optical Transient Search Experiment (ROTSE) in Los Alamos, N.M., which was operated by a team led by Carl Akerlof of the University of Michigan. The team's optical observations showed that the burst continued to brighten for another five seconds then faded away in the succeeding minutes and hours. A group of astronomers led by Sri R. Kulkarni and Garth Illingworth of the University of California, Santa Cruz, used the Keck II 10-m (394-in) telescope in Hawaii to measure a spectrum of the object. Their findings implied that the event occurred in a galaxy about nine billion light-years away. Subsequent observations by the orbiting Hubble Space Telescope (HST) revealed not only the burst's optical afterglow but also the galaxy in which it apparently occurred. If the burst radiated its energy uniformly in all directions, at its peak it was the brightest object in the universe, millions of times brighter than a typical supernova or an entire galaxy. It remained unclear what kind of event produces such bursts, although leading candidates were the merger of two objects—either neutron stars, black holes, or a combination of the two—and a hypothesized extreme version of a supernova called a hypernova.

      In the big-bang model of the universe, space expands at a rate that depends on the strength of the initial explosion, the total matter density of the universe, and the presence or absence of a quantity called the cosmological constant, a kind of energy of the vacuum. Ever since 1929, when the American astronomer Edwin Hubble presented the first detailed quantitative evidence for the expansion of the universe, scientists had tried to determine with increasing accuracy the current expansion rate, which is called Hubble's constant (H0). To determine H0, one must accurately determine the distances to galaxies (measured in units of megaparsecs [Mpc], in which a parsec is about 3.26 light-years) and their rate of recession (measured in kilometres per second). The larger the value of H0 (in units of km/sec/Mpc), the younger the universe is at present. By 1990, at the time of the launch of the HST, astronomers had determined that H0 probably was in the range of 50–100 km/sec/Mpc, corresponding to a universe 10 billion to 20 billion years old. They found this factor-of-two uncertainty to be unsatisfyingly large, however, especially in light of the independently determined age of the universe's oldest known stars—13 billion to 15 billion years. Scientists, therefore, set what was called a Key Project for the HST to determine H0 with an accuracy of 10%.

      In May 1999 Wendy Freedman of the Carnegie Observatories, Pasadena, Calif., and her collaborators on the Key Project announced their result. On the basis of their determination of the distances of 18 galaxies, they concluded that H0 has a value of 70 km/second/Mpc with an uncertainty of 10%. If correct, this result would make the universe quite young, perhaps only about 14 billion years old. Almost immediately, however, another group employing ground-based radio observations and using purely geometric arguments determined the distance to a galaxy; the results led the group to conclude that H0 is 15% larger (and, thus, the universe even younger) than that found by the Key Project researchers. Yet other groups reported smaller values of H0—about 60 km/sec/Mpc—based on other distance determinations of nearby galaxies. At year's end the age of the universe remained an open question.

Kenneth Brecher

Space Exploration
      During 1999, assembly of the International Space Station was delayed, the loss of the Mars Climate Observer cast a shadow over the interplanetary capabilities of the U.S. National Aeronautics and Space Administration (NASA), and the new Chandra X-ray Observatory started producing striking images of the high-energy universe. Astronaut Charles (“Pete”) Conrad, commander of the second manned mission to the Moon, died of injuries sustained in a motorcycle accident on July 8. (See Obituaries (Conrad, Charles, Jr. ).)

Manned Spaceflight.
      Assembly of the International Space Station was stalled through much of the year as the U.S. space shuttles were grounded because of frayed wiring and other problems, and the Russian Space Agency consistently failed to keep to its production schedule for the Service Module needed to maintain the station's orbit and serve as crew quarters. The first two modules, Zarya (“Dawn”) from Russia and Unity from the U.S., had been orbited and joined in 1998. The station was visited once during the year by the U.S. space shuttle Discovery (May 27–June 6), which carried two metric tons of supplies.

      The only other shuttle mission of the year, that of Columbia (July 23–27), launched the Chandra X-Ray Observatory. The mission experienced a rocky start when controllers for two of three main engines failed just seconds after liftoff. Backup controllers took over. Columbia then went into an orbit lower than planned. Inspections after landing revealed a number of frayed wires between the liner of the payload bay. The wires, running from the crew compartment to the engines and other components, had been damaged by ground crews, perhaps years earlier, and gradually had deteriorated further. All four orbiters were grounded for several months of repairs. The engine problem was attributed to a small repair pin that was blown from the combustion chamber and then punctured several small hydrogen coolant lines. This allowed liquid hydrogen, also used as fuel, to leak from the engine during ascent.

      Russia flew two missions to the aging Mir space station, with Soyuz TM-28 (returned February 28) and Soyuz TM-29 (February 20–August 28). The latter was sent to finish closing the station and to prepare it for destruction during reentry to the Earth's atmosphere in early 2000.

      An interesting footnote to history was written when Liberty Bell 7 was located by a salvage team on May 1 and recovered on July 20. It was the only manned spacecraft to have been lost at the end of a successful mission, Virgil I. (“Gus”) Grissom's suborbital flight on July 21, 1961, when its hatch accidentally jettisoned after splashdown in the Atlantic Ocean.

Space Probes.
      The loss of NASA's Mars Climate Orbiter—launched Dec. 11, 1998—at the moment it was expected to settle into Mars orbit on September 23 stunned a planetary community that had become accustomed to near-perfect navigation to Mars by the Jet Propulsion Laboratory. A failure to convert English units to metric properly had resulted in a subtle accumulation of errors that caused the probe to be lower than estimated when it arrived at Mars. Consequently, the probe apparently entered the atmosphere at too deep a level and burned up, rather than entering gradually and using the atmosphere in a series of braking maneuvers.

      The loss hampered but did not seriously degrade the mission of the Mars Polar Lander, launched Jan. 3, 1999. It landed at Mars's south polar region on Dec. 3, 1999, an 11-month cruise. The four-metre-wide, one-metre-tall (1 m = 3.3 ft) craft landed on three legs after descending by aerobraking, parachute, and landing rockets. It was equipped with a two-metre-long robot arm to scoop up and analyze the chemistry of Martian soil. Water would be detected by heating samples and analyzing the volatile substances that boiled off. Two one-metre-long Deep Space 2 probes were fired into the surface, also to look for traces of water (at depths equivalent to 100,000 years old). The Mars Global Surveyor completed a series of aerobraking maneuvers into its planned orbit on Feb. 4, 1999, and started its primary mapping mission on March 8.

      The first U.S. spacecraft to touch the Moon since 1972 did so in a spectacular way when Lunar Prospector, launched in 1998, was deliberately crashed into a crater in the south polar region on July 31 by using the last of its propellant. Telescopes on and around Earth watched for spectral signatures unique to water but found none. Other data from Lunar Prospector, though, provided strong indications that water was present.

      Two probes embarked on missions to explore small planetary bodies. Deep Space 1 (launched Oct. 24, 1998) was propelled by ion thrusters that used electrical charges to repel its exhaust fluid. The mission was primarily a demonstration of that and other advanced technologies, such as autonomous navigation, that were to be employed on future missions. Deep Space 1 flew past asteroid Braille on July 29, 1999. Although the probe was pointed in the wrong direction and did not obtain the high-resolution images scientists wanted, the mission was an overall success. Its primary mission ended on September 18 with a flyby of asteroid 1992 KD.

      On Feb. 7, 1999, NASA launched Stardust, a mission to collect cometary dust from Comet Wild-2, a relatively fresh comet, in early 2004 and interstellar dust from within the solar system before and after the comet encounter (separate collectors would be used). It would return to Earth in 2006. The other small-body mission, the Near Earth Asteroid Rendezvous (NEAR) mission, continued toward a meeting with asteroid 433 Eros following a navigational problem that postponed the original rendezvous.

      Nearing the end of its life was the Galileo spacecraft, which had been orbiting Jupiter since 1995. Despite having a jammed high-gain antenna, Galileo returned dozens of stunning images of Jupiter and its larger moons, making at least 25 flybys of Europa, Callisto, Ganymede, and Io (seven in 1999). The extended Europa Mission formally ended Dec. 31, 1999.

Unmanned Science Satellites.
      The premier unmanned satellite launch of the year was the Chandra X-Ray Observatory. Formerly called the Advanced X-Ray Astrophysics Facility, it was renamed in honour of Indian-American astrophysicist Subrahmanyan Chandrasekhar. Chandra was equipped with a nested array of mirrors to focus X-rays on two cameras that could produce highly detailed images or high-resolution spectra of sources emitting X-rays. Soon after entering orbit, Chandra started returning stunning images of the pulsar in the Crab Nebula, the Cassiopeia A supernova remnant (and an apparent X-ray source that had previously eluded detection), and other bodies. Unexpected radiation degradation affected one instrument, but scientists devised a procedure to prevent further damage.

      Germany's ABRIXAS (A Broad-Band Imaging X-Ray All-Sky Survey; launched April 29) was designed to map up to 10,000 new X-ray sources with a cluster of seven X-ray telescopes. The American Far Ultraviolet Spectroscopic Explorer (June 24) was designed to study hydrogen–deuterium (heavy hydrogen) ratios in intergalactic clouds and interstellar clouds unaffected by star formation in an effort to determine the H–D ratio as it was shortly after the big bang.

      The commercial American Ikonos 2 satellite (September 24) opened the field of high-resolution (one-metre) imaging, previously available only to the military. Images of virtually any part of the Earth could be purchased; the U.S. government reserved the right to block views of sensitive areas, even though it could not control the images provided by non-U.S. firms.

      Low-cost electronics and other factors made possible a number of educational and amateur satellite opportunities. They included South Africa's Sunsat (February 23), Russia's Sputnik Jr. 3 (April 16), Britain's UOSAT 12 (April 21), and the U.S.'s Starshine (June 5), a sphere with 878 48-cm (18.7-in)-diameter mirrors polished by children from the U.S., Zimbabwe, Pakistan, and 15 other countries to enable tracking by 25,000 high-school students throughout the world.

Launch Vehicles.
      The launch industry was troubled by several expensive failures, including two U.S. military Titan 4B rockets, one carrying a missile early-warning satellite (April 9) and the other a communications satellite. Russia's Proton launcher also experienced two failures (July 5 and October 27), which cast doubt on its reliability in supporting the International Space Station. (The service module was to be launched on a Proton.)

      The Roton rotary rocket started limited flight tests on July 23, with a two-man crew piloting a test model in short, low-altitude flights. Roton was a single-stage-to-orbit craft with a unique recovery system. It deployed a four-blade helicopter rotor after reentry. Rocket exhaust ducted through the rotor tips rotated the blades and thus provided lift and control during approach and landing. The crew rode in a small escape capsule between the fuel and oxidizer tanks and next to a payload bay designed to accommodate midsize unmanned satellites.

      Another unique launch system making its debut was the international Sea Launch venture (its ownership was Russian, Ukrainian, American, and Norwegian). This employed Odyssey, a launch facility converted from a self-propelled offshore petroleum platform, and a control ship that doubled as the integration facility. The key advantage was that the ship could be positioned near the Equator, where the Earth's rotation is greater and thus would give the rocket more of a running start. The Earth's geography makes few such land sites available. Sea Launch also eliminated the need for maneuvers that consume fuel in order to align a satellite's orbit with the Equator, as is needed for communications satellites in geostationary orbit. Sea Launch performed well in its first two flights. On March 28 it launched a dummy spacecraft simulating a popular Hughes Aircraft model. Its first paying customer, DirecTV-1R, was launched October 9.

Dave Dooling

▪ 1999

Introduction

Mathematics
      Major mathematical news in 1998 included the claim that a nearly 400-year-old conjecture finally had been proved. In 1611 the German astronomer and mathematician Johannes Kepler concluded that the manner in which grocers commonly stack oranges—in a square-based pyramid with each layer of oranges sitting in a square grid centred above the holes in the layer below—gives the densest way to pack spheres in infinite space. (Packing with oranges in each layer in a hexagonal grid is equally dense.) Thomas Hales of the University of Michigan, after 10 years of work, announced a proof of the conjecture. Nearly every aspect of the proof relied on computer support and verification, and supporting the 250-page written proof were three gigabytes of computer files. Mathematicians would need time to determine if the proof was complete and correct.

      Kepler was set on the sphere-packing problem by correspondence with Thomas Harriot, an English mathematician and astronomer and an assistant to Sir Walter Raleigh. Raleigh wanted a quick way to determine the number of cannonballs in a pile with a base of any shape. Harriot prepared tables for Raleigh and wrote to Kepler about the problem in connection with their discussion of atomism. In 1831 the German mathematician Carl Friedrich Gauss showed that face-centred cubic packing, as the orange packing is known to mathematicians, could not be less dense than other lattice packings, those in which the centres of the spheres lie on a regular grid. Some nonlattice packings, however, are almost as efficient, and in some higher dimensions the densest packings known are nonlattice packings. It was thus possible that a denser nonlattice packing might exist for three dimensions.

      Hales's work built on that of the Hungarian mathematician Laszlo Fejes-Toth, who in 1953 reduced the task of settling the conjecture to that of solving an enormous calculation. Hales formulated an equation in 150 variables that described every conceivable regular arrangement of spheres. This equation derived from a mathematical decomposition of the star-shaped spaces (decomposition stars) between the spheres. Hales had a computer classify the decomposition stars into 5,000 different types. Although each type required the solving of a separate optimization problem, linear programming methods allowed the 5,000 to be reduced to fewer than 100, which were then done individually by computer. The proof involved the solving of more than 100,000 linear programming problems that each included 100-200 variables and 1,000-2,000 constraints.

      The analogue of the Kepler problem in two dimensions is the task of packing circular disks of equal radius as densely as possible. The hexagonal arrangement in which each disk is surrounded by six others—a lattice packing—was shown by Gauss to be the densest packing. For dimensions higher than three, it was not known if the densest lattice packings are the densest packings.

      The mathematics of sphere packing is directly related to issues of reliable data transmission, including data compression and error-correcting codes, in such applications as product bar coding, signals from spacecraft, and music encoded on compact discs. Code words can be considered to correspond to points in a space whose dimension is the common length of a code word. The "Hamming distance" (named for pioneer coding theorist Richard Hamming) between any two given words, which can be code words or words to which they can become distorted by errors in transmission, is the number of positions in which the words differ. Around each code-word point, a sphere of radius r includes all words that differ in at most r places from the code word; these words are the distortions of the code word that would be corrected to the code word by the error-correcting process. The error-detecting and error-correcting capabilities of a code depend on how large r can be without spheres of different code words becoming overlapped; in the case of an overlap, one would know that an error had occurred but not to which code word to correct it.

      An analogy is the task of packing into a box of fixed size a fixed number of same-size glass ornaments (the total number of code words) wrapped in padding, with the requirement that each ornament be padded as thickly as possible. This, in turn, means that the padded ornaments must be packed as closely as possible. Thus, efficient codes and dense packings of spheres (the padded ornaments) go hand in hand. The longer the code words are, the greater is the dimension of the space and the farther apart code words can be, which makes for greater error-detection and error-correction capability. Longer code words, however, are less efficient to transmit. A longer code word corresponds to using a bigger box to ship the same number of ornaments.

      It remained to be seen whether Hales's result or the methods he used would lead to advances in coding theory. Mathematicians generally were skeptical of the value of proofs that relied heavily on computer verification of individual cases without offering new insights into the surrounding mathematical landscape. Nevertheless, Hales's proof, if recognized as correct, could inspire renewed efforts toward a simpler and more insightful proof.

PAUL J. CAMPBELL

Chemistry

Physical Chemistry.
      Hydrogen is the lightest, simplest, and most plentiful chemical element. Under ordinary conditions it behaves as an electrical insulator. Theory predicts that hydrogen will undergo a transition to a metal with superconducting properties if it is subjected to extreme pressures. Until 1998, attempts to create metallic hydrogen in the laboratory had failed. Those efforts included experiments making use of diamond anvil cells that compressed hydrogen to 340 GPa (gigapascals) at room temperature, about 3.4 million times atmospheric pressure. Some theorists predicted that such pressures, which approach those at Earth's centre, should be high enough for the insulator-metal transition to occur.

      Robert C. Cauble and associates of the Lawrence Livermore National Laboratory, Livermore, Calif., and the University of British Columbia reported the first experimental evidence for the long-awaited transition. They used a powerful laser beam to compress a sample of deuterium, an isotope of hydrogen, to 300 GPa. The laser simultaneously heated the deuterium to 40,000 K (about 70,000° F). In the experiments the sample began to show signs of becoming a metal at pressures as low as 50 GPa, as indicated by increases in its compressibility and reflectivity. Both characteristics are directly related to a substance's electrical conductivity. Cauble's group chose deuterium because it is easier to compress than hydrogen, but they expected that hydrogen would behave in the same way. Confirmation of the theory would do more than provide new insights into the fundamental nature of matter. It would lend support to an idea, proposed by astronomers, that giant gas planets like Saturn and Jupiter have cores composed of metallic hydrogen created under tremendous pressure.

      Chemists long had sought methods for glimpsing the intermediate products that form and disappear in a split second as ultrafast chemical reactions proceed. These elusive reaction intermediates can provide important insights for making reactions proceed in a more direct, efficient, or productive fashion. A. Welford Castleman, Jr., and associates of Pennsylvania State University reported development of a new method to "freeze" chemical reactions on a femtosecond (one quadrillionth of a second) time scale. Their technique involved use of a phenomenon termed a Coulomb explosion to arrest a reaction and detect intermediates. A Coulomb explosion occurs when a particle, such as a molecule, has acquired many positive or negative electric charges. The like charges produce tremendous repulsive forces that tear the particle apart. A Coulomb explosion that occurs during a chemical reaction instantly halts the reaction. Fragments left behind provide direct evidence of the intermediates that existed in the split second before the explosion.

      Castleman's group used a pulse from a powerful laser to ionize particles, and so trigger a Coulomb explosion, in a reaction involving the dimer of 7-azaindole. (A dimer is a molecule formed of two identical simpler molecules, called monomers.) When the dimer is excited by light energy, protons (hydrogen ions) transfer from one monomer to another in the system, allowing two dimers to combine into a four-monomer molecule, or tautomer. The explosion froze this reaction, which allowed Castleman's group to determine exactly how the proton transfer occurs.

      In the 1980s physicists developed laser and magnetic techniques for trapping individual atoms at ultracold temperatures, which allowed their properties to be studied in detail never before possible. At room temperature the atoms and molecules in air move at speeds of about 4,000 km/h (2,500 mph), which makes observation difficult. Intense chilling, however, slows atomic and molecular motion enough for detailed study. Specially directed laser pulses reduce the motion of atoms, sapping their energy and creating a cooling effect. The slowed atoms then are confined in a magnetic field. Chemists have wondered for years whether laser cooling techniques could be extended to molecules and thus provide an opportunity to trap and study molecular characteristics in greater detail.

      John M. Doyle and associates at Harvard University reported a new procedure for confining atoms and molecules without laser cooling. In their experiments the researchers focused a laser on solid calcium hydride, liberating calcium monohydride molecules. They chilled the molecules with cryogenically cooled helium, reducing their molecular motion, and then confined the molecules in a magnetic trap. The technique could have important implications for chemical science, leading to new insights into molecular interactions and other processes.

Inorganic Chemistry.
      Gold is somewhat unusual among its neighbours in the periodic table of elements. Whereas the transition metals platinum and palladium, for instance, have become important industrial catalysts, gold has long been regarded to be much less active catalytically. In the past few years, however, researchers reported that gold has extraordinarily high catalytic activity when dispersed as extremely fine particles on supports such as titanium dioxide. In that form gold is active in such processes as low-temperature catalytic combustion, partial oxidation of hydrocarbons, hydrogenation of unsaturated hydrocarbons, and reduction of nitrogen oxides.

      During the year D.W. Goodman and associates at Texas A & M University at College Station reported a much-anticipated explanation for this unusual behaviour. They used scanning tunneling microscopy/spectroscopy and other techniques to study small clusters of gold atoms supported on a titanium dioxide surface. Gold's catalytic activity was found to be related to thickness of the layers, with maximum activity for clusters consisting of about 300 atoms. The findings suggested that supported clusters of metal atoms, in general, may have unusual catalytic properties as cluster size becomes smaller.

      In past research Mika Pettersson and associates of the University of Helsinki, Fin., had synthesized a number of unusual compounds consisting of an atom of the rare gas xenon (Xe) or krypton (Kr), a hydrogen atom, and an atom or chemical group possessing enough affinity for electrons to allow it to bond with the rare-gas atom. The compounds included HXeH, HXeCl, HXeBr, HXeI, HXeCN, HXeNC, HKrCl, and HKrCN. During the year the chemists added to this list with their report of the synthesis of the first known compound containing a bond between xenon and sulfur (S). The compound, HXeSH, was produced during the low-temperature dissociation of hydrogen sulfide (H2S) in a xenon matrix with ultraviolet light at specific wavelengths.

Organic and Applied Chemistry.
      Chemists have synthesized a wide variety of fullerene molecules since 1990, when the soccer-ball-shaped, 60-carbon molecule buckminsterfullerene (C60), the first member of this new family of carbon molecules, was produced in large quantities. All of the fullerene molecules structurally characterized during the period, however, have had a minimum of 60 carbon atoms. Some chemists argued that C60 was the smallest fullerene stable enough to be synthesized in bulk quantities. During the year Alex Zettl and colleagues of the University of California, Berkeley, overturned that notion with the synthesis of the "minifullerene" C36. They used the arc-discharge method, in which an electric arc across two graphite electrodes produces large quantities of fullerenes. The bonding in C36, like that in C60, comprises three-dimensional arrangements of hexagons and pentagons, with the minimum possible number of shared pentagon-pentagon bonds.

      Nuclear magnetic resonance measurements indicated that the adjacent pentagons are highly strained in the fullerene's tightly bound molecular structure. Theorists speculated that the bond strain is so severe that C36 would likely prove to be the smallest fullerene to be made in bulk quantities. The extreme strain may also turn out to enhance the molecule's superconducting properties. Like C60, C36 displays increased electrical conductivity when doped with alkali metals. Zettl speculated that C36 may prove to be a high-temperature superconductor with a higher transition temperature than that of C60.

      Polyethylene's great versatility makes it the single most popular plastic in the world. Although all polyethylene is made from repeating units of the same building-block molecule, the monomer ethylene, catalysts used in the polymerization process have dramatic effects on the physical properties of the plastic. Mixing ethylene with certain catalysts yields a polymer with long, straight, tough molecular chains termed high-density polyethylene (HDPE). HDPE is used to make plastic bottles, pipes, industrial drums, grocery bags, and other high-strength products. A different catalyst causes ethylene to polymerize into a more flexible but weaker material, low-density polyethylene (LDPE). LDPE is used for beverage-carton coatings, food packaging, cling wrap, trash bags, and other products.

      American and British chemists, working independently, reported discovery of a new group of iron- and cobalt-based catalysts for polymerizing ethylene. Experts described the discovery as one of the first fundamentally new advances in the field since the 1970s. The catalysts were as active as the organometallic catalysts called metallocenes in current use for HDPE production—in some instances more active. They also had potential for producing a wider range of polymer materials at lower cost. In addition, the iron-based catalysts were substantially more active than current materials for the production of LDPE. Maurice Brookhart of the University of North Carolina at Chapel Hill headed the U.S. research team. Vernon C. Gibson of Imperial College, London, led the British group.

      Adipic acid is the raw material needed for production of nylon, which is used in fabrics, carpets, tire reinforcements, automobile parts, and myriad other products. In the late 1990s about 2.2 million metric tons of adipic acid were produced worldwide each year, which made it one of the most important industrial chemicals. Conventional adipic acid manufacture involves the use of nitric acid to oxidize cyclohexanol or cyclohexanone. Growing interest in environmentally more benign chemical reactions, often called green chemistry, was making the traditional synthesis undesirable because it produces nitrous oxide as a by-product. Nitrous oxide was believed to contribute to depletion of stratospheric ozone and, as a greenhouse gas, to global warming. Despite the adoption of recovery and recycling technology for nitrous oxide, about 400,000 metric tons were released to the atmosphere annually. Adipic acid production accounted for 5-8% of nitrous oxide released into the atmosphere through human activity.

      Kazuhiko Sato and associates at Nagoya (Japan) University reported development of a new, "green" synthetic pathway to adipic acid. It eliminated production of nitrous oxide and the use of potentially harmful organic solvents. Their alternative synthesis used 30% hydrogen peroxide to oxidize cyclohexene directly to colorless crystalline adipic acid under solvent- and halide-free conditions. Sato reported that the process was suitable for use on an industrial scale and could be the answer to the worldwide quest for a "green" method of synthesizing adipic acid. The major barrier was cost—hydrogen peroxide was substantially more expensive than nitric acid—but stricter environmental regulations on nitrous oxide emission could make the new synthetic process more attractive.

MICHAEL WOODS

Physics

Particle Physics.
      Researchers in 1998 reported the most convincing evidence to date that the subatomic particle called the neutrino has mass. The standard model, science's central theory of the basic constituents of the universe, involves three families of observable particles: baryons (such as protons and neutrons), leptons (such as electrons and neutrinos), and mesons. Of those particles the neutrino has been the most enigmatic. Its existence was first postulated in 1930 by the Austrian physicist Wolfgang Pauli to explain the fact that energy appeared not to be conserved in nuclear beta decay (the decay of an atomic nucleus with the emission of an electron). Neutrinos interact so weakly with other matter that they are extraordinarily difficult to observe; confirmation of their existence did not come until a quarter century after Pauli's prediction. The assumption that neutrinos are massless particles is built into the standard model, but there is no theoretical reason for them not to have a tiny mass.

      Three types of neutrinos were known: electron neutrinos, emitted in beta decay; muon neutrinos, emitted in the decay of a particle known as a pion and first observed in 1962; and tau neutrinos, produced in the decay of an even more exotic particle, the tau. Although the existence of the tau neutrino had been supported by indirect evidence, it was only during 1998 that the particle was reported to have been observed for the first time. Physicists at the Fermi National Accelerator Laboratory (Fermilab), Batavia, Ill., carried out experiments in which they smashed a dense stream of protons into a tungsten target. Less than one collision in 10,000 produced a tau neutrino, but after months of taking data the Fermilab team claimed to have seen direct effects of at least three of these elusive particles.

      That finding was overshadowed, however, by results from Super-Kamiokande, an experimental effort involving an international collaboration of physicists from 23 institutions and headed by the University of Tokyo's Institute for Cosmic Ray Research. The mammoth Super-Kamiokande detector, which was situated 1,000 m (3,300 ft) below the surface in a Japanese zinc mine to minimize the effect of background radiation, comprised a 50,000-ton tank of ultrapure water that was surrounded by 13,000 individual detector elements. Super-Kamiokande was able to observe electron neutrinos and muon neutrinos (but not tau neutrinos) that are produced continually in Earth's atmosphere by cosmic ray bombardment from space. Even that huge detector, however, was able to detect only one or two such neutrinos per day and required months of operation to accumulate sufficient data.

      In 1998 Super-Kamiokande physicists reported a dramatic result. Whereas they found the rate of detection of electron neutrinos to be the same in all directions, they detected significantly fewer muon neutrinos coming upward through Earth than coming directly downward. Theory predicts that, if neutrinos have mass, muon neutrinos should transform, or oscillate, into tau neutrinos with a period depending on the mass difference between the two types. Those neutrinos traveling the longer distance through Earth to the detector had more time to decay. Results suggested a mass difference equal to one ten-millionth of the mass of the electron, giving positive evidence of the existence of neutrino mass and a lower bound for its value.

      The result had two exciting consequences. First, because a nonzero mass for the neutrino is a phenomenon lying beyond the framework of the standard model, it may be the first glimpse of a possible new "grand unified" theory of particle physics that transcends the limitations of the current theory. Second, neutrinos with mass may be a solution to a major problem in cosmology. Present models of the universe require it to have a mass far in excess of the total mass of observable constituents. The presence in the cosmos of a total mass of billions of neutrinos may make up this deficit.

Solid-State Physics.
      In 1998 investigations of the physics of systems using single atoms and small numbers of electrons were making possible electronic devices that had been inconceivable just a few years earlier. These studies were being aided by the development of methods to manipulate single atoms or molecules with unprecedented precision and investigate their properties. In one example Elke Scheer and co-workers of the University of Karlsruhe, Ger., measured the electrical properties of a single atom forming a bridge across two conducting leads. Their achievement suggested the possibility of making even smaller and faster electronic switching devices.

      In another development physicists from Yale University and Chalmers University of Technology, Göteborg, Swed., produced a variant of the field-effect transistor (FET)—a basic building block of modern computer systems—called a single-electron transistor (SET). In a FET a flow of electrons through a semiconducting channel is switched on and off by a voltage in a nearby "gate" electrode. In a SET the semiconducting channel is replaced by an insulator, except for a tiny island of semiconductor halfway along the channel. In the device's conducting mode a stream of electrons crosses the insulator by "hopping" one at a time on and off the island. Such devices were highly sensitive to switching voltages and extremely fast.

      The SET achievement was an example of the developing physics of quantum dots, "droplets" of electric charge that can be produced and confined in semiconductors. Such droplets, having sizes measured in nanometres (billionths of a metre), can contain electrons ranging in number from a single particle to a tailored system of several thousands. Physicists from Delft (Neth.) University of Technology, Stanford University, and Nippon Telegraph and Telephone in Japan used quantum dots to observe many quantum phenomena seen in real atoms and nuclei, from atomic energy level structures to quantum chaos. A typical quantum dot is produced in a piece of semiconductor a few hundred nanometres in diameter and 10 nanometres thick. The semiconductor is sandwiched between nonconducting barrier layers, which separate it from conductors above and below. In a process called quantum tunneling, electrons can pass through the barrier layers and enter and leave the semiconductor, forming the dot. Application of a voltage to a gate electrode around the semiconductor allows the number of electrons in the dot to be changed from none to as many as several hundred. By starting with one electron and adding one at a time, researchers can build up a "periodic table" of electron structures.

      Such developments were giving physicists the ability to construct synthetic structures at atomic-scale levels to produce revolutionary new electronic components. At the same time, research was being conducted to identify the atoms or molecules that give the most promising results. Delft physicist Sander J. Tans and co-workers, for example, constructed a FET made of a single large molecule—a carbon nanotube—i.e., a hollow nanometre-scale tubule of bonded carbon atoms. Unlike other nanoscale devices, the FET worked at room temperature. Future generations of electronics could well be based on carbon rather than silicon.

Condensed-Matter Physics.
      Whereas the properties of ordinary condensed gases were long familiar to physicists, quantum mechanics predicted the possibility of one type of condensate having dramatically different properties. Most condensed gases consist of a collection of atoms in different quantum states. If, however, it were possible to prepare a condensate in which all the atoms were in the same quantum state, the collection would behave as a single macroscopic quantum entity with properties identical to those of a single atom. This form of matter was dubbed a Bose-Einstein condensate after the physicists—Einstein and the Indian physicist Satyendra Bose—who originally envisaged its possibility in the early 20th century. There was no theoretical difficulty about producing such a condensate, but the practical difficulties were enormous, since it was necessary to cool a dilute gas near absolute zero (−273.15° C, or −459.67° F) in order to remove practically all its kinetic energy without causing it to condense into an ordinary liquid or solid.

      Bose-Einstein condensates were first produced in 1995, but the condensate's atoms were trapped in a magnetic "bottle," which had a distorting effect. The removal of such distortions was made possible by the development of laser cooling devices in which kinetic energy is "sucked away" from the atoms into the laser field. Using such a device, physicists at the Massachusetts Institute of Technology succeeded in 1998 in producing a condensate of 100 million hydrogen atoms at a temperature of 40 millionths of a degree above absolute zero. Such a condensate exhibited macroscopic quantum effects like those seen in superfluids, and the interactions between individual atoms could be "tuned" by means of a magnetic field.

General Relativity.
      Although Einstein's general theory of relativity is generally accepted, physicists have suggested other possible theories of gravitation. Two observations gave results in confirmation of predictions made by Einstein. One was the result of an experiment using two Lageos laser-ranging satellites and carried out by physicists from the University of Rome, the Laboratory of Spatial Astrophysics and Fundamental Physics, Madrid, and the University of Maryland. It investigated the Lense-Thirring effect, which predicts that time as measured by a clock traveling in orbit around a spinning object will vary, depending on whether the orbit is in the direction of the spin or against it. The parameter that measures the strength of the effect was found to have a value of 1.1 0.2, compared with general relativity's prediction of 1.

      A second, more dramatic prediction of general relativity was observed by a team of astronomers from the U.S., the U.K., France, and The Netherlands. According to the theory, in the same way that light can be focused by a glass lens, light from a distant luminous object can be focused by the distortion of space by a massive foreground object such as a galaxy—a phenomenon called gravitational lensing. In a special case, called an Einstein ring, the image of the light source will smear out into the shape of a perfect ring around the foreground object. Using three radio telescopes, the group zeroed in on a possible Einstein ring, after which an infrared camera on the Earth-orbiting Hubble Space Telescope imaged to reveal the complete ring—the first unambiguous case in optical and infrared light and a dazzling demonstration of Einstein's theory.

DAVID G.C. JONES

Astronomy
       Earth Perihelion and Aphelion, 1999 (For information on Eclipses, Equinoxes and Solstices, and Earth Perihelion and Aphelion, see Tables (Earth Perihelion and Aphelion, 1999 ).)

      The year 1998 brought new discoveries about astronomical objects as close as the Moon and as far away as the most distant galaxies ever detected. More planets were detected orbiting other stars, and the total number found to date reached an even dozen. Powerful bursts of gamma rays were recorded from stars within the Milky Way Galaxy and from the remotest regions of space. The universe itself appeared to be accelerating in its rate of expansion, contrary to a requirement of the most widely held theoretical model of the cosmos.

Solar System.
      Perhaps the most electrifying astronomical announcement of the year was a prediction of a close encounter of an asteroid with Earth. In early March Brian Marsden of the Harvard-Smithsonian Center for Astrophysics, Cambridge, Mass., and director of the International Astronomical Union's Central Bureau for Astronomical Telegrams announced his calculations that a 1.6-km (one-mile)-wide asteroid, 1997 XF11, discovered the previous December, would pass within 48,000 km (30,000 mi) of Earth on Oct. 26, 2028. This would be the closest known approach of a body of such size since the asteroid that was thought to have hit Earth 65 million years ago. The announcement made a powerful impression on the media, since it coincided with prerelease publicity for two major Hollywood movies, Deep Impact and Armageddon, both of which explored the consequences of the collision of a large body with modern Earth. Shortly after the original announcement, however, new orbital calculations based on 1990 "prediscovery" images of 1997 XF11 showed that Earth was not in imminent danger of a collision, with the asteroid expected to pass about 970,000 km (600,000 mi) from Earth.

      Although humans had first walked on the Moon nearly 30 years earlier, many unanswered questions remained in 1998 concerning the origin and evolution of Earth's nearest neighbour. In January NASA launched Lunar Prospector, a small orbiter that carried a bevy of instruments to measure lunar gravity, magnetism, and surface chemical composition. In March William C. Feldman of Los Alamos (N.M.) National Laboratory and his collaborators announced that the craft had detected evidence of large quantities of water lying in the sunless craters of the lunar polar regions. The water was believed to have been carried to the Moon by comet bombardments in past aeons and to have survived only because the polar craters are in permanent shadow and cold. This resource would prove to be a great resource to any future human presence on the Moon.

      Ever since Galileo Galilei first saw the rings of Saturn in the early 1600s, scientists and public alike had been fascinated by these beautiful astronomical apparitions. Beginning in the late 1970s, ring systems were discovered around the other giant gas planets in the solar system—first Uranus and then Jupiter and Neptune. The rings of Jupiter, first seen in photographs returned by the two Voyager spacecraft, are quite thin. The outermost one was shown by the Jupiter-orbiting Galileo spacecraft in 1998 to comprise two rings, dubbed gossamer rings. All of Jupiter's rings consist of very fine dust, a kind of reddish soot. Because of radiation from the Sun, these small particles should be dragged into Jupiter in a time that is short compared with the age of the solar system. How then have the rings survived? The Galileo craft sent back data providing a likely answer: the dust is replenished with new material kicked off four of Jupiter's tiny inner moons by the continuing impacts of interplanetary meteoroids.

Stars.
      Since 1992, astronomers had been detecting the presence of planets around nearby stars by finding small periodic variations in the speeds of these stars caused by the gravitational tugs of their unseen planetary companions. By the end of 1998, the discovery of 12 planets around other stars had been reported, which made the number of known extrasolar planets greater than the number of planets within the solar system. In all cases the planets are very close to their parent stars, and most have masses measured to be several times that of Jupiter. These two factors combined to produce the relatively large tugs on the parent stars that made the gravitational effects of the planets detectable.

      One of the planets detected during the year orbits the low-mass star Gliese 876, which at a distance of 15 light-years is one of the Sun's nearest neighbours. Geoffrey W. Marcy of San Francisco State University and his collaborators reported that the planet has a 61-day orbital period, placing it closer to Gliese 876 than Mercury is to the Sun. In spite of this proximity, the surface temperature of the planet is an estimated −75° C (−135° F). Calculations suggested that water might exist beneath the planet's surface in the form of liquid drops, one of the necessary conditions for life as it is known on Earth. In a second finding Susan Terebey of Extrasolar Research Corp., Pasadena, Calif., and her collaborators reported the first image of a possible extrasolar planet. Using the Hubble Space Telescope's Near Infrared Camera and Multi-Object Spectrometer, they detected a dim object in the constellation Taurus, about 450 light-years from Earth. Designated TMR-1C, the object appeared to be connected to two young stars by a gaseous bridge. At year's end its interpretation as a planet ejected by one of the stars was still being hotly debated.

      Since the early 1970s sudden bursts of celestial gamma rays had been detected by instruments aboard Earth-orbiting and interplanetary spacecraft. Without seeing obvious optical counterparts, however, astronomers had found it difficult to say with certainty where the bursts were coming from. In 1997, following the discovery of X-ray and optical counterparts for several of the events, it was at last possible to argue convincingly that most of the gamma-ray burst events come from cosmological distances rather than from within or near the Milky Way Galaxy. Nevertheless, some events, called soft gamma-ray repeaters, were known to be associated with objects within the galaxy.

      On August 27 a tremendous burst of gamma rays and X-rays lasting about five minutes pelted Earth. It was so powerful that it produced noticeable ionization in the Earth's upper atmosphere, comparable to that produced by the Sun in the daytime. The X-rays were found to vary with a 5.16-second period, exactly the same as that of an active X-ray source, SGR 1900+14, lying within the galaxy some 20,000 light-years from Earth in the constellation Aquila. Such X-ray sources were thought to be rotating, magnetized neutron stars, and it was suggested that events like the August 27 burst are caused by a "glitch," or starquake, on a neutron star with an extraordinarily high magnetic field, possibly a million billion times larger than that of Earth. Such stellar objects were dubbed magnetars. According to one idea, the magnetar's enormous magnetic field occasionally cracks open the crust of the star, which leads in some way to the production of energetic charged particles and gamma rays.

Galaxies and Cosmology.
      More than 2,000 celestial bursts of gamma rays, each typically lasting some tens of seconds, had been detected by late 1998. On Dec. 14, 1997, one such burst, designated GRB 971214, was accompanied by an X-ray afterglow observed by the Italian-Dutch BeppoSAX satellite, which led to the subsequent observation of a visible afterglow. In early 1998 S. George Djorgovski of the California Institute of Technology and his colleagues, using the giant Keck II Telescope in Hawaii, were able to identify the host galaxy and found that it lies at a distance of about 12 billion light-years. The burst in the gamma-ray portion of the spectrum alone represented roughly 100 times the total energy of a typical supernova explosion, comparable to all of the energy radiated by a typical galaxy in several centuries. The most widely held theory of gamma-ray bursts—that they arise from the merger of two neutron stars—was called into question for being unable to generate sufficient energy to explain the event. Alternatively it was proposed that GRB 971214 was the result of a "hypernova," a kind of super-supernova, or that it was produced by a rotating black hole.

      Astronomers continued scanning the skies for ever more distant galaxies. Their goal was not to add new entries to some "Guinness Book of Cosmic Records" but to determine how long after the big bang the first galaxies formed and how they evolved at that time. The farther out one looks in space, the earlier one is seeing back in time. Because of the expansion of the universe, the more distant a galaxy, the faster it is receding from Earth. The red shift of a galaxy, or shift in the wavelength of its light toward the red end of the spectrum, is the measure of its recession velocity and therefore its distance. In 1997 a galaxy with a red shift of 4.92 was found, the most distant object reported at the time. In 1998 the record fell several times. In March a galaxy with a red shift of 5.34 was reported by Arjun Dey of Johns Hopkins University, Baltimore, Md., and colleagues. In May a group headed by R.G. McMahon of the University of Cambridge extended the record to 5.64, and in November the same group reported studies of another distant galaxy, this one with a red shift of 5.74. It formed when the universe was only 7% of its present age. The object appeared to be creating new stars at a rate of about 10 per year at that time.

      Studies of objects with high red shifts were also the key to understanding the ultimate fate of the universe as a whole. In the 1920s astronomers began measuring the distances and velocities of galaxies, and in 1929 the U.S. astronomer Edwin Hubble announced the discovery of a simple linear relationship between a galaxy's distance and its recession velocity. The relationship had been predicted (and even observed) earlier based on the idea that the universe had come into being in a violent explosion, leading to the expansion of space and the resultant recession of galaxies from one another. The future fate of the expansion depends on the competition between the initial expansion rate and the gravitational pull of the matter filling space, which should lead to a deceleration of the expansion. Whether the universe will expand forever or ultimately collapse depends on whether the mass density of the universe is greater or less than a critical value.

      For decades astronomers had attempted to measure the expansion rate (called the Hubble constant) and the mean density of the universe (or, equivalently, its deceleration rate). In 1998 two teams of astronomers independently announced new results for those parameters. As their distance indicators, both teams used Type Ia supernovas, extremely bright exploding stars thought to have nearly identical intrinsic peak brightnesses, which makes them useful in comparing the distances to various galaxies. The Supernova Cosmology Project, headed by Saul Perlmutter of the Lawrence Berkeley National Laboratory in California, reported on measurements of the apparent brightnesses and red shifts of 42 Type Ia supernovas. The rival High-Z Supernova Search Team, headed by Brian Schmidt of the Mount Stromlo and Siding Spring Observatories in Australia, based their conclusions on a study of 16 Type Ia supernovas. Both teams came up with an astonishing result; not only is the rate of expansion of the universe not decelerating, but it also appears to be accelerating slightly.

      The version of cosmology favoured by many theoretical physicists, the so-called inflationary big-bang universe, required in its simplest form that the universe have a rather high mass density and that its expansion rate be slowing. An idea originally proposed by Albert Einstein in 1917, however, could account for the new observations. Having been told by observational astronomers at that time that the universe is static, Einstein reluctantly introduced a "cosmological constant," a kind of universal sea of repulsive mass and energy, into his general theory of relativity to counteract the attraction of gravity. After the discovery of the expansion of the universe, Einstein referred to the addition of this constant as his "greatest blunder." Nevertheless, if a new repulsive force turned out to exist, Einstein could be proved once again to have been the most prescient scientist of the 20th century.

KENNETH BRECHER

Space Exploration
      In sharp contrast to the previous year, Russia's orbiting space station Mir had a quiet 1998, whereas efforts to assemble the International Space Station (ISS) began under a cloud of management and budget problems. Exploration of the planets and Sun continued with new probes. The world also mourned the death of U.S. astronaut Alan Shepard, Jr. (see OBITUARIES (Shepard, Jr., Alan )), on July 21. Shepard was the first American in space (1961) and, as commander of Apollo 14 (1971), the fifth human to walk on the Moon.

Manned Spaceflight.
      The most watched space mission of the year was that of the space shuttle Discovery (STS-95, October 29-November 7), whose crew included U.S. Sen. John Glenn in a controversial decision by NASA. Glenn, who in 1962 was the first American to orbit Earth, had campaigned for a seat on a shuttle mission. (The Discovery flight was only Glenn's second trip into space; space-program observers generally believed that he had not been allowed to fly again in the 1960s out of concern that a national hero be put at undue risk.) NASA officials asserted that Glenn's presence on the shuttle mission would contribute to research on the aging process—Glenn was 77 at the time—but critics contended that the benefits would be minimal and that comparable data could be obtained from astronauts whom NASA was removing from flight status because they were almost as old as Glenn. The primary mission of STS-95 was to carry the Spacehab module, which contained an array of materials-sciences and life-sciences experiments.

      The shuttle Columbia flew the last Spacelab mission, called Neurolab, during the year (STS-90, April 17-May 3). Spacelab, a reusable laboratory module, had been developed by the European Space Agency (ESA) as its first foray into manned spaceflight. The Neurolab mission performed a range of experiments on the way that nervous systems react and adapt to the effects of space travel. In addition to the human crew members, the experimental subjects included mice and rats (some pregnant), swordtail fish, snails, crickets, and cricket eggs. The results of the mission could have applications to neurological disorders such as Parkinson's disease.

      Two shuttle missions concluded U.S. activities aboard Mir. Endeavour (STS-89, January 22-31) made the eighth shuttle docking with the Russian space station, and Discovery (STS-91, June 2-12) made the ninth and last one. Endeavour replaced a U.S. astronaut who had been aboard Mir since the previous shuttle visit and carried experiments in protein crystal growth (for pharmaceutical studies) and low-stress soil mechanics (to understand how soil behaves when it liquefies during earthquakes). Discovery retrieved the American astronaut and delivered more supplies to the Russian crew staying aboard Mir. The shuttle crew also conducted microgravity-science and cosmic-ray experiments.

      Operations aboard Mir included several space walks by the crew to repair the facility. Russia launched two manned spacecraft to Mir, Soyuz TM-27 on January 29 and TM-28 on August 13. Soyuz TM-26 (launched in 1997) returned to Earth on February 19 carrying two cosmonauts who had been aboard Mir since 1997 and a third who had launched with TM-27. A similar pattern was followed when TM-27 returned with three cosmonauts on August 25. One more manned launch to Mir, Soyuz TM-28 in February 1999, was scheduled to wrap up experiments and start shutting down systems.

      Assembly of the long-delayed and trouble-plagued ISS started on November 20 with the launch by Russia of the station's first element, Zarya ("Dawn," formerly called the FGB module), into an initial 350 185-km (220 115-mi) elliptical orbit and inclined 51.6° to the Equator. Engine firings over the next few days circularized the orbit and raised it to about 385 km (240 mi). Zarya was an unpiloted space "tugboat" providing early propulsion, steering, and communications for the station's first months in orbit. Eventually ISS was to comprise dozens of major elements, including pressure modules containing living and working spaces for a permanent crew of six persons and an open-latticework truss 108.6 m (356.4 ft) long supporting eight massive solar arrays for the station's electrical power.

      Zarya, which was built by Russia from the never-launched Mir 2 station, was counted as a U.S. launch because NASA paid $240 million for it. The module would provide some working space, altitude control, power, and other services while the U.S. and its major partners—Russia, ESA, Canada, and Japan—developed and attached additional elements.

      On December 4 Endeavour (STS-88) carried the second ISS element into orbit; this was the first connecting node, a U.S.-built element called Unity. After Endeavour rendezvoused with Zarya, astronauts grappled the Russian element with the shuttle's robot arm. They then joined it with Unity and completed various connections inside and outside the nascent ISS core. Barring setbacks in space or on Earth, a series of U.S. shuttle and Russian rocket launches in 1999 would continue carrying up additional elements and equipment and assembly crews.

      The program remained hobbled by a number of technical delays, mostly on the Russian side. U.S. officials claimed that Russia was not properly funding its commitments, and NASA was asked to bail out the Russian program with additional funds. In October NASA bought Russia's share of the research time aboard the station to provide a $60 million transfusion.

      A potential stumbling block was the Service Module, a Russian element rescheduled for launch in March 1999. In addition to its function as an early station living quarters, it carried rocket engines and propellants to restore the altitude that the station would steadily lose to atmospheric drag. In 1998 Russia was so far behind in the development of the module that NASA started preliminary plans for a backup Interim Control Module derived from a classified U.S. Navy satellite. Assuming that one or the other country kept the program on schedule, the first permanent three-person crew would be taken to the ISS by a Soyuz launch in the summer of 1999. As with Mir missions, the Soyuz was to stay attached as a lifeboat. By late 1999 attachment of the U.S. Laboratory Module would allow limited science research to start.

Space Probes.
      While scientists continued to absorb the data from the successful Mars Pathfinder mission of 1997, other efforts to explore the red planet continued, and NASA sent its first probe to the Moon since Apollo 17 in 1972.

      Mars Global Surveyor, which had achieved an initial elliptical orbit around Mars in September 1997, continued to work its way into a mapping orbit during the year, although progress was slowed by an incompletely locked solar array and other equipment problems. Scientists expected the satellite to be in its final mapping orbit by early 1999.

      With its July 4 launch of Nozomi ("Hope") from Kagoshima Launch Center, Japan became only the third nation (after Russia and the U.S.) to reach for Mars. Nozomi made two flybys of the Moon in September and December to reshape its trajectory for arrival in a highly elliptical Mars orbit in October 1999. Unfortunately, the second maneuver was off target, and Japan had to alter the spacecraft's trajectory for a 2003 arrival. Nozomi's mission was to measure the interaction between the solar wind and Martian upper atmosphere.

      Of NASA's two new Mars missions, the Mars Climate Orbiter was launched on December 11 for a September 1999 arrival, whereas the Mars Polar Lander was expected to launch on Jan. 3, 1999, and land in the south polar region the following December. During its descent the lander would release two microprobes designed to penetrate the surface and send back data about internal conditions.

      NASA's Lunar Prospector was launched on January 6 by an Athena II vehicle. It entered lunar orbit on January 11 and achieved its final mapping orbit, 100 km (60 mi) high, four days later. It was equipped with a variety of radiation- and particle-measuring equipment to assay the chemistry of the lunar surface. Its major find, announced in March, was strong evidence for the presence of water in the Moon's south polar region—specifically, subsurface ice in areas protected from sunlight. If borne out by later low-level observations, the find would represent a major resource for future interplanetary missions. The water could be electrolyzed into oxygen (valuable as a rocket oxidizer and for crew air) and hydrogen (valuable as a rocket fuel).

      The Jupiter-orbiting Galileo spacecraft, which had completed its primary mission to the giant gas planet in December 1997, started an extended mission of flybys of Jupiter's moon Europa. Earlier Galileo observations had hinted at the presence of an ocean of liquid water—and thus possibly conditions conducive to life—beneath Europa's icy surface. The Cassini mission to put a spacecraft in orbit around Saturn and drop a probe into the atmosphere of Saturn's moon Titan continued smoothly after the craft's October 1997 launch. It flew past Venus for a gravity assist in April and was set to do the same with Earth in August 1999.

      The Near Earth Asteroid Rendezvous (NEAR) mission approached its goal following a January flyby of Earth that reshaped its trajectory toward the asteroid Eros. On Jan. 10, 1999, NEAR was to go into an orbit around Eros that controllers on Earth would then reshape into a variable one for optimal observations of the irregularly shaped body. A crucial mid-course correction burn was missed in December, however, and the rendezvous was postponed a year. NEAR was to image Eros, map its surface and weak gravity field, and study its composition and other properties.

      The Deep Space 1 probe, launched on October 24, was designed to test a dozen new space technologies, including a low-thrust, high-efficiency ion engine, autonomous navigation, and superminiature cameras and electronics. Part of its mission—flybys of an asteroid and a comet—was threatened when the ion engine temporarily shut down unexpectedly November 11 only minutes after it was powered up for a test. Engineers soon determined the problem—apparently a common self-contamination effect—and started long-duration burns on November 24.

      In June NASA formed an Astrobiology Institute to investigate the possibilities of life beyond Earth. The institute was to study the extreme conditions under which life exists on Earth and compare them with conditions on Mars, ice-covered Europa, methane-shrouded Titan, and even asteroids and meteors. It would also be concerned with planetary protection methods to ensure that alien life was not accidentally released on Earth.

Unmanned Satellites.
      Solar astronomy was given a powerful new tool with the launch on April 1 of the Transition Region and Coronal Explorer (TRACE) to study the mysterious region of the solar atmosphere where temperatures soar from 5,000 K (8,500° F) near the visible surface to about 10,000,000 K (18,000,000° F) higher in the corona. TRACE carried an extreme-ultraviolet telescope to monitor the plasma trapped by thin bundles of twisted magnetic force lines, which were presumed to contribute to coronal heating. TRACE soon provided a dazzling series of images of the transition region and corona.

      The field of solar studies was dealt a major, though temporary, blow on June 25 when contact was lost with the Solar and Heliospheric Observatory (SOHO), positioned in a "halo" orbit around L-1, a gravitational balance point between Earth and the Sun about 1.5 million km (930,000 mi) away from Earth. Contact was reestablished in September, and by mid-October scientists were reactivating the science instruments.

      The last spacecraft in the International Solar-Terrestrial Physics campaign was launched on Dec. 2, 1997, when Germany's Equator S spacecraft went into an equatorial orbit within the ring current of the Van Allen radiation belt. Data transmission failed in May 1998. The Advanced Composition Explorer, launched in 1997, reached its station in the L-1 halo orbit, where it was to sample the makeup of the solar wind before it struck the Earth's magnetosphere.

      A new chapter in space studies opened with the February 25 launch of the Student Nitric Oxide Explorer, the first of three NASA-funded, student-built and student-operated satellites. The mini-satellite carried instrumentation built by the faculty and students of the University of Colorado to measure how solar X-rays and auroral activity affect nitric oxide (a stratospheric-ozone-destroying gas) in the upper atmosphere. France launched the SPOT 4 remote-sensing and reconnaissance satellite on March 24. SPOT 4 carried instruments that could monitor vegetation at a one-kilometre (0.6-mi) resolution and other cameras that provided images at 10-20-m (33-66-ft) resolution.

Launch Vehicles.
      In October the U.S. Congress passed the Commercial Space Act to allow the Federal Aviation Administration to license firms to fly vehicles back from space. Since the 1980s private firms had been able to acquire licenses for commercial space launches, but until recently the return trip had been too expensive for any but government agencies. The Space Act also required the federal government to foster a stable business environment for space development.

      NASA's X-33 moved ahead with testing of its rocket engines and heat shield and assembly of its first flight hardware. The X-33 was a subscale demonstrator of Lockheed Martin's proposed VentureStar Reusable Launch Vehicle (RLV) that would ascend from ground to orbit as a single unit and then fly back to Earth. No boosters or tanks would be shed along the way. One of the innovative elements of the X-33 was its linear aerospike engine, which comprised two lines of burners firing along a wedge between them. The outer "wall" of the engine was formed by shock waves from the vehicle's high-speed flight. A 2.8-second firing in October at NASA's Stennis Space Center, Bay St. Louis, Miss., initiated tests that would lead to full-scale testing of the engines.

      NASA also moved to ensure complete testing of the X-34, a smaller RLV that was to be air-launched from a Lockheed L-1011 jetliner. NASA was buying parts to make a second vehicle in case the first was seriously damaged. The X-34 was a single-engine winged rocket, 17.8 m (58.4 ft) long and spanning 8.5 m (27.9 ft). It would fly as fast as eight times the speed of sound and reach altitudes as high as 76 km (250,000 ft) to demonstrate various RLV concepts, including low-cost reusability, autonomous landing, subsonic flights through inclement weather, safe abort conditions, and landing in strong crosswinds.

      Several launch failures dotted the calendar during the year, including the first attempt by amateurs to launch a satellite by "rockoon"—a rocket carried to high altitude by a balloon. It also was the first attempt by amateurs to launch any satellite. More spectacular failures came with the losses in August of a Titan 4 carrying a classified spy satellite and a Delta III launcher, on its first flight, carrying a Galaxy X communications satellite. A novel style of launch succeeded on July 7 when Russia orbited Germany's Tubsat-N and Tubsat-N1 remote-sensing microsatellites atop a submarine-launched ballistic missile. Russia hoped to market launch services using missile submarines that it otherwise could not afford to keep operable.

DAVE DOOLING

▪ 1998

Introduction

Mathematics
      A major topic occupying mathematicians in 1997 was the nature of randomness. Popular notions often differ from mathematical concepts; reconciling the two in the case of randomness is important because of the use of randomization in many aspects of life, from gambling lotteries to the selection of subjects for scientific experiments.

      Although the result of a coin toss, i.e., heads or tails, is determined by physical laws, it can be regarded as random because it is not predictable, provided that the coin rotates many times. Similarly, numbers from a computer random-number generator are accepted as random, even though such numbers are usually produced by a purely mechanistic process of computer arithmetic.

      Since the two sides of a coin are quite similar, people agree that heads and tails are equally likely to turn up. Other methods of randomization, however, such as spinning the coin on a tabletop or standing it on edge and striking the table, may favour one outcome over the other if the coin is not absolutely symmetrical. One's perception of the probability of a random event may be based on physical principles such as symmetry (e.g., the six sides of a die are equally likely to come up), but it also may have a less-tangible basis, such as long experience (one rarely wins a big lottery) or subjective belief (some people are lucky).

      Statisticians regard a sequence of outcomes as random if each outcome is independent of the previous ones—that is, if its probability is not affected by previous outcomes. Most people agree that tosses of a coin are independent; the coin has no "memory" of previous tosses or cosmic duty to even out heads and tails in the long run. The belief that after a long sequence of heads, tails is more likely on the next toss is known as the "gambler's fallacy."

      For heads (H) and tails (T) being equally likely, the three sequences HHHHHHHH, HTHTHTHT, and HTHHTHTT are all random, and the first two are as likely to occur as the third. If one of the first two occurs, however, the result does not appear random. Many people believe that a random sequence should have no "obvious" patterns; that is, later elements of the sequence should not be predictable from early ones. In the 1960s a team of mathematicians suggested measuring randomness by the length of the computer program needed to reproduce the sequence. For a sequence in which tails always follows heads, the program instructions are simple—just write HT repeatedly. A sequence with no discernible pattern requires a longer program, which enumerates each outcome of the sequence. Requiring a long program is equivalent to having the sequence pass certain statistical tests for randomness.

      According to this measure, however, the first million decimal digits of pi are not random, since very short computer programs exist that can reproduce them. That conclusion contradicts mathematicians' sense that the digits of pi have no discernible pattern. Nevertheless, the spirit of the approach does correspond to human intuition. Research published in 1997 by Ruma Falk of the Hebrew University of Jerusalem and Clifford Konold of the University of Massachusetts at Amherst concluded that people assess the randomness of a sequence by how hard it is to memorize or copy.

      In 1997 freelance mathematician Steve Pincus of Guilford, Conn., Burton Singer of Princeton University, and Rudolf E. Kalman of the Swiss Federal Institute of Technology, Zürich, proposed assessing randomness of a sequence in terms of its "approximate entropy," or disorder. To be random in this sense, a sequence of coin tosses must be as uniform as possible in its distribution of heads and tails, of pairs, of triples, and so forth. In other words, it must contain (as far as possible given its length) equal numbers of heads and tails, equal numbers of each of the possible adjacent pairs (HH, HT, TH, and TT), equal numbers of each of the eight kinds of adjacent triples, and so forth. This must hold for all "short" sequences of adjacent outcomes within the original sequence—ones that are significantly shorter than the original sequence (in technical terms, for all sequences of length less than log2 log2 n + 1, in which n is the length of the original sequence and logarithms are taken to base 2).

      When this definition is applied to the 32 possible sequences of H and T having a length of five, the only random ones among them are HHTTH, HTTHH, TTHHT, and THHTT. In this case the short sequences under scrutiny have a length less than log2 log2 5 + 1, or about 2.2. Thus, a random sequence with a length of five must have, as far as possible, equal numbers of heads and tails—hence, two of one and three of the other—and equal numbers of each pair—here, exactly one of each among the four successive adjacent pairs. Furthermore, when this definition is applied to the decimal digits of pi, they do form a random sequence. In the case of a nonrandom sequence, the approximate entropy measures how much the sequence deviates from the "ideal."

      Other investigators have used the concept of approximate entropy to investigate the possibility that symptoms anecdotally ascribed to "male menopause" may be sufficiently nonrandom to indicate the existence of such a condition and to assess how randomly the prices of financial stocks fluctuate.

PAUL J. CAMPBELL
      This article updates statistics.

CHEMISTRY

Chemical Nomenclature.
      Decades of controversy over official names for a group of heavy elements ended in 1997 after the International Union of Pure and Applied Chemistry (IUPAC) adopted revised names substantially different from those that it had proposed in 1994. IUPAC is an association of national chemistry organizations formed in 1919 to set uniform standards for chemical names, symbols, constants, and other matters. The action cleared the way for the adoption of official names for elements 101-109 on the periodic table.

      The elements were synthesized between the 1950s and the 1980s by researchers in the U.S., Germany, and the Soviet Union, but official names were never adopted because of disagreements over priority of discovery. After an international scientific panel resolved the priority disputes in the early 1990s, IUPAC was free to consider names for the elements proposed by the discoverers. When, however, it rejected some of those proposals and substituted its own names, it received sharp criticism. Discoverers of new elements traditionally have had the right to pick names. IUPAC's rejection of the name seaborgium for element 106 caused special dismay in the U.S., where discoverers of the element had named it for Nobel laureate Glenn T. Seaborg, codiscoverer of plutonium and several other heavy elements.

      The dispute led the over-151,000-member American Chemical Society (ACS) to support a largely different group of names and to use them in its many publications. An IUPAC committee subsequently proposed a revised list of names, which were accepted by IUPAC's governing body and the ACS in mid-1997. The official names and symbols of the nine elements were: 101, mendelevium (Md); 102, nobelium (No); 103, lawrencium (Lr); 104, rutherfordium (Rf); 105, dubnium (Db); 106, seaborgium (Sg); 107, bohrium (Bh); 108, hassium (Hs); and 109, meitnerium (Mt). Resolution of the conflict cleared the way for naming the recently discovered elements 110, 111, and 112. The scientists who discovered them had decided not to propose names until the earlier controversy ended.

Inorganic Chemistry.
      The periodic table of elements graphically depicts the periodic law. This cornerstone of chemistry states that many physical and chemical properties of elements recur in a systematic fashion with increasing atomic number. Confidence in the law as it applies to very heavy elements was shaken, however, when previous studies concluded that rutherfordium and dubnium (elements 104 and 105, respectively) departed from periodicity. For instance, although dubnium is positioned under tantalum in the table, in water solutions it exhibited behaviour different from that of tantalum. During the year a research group headed by Matthias SchŠdel of the Institute for Heavy Ion Research, Darmstadt, Ger., restored confidence in the law with studies of the chemistry of seaborgium (element 106). Working with just seven atoms of the element, they concluded that seaborgium does behave like its lighter counterparts—including molybdenum and tungsten—in group 6 on the table, as periodic law predicts. SchŠdel used gas chromatography and liquid chromatography experiments to show that seaborgium forms the same kind of compounds as other group 6 elements.

      The first synthesis of mesoporous silica in 1992 led to many predictions that the material would have widespread commercial and industrial applications. Mesoporous silica is silicon dioxide, which occurs in nature as sand and quartz, but it differs from natural forms in that it is riddled with billions of pores, each only a few nanometres (nm), or billionths of a metre, in diameter. (Materials with pores 2-50 nm in diameter are usually called mesoporous; those with pores less than 2 nm in diameter are microporous.) The pores give the silica an amazingly large surface area; a single gram has about 1,500 sq m (16,000 sq ft) of surface. The large surface area seemed to make it ideal for adsorbing materials or perhaps as a catalyst in accelerating chemical reactions. Nevertheless, few such applications materialized.

      Jun Liu of the Pacific Northwest National Laboratory, Richland, Wash., and associates reported one of the first potential practical applications for the material. They found that mesoporous silica coated with monolayers (single molecular layers) of tris(methoxy)mercaptopropylsilane had a remarkable ability to bind and remove heavy metals from contaminated water and thus could have important applications in remediating environmental pollution. In laboratory tests on heavily contaminated water, the coated material reduced levels of mercury, silver, and lead to near zero. Liu said the coating could be modified such that the material selectively adsorbed some metals, but not others, to suit different specialized situations. It could be used as a powder packed into treatment columns or fabricated into filtration disks.

      Zeolites are microporous materials with many practical uses. They serve as catalysts in refining gasoline, water softeners in laundry detergents, and agents for separating gases. Zeolites work because their internal structure is riddled with highly uniform molecular-sized pores, which allow them to act as molecular sieves, controlling the entry and exit of molecules by size. Natural zeolites are minerals having a three-dimensional aluminosilicate framework, and for several decades scientists have developed synthetic zeolites and zeolite-like materials consisting, initially, of aluminosilicates like the natural minerals and, later, of aluminophosphates, substituted aluminophosphates, zincophosphates, and other combinations of elements. Efforts have also been made to synthesize such materials incorporating cobalt, since inclusion of that element would provide catalytic activity of potential use in many industrial processes. During the year Galen D. Stucky and colleagues of the University of California, Santa Barbara, announced the development of a general method for synthesizing cobalt phosphate zeolite-like materials. Their process yielded materials of new chemical types and structural configurations. The cobalt content could be tailored to fit specific intended applications by adjustment of the electrical charge and structure of amide molecules used in the synthesis.

Organic Chemistry.
      The buckminsterfullerene molecule (C60) comprises 60 carbon atoms bound together into a spherical cage having a bonding structure that resembles the seams on a soccer ball. In recent years chemists had synthesized a number of dimers of C60—that is, molecules made of two connected C60 units. They included such dimers as C121H2 and C120O2, in which two C60 molecules are connected with various linkages. The simplest C60 dimer, which is C120, had eluded synthesis, however.

      During the year Koichi Komatsu and associates at Kyoto (Japan) University and the Rigaku Corp., Tokyo, reported synthesis of the C120 dimer. It consists of C60 cages linked by a single shared four-carbon ring. The configuration gives the dimer the distinctive shape of a dumbbell, with the shared ring forming a handle that connects the two C60 spheres. Komatsu developed a new solid-state mechanical-chemical technique for the synthesis that makes use of a vibrating mill. High-speed vibrations activate the reaction by bringing the reagents into very close contact and providing extra mechanical energy. The mill consisted of a stainless-steel capsule containing a stainless-steel ball and a solid mixture of C60 and potassium cyanide (used as a catalyst) under nitrogen gas. Researchers vibrated the mill forcefully for 30 minutes, producing 18% yields of C120. Komatsu reported that the vibrating-mill method could be used in the preparation of dimers of other fullerene molecules—e.g., C140 from C70.

      The framework of the cubane molecule (C8H8) consists of eight carbon atoms linked together in the shape of a cube, a structure that has challenged traditional concepts about chemical bonding. Cubane has properties, including highly strained 90° bonds storing enormous amounts of energy, that make it an ideal candidate for a new generation of powerful explosives, rocket propellants, and fuels. Substitution of nitro groups (-O-N=O) for the eight hydrogen atoms, for instance, would create an explosive expected to be twice as powerful as TNT. Furthermore, the rigid cubic structure appeared useful as the molecular core in the synthesis of antiviral agents and other drugs. Such applications lagged, however, in part because chemists knew little about its basic chemistry and behaviour. Advances in 1997 added to knowledge about cubane, which was first synthesized in 1964.

      Scientists at the National Institute of Standards and Technology, Gaithersburg, Md., and the University of Chicago reported determination of cubane's crystal structure at high temperatures. They used X-ray crystallography to show that the basic unit of solid cubane remains a rhombohedron even at temperatures near its melting point. In a second report scientists from the University of Minnesota and the University of Chicago announced determination of several key properties of cubane in the gas phase, including the first experimental values for its bond dissociation energy, heat of hydration, heat of formation, and strain energy.

Applied Chemistry.
      Researchers in industrial settings were working to develop new ways of synthesizing chemical compounds by means of reactions that do not require toxic ingredients or generate toxic by-products. Such efforts, sometimes termed "green chemistry" or "waste reduction," promised to benefit both the environment and the economy in that they would reduce the use of toxic chemicals and the volume of hazardous waste that would need costly treatment or disposal. Walter V. Cicha and associates of the Du Pont Co., Wilmington, Del., reported a new method for making phosgene that substantially reduced formation of unwanted carbon tetrachloride (CCl4). Large quantities of phosgene are produced and used annually in the manufacture of polycarbonates and polyurethane plastics, pesticides, and other products. The traditional process for making phosgene involves the reaction of carbon monoxide and chlorine with carbon-based catalysts; it forms substantial amounts of CCl4, a known carcinogen. Phosgene producers use high-temperature incineration to eliminate the CCl4, but incineration produces hydrogen chloride, which has to be scrubbed from incinerator exhaust gases before their release into the environment. The Du Pont researchers worked out the mechanism of CCl4 formation in the phosgene reaction and examined dozens of alternative catalysts. They eventually identified one that produced high yields of phosgene but formed 90% less CCl4 than the traditional catalyst.

      Aldol condensation reactions have been a mainstay in organic chemistry, widely used to synthesize chemicals having important commercial and industrial applications. They involve a transfer of hydrogen between molecules in a reaction to form a new molecule, called an aldol, that is both an aldehyde and an alcohol. The first in a new generation of catalysts for accelerating hundreds of different aldol condensations became commercially available in 1997. It is a catalytic antibody, called 38C2, that was developed by researchers at the Scripps Research Institute, La Jolla, Calif., and the Sloan-Kettering Institute for Cancer Research, New York City, and marketed by the Aldrich Chemical Co., Milwaukee, Wis. Catalytic antibodies, or abzymes (a contraction of "antibody enzymes"), are substances derived from the immune systems of living organisms that selectively accelerate organic chemical reactions by attaching to and stabilizing intermediate structures produced as a reaction progresses. Researchers reported that 38C2 was very efficient in catalyzing an extremely broad range of chemical reactions and that a number of similar catalysts would be commercially available in the near future.

MICHAEL WOODS

      This article updates chemical element.

Physics
      The physics community worldwide acknowledged 1997 as the centenary of the discovery of the electron—the first identification of a subatomic particle—by the British physicist J.J. Thomson. Subatomic particles, and the particles of which they are constituted, also were at the centre of several interesting experimental results reported during the year, some of which had implications for both physics and cosmology. Evidence continued to underscore the dramatic differences between the reality of quantum physics and normal experience, and researchers reported developing the first atom laser.

Particle Physics.
      An atom consists of a cloud of electrons surrounding a tiny nucleus. The nucleus in turn is made up of particles called hadrons—specifically, protons and neutrons—which themselves are built up from more fundamental units called quarks. The standard model, the central theory of fundamental particles and their interactions, describes how the quarks are held together in hadrons via the strong force, which is mediated by field particles known as gluons. A proton or neutron comprises three quarks tied together by gluons. Other hadrons called mesons comprise two quarks bound by gluons. Theorists had predicted, however, that "exotic" mesons could also exist. One type could consist of two quarks held together by distinctive, energetically excited gluons; another type could be made of four quarks bound by gluons in a more ordinary way.

      In 1997 experimenters at the Brookhaven National Laboratory, Upton, N.Y., claimed to have observed effects due to exotic mesons. The evidence was indirect, since the lifetime of the particles was expected to be about 10-23 seconds. The Brookhaven team used a beam of high-energy pions, a type of meson, to bombard protons in a hydrogen target. The characteristics of a small fraction of the debris from the pion-proton collisions suggested that a new particle had formed briefly. The claim was supported by experimenters at CERN (European Laboratory for Particle Physics), near Geneva, who observed similar results by means of a different method involving the annihilation of antiprotons, the antimatter counterpart of protons. If confirmed, the results would be further validation of the standard model.

      The standard model considers quarks to be "point particles," with no spatial size, but evidence continued to collect that quarks themselves may have structure. At the DESY (German Electron Synchrotron) laboratory, Hamburg, experiments were being carried out in which positrons, the antimatter counterparts of electrons, were smashed into protons at very high energy and their scattering pattern compared with that from theoretical calculations incorporating the assumption that protons consist of pointlike quarks. For the vast majority of collisions, the results agreed well with theory. For the most violent collisions, however, the dependence of the scattering pattern on energy seemed to be different. This deviation was interpreted as possible evidence for structure within the quark itself or, alternatively, for the transient appearance of a previously unobserved particle.

      Of great significance for particle physicists, astrophysicists, and cosmologists is the question of whether another fundamental particle, the neutrino, has a small mass. Neutrinos are very common, but they very rarely interact with other matter and so are difficult to observe. The idea of massless neutrinos is an assumption built into the standard model, but there is no compelling theoretical reason for them to have exactly zero mass. Indeed, the existence of a small mass for neutrinos could help explain both the shortfall of neutrinos, compared with theoretical predictions, detected from the Sun and the fact that the universe behaves as if it has much more mass (so-called missing mass or dark matter) than the total amount of luminous matter currently known to exist.

      Evidence from three groups during the year added to previous data suggesting some small mass for the neutrino. Research groups at the Liquid Scintillator Neutrino Detector at Los Alamos (N.M.) National Laboratory (LANL), the Soudan 2 detector in the Soudan iron mine in Minnesota, and the Super-Kamiokande detector in Japan reported results from ongoing experiments that point to a finite mass. At least three other groups around the world were also carrying out experiments intended to give a definite upper boundary for the possible mass of the particle.

Quantum Theory.
      Several experiments confirmed predictions of quantum theory that had not been experimentally verified previously. Scientists were long familiar with the phenomenon of particle annihilation, in which a collision between a particle and its antiparticle converts both into a burst of electromagnetic radiation. Only during the year, however, did physicists at the Stanford (Calif.) Linear Accelerator Center (SLAC) demonstrate the reverse process. Photons (the particle-like energy packets that constitute light radiation) from a superpowerful short-pulse glass laser, producing a half trillion watts of power in a beam 6 micrometres (0.0002 in) across, were arranged to interact with a pulsed beam of high-energy electrons. Some of the photons collided with the electrons, gaining a huge energy boost, and recoiled back along the line of the laser beam. A number of those energetic photons collided with oncoming laser photons and, in so doing, sufficiently broke down the vacuum to produce pairs of electrons and positrons. The experiment marked the first time that the creation of matter from radiation had been directly observed.

      To some the SLAC experiment might seem almost mundane compared with that of Nicolas Gisin's group at the University of Geneva. One of the best-known debates within quantum physics has been that over the Einstein-Podolsky-Rosen paradox. In the 1930s, to express their dissatisfaction with quantum theory, Einstein and two colleagues proposed a thought experiment based on a part of the theory that allows the states of two particles to be quantum mechanically "entangled." For example, two particles with opposite spins could be created together in a combined state having zero spin. A measurement on one particle showing that it is spinning in a certain direction would automatically reveal that the spin of the other particle is in the other direction. According to quantum theory, however, the spin of a particle exists in all possible states simultaneously and is not even defined until a measurement has been made on it. Consequently, if a measurement is made on one of two entangled particles, only then, at that instant, would the state of the other be defined. If the two particles are separated by some distance before the measurement is made, then the definition of the state of the second particle by the measurement on the first would seem to require some faster-than-light "telepathy," as Einstein called it, or "spooky actions at a distance."

      For Einstein this conclusion demonstrated that quantum mechanics could not be a complete description of reality. Nevertheless, in 1982 the French physicist Alain Aspect and co-workers showed that such action at a distance indeed exists for photons a short distance apart. In 1997 Gisin and his co-workers extended the experiment for particles separated by large distances. They set up a source of pairs of entangled photons, separated them, and piped them over optical fibres to laboratories in two villages several kilometres apart. Measurements at the two sites showed that each photon "knew" its partner's state in less time than a signal traveling at light speed could have conveyed the information—a vindication of the theory of quantum mechanics but a problem, for some, for theories of causation.

      An even stranger experiment confirmed a prediction made in the late 1940s by Dutch physicist Hendrik Casimir. In acoustics the vibration of a violin string may be broken down into a combination of normal modes of oscillation, defined by the distance between the ends of the string. Oscillating electromagnetic fields can also be described in terms of such modes—for example, the different possible standing wave fields in a vacuum inside a metal box. According to classical physics, if there is no field in the box, no energy is present in any normal mode. Quantum theory, however, predicts that even when there is no field in the box, the vacuum still contains normal modes of vibration that each possess a tiny energy, called the zero-point energy. Casimir realized that the number of modes in a closed box with its walls very close together would be restricted by the space between the walls, which would make the number smaller than the number in the space outside. Hence, there would be a lower total zero-point energy in the box than outside. This difference would produce a tiny but finite inward force on the walls of the box. At the University of Washington, Steven Lamoreaux, now at LANL, measured this force for the first time—the bizarre effect produced by the difference between two nonexistent electromagnetic fields in a vacuum. The amount of the force, less than a billionth of a newton, agreed with theory to within 5%.

Atomic Physics.
      An optical laser emits photons of light all in the same quantum state. As a result, a beam of laser light is of a single pure colour and is coherent; i.e., all the components of the radiation are in step. During the year Wolfgang Ketterle and his co-workers at the Massachusetts Institute of Technology created an analogous quantum state of coherence in a collection of atoms and then released them as a beam, thus producing the first atom laser. The coherent state, created in a gas of sodium atoms, was achieved by means of technique perfected two years earlier for trapping atoms and chilling them to temperatures just billionths of a degree above absolute zero (0 K, -273.15° C, or -459.67° F) to form a new kind of matter called a Bose-Einstein condensate (BEC). In a BEC the constituent atoms exist in the same quantum state and act coherently as a single entity. To make the atom laser, Ketterle's group devised a way to allow a portion of the trapped BEC to emerge as a beam. The beam behaved as a single "matter wave" that could be manipulated like laser light. Although much development was needed, in the future an atom laser might bear the same relation to an optical laser as an electron microscope does to an optical one. Researchers foresaw applications in precision measurement and the precise deposition of atoms on surfaces for the manufacture of submicroscopic structures and devices.

DAVID G.C. JONES

      This article updates subatomic particle.

Astronomy
       Earth Perihelion and Aphelion, 1998(For information on Eclipses, Equinoxes and Solstices, and Earth Perihelion and Aphelion, see Tables (Earth Perihelion and Aphelion, 1998).)

      Throughout 1997 the universe revealed its secrets to astronomers equipped with a bevy of new telescopes, spacecraft, and novel scientific instruments. Optical astronomy received a major boost in February with an upgrade by space shuttle astronauts to the Earth-orbiting Hubble Space Telescope's (HST's) scientific instruments. Space astronomy missions included a flyby of asteroid Mathilde and the arrival of two spacecraft at Mars, and major astronomical payload launches concluded with the successful, though controversial, liftoff of the Cassini spacecraft, headed for a rendezvous with the giant planet Saturn in the year 2004. below.) (Mathematics and Physical Sciences ) In early 1997 Comet Hale-Bopp put on a spectacular naked-eye celestial display for people everywhere. Late in the year astronomers using the 5-m (200-in) Hale telescope on Mt. Palomar, California, reported the discovery of two additional moons in orbit around Uranus, raising the number known to 17.

Solar System.
      The search for the origins of life and for signs of past or present life beyond Earth remained one of the most exciting challenges in science. During the year several space missions shed new light on these issues. On July 4 NASA's Pathfinder spacecraft arrived at Mars, providing the first close-up view of the "red planet" in 21 years. Embodying the new NASA creed of "cheaper, faster, better," Mars Pathfinder made use of a novel landing strategy employing air bags to cushion its final descent to the planetary surface. Two days later Sojourner, a kind of roving robot geologist, wheeled away from Pathfinder, becoming the first moving vehicle ever deployed on another planet. The landing site appeared to be a rock-strewn plain, once swept by water floods. Images from the two craft indicated that some of the rocks may be sedimentary material called conglomerate, which further supports the idea of free-flowing water on the Martian surface in the past. In addition, chemical evidence that the rocks had been repeatedly heated and cooled suggested that Mars had a geologic history somewhat like that of Earth. All told, during their 83 days of operation, Pathfinder and Sojourner collected 16,000 photographs and a vast array of other data on Mars's geology, geochemistry, and atmosphere, which researchers had only begun to analyze in detail by year's end. Overall, scientists already seemed to agree that the data supported the notion that early in its history Mars may have had the necessary conditions to support life.

      In September the Mars Global Surveyor orbiting spacecraft reached its destination. It was designed to monitor the Martian climate and to map the planet's surface with a resolution of about 1.4 m (5 ft). To prepare for the start of those activities in March 1999, the spacecraft began readjusting its highly elliptical orbit into a circular, low-altitude orbit by dipping repeatedly into the upper atmosphere, using it as a brake. At the same time, the craft allowed its onboard magnetometer to measure the Martian magnetic field. Early Surveyor results indicated that Mars has a weak global magnetic field, about 1% that of Earth, but later measurements showed the field to exist only as local patches each a few hundred kilometres across, with their magnetic axes pointing in different directions. The local field regions were thought to be remnants of an earlier, stronger global magnetic field, which could have protected the surface of Mars from incoming cosmic rays and enhanced the chances for past life.

      After arriving at Jupiter in late 1995, the Galileo spacecraft spent the next two years photographing the giant planet and its moons. In February 1997 Galileo came within 586 km (364 mi) of the fractured-ice surface of the Jovian moon Europa. Images taken during that flyby supported earlier speculation that Europa may have a thin icy surface overlying oceans of liquid water or slush that is being warmed by the tidal energy dissipation produced by Jupiter. In addition, some of the images showed surface areas that appeared to be comparatively smooth and crater-free, which stirred debate over whether part or all of Europa had been resurfaced by upwelling water in relatively recent times (within the past few million years) or whether the surface dates back to the early days of the formation of the solar system. If there is liquid water in Europa's interior—and if the moon possesses the kinds of organic compounds that Galileo detected during the year on two other Jovian satellites, Ganymede and Callisto—Europa could be one of the best candidate hosts in the solar system for extraterrestrial life.

      For many people 1997 was the year of the great Comet Hale-Bopp, which was witnessed by more individuals than any other comet in history. Surveys showed that by April more than 80% of the U.S. population had seen the comet. Scientifically, other than Halley's Comet, Hale-Bopp was the most photographed and best-studied comet in history. Following just a year after the naked-eye appearance of the bright Comet Hyakutake, Hale-Bopp put on a spectacular show lasting several months; at its brightest it was outshone only by the Moon and a handful of bright planets and stars. Gas and dust shells from the comet were recorded by many instruments, as was its elongated plasma tail. Spectrometers detected more than three dozen organic compounds present in the tail, including ones never before seen in comets. Since many of those molecules had been detected in dense interstellar molecular clouds, this observation strengthened the link between comets and primitive pre-stellar material. From their orbits above Earth, two astronomical observatory satellites, ROSAT and the Extreme Ultraviolet Explorer, detected X-rays from the comet, as they had from Hyakutake and several other comets. A variety of models for producing the X-rays had been proposed, but at year's end their origin remained unclear.

Stars.
      The distance to a star is one of the most important pieces of information used to determine its properties. It is also a link in the chain of reasoning employed to establish both the size and the age of the universe. The only direct way to measure stellar distances is to use the phenomenon of parallax. Each year, as the Earth orbits the Sun, nearby stars appear to swing back and forth slightly in their angular position with respect to the very distant stars. By measuring this angular shift and using their knowledge of the diameter of the Earth's orbit, scientists can triangulate the distance to nearby stars. Because Earth's atmosphere limits the precision with which stellar positions can be measured from its surface, the European Space Agency launched the Hipparcos satellite in 1989 to survey the sky and determine accurately the positions of nearby stars. Results of the Hipparcos survey were announced in early 1997. They included determinations of positions for more than 100,000 stars with a precision 100 times better than ever before achieved on Earth and of positions for an additional 1,000,000 stars with somewhat lower precision.

      Among the most important results from Hipparcos was a new determination of the distance to, and therefore the luminosity of, the Cepheid variable stars in the Milky Way. These stars, which pulse regularly in brightness, are used to calibrate the distances to remote galaxies. On the basis of Hipparcos's determinations, both Cepheids and galaxies appeared to be about 10% farther away than previously thought. The Hipparcos data also led to a revision of the distance and age determinations of the stars in globular clusters, thought to be the oldest stellar members of the Milky Way Galaxy. They appeared to be 11 billion years old rather than the previously estimated 14 billion-16 billion years. Taken together, the results appeared to resolve the discrepancy between the age of the universe deduced from the ages of the oldest stars and the age found from the observed recession of distant galaxies. They suggested that the universe is about 12 billion years old.

      Stars have been observed in a wide variety of sizes and masses, from one-tenth to perhaps 20-50 times the mass of the Sun. Using a newly installed near-infrared camera and multiobject spectrometer on the HST, a team of astronomers headed by Donald F. Figer of the University of California, Los Angeles, announced the discovery of perhaps the brightest and most massive star ever seen. Although hidden from optical view within a region of gas and dust called the Pistol Nebula, it was detectable at infrared wavelengths. The object appeared to radiate 10 million times the luminosity of the Sun. If it is indeed a single star, its present mass is perhaps 60 times that of the Sun, and at birth it may have been as much as 200 solar masses.

Galaxies and Cosmology.
      Brief bursts of gamma rays coming from the sky were first detected in 1973 by satellites sent aloft to look for the gamma rays that would accompany surreptitious nuclear weapons testing. Since that time these burst events have been detected by a variety of civilian and military satellites and spacecraft. After its launch in 1991, the orbiting Compton Gamma Ray Observatory began detecting about one burst per day, which would bring the total number of events observed to date to more than 2,000. The bursts appeared to arrive at Earth from random directions over the sky. Until 1997 no gamma-ray burst had ever been associated with any star, galaxy, or other known celestial object. The problem in accurately determining their locations was due to the poor angular resolution of current gamma-ray telescopes and the brief duration of the bursts—only seconds on average. In 1996, however, the Italian-Dutch BeppoSAX satellite was launched to search for X-rays from celestial objects, find their precise positions, and study their luminosity variations. It also had the ability to monitor the sky for X-rays that accompany gamma-ray bursts and the capability of being pointed to the region of a burst within hours of the event.

      In February 1997 BeppoSAX found an X-ray counterpart for a gamma-ray burst. Subsequent optical observations by the HST revealed two possible optical counterparts, one fuzzy and one starlike, but neither object was bright enough to identify. A gamma-ray burst in May, however, also was followed by the appearance of an X-ray source. Its detection by BeppoSAX quickly led to the discovery of an associated optical object. Pointing one of the twin Keck 10-m (400-in) telescopes in Hawaii to this dim optical counterpart only 56 hours after the initial gamma-ray burst, Mark Metzger and colleagues of the California Institute of Technology measured the spectrum of what turned out to be a distant galaxy. Its red shift of 0.835 placed the source of the burst at a distance of at least 10 billion light-years. The discovery made it clear that gamma-ray bursts arrive at the Earth from cosmological distances, rather than somewhere within or near the Milky Way Galaxy, and that they release more energy in a few seconds than the Sun radiates in its lifetime. The ultimate cause of the bursts remained to be determined, though many astronomers favoured a model involving the coalescence of two neutron stars in a binary system, resulting in a giant explosion and a rapidly expanding fireball.

      The brightest objects in the universe are the enigmatic quasars. Since their discovery in 1963, quasars, rather than the far more plentiful but far less luminous galaxies, had held the record for the most distant objects that had been seen in space. In 1997, however, a galaxy was discovered with a red shift of 4.92, displacing the previous record holder, the quasar PC1247+34. The discovery came about when Marijn Franx and collaborators of the Kapteyn Institute, Groningen, Neth., using the Hubble telescope, found a red arc of light near the centre of a relatively nearby cluster of galaxies. A spectrum of the arc taken with one of the Keck telescopes revealed that it was, in fact, a distant and quite young galaxy. It was observable only because the nearer cluster of galaxies acted as a gravitational lens, distorting but magnifying the light from the distant galaxy as it passed through the cluster.

KENNETH BRECHER

      This article updates river.

Space Exploration
      Tracks on the planet Mars and tribulations on Russia's space station Mir vied for centre stage in space exploration during 1997. Meanwhile, preparations continued apace for the first launch of parts of the International Space Station (ISS).

Manned Spaceflight.
      Ten manned space launches were made during the year, most in support of plans to assemble the ISS beginning in 1998. Three U.S. space shuttle missions and two Russian Soyuz missions went to Mir; four other shuttle flights carried science missions; and one shuttle flight visited the Hubble Space Telescope (HST) on a servicing mission.

      Atlantis made all of the U.S.'s shuttle trips to Mir. Although the flights had been meant to give U.S. astronauts experience on a space station, they became part of Mir's lifeline as the aging station (launched in 1986) experienced a series of major mishaps. On February 23, a month after Atlantis's first visit, the space station had a fire, one of the most serious accidents that can happen aboard a spacecraft. Six people were aboard, rather than the usual three, because Soyuz TM-25, which had been carrying a replacement crew, had recently docked. A solid-chemical oxygen canister burned for more than a minute, which forced the crew to don breathing equipment and seriously damaged the station's main electrolysis-based oxygen-generating system. In April an unmanned Progress resupply ferry delivered fresh oxygen canisters and fire extinguishers to Mir, and Atlantis's second mission in May included a replacement oxygen generator.

      On June 25 Mir suffered a near-fatal mishap when a Progress ferry being docked via remote control by Russian cosmonaut Vasily Tsibliyev accidentally rammed into the Spektr science module, putting a hole in the pressure vessel and damaging its solar arrays beyond use. To salvage the station, which consisted of a core, a connecting node, and five science modules, crew members severed electrical and data connections between Spektr and the rest of the station and then sealed off the module. They saved the station but lost about half of their electrical power.

      Problems subsequently cascaded as Mir's main computer shut down and had to be jury-rigged to keep working. A planned internal space walk in July to repair the station was postponed when Tsibliyev developed an irregular heartbeat and officials in Moscow decided that the crew was too fatigued to work safely. The toll on the crew became apparent when on July 17 one of them accidentally disconnected a computer cable, which caused the station to drift and its solar panels to point away from the Sun.

      With a Progress resupply visit in July, the Soyuz TM-26 crew-replacement mission in August, and the year's third visit by Atlantis in September-October, Mir had a fresh crew and all needed repair equipment, including a special hatch with electrical connectors to allow Spektr's lines to be reconnected. In activities inside and outside Mir between August and November, the crew restored most of the lost power and the main oxygen-generating system (which had experienced renewed problems after the June collision), replaced the onboard computer with a new unit, and installed new solar arrays, although they remained unable to locate the exact point of the hole in Spektr.

      Assembly of the ISS was delayed from a late-1997 start to mid-1998 after Russia ran into financial and technical problems with the space station's service module, which was built from what once had been planned as Mir 2. The first ISS element, dubbed the FGB, was to be launched in June, with a space shuttle carrying up the first U.S.-built components a month later.

      Two shuttle missions, which had to be accomplished with three flights, concentrated on microgravity materials sciences. Soon after launch of the Microgravity Science Laboratory (MSL-1) mission aboard Columbia in April, the malfunction of an electricity-generating fuel cell left the shuttle with no reserve and forced its return after only four days in space. Because of the importance of the mission's results to future ISS research, NASA exploited a gap in the shuttle flight schedule to refly the entire mission and crew, a first for the shuttle program. On July 1 MSL-1 was relaunched aboard Columbia, and all the experiments were conducted as planned.

      In November Columbia flew again, carrying the fourth U.S. Microgravity Payload (USMP-4) and Spartan 201, a deployable pair of solar instruments. After Columbia's robot arm put Spartan into space, it was unable to relock onto the craft for retrieval. NASA took advantage of a scheduled space walk by astronauts Winston E. Scott and Takao Doi for testing ISS assembly techniques by having the two catch Spartan by hand and pull it into the shuttle's open payload bay. A second, unscheduled space walk was held just before the end of the mission in order to make up some of the tests that were skipped during the unplanned spartan retrieval.

      The year's other science mission for the shuttle was flown in August by Discovery. Its major payload was Germany's CRISTA-SPAS-2, a collection of spectrometers and telescopes that the shuttle deployed in space for observations of the Earth's atmosphere.

      In February Discovery astronauts made the second service call on the orbiting HST since its launch in 1990. In five space walks, they installed more than two tons of equipment, including new spectrographic and imaging instruments, and patched insulation blankets that were found to have eroded under conditions in orbit.

Space Probes.
      Mars, quite simply, was the planet of the year as Mars Pathfinder and its deployed rover beamed back images from the surface and as Mars Global Surveyor started settling into its planned orbit.

      Launched the previous December, Pathfinder entered the Martian atmosphere on July 4, 1997. Its descent was braked by a heat shield, a parachute, and rockets and finally by air bags, on which it bounced to rest on the surface. Once down, the tetrahedral craft deployed solar arrays, a colour stereo camera, and instruments for atmospheric and meteorologic studies. Early images revealed the landing area to be a rock-strewn plain showing signs that liquid water once had run through the area. Pathfinder then deployed its six-wheeled rover, Sojourner, which carried colour cameras and a special spectrometer for geologic and geochemical studies of Martian rocks, soil, and dust. After thousands of images were returned from the lander and rover, the mission ended in November. During their operation Pathfinder and Sojourner demonstrated a number of new technologies for future Mars missions. above.) (Mathematics and Physical Sciences )

      Launched a month earlier than Pathfinder, Mars Global Surveyor went into an elliptical orbit around Mars on September 11. It then dipped into the upper Martian atmosphere in a series of aerobraking maneuvers designed to take the satellite into a lower orbit better suited for mapping. A solar array that had not properly deployed after launch began to flex excessively, which prompted NASA to suspend the aerobraking for several weeks while engineers developed gentler maneuvers that would not endanger the craft.

      The Near Earth Asteroid Rendezvous (NEAR) spacecraft remained on course to the asteroid Eros, which it was to orbit in 1999 and study for approximately a year. On June 27 NEAR passed within 1,200 km (750 mi) of asteroid Mathilde and took many multispectral images.

      The Cassini mission to Saturn lifted off October 15 after a flurry of protests and lawsuits attempted to block the launch. Cassini drew its electric power from the heat generated by the decay of radioactive plutonium. Protesters had claimed that a launch accident could expose Earth's population to plutonium dust, but NASA countered that the casks encasing the plutonium were robust enough to survive any mishap. The ambitious mission was to be the first to orbit Saturn and the first to land on the moon of an outer planet. Cassini was scheduled to reach Saturn in 2004, after which it would send its Huygens probe parachuting into the methane-rich atmosphere of Titan.

      The Galileo spacecraft ended its primary mission to Jupiter on December 7, two years after reaching the planet. NASA and the U.S. Congress, however, approved a two-year mission extension during which Galileo would study Jupiter's moons Europa and Io.

Unmanned Satellites.
      The United States launched the Advanced Composition Explorer (ACE) on August 25 to study the makeup of the solar wind from a "halo orbit" centred on L-1, a gravitational balance point between Earth and the Sun about 1.5 million km (930,000 mi) away from Earth. ACE carried instruments to monitor the magnetic field, solar-wind electrons and ions, and cosmic-ray ions.

      Japan's HALCA radio-astronomy satellite was launched on an M-5 rocket from the Kagoshima Space Center on February 12. The 830-kg (1,830-lb) satellite carried an 8-m (26-ft) wire-mesh dish antenna that deployed in orbit. With an apogee of 21,400 km (13,300 mi), the satellite was being used in conjunction with ground-based radio telescopes for very long baseline interferometry to give the effect of a radio antenna more than twice Earth's diameter.

      Launched Dec. 24, 1996, the U.S-Russian-French Bion 11 mission, which had been opposed by animal rights groups, carried two monkeys and a variety of other organisms into orbit to study their physiological responses to weightlessness. After the Bion capsule returned to Earth January 7, one of the monkeys died while under anesthesia for tissue biopsies. Scientists later decided that the whole process was too traumatic and suspended flight experiments with primates for an indefinite time.

      India launched its fourth remote-sensing satellite, IRS-1D, on its locally developed PSLV-C1 (Polar Satellite Launch Vehicle) rocket from Sriharikota Island on September 29. The 1,200-kg (2,650-lb) craft had a black-and-white camera with a resolution of 5 m (16.5 ft), a linear imaging colour scanner with a resolution of 23.5 m (78 ft), and a wide-field sensor.

      Losses of Japan's Midori (Advanced Earth Observation Satellite) and the U.S.'s Lewis satellites marred the year's activities. Midori, launched in August 1996 to monitor changes in the global environment, ceased operation in June when its solar array failed. Lewis was written off shortly after launch on August 23. The first of two Small Spacecraft Technology Initiative missions planned by NASA, the satellite carried visible and infrared Earth imagers and an ultraviolet cosmic background imager in a small 445-kg (980-lb) package. A few days after launch, its attitude control system failed, which caused it to reenter the atmosphere in late September. Launch of its companion craft, Clark, was delayed to March 1998 to ensure that the problem was not repeated.

Launch Vehicles.
      The U.S. Air Force surprised the aerospace industry when it decided to choose both finalists in the Evolved Expendable Launch Vehicle (EELV) competition. Selection of a single winner had been expected in June 1998, but the large backlog of planned communications-satellite launches and its own desire to negotiate the best possible launch prices led the Air Force to announce that it would buy services from both Lockheed Martin Corp. and the Boeing Co. Lockheed Martin was to develop a series of EELV launchers based on its Atlas II family; Boeing was to develop its Delta III and IV families.

      Europe remained competitive with its Ariane 4 family of launchers and the successful launch in October of its second Ariane 5 vehicle. Investigation of the failed first launch of the Ariane 5 in 1996 revealed that the rocket's guidance system had been adapted from the Ariane 4 design without proper modifications. A management shake-up and a rigorous review of the entire design followed.

DAVE DOOLING

      See also Television. (Media and Publishing )

      This article updates space exploration.

▪ 1997

Introduction

MATHEMATICS
      The year 1996 was notable for the successful application of recent advances in mathematics to such practical concerns as the coiling of wire and the manipulation of digital images. In one instance a team at the Spring Research and Manufacturers' Association in Sheffield, Eng., employed methods of data analysis derived from chaos theory, which studies apparently random or unpredictable behaviour in physical systems governed by deterministic laws, to develop a novel quality-control test for wire used in spring manufacture. For decades the spring industry had faced the problem of predicting whether a given sample of wire had good or bad coilability. The new test was carried out in a few minutes by a machine called a FRACMAT, which coils a long test spring, measures the spacing of successive coils with a laser micrometer, and analyzes the resulting numbers, using methods originally developed to find chaotic attractors—geometric descriptions of the behaviour of chaotic systems—in the behaviour of fluid flow.

      Other novel applications were based on a mathematical technique called wavelet analysis. The technique was introduced in the early 1980s and was established firmly in 1987 by Ingrid Daubechies, then at AT&T Bell Laboratories, Murray Hill, N.J. Wavelet analysis represents data in terms of localized bliplike waveforms called wavelets. The resultant, often greatly simplified representation of the original data is called a wavelet transform. Perhaps the best-known application of wavelet analysis to date derived from the U.S. FBI's decision in 1993 to use a wavelet transform for encoding digitized fingerprint records. A wavelet transform occupies less computer memory than conventional methods for image storage, and its use was predicted to reduce the amount of computer memory needed for fingerprint records by 93%.

      Some of the most recent applications of wavelets involved medical imaging. In the past two decades, medical centres had come to employ various kinds of scanner-based imaging systems, such as computed tomography and magnetic resonance imaging, that use computers to assemble the digitized data collected by the scanner into two- or three-dimensional pictures of the body's internal structures. Dennis Healy and his team at Dartmouth College, Hanover, N.H., demonstrated that a poor digitized image can be smoothed and cleaned up by taking a wavelet transform of it, removing unwanted components, and "detransforming" the wavelet representation to yield an image again. The method reduced the time of the patient's exposure to the radiation involved in the scanning process and thus made the imaging technique cheaper, quicker, and safer. His team also used wavelets to improve the strategies by which the scanners acquired their data at the start. Other researchers were applying the data-enhancement capabilities of wavelets to such tasks as improving the ability of military radar systems to distinguish objects and cleaning up noise from sound recordings. (IAN STEWART)

      This article updates analysis; information processing.

CHEMISTRY

Nuclear Chemistry.
      In 1996 scientists at Germany's Institute for Heavy Ion Research (GSI) in Darmstadt added a new entry to the periodic table with the creation of element 112. The element, so far unnamed, was synthesized by a multinational team headed by Peter Armbruster (see BIOGRAPHIES (Nobel Prizes )) and Sigurd Hofmann. The researchers first accelerated a beam of zinc ions to high energies in GSI's heavy-ion accelerator UNILAC. They then shot the ions into a lead target, whereupon the zinc and lead nuclei fused. The team detected a single nucleus of the new element consisting of 112 protons and 165 neutrons, which gives it an atomic mass of 277. It was thus the heaviest nucleus ever created in the laboratory. GSI teams previously had discovered several other new chemical elements, including two—elements 110 and 111—in 1994 alone.

      Like other superheavy elements created in the past, element 112 decays in a small fraction of a second, but its discovery provided encouragement that scientists would soon succeed in efforts to create element 114. Theoretical studies predicted that beginning at element 114, the periodic table contains an "island of stability"—a region of comparatively long-lived superheavy elements that would be easier for scientists to use in their studies of the composition and properties of matter.

      Francium is a short-lived radioactive element created naturally in trace amounts in uranium deposits; its longest-lived isotope, francium-223, has a half-life of 21 minutes. Francium's fleeting existence has made it difficult for scientists to study its properties. Luis A. Orozco, Gene D. Sprouse, and associates at the State University of New York at Stony Brook developed a way to create francium atoms and trap them in a glass bulb. They bombarded a gold target with oxygen-18 atoms, creating atoms of francium-210, which then were moved into a glass bulb having a reflective coating that kept the atoms from escaping. Fortified with laser beams and a magnetic field, the bulb held the francium atoms for only about 20 seconds before they decayed or escaped, but new atoms were continuously produced, so about 1,000 were constantly present inside. The apparatus set the stage for the first detailed studies of francium's atomic characteristics.

Organic Chemistry.
      The buckminsterfullerene molecule, nicknamed buckyball and symbolized C60, consists of 60 carbon atoms bound together into a three-dimensional spherical cage with a bonding structure that looks like the seams on a soccer ball. Named for its resemblance to the geodesic domes created by the late U.S. engineer and architect R. Buckminster Fuller, the molecule has fascinated scientists and the public since the 1980s, when it was first discovered. C60 recently was proclaimed "The Most Beautiful Molecule" in a popular book of that title, yet by the mid-1990s no major commercial or industrial application for the material had emerged.

      During the year Ben Z. Tang and Nai-Ten Yu of the Hong Kong University of Science and Technology, Kowloon, reported what they hailed as the first such application. They discovered that C60 has novel optical properties that allow it to block light of specific wavelengths over most of the ultraviolet and visible spectrum. Tang and Yu developed transparent materials incorporating C60 that filter out harmful ultraviolet wavelengths and block or limit transmission of other undesirable wavelengths. Traditional techniques for manufacturing glass and plastic light-filtering materials were complex and costly; making coloured glass filters, for instance, involved high-temperature processes that required multiple steps and consumed large amounts of energy. By contrast, the process for making filter materials with C60 was gel-based and was carried out at room temperature. Furthermore, changing the optical properties of the filter required adjusting only one variable, the quantity of C60 itself.

      Chemists long have recognized that the internal cavity of the buckminsterfullerene cage, which measures seven angstroms (Å) in diameter, could act as a container for atoms. (An angstrom is a ten-billionth of a metre.) The cavity is large enough to hold an atom of any element in the periodic table and thus could serve as the basis for the synthesis of a range of commercially valuable endohedral (inside-the-cage) chemical species. Among the most alluring were metal-atom-containing C60 complexes, or endohedral metallofullerenes, which could, for example, provide a new and useful family of superconductors. Getting large metal atoms inside the cavity by opening holes in the cage, however, was proving difficult. Yves Rubin and co-workers at the University of California, Los Angeles, reported their creation of the largest hole yet opened in buckminsterfullerene. Moreover, they succeeded in attaching a cobalt atom over the hole with a bridge of carbon atoms, although the hole was not large enough for the metal atom to slip inside. Rubin's group speculated that it might be possible to move the cobalt atom inside, a process they termed "stuffing the turkey," by thermally exciting the complex to stretch the hole.

      Researchers at Purdue University, West Lafayette, Ind., reported the first direct method for alkynylation of carbon-hydrogen bonds, an advance that other chemists described as "unique" and "unprecedented." The technique allowed chemists to attach alkyne groups to hydrocarbons, ethers, and other commercially important organic molecules faster, easier, and in higher yields than previously possible. Alkynes are hydrocarbons like acetylene (ethyne; HC≡CH) that contain a carbon-carbon triple bond. Traditional alkynylation techniques were inefficient and difficult and involved multiple reactions. The single-step technique was discovered serendipitously by Philip L. Fuchs and Jianchun Gong.

Inorganic and Physical Chemistry.
      Ever since 1778, when the Swedish scientist Carl Wilhelm Scheele discovered molybdenum blue, chemists have been mystified about the structural features that give this material its unusual properties. The chemical is familiar to chemistry students studying qualitative analysis, who try to identify the chemical composition of unknown materials. If a reducing agent is added to a solution under analysis and causes a characteristic colour change due to the formation of molybdenum blue, the result confirms the presence of the molybdate ion. However, chemists have not been able to determine whether molybdenum blue is an amorphous or crystalline material, a colloid or a solution, or a distinct compound or a mixture.

      Achim Müller and co-workers at the University of Bielefeld, Ger., proposed a structure for molybdenum blue that explains some of its features. The structure suggests that molybdenum blue is the ammonium salt of a large doughnut-shaped anion (negatively charged ion) comprising a cluster of MoO3 units combined with hydroxyl (OH) groups and water molecules. Molybdenum blue's apparent amorphous nature may result from the large size of the anionic cluster, which would not fit easily into a crystalline structure. The water molecules and other hydrophilic (water-seeking) surface components of the cluster would explain the substance's high solubility in water, alcohol, and certain other solvents.

      Chemists at Pennsylvania State University reported synthesis of a compound of potassium and nickel that may open a new area of high-pressure chemistry. Because of differences in the electronic structures and sizes of their atoms, potassium, which is an alkali metal, and nickel, which is a transition element, normally do not combine. John V. Badding and co-workers found that potassium develops characteristics of a transition element when subjected to pressures of about 31 gigapascals, 310,000 times greater than normal atmospheric pressure. It then forms chemical bonds with nickel. Badding's group reported that other alkali metals, including rubidium and cesium, also assume traits of transition elements at high pressures. They used a diamond anvil cell and infrared laser heating to form the new compound. Evidence that potassium binds to nickel under pressure supported a theory that radioactive potassium exists in the Earth's core, perhaps bound to iron. The researchers planned to test that theory as they worked to make other exotic compounds from atoms that will not bond at milder pressures.

      The hydroxyl radical is the most important free radical in the lower atmosphere. It plays a major role in the photochemical reactions that remove the greenhouse gas methane and other natural and human-made atmospheric emissions. This scavenger has only a fleeting existence, and measuring hydroxyl levels has been difficult, requiring elaborate ground-based instruments that project a laser beam through many kilometres of air. Hans-Peter Dorn and co-workers at the Jülich (Ger.) Research Centre's Institute for Atmospheric Chemistry reported making accurate OH measurements with more compact instruments that can fit in aircraft and ships. The technique, called laser-induced fluorescence spectroscopy, bounces a laser beam between two sets of mirrors only 38.5 m (126 ft) apart. It could permit the first routine measurements of OH, including point measurements of OH at specific locations.

      In 1993 W. Ronald Gentry and associates at the University of Minnesota at Minneapolis reported detecting the first helium dimers, two-atom molecules of helium, at conditions of extremely low temperature. They concluded from theoretical calculations that the bond between the helium atoms in the dimer is the longest and weakest chemical bond in any molecule. They estimated that the bond is 55 Å long, a far cry from the 1-2 Å that separate atoms bonded together in most other molecules. During 1996 the group reported experimental verification of the dimer's record status. They measured the bond length at 62 Å with a possible error of +/-10 Å. Gentry said that helium dimers promised to be of considerable value in helping scientists understand the forces that operate among atoms bonded together into molecules. One, the Casmir force, comes into play when the distance between two atoms is very large, as it is in the helium dimer.

      Hydrogen bonding is one of the fundamental ways in which atoms link together. It is the attraction between a positively charged hydrogen atom in one molecule and a negatively charged atom or group in another molecule. Hydrogen bonding between molecules of water (H2O), where oxygen serves as the negatively charged atom, accounts for the unexpectedly high melting and boiling points of the compound. Robert Crabtree of Yale University and co-workers discovered a new kind of hydrogen bond that they termed the dihydrogen bond. Crabtree detected the bond between molecules of a compound with one hydrogen atom that is negatively charged and another that is positively charged. The positively charged hydrogen on one molecule attracts the negatively charged hydrogen on a second molecule. According to Crabtree, the strong dihydrogen bond explained the properties of some compounds. For example, dihydrogen bonding occurs in H3BNH3, which melts at 104° C (220° F). By contrast, the similar compound H3CCH3 does not exhibit dihydrogen bonding and melts at -181° C (-294° F).

Applied Chemistry and Materials.
      Zeolites are compounds of aluminum, silicon, and alkali and alkaline-earth metals like sodium and calcium. Their crystal structures are riddled with millions of tiny pores and channels that can absorb a variety of atoms and molecules. The pore walls of the aluminosilicate zeolites are strongly acidic, which gives them a catalytic effect widely exploited by the petroleum industry and elsewhere. Zeolites have other industrial applications, including use as molecular sieves for absorbing and separating materials. All known natural and synthetic zeolites contain pores that are ringed by no more than 12 aluminum or silicon atoms (each bonded to four oxygen atoms in an elegant tetrahedral arrangement). C.C. Freyhardt of the California Institute of Technology and co-workers reported making the first aluminosilicate zeolites with pores ringed by 14 atoms. The larger rings mean larger pores, which range from 7.5 to 10 Å. Freyhardt noted that large-pore zeolites were much in demand for containing and catalyzing reactions involving larger organic molecules. Although other researchers previously had synthesized large-pore zeolites, the materials had drawbacks that seriously limited practical applications.

      Ceramics are of major commercial interest for components of engines, tools, electrical devices, and other products that demand hardness, stiffness, and high-temperature stability. Two of the most appealing ceramics were those based on silicon nitride and silicon carbide. Silicon nitride, however, begins to decompose at about 1,400° C (2,550° F) and has an ultimate thermal stability limit of 1,500° C (2,730° F), which has limited its use in extremely hot environments such as engines and turbines.

      Ralf Riedel of the Technical University of Darmstadt, Ger., and co-workers reported synthesis of a new composite ceramic based on silicon nitride and silicon carbide that is stable to 2,000° C (3,630° F). The material, silicoboron carbonitride, can be processed into bulk ceramic materials or coatings or into spun fibres suitable for use as composite reinforcing material. The researchers did not yet understand the basis for silicoboron carbonitride's enhanced thermal stability. Riedel predicted that the new ceramic would have considerable potential in technologies such as energy-efficient power generation and mechanical and chemical engineering projects. (MICHAEL WOODS)

PHYSICS
      In 1996 scientists produced the first atoms of antimatter—specifically, antihydrogen atoms—in a long-awaited confirmation of fundamental theory. (See Sidebar (PHYSICS: The First Antiatoms ).) At the other end of the periodic table, an atom of element 112, heavier than any other known element, was synthesized for the first time. (See Chemistry (Mathematics and Physical Sciences ).) Experiments involving the trapping and observation of single atoms furthered investigations into the strange properties of the quantum world, while several results in particle-physics research raised questions about possible flaws in the standard model. In the continuing debate over the age of the universe, astrophysicists appeared to be converging on an agreed value, although one that posed considerable problems for theorists.

      Advances in ultrahigh-vacuum techniques coupled with high-precision laser spectroscopy allowed physicists to carry out some of the most fascinating experiments of the year. Single atoms and ions could be trapped and held for hours, even weeks, and their interactions with electromagnetic fields explored in minute detail with high-precision lasers. One of the greatest subjects of contention in the past few decades has been the precise nature of the electromagnetic field, which is most familiar as the propagating electromagnetic radiation called light. Since the invention of the laser, the concept of the photon as a fundamental "particle of light," or quantized packet of electromagnetic energy, has had to be rediscussed and refined. Strangely, although the quantum nature of light was postulated by Albert Einstein in 1905, not until quite recently did unambiguous evidence of this quantization exist. During the year two groups of researchers, using single atoms, carried out work that demonstrated the nature of these quantum effects and even pointed the way forward toward the possibility of quantum computers.

      An experiment conducted by David Wineland and colleagues of the National Institute of Standards and Technology, Boulder, Colo., made use of a single beryllium ion. The ion was held at ultrahigh vacuum in an ion-confinement device called a Paul trap by a radio-frequency field, cooled until nearly motionless, and observed as it executed simple harmonic oscillation in the field. At the extremely low energies involved, the oscillatory motion was quantized. The ion could possess only one particular energy out of a "staircase" of energies, for which the energy difference between two stairs was a quantized packet of oscillatory energy called a phonon. Energy is gained or lost by the absorption or emission of a phonon.

      The situation was identical to that of a single vibrational mode of an electromagnetic field, with the difference being that for the field, the energy steps are photons and the total field energy is defined by the total number of photons in the mode. The electromagnetic field may exist in different "states," which can be defined only by measuring the probability of detecting a number of photons. This probability distribution is different, depending on whether the source of light is a laser or a conventional light source. In the ion-oscillation experiment, similar probability distributions of phonons could be produced, and the oscillation could be stimulated in classical and quantum states. Among other experiments, the group claimed to have prepared a beryllium ion in "Schrödinger's cat" states—states that are a superposition of two different possible results of a measurement. In the 1930s quantum theory pioneer Erwin Schrödinger proposed his famous thought experiment, in which a cat in a closed box appears to be both alive and dead at the same time until someone observes it, as a demonstration of the philosophical paradoxes involved in quantum theory. The possibility of the experiment's actually being done could have far-reaching effects on the heated philosophical debate about the meaning of quantum theory. On the practical side, atoms held in two different superposed quantum states could serve as logic elements in quantum computers, which might be able to make use of the superpositions to carry out many calculations simultaneously.

      In an experiment almost the reverse of the one discussed above, Serge Haroche and co-workers of the École Normale Supérieure, Paris, isolated and counted a small number of photons in the microwave region of the electromagnetic spectrum. Their "photon trap" was a cavity 3 cm (1.2 in) in length, bounded by two curved superconducting mirrors. To detect the trapped photons, the experimenters projected atoms of rubidium through the cavity, one at a time. Each atom had been carefully prepared in a single excited state that survived long enough to cross the cavity. As the atom crossed, it exchanged energy with the electromagnetic field inside. Counting the number of atoms that arrived in an excited or de-excited state gave the experimenters a direct picture of the interactions in the field. The picture confirmed directly that the energy states of the field in the cavity were quantized.

      In elementary-particle physics, confirmation of the existence of the top quark in 1995 appeared to complete the experimental evidence for the standard model, which describes all matter in terms of the interactions between six leptons (particles like the electron and its neutrino) and six quarks (which make up particles like protons and neutrons). On the other hand, the team at the Fermi National Accelerator Laboratory (Fermilab) near Chicago that discovered the top quark also found evidence suggesting that quarks may themselves consist of something even smaller. The evidence came from the results of extremely high-energy collisions between protons and antiprotons. Observations of particle jets produced by such collisions showed that, for jet energies above 350 GeV (billion electron volts), the experimental results appeared to diverge dramatically from those predicted by quantum chromodynamics, that part of the standard model that describes the interaction of quarks via the strong nuclear force.

      A theory of elementary particles that goes beyond the standard model is that of supersymmetry. The theory predicts that every known fundamental particle has a supersymmetric partner. If the particle is a carrier of one of the fundamental forces, like the photon (which carries the electromagnetic force) or the gluon (which carries the strong force), the partner is of the non-force-carrying kind, like quarks or leptons. Likewise, non-force-carrying particles have their force-carrying supersymmetric partners. Researchers working with the Karlsruhe Rutherford Medium Energy Neutrino (KARMEN) experiment at the Rutherford Appleton Laboratory, Chilton, Eng., claimed to have observed results that suggest the existence of a photino, the supersymmetric partner of the photon. Similarly, researchers at Fermilab identified particle-collision events that suggested the creation of selectrons, the supersymmetric partners of electrons. Other explanations, however, were possible for both results.

      Data from the Liquid Scintillator Neutrino Detector at the Los Alamos (N.M.) National Laboratory added to evidence, first reported from that facility in 1995, that neutrinos—the most elusive of common elementary particles—may have a small mass. Very difficult to detect because of their weak interaction with other particles of matter, neutrinos had been thought for decades to be entirely massless. Should they prove to have even a tiny mass, they could offer one possible solution to the problem of the "missing mass" of the universe—the idea, based on cosmological theory and observations of the gravitational behaviour of galaxies, that the universe contains much more mass than can be accounted for by adding up the masses of all of the observable objects.

      The ongoing debate over the age of the universe appeared to be approaching a consensus. The vital parameter defining the age is Hubble's constant (H0), which expresses the rate at which the universe is expanding. A high value for H0 implies a young universe, and vice versa. Wendy Freedman of the Carnegie Observatories, Pasadena, Calif., used the Earth-orbiting Hubble Space Telescope to observe the apparent brightness of pulsating stars known as Cepheid variables in distant galaxies. Her result for H0 of 73+/-11 km per second per megaparsec implied an age of about 11 billion years. On the other hand, Allan Sandage of the same institution, studying the apparent brightness of supernovas in distant galaxies, reported a value of 57+/-4 km per second per megaparsec, which suggested an age of about 14 billion years. Although the two values nearly overlapped at the extremes of their error ranges, the age range that they encompassed presented difficulties. First, the oldest globular star clusters in the Milky Way Galaxy appeared to be at least 12 billion years old and could be several billion years older. Second, galaxies with an apparent age almost as great as that of the universe were observed in 1996 by several groups. The presence of "old" stars and galaxies in a relatively "young" universe made it difficult for theorists to find the time needed for the universe to have formed galaxies and stars and for some of those objects to have become as old as they appeared to be.

      One eagerly anticipated experiment did not take place. The European Space Agency's Cluster mission, in which four artificial satellites were to be placed in stationary orbits relative to one another to give a three-dimensional picture of the solar wind and its effect on Earth, was destroyed by the explosion of its Ariane 5 launch vehicle. (DAVID G.C. JONES)

ASTRONOMY
       Earth Perthelion and Aphelion, 1997(For information on eclipses and other standard astronomical events due to take place in 1997, see Table (Earth Perthelion and Aphelion, 1997).)

      For astronomy, 1996 would probably be remembered as the year in which scientists announced evidence for ancient life in a meteorite thought to have originated on Mars. It was also a year in which astronomers discovered a host of extrasolar planets, some perhaps with the physical and chemical conditions necessary to harbour life as it is known on Earth. Amateur astronomers and the public alike delighted in Comet Hyakutake, the most spectacular comet seen in two decades. Orbiting Earth, the Hubble Space Telescope produced a remarkable image of the most ancient galaxies in the universe found to date.

Solar System.
      The most exciting astronomical discovery of the year was made without the aid of telescopes, radio antennas, or spacecraft, the main tools of modern astronomical exploration. In August a team headed by David McKay (see BIOGRAPHIES (McKay, David Stewart )) of NASA's Johnson Space Center, Houston, Texas, and Richard Zare of Stanford University announced that it had found strongly suggestive evidence for Martian life's having existed more than 3.6 billion years ago. The claim was based on a wide variety of studies of a meteorite called ALH84001. This particular meteorite was found in 1984 in the Allan Hills ice field of Antarctica. It was recognized to be of possible Martian origin only in 1994 and was one of only about a dozen meteorites found to date on Earth whose chemistry matches the unique Martian chemistry found by the Viking spacecraft that landed on Mars in 1976.

      The softball-sized igneous rock weighs about 1.9 kg (4.2 lb) and has a complex history. Its origin was dated to about 4.5 billion years ago, when Mars and the other planets formed. It was thought to have originally formed beneath the Martian surface and then been fractured by a meteorite impact some 3.6 billion years ago. Penetrated by water and minerals, it then encapsulated and fossilized whatever matter was present at the time. The meteorite appeared to have been ejected from Mars about 16 million years ago following a large asteroid impact with the planet and subsequently to have reached Earth about 13,000 years ago.

      Using high-resolution scanning electron microscopy and laser mass spectrometry to study ALH84001, the NASA-funded research team reported finding the first organic molecules of Martian origin, several mineral features characteristic of biological activity, and what the team suggested were microscopic fossils of primitive, bacteria-like organisms. The organic molecules, called polycyclic aromatic hydrocarbons, are characteristic of the residue found after terrestrial microorganisms die and their initially more complex organic molecules subsequently degrade. Possibly the most suggestive evidence comprised tubular and egg-shaped structures that resembled, though on a much smaller scale, the fossils of ancient single-celled bacteria found on Earth. Many scientists commented that although the evidence from ALH84001 was compelling, it was not conclusive proof for the presence of ancient life on Mars. At year's end an independent group of British scientists reported evidence for ancient life in another presumed Martian meteorite, designated EETA79001, which formed only about 175 million-180 million years ago and was ejected from Mars only about 600,000 years ago. Although NASA was already involved with several missions to study Mars in the near future, the meteorite discoveries prompted an increased commitment to the search for extraterrestrial life with a series of unmanned Martian observers, explorers, and, ultimately, a mission to return rock samples to Earth. (See Space Exploration, below.)

      On Dec. 7, 1995, the Galileo spacecraft reached the giant planet Jupiter after a six-year, 3.7 billion-km (2.3 billion-mi) journey. Galileo consisted of two parts: a small probe designed to plunge into the Jovian atmosphere and a larger orbiter whose mission was to survey Jupiter and its four major (Galilean) moons over a two-year period by taking pictures and making magnetic, thermal, and other measurements of their properties. On the day that Galileo arrived, its probe descended into Jupiter's thick atmosphere, surviving a mere 57 minutes while it radioed its measurements back to the orbiter. Among the surprises that emerged in subsequent weeks as astronomers analyzed the probe's data were the discoveries that Jupiter's atmosphere contains less water than had been thought, that its outer atmosphere is 100 times denser and hotter than previously predicted, and that its atmospheric winds, with speeds up to 530 km (330 mi) per hour, are faster than had been suspected.

      Data collected by the orbiter over the ensuing year showed that Jupiter's moon Io, the first satellite in the solar system known to have active volcanoes, has a dense inner core, likely made of iron, and possibly its own magnetic field. Another moon, Ganymede, was found to be covered by grooves, faults, and fractures suggesting that it was considerably hotter and more active in the past than planetary scientists had thought. Perhaps most exciting of all were the images of the moon Europa suggesting that it may have had, and might still have, a watery interior. Some scientists proposed that because of tidal heating of the satellite by the strong gravitational pull of Jupiter, Europa's warm interior sea of water could have the conditions to harbour life.

      Comets and asteroids also made news in 1996. Comet Hyakutake, discovered in January by Japanese amateur astronomer Yuji Hyakutake, streaked across the sky in March, April, and May, the brightest comet visible from Earth since Comet West in 1976. Sky watchers all over the world were delighted by its long feathery tail and high brightness, which made it visible even against bright city lights. Professional astronomers found the comet to be a source of new insights into the nature of these icy wanderers of the solar system. A team of NASA scientists using the NASA Infrared Telescope Facility in Hawaii detected ethane and methane in the tail of Hyakutake, the first time those molecules had been seen in a comet. Because as much as 2% of the frozen gases of Hyakutake appeared to consist of ethane and methane, scientists speculated that the comet had a very different history from many other well-studied comets. Perhaps even more startling was the discovery by the Earth-orbiting German-U.S.-British ROSAT satellite that X-rays were coming from Hyakutake, the first time such high-energy radiation had been detected from any comet.

      Two puzzling solar system objects were discovered in late 1996. The first, designated Asteroid 1996 PW, has the photographic appearance of an ordinary asteroid, most of which were thought to be made of rocky material. But while most asteroids orbit the Sun in a region called the asteroid belt, which lies between the orbits of Mars and Jupiter, Asteroid 1996 PW moves in a highly elliptical orbit, traveling from the outer solar system toward the Sun in a path that resembles those of most comets. The second object, Comet 1996 N2, has the photographic appearance of a comet with a well-developed tail but moves in a circular orbit entirely within the asteroid belt. Taken together, the two objects left scientists with a new set of puzzles about the origin and evolution of comets and asteroids and the distinction between them.

Stars.
      The discovery of the first planet orbiting a Sun-like star, 51 Pegasi, was announced in late 1995. However, with a mass about half that of Jupiter and a surface temperature of 1,000° C (1,832° F), the planet appeared unlikely to harbour life as scientists understood it. Early in 1996 Geoffrey Marcy of San Francisco State University and Paul Butler of the University of California, Berkeley, announced the detection of the first extrasolar planets whose surface temperatures would allow the presence of surface or atmospheric water, considered to be a necessary prerequisite for life. So began a remarkable year in the ongoing search for planets outside the solar system.

      Since extrasolar planets are themselves too dim to photograph in the glare of their parent stars, their presence is detected by the effect they have on the observed motion of their stars. To find such planets, astronomers usually look either for small periodic wobbling motions of the star's position in space or for changes in the star's velocity as indicated by studies of its spectral lines. By late 1995 Marcy and Butler had been monitoring the spectra of 120 stars for eight years, using a spectrograph attached to the 3-m (120-in) telescope at Lick Observatory on Mt. Hamilton, California. Detailed analysis of the spectra of two of the stars indicated that they oscillate back and forth along the line of sight to Earth. The unseen body orbiting the star 47 Ursae Majoris, in the constellation Ursa Major (the Big Dipper), was determined to have a mass about three times that of Jupiter. It revolves around the star at about twice the Earth-Sun distance in roughly three years, and although its surface temperature was determined to be only about -90° C (-130° F), its atmosphere is warm enough to contain liquid water. A second star that they studied, 70 Virginis, in the constellation Virgo, is orbited by a planet several times the mass of Jupiter with a moderate surface temperature of about 84° C (183° F), which would allow any water present to exist as a liquid.

      The fourth closest star to Earth, Lalande 21185, which lies about eight light-years away, was also reported to have a planet. George Gatewood of the Allegheny Observatory, Pittsburgh, Pa., observed periodic changes in the angular position of the star suggesting the presence of a planet with a mass 9/10 that of Jupiter orbiting the star every 5.8 years—and possibly a second planet with an orbital period of about 30 years. Report of yet another large planet by Christopher Burrows of the Space Telescope Science Institute (STScI), Baltimore, Md., was based on entirely different types of observations of the star Beta Pictoris. Its surrounding dusty disk has long been thought to be a nursery for planetary formation. The newly observed warping of the disk seemed to indicate the presence of a Jupiter-sized planet that is perturbing the disk. Although a truly Earth-like planet orbiting a Sun-like star remained to be found, by year's end at least nine planets revolving around relatively nearby normal stars had been reported. Within the space of one year, astronomers had begun to suspect that the existence of planets around other stars is the rule rather than the exception.

Galaxies and Cosmology.
      Ever since its launch, the Hubble Space Telescope (HST) had been pointed at specific visible objects to help uncover their secrets. In an exciting reversal of that approach, Robert Williams, director of the STScI, decided to use his director's discretionary time on the HST to do the opposite—to stare at a region of the sky not known to contain any bright objects. The instrument was trained on a small area, only about 1/30 the diameter of the Moon, in a dark region of Ursa Major. Almost 350 separate images were taken over a 10-day period, building up a mosaic of the region that was the deepest-seeing astronomical photograph ever taken. Lying within this Hubble Deep Field, as the image was called, are at least 1,500 galaxies, among which are the faintest and therefore probably the most distant galaxies ever seen. Scientists began to study the galaxies by combining data from the image with data gathered from Earth-based telescopes. One early finding was that many of the galaxies are irregular or distorted in appearance. Furthermore, the galaxies were formed when the universe was no more than a billion years old, less than 10% of its present age and much sooner after the initial big bang explosion than had been expected. The Deep Field image also revealed that the universe contains 50% more galaxies than had been previously estimated. (See Physics (Mathematics and Physical Sciences ).)

      (KENNETH BRECHER)

      This article updates Cosmos; galaxy (Milky Way Galaxy); astronomy; solar system; star.

SPACE EXPLORATION
      The world's space agencies moved closer in 1996 to realizing two major dreams: the assembly of an International Space Station (ISS) and the discovery of life elsewhere in the solar system. The United States and Russia continued to field joint missions to Russia's operating space station, Mir, and to develop hardware for the international station, scheduled to begin assembly in space in late 1997.

Manned Spaceflight.
       Manned Spaceflights, 1996(For information on manned space in 1996, see Table (Manned Spaceflights, 1996).) During the year NASA launched seven space shuttle missions, which included two that docked with Mir. In January the shuttle Endeavour retrieved two satellites, Japan's Space Flyer Unit (SFU) and the OAST-Flyer developed by NASA. The SFU had been launched in March 1995 to test new technologies in orbit. The OAST-Flyer, on a similar mission, was put into space on the January Endeavour flight and retrieved two days later. Launched in late February, Columbia took back into space the Tethered Satellite System, which had jammed on its first flight in 1992. This time deployment went smoothly until 19.6 km (12.2 mi) of tether had been unwound, whereupon the line broke and the satellite package sailed away into its own orbit. Investigators later determined that small amounts of dust had collected on the tether during processing in the clean room. The dust caused a static electric charge to build up and then burn through the Kevlar tether. The satellite eventually entered Earth's atmosphere and burned up.

      The Life and Microgravity Spacelab mission was flown aboard Columbia in June and July. The science crew conducted a series of experiments on the way in which plants, humans, and nonhuman animals adapt to the weightlessness of space. Other microgravity experiments were conducted in May aboard Endeavour, which carried the Spacehab laboratory module and which also deployed the first inflatable antenna, a demonstration of technologies that could allow large structures to be built in orbit via the inflation of specially designed balloons.

      Two missions flown by Atlantis in March and September took astronauts and cargo to and from Mir. U.S. astronaut Shannon W. Lucid arrived on Mir in March for what was to have been a 115-day stay in space. It stretched to 188 days, however—a record for women and for Americans—when her ride home was delayed three times by a booster problem discovered during Columbia's July launch and by two hurricanes that swept the launch pad. Lucid was finally replaced by astronaut John E. Blaha in September.

      The year's shuttle missions ended in November with Columbia flying the Wake Shield Facility (WSF) a third time. Despite operating problems on two previous flights, the WSF functioned as planned, successfully growing semiconductor crystals in the ultrahard vacuum that was created on the lee side of the facility as it temporarily orbited separately from the shuttle. A stuck hatch on Columbia forced cancellation of two planned space walks, while bad weather extended the mission to a record length for a shuttle flight of 17 days 15 hours 53 minutes.

      Among shuttle missions planned for 1997 was one in December to contribute to the initial assembly of the ISS. Shuttle astronauts were to attach the first of two U.S.-built nodes, which served as assembly points for the station, to the FGB (functional block) module that would have been launched by Russia the previous month. Additional modules were to be added in 1998 and beyond. To support the ISS program, NASA planned improvements to the shuttle system that would add 7,816 kg (17,231 lb) of payload to its lifting capability.

      Manned operations involving Russian and non-Russian crew members continued aboard Mir. Russia launched two replacement crews to the station on Soyuz TM-23 in February and TM-24 in August. In addition, Russia launched the Priroda science module in April to round out Mir's laboratory capabilities. On May 24 cosmonauts Yury Onufriyenko and Yury Usachev conducted a space walk to install solar panels that would boost the electrical power to Mir. The panels, delivered by space shuttle in November 1995, were built by Lockheed Martin Corp. and used the same basic designs as those planned for ISS.

Space Probes.
      Arguably the greatest excitement in planetary exploration came not from a probe but from an Antarctic meteorite, believed to be from Mars, that was reported to contain organic material and microfossil-like structures suggestive of primitive life. (See Astronomy, above; EARTH SCIENCES: Geology and Geochemistry; LIFE SCIENCES: Paleontology.) Exploration of Mars already had been revitalized by the planned launches of three missions in late 1996. Although the Mars rock announcement came too late to affect the year's launches, space scientists were rethinking strategies for later missions.

      The U.S. Mars Global Surveyor was the first mission to Mars since 1993, when the ill-fated Mars Observer lost contact with Earth just before it was to go into Mars orbit. Mars Global Surveyor carried instruments built from Mars Observer's spare parts. Launched on November 7, it was to arrive at Mars in September 1997. After establishing a circular orbit, the spacecraft would conduct a full Martian year (687 days) of observations starting January 1998. Instruments included a camera, a laser altimeter, and plasma and electric field sensors.

      The U.S. Mars Pathfinder, launched December 4, was the first landing attempt since the two Viking spacecraft in 1976. After descending to the Martian surface in July 1997 with the aid of parachutes, rockets, and air bags, the tetrahedral craft would deploy instruments to study Mars and a small, six-wheeled "microrover," dubbed Sojourner, to explore as far as 500 m (1,640 ft) from the lander.

      Mars 96 was Russia's first exploratory mission to Mars since the breakup of the U.S.S.R. Comprising a large orbiter with two 50-kg (110-lb) small landers and two 65-kg (145-lb) surface penetrators, it was launched November 16 and put into Earth orbit. However, its fourth-stage engine, which was to have directed it toward Mars, failed, which allowed the spacecraft to slip back into the atmosphere and then fall to Earth.

      NASA's Near Earth Asteroid Rendezvous (NEAR) spacecraft, the first designed to orbit an asteroid, was launched February 17 toward a June 1997 flyby of asteroid Mathilde and then a flyby of Earth in 1998 to boost its speed. In 1999 NEAR was to enter a loose orbit of asteroid Eros. Eventually the orbit would be tightened to 15 km (9 mi) above the surface as NEAR took pictures and measured the surface profile of Eros.

      The Galileo spacecraft, in orbit around Jupiter since December 1995, offered a separate set of hints that life might be found elsewhere in the solar system. Galileo continued to take pictures and make measurements of Jupiter and its moons, while its orbit was tweaked every few days or weeks to allow flybys as close as 250 km (155 mi) of Ganymede, Callisto, Europa, and Io on a grand tour of this miniature planetary system. Returned images showed that ice covering some areas of Europa has been cracked into large chunks and shifted by tidal effects of Jupiter's powerful gravitational pull. Planetary scientists interpreted this and other signs of activity as evidence that tidally heated "warm ice" or even liquid water might exist below the surface, harbouring conditions that could conceivably support life. (See Astronomy, above.)

Unmanned Satellites.
      Several satellites were launched to help provide an improved understanding of global environmental changes on Earth. Sent aloft August 17, Japan's Midori (originally, Advanced Earth Observation Satellite) carried several instruments to measure changes in the global environment, including a total-ozone mapping spectrometer and radar scatterometer from NASA and a greenhouse-gas monitor from Japan. By September the ozone spectrometer had produced the first global image of ozone in the upper atmosphere. Other launches of Earth-observing satellites included India's IRS-P3 on March 21, on an Indian rocket, and NASA's Total Ozone Mapping Spectrometer-Earth Probe on July 2.

      The astronomer's range of tools was expanded during the year with the U.S. X-Ray Timing Explorer, launched Dec. 30, 1995, and Italy's small X-ray Astronomy Satellite (SAX), launched April 30. One of the oldest space telescopes, the International Ultraviolet Explorer, was turned off September 30. It was launched in January 1978 on what was to have been a three-year mission to observe the stars in ultraviolet light. NASA started preliminary design of a Next Generation Space Telescope designed to deploy an 8-m (26-ft) primary mirror for observations in the infrared spectrum to look deeper into the recesses and the past of the universe. Launch was planned for 2005.

      The addition of three new geophysics satellites to the International Solar Terrestrial Physics program was muted by the loss of the European Space Agency's Cluster mission, a set of four satellites that were destroyed during launch when their Ariane 5 rocket failed. (See Launch Vehicles, below.) On February 24 the U.S. launched Polar, which carried visible-light and ultraviolet cameras to take pictures of the dayside and nightside auroras, and on August 21 it launched the Fast Auroral Snapshot Explorer (FAST), which had instruments to make high-time-resolution "snapshots" of electric fields, magnetic fields, and energetic electron and ion distributions at altitudes of 1,920-4,160 km (1,190-2,580 mi) near the Earth's magnetic poles. On August 29 Russia launched the Interbol-2 spacecraft, which released its complementary Czech-built Magion-5 subsatellite.

Launch Vehicles.
      Two unique launch-vehicle concepts, the Reusable Launch Vehicle (RLV) and Sea Launch, moved ahead. NASA selected Lockheed Martin Corp. to develop the company's wedge-shaped VentureStar concept, which would first be built as the X-33 RLV demonstrator. Like the current space shuttle, the RLV would launch vertically and land horizontally. Unlike the shuttle, it would be unmanned and would not drop boosters and fuel tanks; rather, it would use the single-stage-to-orbit (SSTO) concept, which promised to reduce the cost of launching satellites and probes. The RLV also would use a more robust metal heat shield in place of the shuttle's silica tiles. The X-33 was intended to demonstrate the feasibility of RLV technology in suborbital flights as fast as Mach 15 (15 times the speed of sound). Test flights were planned to start in early 1999 and last into 2000. Flight tests with the DC-XA, an advanced version of the DC-X vertical takeoff and landing rocket and a precursor to the X-33 project, ended on July 31 when a landing leg failed to extend, which caused the craft to topple on its side at the end of a test flight. Earlier flights, on May 18 and June 7, had been successful.

      In a more conventional vein, Sea Launch Co., LDC, a Boeing Co. multinational venture, began converting an offshore oil-drilling platform to serve as a launch pad that could be towed to the Equator (where the Earth's rotation gives a rocket the greatest running start). Sea Launch would use Zenit 3SL rockets, developed by the former U.S.S.R. and currently marketed by companies based in Russia and Ukraine. The launch platform and its assembly-and-control ship would operate out of Long Beach, Calif., and launch south of Hawaii.

      The debut of Europe's Ariane 5 rocket turned to disaster on June 4 when the vehicle veered off its course and was destroyed along with its payload of satellites. An investigation revealed that the guidance system, successfully used in the Ariane 4 series of rockets, had not been properly modified to account for subtle differences between the performances of the Ariane 4 and Ariane 5 models.

      The launch industry was surprised in August when the Boeing Co. announced that it would purchase the aerospace and defense sectors of Rockwell International. The purchase included Rockwell's Space Division, which built and maintained the space shuttle orbiters, and Rocketdyne, which built the shuttle main engines. The transaction put Boeing in a strong position as it bid for the U.S. Air Force's Evolved Expendable Launch Vehicle program.

      (DAVE DOOLING)

      See also Telecommunications (Business and Industry Review ); Television (Media and Publishing ).

      This article updates space exploration; telescope.

▪ 1996

Introduction

MATHEMATICS
      The long-running saga of Fermat's last theorem was finally concluded in 1995. The nearly 360-year-old conjecture states that xn + yn = zn has no positive integer solutions if x, y, z, and n are positive integers and n is three or more. In 1993 Andrew Wiles of Princeton University announced a proof, based on new results in algebraic number theory. By 1994, however, a gap in the proof had emerged. The gap was repaired—or, more accurately, circumvented—by Wiles and former student Richard Taylor of the University of Cambridge. The difficulty in Wiles's proof arose from an attempt to construct a so-called Euler system. The new approach involves making a detailed study of algebraic structures known as Hecke algebras, a task in which Taylor's contribution proved crucial. The complete proof was confirmed by experts and published in the Annals of Mathematics.

      Fruitful revisionism of a different kind took place in the important area of gauge field theory, in which ideas originating in mathematical physics for the purpose of describing subatomic particles and their interactions were being applied to topology—the study of the properties that a region of space retains under deformation—with spectacular consequences. Paramount among them was the discovery, made in 1983 by Simon Donaldson of the University of Oxford, that the properties of four-dimensional Euclidean space are exceptional compared with those of the spaces of all other dimensions. Donaldson's discovery was based on the Yang-Mills field equations in quantum mechanics, introduced in the 1950s by the physicists Chen Ning Yang and Robert L. Mills to describe the interactions between particles in the atomic nucleus. The equations possess special solutions known as instantons—particle-like wave packets that occupy a small region of space and exist for a tiny instant. Donaldson observed that instanton solutions of the Yang-Mills equations encode topological information about the space for which the equations are posed. But just as mathematics was adjusting to the powerful new techniques arising from that insight, Edward Witten of the Institute for Advanced Study, Princeton, N.J., developed an entirely new system of equations that can be substituted for those of Yang and Mills. Witten's ideas, far from supplanting the earlier approach, were shedding light on how the Yang-Mills equations work. Witten's equations replace instantons with magnetic monopoles, hypothetical particles possessing a single magnetic pole—mathematically a far more tractable setting. The early payoff included proofs of several long-standing conjectures in low-dimensional topology.

      A long-standing question in dynamical systems theory, i.e., the genuineness of the chaos observed in the Lorenz equations, was answered. The equations were developed by the meteorologist Edward Lorenz in 1963 in a model of atmospheric convection. Using a computer, he showed that the solutions were highly irregular—small changes in the input values produced large changes in the solutions, which led to apparently random behaviour of the system. In modern parlance such behaviour is called chaos. Computers, however, use finite precision arithmetic, which introduces round-off errors. Is the apparent chaos in the Lorenz equations an artifact of finite precision, or is it genuine? Konstantin Mischaikow and Marian Mrozek of the Georgia Institute of Technology showed that chaos really is present. Ironically their proof was computer-assisted. Nevertheless, that fact did not render the proof "unrigorous" because the role of the computer was to perform certain lengthy but routine calculations, which in principle could be done by hand. Indeed, Mischaikow and Mrozek justified using the computer by setting up a rigorous mathematical framework for finite precision arithmetic. Their main effort went into devising a theory to pass from finite precision to infinite precision. In short, they found a way to parlay the computer's approximations into an exact result.

      A famous problem in recreational mathematics was solved by political scientist Steven Brams of New York University and mathematician Alan Taylor of Union College, Schenectady, N.Y. The problem is to devise a proportional envy-free allocation protocol. An allocation protocol is a systematic method for dividing some desired object—traditionally a cake—among several people. It is proportional if each person is satisfied that he or she is receiving at least a fair share, and it is envy-free if each person is satisfied that no one is receiving more than a fair share. This area of mathematics was invented in 1944 by the mathematician Hugo Steinhaus. For two people the problem is solved by the "I cut, you choose" protocol; Steinhaus' contribution was a proportional but not envy-free protocol for three people. In the early 1960s John Selfridge and John Horton Conway independently found an envy-free protocol for three people, but the problem remained open for four or more people. Brams and Taylor discovered highly complex proportional envy-free protocols for any number of people. Because many areas of human conflict focus upon similar questions, their ideas had potential conflict-resolving applications in economics, politics, and social science. (IAN STEWART)

      This updates the articles analysis; number theory; physical science, principles of; topology.

CHEMISTRY

Chemical Nomenclature.
      Responding to criticism from chemists around the world, the International Union of Pure and Applied Chemistry (IUPAC) in 1995 decided to reconsider the definitive names that it had announced the previous year for elements 101-109. The decision was unprecedented in the history of IUPAC, an association of national chemistry organizations formed in 1919 to set uniform standards for chemical names, symbols, constants, and other matters. IUPAC's Commission on Nomenclature of Inorganic Chemistry had recommended adoption of names for the elements that, in several cases, differed significantly from names selected by the elements' discoverers.

      The extremely heavy elements were synthesized between the 1950s and 1980s by researchers in the U.S., Germany, and the Soviet Union. Although the discoverers had exercised their traditional right to select names, the names never received IUPAC's stamp of approval because of disputes over priority of discovery. The conflicting claims were resolved by an international commission in 1993, and the discoverers submitted their chosen names to IUPAC. An international furor ensued after the IUPAC nomenclature panel ignored many of the submissions and made its own recommendations. IUPAC's rejection of the name seaborgium for element 106 caused particular dismay in the U.S. Discoverers of the element had named it for Glenn T. Seaborg, Nobel laureate and codiscoverer of plutonium and several other heavy elements. In response, IUPAC's General Assembly decided that names for elements 101-109 would revert to provisional status during a five-month review process scheduled to begin in January 1996. Chemists and member organizations were to submit comments on the names for IUPAC's reconsideration.

      The American Chemical Society (ACS) directed its publications to continue using the recommendations of its own nomenclature committee for the duration of IUPAC's review. All of the ACS's names for elements 104-108 differed from those on IUPAC's list.

Inorganic Chemistry.
      People treasure gold mainly because it resists tarnishing and discoloration better than any other metal. Iron rusts and silver tarnishes when in contact with oxygen in the air. Gold remains bright and glistening, however, even in the presence of acids and other highly corrosive chemicals. Scientists have never fully understood gold's inertness. It is not a simple matter of gold's inability to form chemical bonds, since it does form stable compounds with many elements. The real mystery is why gold does not react with atoms or molecules at its surface, at the interface with gases or liquids.

      Bjork Hammer and Jens Nørskov of the Technical University of Denmark, Lyngby, used calculations run on a supercomputer to explain gold's stature as the noblest of the noble metals. Those elements, known for their inertness, are gold, silver, platinum, palladium, iridium, rhodium, mercury, ruthenium, and osmium. The Danish scientists found that gold's surface has electronic features that make reactions energetically unfavourable. Molecules form very weak attachments to gold's surface and quickly lose their tendency to break up into reactive chemical species. As a result, they simply slide away without forming long-lasting electronic or molecular attachments.

      Hammer and Nørskov studied a simple reaction involving the breakup, or dissociation, of molecular hydrogen (H2) into its constituent atoms on the surface of gold and other metals. Of all the metals studied, gold had the highest barrier for dissociation and the least-stable chemisorption state—i.e., the least tendency to take up and hold atoms or molecules by chemical bonds. The properties result, in part, from an overlap of the electron orbitals, the clouds of electrons that surround atoms, between gold and the adsorbed molecule. The overlapping orbitals oscillate out of phase with each other, a situation that makes bond formation unlikely.

Physical Chemistry.
      Chemists long have sought better techniques for studying individual reactions between molecules in solutions. Such information about reaction dynamics can contribute to a basic understanding of chemical reactions and to the search for ways of improving the yield of industrial processes. Molecules in solution tend to move around rapidly, making it difficult to observe how the molecules react to yield a product. In contrast, molecules in solids undergo relatively little movement, and well-established techniques exist for studying interactions between molecules in gases. Recent efforts at improving the picture for molecules in solutions involved focusing on extremely small volumes of solution, thus reducing the number of molecules to be observed.

      R. Mark Wightman of the University of North Carolina at Chapel Hill and Maryanne M. Collinson of Kansas State University reported a new technique for confining and observing molecules in solution that combines spectroscopy and electrochemistry. Wightman and Collinson studied reactions of oppositely charged ions of 9,10-diphenylanthracene (DPA) in an electrochemical cell containing a gold electrode. By rapidly reversing the electrical potential in the cell, the researchers produced batches of DPA cations and then anions—DPA ions with, respectively, positive and negative electrical charges. When a pair of oppositely charged ions interact, one of them emits a photon of light that can be detected with a photomultiplier tube. The researchers restricted the motion of DPA molecules by making the electrode only 10 micrometres (0.0004 in) in diameter, which produced small quantities of ions. They also observed the reactions in 50-microsecond time steps, which gave the DPA ions little time for movement.

Materials.
      Fibre-reinforced composite materials are a fixture in modern society. Tiny fibres of glass or silicon carbide, for instance, can be mixed into batches of plastic, ceramics, or other material. The combination yields lightweight, superstrong composites used in aircraft, automobiles, sports equipment, and many other products. Generally, the thinner the fibre, the stronger the material. Thin fibres provide a greater surface area to bond with the plastic or ceramic matrix and are less likely to have weakening defects in their crystal structure. Tensile strength increases as the size of the fibres decreases.

      Charles M. Lieber and his associates of Harvard University reported synthesizing carbide whiskers 1,000 nm (nanometres; billionths of a metre) long and less than 30 nm in diameter—one-thousandth the size of those used in today's superstrong composites. Their ultrafine whiskers, or "nanorods," of silicon carbide—and carbides of boron, titanium, niobium, and iron—could lead to a new generation of superstrong composites. Lieber's carbide nanorods have the same properties as the bulk materials. Nanorods of silicon carbide, for instance, are semiconductors, those of niobium carbide are superconducting, and those of iron carbide are ferromagnetic. Nanorods thus could have additional practical applications in electronics. Lieber synthesized carbide nanorods from carbon nanotubes, which are hollow, nanometre-diameter tubes of graphitic carbon. They used the nanotubes as templates, heating the tubes with volatile oxides such as silicon monoxide (SiO) or halides such as silicon tetraiodide (SiI4) in sealed quartz tubes at temperatures above 1,000° C (1,800° F).

      Charles R. Martin and co-workers of Colorado State University reported the synthesis of metal membranes that are spanned by nanometre-sized pores and that can selectively pass, or transport, ions, an ability similar to that possessed by ion-exchange polymers. The electrical charge on the membranes can be varied such that they reject ions of the same charge and transport ions of the opposite charge. Existing porous membranes can transport either anions or cations, but they are fixed in terms of ion selectivity and pore size. Martin suggested that the new membranes could serve as a model for studying biological membranes, which exhibit the same ion selectivity. They also could be used in commercial separation processes—for example, for separating small anions from a solution containing both large and small anions and cations.

      Martin's group made the membranes by gold-plating commercially available polymer filtration membranes, which have cylindrical pores about 50 nm in diameter. The researchers originally planned to plate the pores full of gold to make gold nanofibres. Serendipitously they discovered that the membrane became ion selective when its pores were lined with gold but not completely filled.

      Researchers at the University of Bath, England, reported a method for synthesizing hollow porous shells of crystalline calcium carbonate, or aragonite, from a self-organizing reaction mixture. The shells resemble the so-called coccospheres synthesized by certain marine algae and could have important applications as lightweight ceramics, catalyst supports, biomedical implants, and chemical separations material. Stephen Mann and his associates made the complex, three-dimensional structures from emulsions consisting of microscopic droplets of oil, water, and surfactants (detergents) and supersaturated with calcium bicarbonate. The pore size of the resulting material was determined by the relative concentrations of water and oil in the emulsion, with micrometre-sized polystyrene beads serving as the substrate.

Organic Chemistry.
      Polyethylene plastics are the world's most popular type of plastic, widely used in packaging, bags, disposable diapers, bottles, coatings, films, and innumerable other products. Chemical companies make polyethylene by means of a polymerization reaction that involves linking together thousands of molecular units of ethylene (C2H4) into enormous chains.

      Researchers at BP Chemicals, a division of British Petroleum, London, reported development of a simple modification in their widely used polyethylene process that can more than double output from each reactor. During conventional polymerization, reactor temperatures rise, and heat removal becomes a bottleneck that limits production capacity. The new reactor design overcomes the problem by using gases given off during polymerization to cool the reactor. Gases are collected, cooled, liquefied, and injected back into the reactor. The liquids immediately vaporize and, in so doing, absorb enough heat to permit a doubling of polyethylene output.

      Chemists have grown adept at enclosing single atoms of different elements inside molecular cages like the 60-carbon molecules known as buckminsterfullerenes, or buckyballs. The spaces inside those soccer-ball-shaped molecules are relatively small, however, which has spurred researchers to develop bigger molecular cages that can accommodate larger molecules or groups of molecules. Held together in close quarters, such confined molecules might undergo commercially important reactions.

      Richard Robson and his associates at the University of Melbourne, Australia, reported their development of a crystalline lattice containing a regular array of comparatively huge cagelike compartments. Each cage is about 2.3 nm in diameter, large enough to house as many as 20 large molecules. Robson and co-workers developed the cages by accident while trying to make new types of zeolites, highly porous minerals used as catalysts and molecular filters. Into an organic solvent they mixed ions of nitrate, cyanide, zinc, and molecules of tri(pyridyl)-1,3,5-triazine, hoping to create a new zeolite. Instead, the components self-assembled into two interlocking structures that formed a lattice of large cagelike cells.

      Light-emitting diodes (LEDs) have become a ubiquitous part of modern life, widely used as small indicator lights on electronic devices and other consumer products. LEDs are semiconductors that convert electricity directly into light. The most common commercial LEDs are made from gallium arsenide phosphide and emit red light. Nevertheless, chemists and materials scientists also have developed LEDs that emit light of other colours, a notable exception being true, bright white light.

      Junji Kido's group at Yamagata (Japan) University reported progress in making such an LED, which could have major commercial applications—for example, as a backlight source for extremely thin, flat television screens, computer displays, and other devices. Kido made the LED by stacking layers of three different light-emitting organic compounds between two electrodes. The bottom layer, made from triphenyldiamine, emits blue light. The middle layer is made from tris(8-quinolinolato)aluminum(III) and emits green light. The top layer is a red emitter made from tris(8-quinolinolato)aluminum(III) combined with small amounts of the organic dye nile red. Kido added a layer of another material between the blue and green to enhance production of blue light. The combination of red, green, and blue emission results in a bright white light. Kido's device shone with a record intensity for an LED, 2,000 candelas per square metre, which is about half the intensity of an ordinary fluorescent room light. (MICHAEL WOODS)

      This updates the articles chemical compound; chemical element; chemical reaction; electronics; ; chemistry.

PHYSICS
      Confirmation of the discovery of a long-sought elementary particle delighted physicists in 1995, while the possible identification of another, unexpected type of particle gave them pause for thought. Cosmologists and astronomers were pleased with the finding of strong evidence for dim, small, starlike objects called brown dwarfs, which represent some of the so-called dark matter that is believed to make up perhaps 90% of the universe, but were baffled by conflicting determinations of the age of the universe. In the strange world of quantum physics, an intriguing proposal was made for an experiment using DNA, the molecule of life, in a modern version of a famous thought experiment outlined 60 years earlier.

      The biggest development of the year was the confirmation of a claim tentatively put forward in 1994 that the top quark had been detected in particle-collision experiments at the Fermi National Accelerator Laboratory (Fermilab) near Chicago. Data in 1995 from two separate detectors at Fermilab's Tevatron proton-antiproton collider provided what appeared to be unequivocal evidence for this last piece in the jigsaw puzzle of the so-called standard model of particle physics. The standard model explains the composition of all matter in terms of six leptons (particles like the electron and its neutrino) and six quarks (constituents of particles like protons and neutrons), five of which had already been detected. Results from one detector indicated a mass for the top quark of 176 GeV (billion electron volts), with an uncertainty of 13 GeV; results from the other detector gave a mass of 199 GeV, with an uncertainty of 30 GeV. The two values were consistent with each other, given the overlap in their uncertainties.

      Further experiments were expected to pin down the mass of the top quark more precisely, which in turn would provide insight into the nature of a theoretical entity called the Higgs field. The Higgs field is thought to pervade all of space and, through its interaction with all the matter particles, to give the particles their masses. A major shortcoming of the standard model is that it does not account for the way in which the quarks and leptons come to have the masses that they do.

      Confirmation of the existence of the top quark by no means closed the book on the mysteries of particle physics. In mid-1995 researchers working with the HERA accelerator at DESY, the German national accelerator laboratory in Hamburg, announced that they had found something completely different. Their work built on earlier evidence that mysterious showers of particles are sometimes produced in so-called soft collisions, wherein a proton and an electron, or a pair of protons, strike each other a glancing blow rather than colliding head-on. Almost tongue in cheek, physicists had suggested that one of the colliding particles might emit a new kind of particle, dubbed a pomeron, that is actually responsible for the effects observed in a soft collision. The problem has been that the standard model, which relies on the theory of quantum chromodynamics (QCD) to explain the strong force that binds the quarks in the protons and neutrons of the atomic nucleus, is inaccurate for low energies. QCD is much less useful for calculating what happens in soft collisions than in the more energetic collisions like those used to search for the top quark. Nevertheless, the results from HERA did suggest that pomerons are involved in soft collisions. When, for example, an electron and a proton approach one another, the proton emits a pomeron, which then interacts with the electron to produce a shower of other particles, while the proton itself proceeds unscathed. The questions to be answered were whether the pomeron indeed does exist, what it is made of, and what its properties are.

      Physicists found the possibility of a particle like the pomeron exciting because it was something not predicted by theory. On the other hand, two teams of researchers were no less excited by their success in obtaining a new form of matter that had actually been predicted 70 years earlier, as a result of theoretical work by Albert Einstein and the Indian physicist Satyendra Bose. The old calculations had predicted that if atoms in the form of a dilute gas could be made cold enough, they would merge and become, in a quantum sense, a single entity much larger than any individual atom. The challenge was to produce the phenomenal cooling required for achieving this state, called the Bose-Einstein condensate. The atoms must be chilled to less than 200 billionths of a degree above absolute zero, -273.15° C (-459.67° F). The trick was at last achieved during the year, first by scientists from the National Institute of Standards and Technology, Boulder, Colo., and the University of Colorado and then by a team at Rice University, Houston, Texas. Both used similar techniques of slowing the atoms down with laser beams, trapping them in a magnetic field, and allowing the hottest, fastest individuals to escape. The resulting Bose-Einstein condensates were made up of several thousand atoms in a ball about 30 micrometres (0.001 in) across, behaving as a single quantum entity thousands of times bigger than an atom. The first experiment to achieve this state cost only about $50,000 for the hardware, plus months of intense and skillful effort, and opened up a whole new area of investigation of the predictions of quantum theory.

      Investigations of quantum phenomena like Bose-Einstein condensation gained new importance from recent work highlighting the baffling nature of quantum physics. Sixty years after the quantum theory pioneer Erwin Schrödinger devised his famous cat paradox to illustrate his dissatisfaction with the more absurd aspects of the standard interpretation of quantum theory, two Indian researchers went one better. They conceived a version of this thought experiment using DNA, which is particularly apposite since Schrödinger's book What Is Life?, written in the 1940s as an attempt to use quantum physics to explain the stability of genetic structure, was instrumental in setting Francis Crick on the trail that lead to his identification of the structure of DNA with James Watson in 1953.

      The absurdity that Schrödinger wished to emphasize was the part of quantum theory that says that the outcome of any quantum experiment is not real until it has been observed, or measured by an intelligent observer. He scaled an imaginary experiment up from the quantum world of particles and atoms to a situation in which a cat exists in a 50:50 "superposition of states," both dead and alive at the same time, and definitely takes on one or the other state only when somebody looks to see if it is dead or alive. Whereas carrying out such an experiment with a real cat would present tremendous difficulties, the experiment proposed by Dipankar Home and Rajagopal Chattapadhyay of the Bose Institute, Calcutta, really could be done.

      To bring out the quantum measurement paradox in sharp relief, they picked up on a comment made by Alastair Rae in his book Quantum Physics (1986) that a single particle is all that is required for producing a mutation in a DNA molecule. In the proposed experiment a gamma-ray photon (a particle-like packet of electromagnetic energy) is directed into a cesium iodide crystal, producing a shower of photons with wavelengths in the ultraviolet (UV) range around 250 nanometres (billionths of a metre). The photon shower then passes through a solution containing DNA and an enzyme known as photolyase. Any DNA molecule that is damaged by absorption of a UV photon changes its shape in such a way that molecules of photolyase bind to it. In principle, an observer could then measure the enzyme binding.

      The point of the experiment is that absorption of a single UV photon, a quantum event, causes a microscopic displacement in the molecular structure of the DNA, which in turn produces a macroscopically measurable (i.e., a nonquantum, or classical) effect through its chemical interaction with the enzyme. The standard interpretation of quantum theory says that each DNA molecule should exist in a superposition of states, a mixture of being damaged and not damaged, until an intelligent observer looks at it. On the other hand, common sense says that each molecule is either damaged or not damaged, and that the enzyme is perfectly capable of telling the state of the DNA without assistance from a human observer. In a Bose Institute preprint, the two researchers came down on the side of common sense, arguing that an individual DNA molecule could be regarded as definitely either damaged or not damaged "regardless of whether or when an experimenter chooses to find this out." Thus, in their view some other interpretation of quantum physics was required. Sixty years on, Schrödinger would be delighted to see which way the quantum wind was blowing.

      One of the more eagerly anticipated discoveries of relevance to cosmology was made by researchers using the William Herschel Telescope on La Palma, one of the Canary Islands. They found the best evidence yet for a brown dwarf, a small, extremely faint substellar object, in the Pleiades star cluster. It has only a small percentage of the mass of the Sun and less than 100 times as much as the planet Jupiter. Because they are so small, brown dwarfs could exist in the Milky Way Galaxy in huge numbers without contributing much to its overall mass. The new discovery suggested that about 1% of the mass of the Milky Way (and, by extension, other galaxies) is in the form of brown dwarfs. That value still leaves plenty of scope for other, as yet unidentified, entities to make up the rest of the "missing mass" of the universe, the dark, or nonluminous, matter whose presence is suggested through its gravitational effects on the observed rotation of galaxies and their movement in clusters. (See Astronomy (Earth and Space Sciences ).)

      In another tour-de-force Earth-based observation, astronomers at the Cerro Tololo Inter-American Observatory in Chile discovered the most distant supernova—an explosion of a dying star—yet seen. It lies in a galaxy about six million light-years from the Earth. Because supernovas, depending on their type, have much the same absolute brightness (they are "standard candles," in astronomical terms), if more can be found at such great distances, it may be possible to use them to measure how quickly the rate at which galaxies are moving apart is decreasing—i.e., how fast the expansion of the universe is decelerating. If the absolute brightness of a supernova is known, then its apparent brightness can be used to calculate its true distance. This value then can be combined with the red shift of the supernova's parent galaxy, which is a measure of how fast the galaxy is receding from the Earth.

      This ability would be a great boon because it is one way to determine the time that has elapsed since the big bang—i.e., the age of the universe. The age is calculated in terms of a number called the Hubble parameter, or Hubble constant (H0), a constant of proportionality between the recessional velocities of the galaxies and their distances from the Earth. H0 is the rate at which the velocity of the galaxies increases with distance and is conventionally expressed in kilometres per second per megaparsec (a parsec is 3.26 light-years). The reciprocal of H0, 1/H0, yields the time that has elapsed since the galaxies started receding. Various techniques for making the galaxy-distance measurements that were needed to calculate H0 had seemed for some years to be converging on a value for H0 that yielded an age for the universe of 15 billion to 20 billion years, and it had been anticipated that measurements for distant galaxies made with the Hubble Space Telescope (HST) would give a definitive value. To the surprise of many, measurements with the HST in late 1994 determined a value for H0 that implied an age of 8 billion to 12 billion years. In 1994 and 1995 other determinations made with the HST or ground-based telescopes gave a range of values for H0, some indicating a relatively young universe and others an old one. The new measurements put clear water between two sets of numbers that were, at face value, impossible to reconcile.

      Apart from the embarrassment of the disagreement itself, some of the measurements implied that the age of the universe is less than the accepted ages of the oldest stars, which are at least 15 billion years old. Clearly something was wrong. A major consolation, however, was that some of the most significant progress in science eventually comes from investigations in areas where theory and observation are in conflict rather than in agreement. (JOHN GRIBBIN)

* * *


Universalium. 2010.

Игры ⚽ Нужно сделать НИР?

Look at other dictionaries:

  • UCL Faculty of Mathematical and Physical Sciences — Dean Professor Richard Catlow[1] Admin. staff 445[1] (Academic and research staff (as at October 2009)) …   Wikipedia

  • Faculty of Engineering and Physical Sciences (University of Manchester) — The Faculty of Engineering and Physical Sciences (EPS) is one of the four faculties that comprise the University of Manchestercite web title = Ordinances of The University of Manchester url=http://www.manchester.ac.uk/medialibrary/governance/ordin… …   Wikipedia

  • Engineering and Physical Sciences Research Council — The Engineering and Physical Sciences Research Council (EPSRC) is a British Research Council that provides government funding for grants to undertake research and postgraduate degrees in engineering and the physical sciences (including… …   Wikipedia

  • Earth and Space Sciences — ▪ 1996 Introduction GEOLOGY AND GEOCHEMISTRY       In 1995 significant developments took place in the realm of geologic mapping, which provides the foundation for the presentation and comparison of data in the Earth sciences. The most important… …   Universalium

  • Mathematics and fiber arts — A Möbius strip scarf made from crochet. Mathematical ideas have been used as inspiration for a number of fiber arts including quilt making, knitting, cross stitch, crochet, embroidery and weaving. A wide range of mathematical concepts have been… …   Wikipedia

  • Mathematics and art — have a long historical relationship. The ancient Egyptians and ancient Greeks knew about the golden ratio, regarded as an aesthetically pleasing ratio, and incorporated it into the design of monuments including the Great Pyramid,[1] the Parthenon …   Wikipedia

  • NUS High School of Mathematics and Science — National University of Singapore High School of Mathematics and Science 新加坡国立大学附属数理中学 சிங்கப்பூர் தேசிய பல்கலைக்கழகம் கணிதம் மற்றும் அறிவியல் மேல்நிலைப்பள்ளி Motto: Experiment. Explore. Excel …   Wikipedia

  • Maths and Social Sciences Building — The Mathematics Social Sciences Building General information Type Academic …   Wikipedia

  • Harvard School of Engineering and Applied Sciences — Infobox University name = Harvard School of Engineering and Applied Sciences established = 1847, as the Lawrence Scientific School type = Private dean= Venkatesh Venky Narayanamurti city = Cambridge state = Massachusetts country = USA students =… …   Wikipedia

  • Mathematical Methods in the Physical Sciences —   Author(s) Mary L. Boas …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”