United States

United States
a republic in the N Western Hemisphere comprising 48 conterminous states, the District of Columbia, and Alaska in North America, and Hawaii in the N Pacific. 267,954,767; conterminous United States, 3,022,387 sq. mi. (7,827,982 sq. km); with Alaska and Hawaii, 3,615,122 sq. mi. (9,363,166 sq. km). Cap.: Washington, D.C. Abbr.: U.S., US Also called United States of America, America.

* * *

United States

Introduction United States
Background: Britain's American colonies broke with the mother country in 1776 and were recognized as the new nation of the United States of America following the Treaty of Paris in 1783. During the 19th and 20th centuries, 37 new states were added to the original 13 as the nation expanded across the North American continent and acquired a number of overseas possessions. The two most traumatic experiences in the nation's history were the Civil War (1861-65) and the Great Depression of the 1930s. Buoyed by victories in World Wars I and II and the end of the Cold War in 1991, the US remains the world's most powerful nation- state. The economy is marked by steady growth, low unemployment and inflation, and rapid advances in technology. Geography United States -
Location: North America, bordering both the North Atlantic Ocean and the North Pacific Ocean, between Canada and Mexico
Geographic coordinates: 38 00 N, 97 00 W
Map references: North America
Area: total: 9,629,091 sq km land: 9,158,960 sq km water: 470,131 sq km note: includes only the 50 states and District of Columbia
Area - comparative: about half the size of Russia; about three-tenths the size of Africa; about half the size of South America (or slightly larger than Brazil); slightly larger than China; about two and a half times the size of Western Europe
Land boundaries: total: 12,034 km border countries: Canada 8,893 km (including 2,477 km with Alaska), Mexico 3,141 km note: US Naval Base at Guantanamo Bay, Cuba is leased by the US and thus remains part of Cuba; the base boundary is 29 km
Coastline: 19,924 km
Maritime claims: contiguous zone: 24 NM continental shelf: not specified exclusive economic zone: 200 NM territorial sea: 12 NM
Climate: mostly temperate, but tropical in Hawaii and Florida, arctic in Alaska, semiarid in the great plains west of the Mississippi River, and arid in the Great Basin of the southwest; low winter temperatures in the northwest are ameliorated occasionally in January and February by warm chinook winds from the eastern slopes of the Rocky Mountains
Terrain: vast central plain, mountains in west, hills and low mountains in east; rugged mountains and broad river valleys in Alaska; rugged, volcanic topography in Hawaii
Elevation extremes: lowest point: Death Valley -86 m highest point: Mount McKinley 6,194 m
Natural resources: coal, copper, lead, molybdenum, phosphates, uranium, bauxite, gold, iron, mercury, nickel, potash, silver, tungsten, zinc, petroleum, natural gas, timber
Land use: arable land: 19.32% other: 80.46% (1998 est.) permanent crops: 0.22% NEGL%
Irrigated land: 214,000 sq km (1998 est.)
Natural hazards: tsunamis, volcanoes, and earthquake activity around Pacific Basin; hurricanes along the Atlantic and Gulf of Mexico coasts; tornadoes in the midwest and southeast; mud slides in California; forest fires in the west; flooding; permafrost in northern Alaska, a major impediment to development Environment - current issues: air pollution resulting in acid rain in both the US and Canada; the US is the largest single emitter of carbon dioxide from the burning of fossil fuels; water pollution from runoff of pesticides and fertilizers; very limited natural fresh water resources in much of the western part of the country require careful management; desertification Environment - international party to: Air Pollution, Air
agreements: Pollution-Nitrogen Oxides, Antarctic-Environmental Protocol, Antarctic-Marine Living Resources, Antarctic Seals, Antarctic Treaty, Climate Change, Desertification, Endangered Species, Environmental Modification, Marine Dumping, Marine Life Conservation, Nuclear Test Ban, Ozone Layer Protection, Ship Pollution, Tropical Timber 83, Tropical Timber 94, Wetlands, Whaling signed, but not ratified: Air Pollution-Persistent Organic Pollutants, Air Pollution-Volatile Organic Compounds, Biodiversity, Climate Change-Kyoto Protocol, Hazardous Wastes
Geography - note: world's third-largest country by size (after Russia and Canada) and by population (after China and India); Mt. McKinley is highest point in North America and Death Valley the lowest point on the continent People United States
Population: 280,562,489 (July 2002 est.)
Age structure: 0-14 years: 21% (male 30,116,782; female 28,765,183) 15-64 years: 66.4% (male 92,391,120; female 93,986,468) 65 years and over: 12.6% (male 14,748,522; female 20,554,414) (2002 est.)
Population growth rate: 0.89% (2002 est.)
Birth rate: 14.1 births/1,000 population (2002 est.)
Death rate: 8.7 deaths/1,000 population (2002 est.)
Net migration rate: 3.5 migrant(s)/1,000 population (2002 est.)
Sex ratio: at birth: 1.05 male(s)/female under 15 years: 1.05 male(s)/female 15-64 years: 0.98 male(s)/female 65 years and over: 0.72 male(s)/ female total population: 0.96 male(s)/ female (2002 est.)
Infant mortality rate: 6.69 deaths/1,000 live births (2002 est.) Life expectancy at birth: total population: 77.4 years male: 74.5 years female: 80.2 years (2002 est.)
Total fertility rate: 2.07 children born/woman (2002 est.) HIV/AIDS - adult prevalence rate: 0.61% (1999 est.) HIV/AIDS - people living with HIV/ 850,000 (1999 est.)
HIV/AIDS - deaths: 20,000 (1999 est.)
Nationality: noun: American(s) adjective: American
Ethnic groups: white 77.1%, black 12.9%, Asian 4.2%, Amerindian and Alaska native 1.5%, native Hawaiian and other Pacific islander 0.3%, other 4% (2000) note: a separate listing for Hispanic is not included because the US Census Bureau considers Hispanic to mean a person of Latin American descent (especially of Cuban, Mexican, or Puerto Rican origin) living in the US who may be of any race or ethnic group (white, black, Asian, etc.)
Religions: Protestant 56%, Roman Catholic 28%, Jewish 2%, other 4%, none 10% (1989)
Languages: English, Spanish (spoken by a sizable minority)
Literacy: definition: age 15 and over can read and write male: 97% female: 97% (1979 est.) total population: 97%
People - note: note: data for the US are based on projections that do not take into consideration the results of the 2000 census Government United States
Country name: conventional long form: United States of America conventional short form: United States abbreviation: US or USA
Government type: federal republic; strong democratic tradition
Capital: Washington, DC Administrative divisions: 50 states and 1 district*; Alabama, Alaska, Arizona, Arkansas, California, Colorado, Connecticut, Delaware, District of Columbia*, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Jersey, New Mexico, New York, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, Washington, West Virginia, Wisconsin, Wyoming
Dependent areas: American Samoa, Baker Island, Guam, Howland Island, Jarvis Island, Johnston Atoll, Kingman Reef, Midway Islands, Navassa Island, Northern Mariana Islands, Palmyra Atoll, Puerto Rico, Virgin Islands, Wake Island note: from 18 July 1947 until 1 October 1994, the US administered the Trust Territory of the Pacific Islands, but recently entered into a new political relationship with all four political units: the Northern Mariana Islands is a commonwealth in political union with the US (effective 3 November 1986); Palau concluded a Compact of Free Association with the US (effective 1 October 1994); the Federated States of Micronesia signed a Compact of Free Association with the US (effective 3 November 1986); the Republic of the Marshall Islands signed a Compact of Free Association with the US (effective 21 October 1986)
Independence: 4 July 1776 (from Great Britain)
National holiday: Independence Day, 4 July (1776)
Constitution: 17 September 1787, effective 4 March 1789
Legal system: based on English common law; judicial review of legislative acts; accepts compulsory ICJ jurisdiction, with reservations
Suffrage: 18 years of age; universal
Executive branch: chief of state: President George W. BUSH (since 20 January 2001) and Vice President Richard B. CHENEY (since 20 January 2001); note - the president is both the chief of state and head of government head of government: President George W. BUSH (since 20 January 2001) and Vice President Richard B. CHENEY (since 20 January 2001); note - the president is both the chief of state and head of government cabinet: Cabinet appointed by the president with Senate approval elections: president and vice president elected on the same ticket by a college of representatives who are elected directly from each state; president and vice president serve four-year terms; election last held 7 November 2000 (next to be held 2 November 2004) election results: George W. BUSH elected president; percent of popular vote - George W. BUSH (Republican Party) 48%, Albert A. GORE, Jr. (Democratic Party) 48%, Ralph NADER (Green Party) 3%, other 1%
Legislative branch: bicameral Congress consists of the Senate (100 seats, one-third are renewed every two years; two members are elected from each state by popular vote to serve six-year terms) and the House of Representatives (435 seats; members are directly elected by popular vote to serve two-year terms) election results: Senate - percent of vote by party - NA%; seats by party - Democratic Party 50, Republican Party 49, independent 1; House of Representatives - percent of vote by party - NA%; seats by party - Republican Party 221, Democratic Party 211, independent 2, vacant 1 elections: Senate - last held 7 November 2000 (next to be held 4 November 2002); House of Representatives - last held 7 November 2000 (next to be held 4 November 2002)
Judicial branch: Supreme Court (its nine justices are appointed for life by the president with confirmation by the Senate); United States Courts of Appeal; United States District Courts; State and County Courts Political parties and leaders: Democratic Party [Terence McAULIFFE, national committee chairman]; Green Party [leader NA]; Republican Party [Governor Marc RACICOT, national committee chairman] Political pressure groups and NA
leaders: International organization AfDB, ANZUS, APEC, ARF (dialogue
Flag description: thirteen equal horizontal stripes of red (top and bottom) alternating with white; there is a blue rectangle in the upper hoist-side corner bearing 50 small, white, five-pointed stars arranged in nine offset horizontal rows of six stars (top and bottom) alternating with rows of five stars; the 50 stars represent the 50 states, the 13 stripes represent the 13 original colonies; known as Old Glory; the design and colors have been the basis for a number of other flags, including Chile, Liberia, Malaysia, and Puerto Rico Economy United States -
Economy - overview: The US has the largest and most technologically powerful economy in the world, with a per capita GDP of $36,300. In this market-oriented economy, private individuals and business firms make most of the decisions, and the federal and state governments buy needed goods and services predominantly in the private marketplace. US business firms enjoy considerably greater flexibility than their counterparts in Western Europe and Japan in decisions to expand capital plant, lay off surplus workers, and develop new products. At the same time, they face higher barriers to entry in their rivals' home markets than the barriers to entry of foreign firms in US markets. US firms are at or near the forefront in technological advances, especially in computers and in medical, aerospace, and military equipment, although their advantage has narrowed since the end of World War II. The onrush of technology largely explains the gradual development of a "two-tier labor market" in which those at the bottom lack the education and the professional/technical skills of those at the top and, more and more, fail to get comparable pay raises, health insurance coverage, and other benefits. Since 1975, practically all the gains in household income have gone to the top 20% of households. The years 1994-2000 witnessed solid increases in real output, low inflation rates, and a drop in unemployment to below 5%. The year 2001 witnessed the end of the boom psychology and performance, with output increasing only 0.3% and unemployment and business failures rising substantially. The response to the terrorist attacks of September 11 showed the remarkable resilience of the economy. Moderate recovery is expected in 2002, with the GDP growth rate rising to 2.5% or more. A major short-term problem in first half 2002 was a sharp decline in the stock market, fueled in part by the exposure of dubious accounting practices in some major corporations. Long-term problems include inadequate investment in economic infrastructure, rapidly rising medical and pension costs of an aging population, sizable trade deficits, and stagnation of family income in the lower economic groups.
GDP: purchasing power parity - $10.082 trillion (2001 est.)
GDP - real growth rate: 0.3% (2001 est.)
GDP - per capita: purchasing power parity - $36,300 (2001 est.) GDP - composition by sector: agriculture: 2% industry: 18% services: 80% (2001 est.) Population below poverty line: 12.7% (2001 est.) Household income or consumption by lowest 10%: 1.8%
percentage share: highest 10%: 30.5% (1997) Distribution of family income - Gini 40.8 (1997)
index: Inflation rate (consumer prices): 2.8% (2001)
Labor force: 141.8 million (includes unemployed) (2001) Labor force - by occupation: managerial and professional 31%, technical, sales and administrative support 28.9%, services 13.6%, manufacturing, mining, transportation, and crafts 24.1%, farming, forestry, and fishing 2.4% (2001) note: figures exclude the unemployed
Unemployment rate: 5% (2001)
Budget: revenues: $1.828 trillion expenditures: $1.703 trillion, including capital expenditures of $NA (1999)
Industries: leading industrial power in the world, highly diversified and technologically advanced; petroleum, steel, motor vehicles, aerospace, telecommunications, chemicals, electronics, food processing, consumer goods, lumber, mining Industrial production growth rate: -3.7% (2001 est.) Electricity - production: 3,799.944 billion kWh (2000) Electricity - production by source: fossil fuel: 70.76% hydro: 7.19% other: 2.21% (2000) nuclear: 19.84% Electricity - consumption: 3.613 trillion kWh (2000)
Electricity - exports: 14.829 billion kWh (2000)
Electricity - imports: 48.879 billion kWh (2000)
Agriculture - products: wheat, other grains, corn, fruits, vegetables, cotton; beef, pork, poultry, dairy products; forest products; fish
Exports: $723 billion (f.o.b., 2001 est.)
Exports - commodities: capital goods, automobiles, industrial supplies and raw materials, consumer goods, agricultural products
Exports - partners: Canada 22.4%, Mexico 13.9%, Japan 7.9%, UK 5.6%, Germany 4.1%, France, Netherlands (2001)
Imports: $1.148 trillion (f.o.b., 2001 est.)
Imports - commodities: crude oil and refined petroleum products, machinery, automobiles, consumer goods, industrial raw materials, food and beverages
Imports - partners: Canada 19%, Mexico 11.5%, Japan 11.1%, China 8.9%, Germany 5.2%, UK, Taiwan (2001)
Debt - external: $862 billion (1995 est.)
Economic aid - donor: ODA, $6.9 billion (1997)
Currency: US dollar (USD)
Currency code: USD
Exchange rates: British pounds per US dollar - 0.6981 (January 2002), 0.6944 (2001), 0.6596 (2000), 0.6180 (1999), 0.6037 (1998), 0.6106 (1997); Canadian dollars per US dollar - 1.6003 (January 2002), 1.5488 (2001), 1.4851 (2000), 1.4857 (1999), 1.4835 (1998), 1.3846 (1997); French francs per US dollar - 5.65 (January 1999), 5.8995 (1998), 5.8367 (1997); Italian lire per US dollar - 1,668.7 (January 1999), 1,763.2 (1998), 1,703.1 (1997); Japanese yen per US dollar - 132.66 (January 2002), 121.53 (2001), 107.77 (2000), 113.91 (1999), 130.91 (1998), 120.99 (1997); German deutsche marks per US dollar - 1.69 (January 1999), 1.9692 (1998), 1.7341 (1997); euros per US dollar - 1.1324 (January 2002), 1.1175 (2001), 1.08540 (2000), 0.93863 (1999) note: financial institutions in France, Italy, and Germany and eight other European countries started using the euro on 1 January 1999 with the euro replacing the local currency in consenting countries for all transactions in 2002
Fiscal year: 1 October - 30 September Communications United States Telephones - main lines in use: 194 million (1997) Telephones - mobile cellular: 69.209 million (1998)
Telephone system: general assessment: a very large, technologically advanced, multipurpose communications system domestic: a large system of fiber- optic cable, microwave radio relay, coaxial cable, and domestic satellites carries every form of telephone traffic; a rapidly growing cellular system carries mobile telephone traffic throughout the country international: 24 ocean cable systems in use; satellite earth stations - 61 Intelsat (45 Atlantic Ocean and 16 Pacific Ocean), 5 Intersputnik (Atlantic Ocean region), and 4 Inmarsat (Pacific and Atlantic Ocean regions) (2000) Radio broadcast stations: AM 4,762, FM 5,542, shortwave 18 (1998)
Radios: 575 million (1997) Television broadcast stations: more than 1,500 (including nearly 1,000 stations affiliated with the five major networks - NBC, ABC, CBS, FOX, and PBS; in addition, there are about 9,000 cable TV systems) (1997)
Televisions: 219 million (1997)
Internet country code: .us Internet Service Providers (ISPs): 7,800 (2000 est.)
Internet users: 166 million (2001) Transportation United States
Railways: total: 212,433 km mainline routes standard gauge: 212,433 km 1.435- m gauge note: represents the aggregate length of roadway of all line-haul railroads including an estimate for Class II and III railroads (1998)
Highways: total: 6,370,031 km paved: 5,733,028 km (including 74,091 km of expressways) unpaved: 637,003 km (1997)
Waterways: 41,009 km note: navigable inland channels, exclusive of the Great Lakes
Pipelines: petroleum products 276,000 km; natural gas 331,000 km (1991)
Ports and harbors: Anchorage, Baltimore, Boston, Charleston, Chicago, Duluth, Hampton Roads, Honolulu, Houston, Jacksonville, Los Angeles, New Orleans, New York, Philadelphia, Port Canaveral, Portland (Oregon), Prudhoe Bay, San Francisco, Savannah, Seattle, Tampa, Toledo
Merchant marine: total: 264 ships (1,000 GRT or over) totaling 6,911,641 GRT/9,985,660 DWT ships by type: barge carrier 1, bulk 11, cargo 14, chemical tanker 16, collier 1, combination bulk 4, combination tanker 11, container 86, multi-functional large-load carrier 4, passenger/cargo 2, petroleum tanker 81, roll on/roll off 28, specialized tanker 3, vehicle carrier 2 note: includes some foreign-owned ships registered here as a flag of convenience: Australia 1, Canada 4, Denmark 15, France 1, Germany 1, Netherlands 3, Norway 7, Puerto Rico 4, Singapore 11, Sweden 1, United Kingdom 3 (2002 est.)
Airports: 14,695 (2001) Airports - with paved runways: total: 5,127 over 3,047 m: 183 2,438 to 3,047 m: 222 914 to 1,523 m: 2,413 under 914 m: 967 (2001) 1,524 to 2,437 m: 1,342 Airports - with unpaved runways: total: 9,568 under 914 m: 7,716 (2001) over 3,047 m: 1 2,438 to 3,047 m: 7 914 to 1,523 m: 1,679 1,524 to 2,437 m: 165
Heliports: 132 (2001) Military United States
Military branches: Department of the Army, Department of the Navy (includes Marine Corps), Department of the Air Force note: the Coast Guard is normally subordinate to the Department of Transportation, but in wartime reports to the Department of the Navy Military manpower - military age: 18 years of age (2002 est.) Military manpower - availability: males age 15-49: 70,819,436 (2001 est.) Military manpower - fit for military NA (2002 est.)
service: Military manpower - reaching males: 2,053,179 (2002 est.)
military age annually: Military expenditures - dollar $276.7 billion (FY99 est.)
figure: Military expenditures - percent of 3.2% (FY99 est.)
Military - note: note: 2002 estimates for military manpower are based on projections that do not take into consideration the results of the 2000 census Transnational Issues United States Disputes - international: maritime boundary disputes with Canada (Dixon Entrance, Beaufort Sea, Strait of Juan de Fuca, Machias Seal Island); US Naval Base at Guantanamo Bay is leased from Cuba and only mutual agreement or US abandonment of the area can terminate the lease; Haiti claims Navassa Island; US has made no territorial claim in Antarctica (but has reserved the right to do so) and does not recognize the claims of any other state; Marshall Islands claims Wake Island
Illicit drugs: consumer of cocaine shipped from Colombia through Mexico and the Caribbean; consumer of heroin, marijuana, and increasingly methamphetamine from Mexico; consumer of high-quality Southeast Asian heroin; illicit producer of cannabis, marijuana, depressants, stimulants, hallucinogens, and methamphetamine; money-laundering center

* * *

Federal republic, North America.

It comprises 48 contiguous states occupying the mid-continent, Alaska at the northwestern extreme of North America, and the island state of Hawaii in the mid-Pacific Ocean. Area, including the U.S. share of the Great Lakes: 3,675,031 sq mi (9,518,287 sq km). Population (2002 est.): 287,602,000. Capital: Washington, D.C. The population includes people of European and Middle Eastern ancestry, African Americans, Hispanics, Asians, Pacific Islanders, American Indians (Native Americans), and Alaska Natives. Languages: English (predominant), Spanish. Religions: Protestantism, Roman Catholicism, Judaism, Islam. Currency: U.S. dollar. The country's regions encompass mountains, plains, lowlands, and deserts. Mountain ranges include the Appalachians, Ozarks, Rockies, Cascades, and Sierra Nevada. The lowest point is Death Valley, Calif. The highest point is Alaska's Mount McKinley; within the coterminous U.S. it is Mount Whitney, Calif. Chief rivers are the Mississippi system, the Colorado, the Columbia, and the Rio Grande. The Great Lakes, the Great Salt Lake, and Lake Okeechobee are the largest lakes. The U.S. is among the world's leading producers of several minerals, including copper, silver, zinc, gold, coal, petroleum, and natural gas; it is the chief exporter of food. Its manufactures include iron and steel, chemicals, electronic equipment, and textiles. Other important industries are tourism, dairying, livestock raising, fishing, and lumbering. The U.S. is a republic with two legislative houses; its head of state and government is the president. The territory was originally inhabited for several thousand years by numerous American Indian peoples who had probably emigrated from Asia. European exploration and settlement from the 16th century began displacement of the Indians. The first permanent European settlement, by the Spanish, was at Saint Augustine, Fla., in 1565; the British settled Jamestown, Va. (1607); Plymouth, Mass. (1620); Maryland (1634); and Pennsylvania (1681). The British took New York, New Jersey, and Delaware from the Dutch in 1664, a year after the Carolinas had been granted to British noblemen. The British defeat of the French in 1763 (see French and Indian War) assured British political control over its 13 colonies. Political unrest caused by British colonial policy culminated in the American Revolution (1775–83) and the Declaration of Independence (1776). The U.S. was first organized under the Articles of Confederation (1781), then finally under the Constitution (1787) as a federal republic. Boundaries extended west to the Mississippi River, excluding Spanish Florida. Land acquired from France by the Louisiana Purchase (1803) nearly doubled the country's territory. The U.S. fought the War of 1812 against the British and acquired Florida from Spain in 1819. In 1830 it legalized removal of American Indians to lands west of the Mississippi River. Settlement expanded into the Far West in the mid-19th century, especially after the discovery of gold in California in 1848 (see gold rush). Victory in the Mexican War (1846–48) brought the territory of seven more future states (including California and Texas) into U.S. hands. The northwestern boundary was established by treaty with Great Britain in 1846. The U.S. acquired southern Arizona by the Gadsden Purchase (1853). It suffered disunity during the conflict between the slavery-based plantation economy in the South and the free industrial and agricultural economy in the North, culminating in the American Civil War and the abolition of slavery under the 13th Amendment. After Reconstruction (1865–77) the U.S. experienced rapid growth, urbanization, industrial development, and European immigration. In 1877 it authorized allotment of American Indian reservation land to individual tribesmen, resulting in widespread loss of land to whites. By the end of the 19th century, it had developed foreign trade and acquired outlying territories, including Alaska, Midway Island, the Hawaiian Islands, the Philippines, Puerto Rico, Guam, Wake Island, American Samoa, the Panama Canal Zone, and the Virgin Islands. The U.S. participated in World War I in 1917–18. It granted suffrage to women in 1920 and citizenship to American Indians in 1924. The stock market crash of 1929 led to the Great Depression. The U.S. entered World War II after the Japanese bombing of Pearl Harbor (Dec. 7, 1941). The explosion by the U.S. of an atomic bomb on Hiroshima (Aug. 6, 1945) and another on Nagasaki (Aug. 9, 1945), Japan, brought about Japan's surrender. Thereafter the U.S. was the military and economic leader of the Western world. In the first decade after the war, it aided the reconstruction of Europe and Japan and became embroiled in a rivalry with the Soviet Union known as the Cold War. It participated in the Korean War from 1950 to 1953. In 1952 it granted autonomous commonwealth status to Puerto Rico. Racial segregation in schools was declared unconstitutional in 1954. Alaska and Hawaii were made states in 1959. In 1964 Congress passed the Civil Rights Act and authorized U.S. entry into the Vietnam War. The mid-to late 1960s were marked by widespread civil disorder, including race riots and antiwar demonstrations. The U.S. accomplished the first manned lunar landing in 1969. All U.S. troops were withdrawn from Vietnam in 1973. The U.S. led a coalition of forces against Iraq in the First Persian Gulf War (1991), sent troops to Somalia (1992) to aid starving populations, and participated in NATO air strikes against Serbian forces in the former Yugoslavia in 1995 and 1999. In 1998 Pres. Bill Clinton became only the second president to be impeached by the House of Representatives; he was acquitted by the Senate in 1999. Administration of the Panama Canal was turned over to Panama in 1999. In 2000 George W. Bush became the first person since 1888 to be elected president by the electoral college despite having won fewer popular votes than his opponent, Al Gore. After the September 11 attacks on the U.S. in 2001 destroyed the World Trade Center and part of the Pentagon, the U.S. attacked Afghanistan's Taliban government for harbouring and refusing to extradite the mastermind of the terrorism, Osama bin Laden. In 2003 the U.S. and the United Kingdom attacked Iraq and overthrew the government of Saddām Ḥussein, which they had accused of aiding terrorists and possessing and developing biological, chemical, and nuclear weapons. As the U.S. attempted to help reconstruct and bring democracy to Iraq, it faced an escalating Iraqi insurgency. In 2004 Bush narrowly defeated Democratic challenger John Kerry to win a second presidential term.

* * *

▪ 2009

9,522,055 sq km (3,676,486 sq mi), including 204,083 sq km of inland water and 156,049 sq km of the Great Lakes that lie within U.S. boundaries but excluding 109,362 sq km of coastal water
(2008 est.): 305,146,000
Washington, D.C.
Head of state and government:
President George W. Bush

      With the long-developing subprime-mortgage crisis as the proximate cause, the United States led the world into a historic economic recession in late 2008. The downturn was marked by the collapse of financial firms, a dramatic decline in equity prices, and a subsequent falloff in lending and economic activity. By September the malaise had spread to developed economies in Europe, Asia, and elsewhere, prompting Western governments to undertake extraordinary rescue measures, often by nationalizing private banks. The U.S. government abandoned traditional free-market boundaries as it struggled to fashion an effective response, providing billions in assistance to save some firms, lowering interest rates, injecting capital to encourage lending, and taking an unprecedented equity position in private companies. By year's end the heroic measures had stabilized the economy at least temporarily, but the U.S. was clearly deeply mired in a global economic slump of uncertain duration.

      The economic turmoil occurred against the backdrop of a national election, and the Republican administration's controversial response to the crisis, accompanied by a public demand for policy change, helped Democrats take full control in Washington. The deteriorating economy and an overextended military also helped to ensure that the U.S. enjoyed few diplomatic successes during the year. In the ongoing war on terrorism, one bright spot for the administration was the continued firming up of the security situation in Iraq and the completion of a road map for ending U.S. combat operations there. The progress in Iraq, however, was at least partially offset by deteriorating conditions in Afghanistan that would require an increased Western troop presence.

Economic Crisis.
      Waning confidence in the value of securitized home mortgages and derivatives finally caught up with the U.S. economy during the year, prompting a disastrous chain reaction that eventually infected financial markets worldwide. The mortgages were packaged together and sold in bundles, backed by intricate and highly leveraged financial contracts, in a scheme designed to mitigate risk. The instruments, designed by Wall Street lawyers outside government regulatory oversight, were complicated and lacked transparency. When cracks appeared, instead of spreading and minimizing risk, the system acted to amplify unease and created a domino effect that spread across the financial system, from housing to mortgage lending, to investment banks, to securities firms, and beyond.

      In January, amid gloomy news of plummeting home sales and the first annual decline in home prices in at least four decades, equity prices plummeted rapidly. In response, Congress approved an economic stimulus package that provided a $600 cash rebate for most persons filing income-tax returns. The measure put $168 billion quickly into the economy but served only to delay more serious consequences. Institutions exposed to securitized mortgages and associated instruments saw their positions continue to deteriorate. In March, Bear Stearns, a venerable New York City investment bank, neared collapse and was sold in a fire sale backed by $30 billion in Federal Reserve funds. In July, Indymac Bancorp, the largest thrift institution in the Los Angeles area, was placed in receivership.

      On July 30, Pres. George W. Bush signed a bill designed to shore up mortgage lenders by guaranteeing up to $300 billion in new fixed-rate mortgages. The measure was ineffectual, however, and on September 7 the federal government essentially nationalized both Fannie Mae and Freddie Mac, which together owned or guaranteed half of the country's $12 trillion mortgage market. Instead of providing reassurance, the move only heightened investor worries about the economy and financial markets.

      In mid-September the dam broke. Merrill Lynch, the country's largest brokerage house, was sold to Bank of America under duress. Investment bank Lehman Brothers filed for bankruptcy, and federal regulators said that the firm owned so many toxic assets that a bailout attempt would be futile. A major money-market mutual fund, Reserve Primary, said that losses threatened its solvency; the Federal Reserve (Fed) offered $105 billion to shore up money funds, and the U.S. Treasury offered temporary insurance to money-fund investors. The Fed also pumped $85 billion into insurance giant AIG, which had provided backing for mortgage instruments, with the government taking a major equity position in return. Washington Mutual, the country's largest thrift institution, was seized as insolvent and sold for a fraction of its former value. By this time the contagion had spread to Europe and Asia, throwing the developed world economy into turmoil. (See Special Report (Financial Crisis of 2008 ).)

      U.S. Treasury Secretary Henry Paulson proposed a $700 billion rescue bill—initially written on only three pages—that was eventually approved by Congress on October 3. The Troubled Asset Relief Program (TARP) allowed federal authorities to purchase assets of failing banks and eased rules requiring strict valuation of distressed securities. The week of October 6–10, however, proved to be the worst one on Wall Street in at least 75 years, with the Dow Jones Industrial Average (DJIA) down 18%. Under pressure to prevent a complete financial collapse, during October the Fed pumped more than $2.5 trillion in emergency loans to banks and nonfinancial firms, lowering interest rates and working with European central banks to contain the damage.

      In November, as confidence continued to erode, Paulson abandoned plans to buy troubled assets under TARP and instead launched a plan to recapitalize financial firms, mostly by purchasing preferred shares of banks. The Fed also pledged another $800 billion to shore up distressed mortgages, provided $45 billion in assistance to Citigroup, and vowed further cuts to already-low interest rates. Those actions, in addition to similar moves by European and Asian governments, appeared to stabilize investor confidence. The stock market hit bottom for the year on November 20, with the DJIA settling at just over half of its record level of a year earlier. Even so, all indicators were showing that the underlying U.S. economy—technically in recession since the previous December—would continue to suffer from the crisis for months to come.

      Other distressed U.S. industries began petitioning Washington for assistance. After Congress refused a request from Detroit automakers for a $14 billion package, in December the Bush administration awarded up to $17.4 billion in loans to General Motors and Chrysler. That effectively postponed the automakers' plight until 2009 and handed the problem over to a new administration. Aides to President-elect Barack Obama (Obama, Barack ) publicly contemplated another federal stimulus package of $850 billion or more, including money for government infrastructure projects, as an early 2009 priority.

      The wild economic year devastated the country's balance sheet. The federal deficit for the fiscal year that ended September 30 almost tripled, to $454.8 billion, and analysts predicted that it would top $1 trillion in 2009. Investors lost an estimated $7.3 trillion in value from the decline in the 5,000 largest stocks alone. Overall, the year produced a 13% drop in the median home resale price, and an estimated 1 in 10 homeowners was in financial distress. Unemployment started the year at a modest 5% but stood at 7.2% in December and was climbing. The accelerating recession at least temporarily erased fears over rising inflation, with the consumer price index up little more than 1% in 2008. At midyear, as international demand peaked, oil touched $147 a barrel, producing gasoline prices of more than $4 per gallon and widespread distress in American households. By year's end demand was down, crude was under $40 per barrel, and gasoline had dropped to around $1.60 a gallon.

      The economy took a final blow in December with the arrest of Bernard Madoff, a major New York City hedge-fund operator. Madoff was accused of having run a giant Ponzi scheme, bilking his investors of up to $50 billion in what could be the largest financial scandal in history.

War on Terrorism.
      Five years after leading the invasion that toppled Saddam Hussein, the U.S. negotiated with the new Iraqi democratic government for an eventual end to allied combat operations. The agreement capped a year of declining violence and increased government control in Iraq and represented a dramatic turnaround for U.S. policy, which had seemed destined for a humiliating defeat only two years earlier. It also cleared the way for redeployment of U.S. troops elsewhere, particularly into resurgent terrorist areas of Afghanistan. Outgoing president George W. Bush hailed the Iraqi developments as a major step forward for democracy and credited the 2007 U.S. military surge, but his year-end visit to Iraq ironically was marred by dramatic political protest.

      Under U.S. pressure the Iraqi parliament took several steps to accommodate Iraq's Sunni minority and achieve ethnic reconciliation. In March the Shiʿite-dominated government deployed 30,000 Iraqi troops, accompanied by U.S. air support, into Basra in a successful thrust to depose the Mahdi Army, a radical Shiʿite militia that had long controlled the port city. Iraqi troops later entered and occupied Sadr City, a renegade Shiʿite section of Baghdad, without significant resistance.

      As violence ebbed markedly during the year, the Iraqi government took over increasing responsibility for its domestic security. In September Anbar province, once the cradle of the Sunni insurgency against the central government, was turned over to full Iraqi control. The following month Iraq assumed responsibility for some 100,000 (mostly Sunni) fighters; these Awakening Council forces had previously been paid and supervised by the U.S. military.

      At year's end Iraq and the U.S. signed a status-of-forces agreement that called for the removal of allied troops from Iraqi cities by mid-2009 and complete withdrawal of U.S. combat troops by the end of 2011. The agreement also gave Iraqi civilian authorities criminal jurisdiction over off-duty U.S. troops who committed infractions while away from their bases. Incoming U.S. president Barack Obama had campaigned for earlier withdrawal of U.S. forces within 16 months—or by May 2010. Obama later signaled, however, that he would listen to military advice and remain flexible on his timetable.

      By year's end allied forces were withdrawing from Iraq, and the U.S. military presence was diminishing toward presurge levels of 135,000. According to the Associated Press, U.S. troop deaths in 2008 stood at 314, down from more than 900 in 2007. (A total of 4,221 U.S. soldiers had died in the conflict since it began in 2003.) Some Middle East experts suggested that the security improvements were largely the result of internal Iraqi political reconciliation. In a final visit to Baghdad on December 14, however, President Bush declared that his administration's policies deserved credit and called the surge “one of the greatest successes in the history of the United States military.” At a press conference that same day with Iraqi Pres. Nuri al-Maliki, in a highly publicized incident, an Iraqi journalist threw two shoes at Bush as a sign of disrespect. Bush ducked the shoes; the journalist was temporarily jailed; and critics noted that such political protest would have been inconceivable in Saddam's Iraq.

      The military progress in Iraq was offset by renewed violence in Afghanistan as Sunni-dominated militant groups, including the Taliban and al-Qaeda, penetrated and challenged NATO forces in more than half of the country. In tacit recognition of the threat, U.S. Army Gen. David Petraeus, architect of the Iraq surge strategy, was elevated in October to head the U.S. Central Command, effectively taking control of allied military strategy in the war on terrorism, including the aggression in Afghanistan.

      As Afghan terrorist violence increased during the year, several NATO countries augmented troop deployments. At year's end the U.S. had about 32,000 of 62,000 NATO combat troops in Afghanistan, including an additional 1,000 sent by President Bush in November as part of what he termed a “quiet surge.” U.S. forces were concentrated in the east, on the dangerous border with Pakistan; the U.S. pursued an active counterinsurgency program on both sides of the border that involved the use of unmanned drone airplanes equipped with missiles.

      The Bush administration's legal strategy toward suspected terrorists suffered setbacks during 2008. In June the U.S. Supreme Court ruled, in a 5–4 decision, that even enemy combatants held outside the U.S.—at the U.S. detention facility at Guantánamo Bay, Cuba—had a right to a review of their cases in U.S. civilian courts. The ruling declared unconstitutional parts of two laws approved by Congress after 9/11 that were designed to allow indefinite detention of suspects and their eventual trial by military commissions. It further complicated dozens of pending combatant cases that were already burdened with charges of torture, withholding of evidence, and violations of international law by the U.S. military.

      Two war crimes trials were concluded during the year, the first in the U.S. since World War II. Salim Hamdan, a former driver for Osama bin Laden, was convicted in August on reduced charges of having provided “material support for terrorism.” He received a modest sentence of five and a half years and was released at year's end. A second defendant, Ali Hamza al-Bahlul, a Yemeni accused of having produced propaganda for al-Qaeda, including videos, was convicted by a military commission at Guantánamo Bay in October and given a life sentence. Neither Bahlul nor his attorney participated in his defense.

      In U.S. civilian courts, federal prosecutors won convictions in two antiterrorism criminal cases. In November, after a previous trial ended in a hung jury, the Holy Land Foundation and five former organizers were found guilty in Dallas of having funneled $12 million to the terrorist group Hamas. One observer alleged that the Muslim foundation's practice of supplying cash payments to Palestinian terrorists' families was the moral equivalent of car bombing. In December five foreign-born Muslims were convicted in New Jersey on charges that included having planned to kill U.S. soldiers at Ft. Dix. Defense attorneys claimed that the men were only talking and had planned no real violence, but prosecutors said that the convictions proved the effectiveness of the U.S. post-9/11 strategy of infiltrating violence-prone groups.

Domestic Policy.
      As lawmakers awaited a new administration following the historic win of Barack Obama in the presidential contest, election-year political considerations dramatically slowed the U.S. legislative process. (See Special Report (U.S. Election of 2008 ).) Despite record farm and food prices, Congress approved a $289 billion farm bill renewal that expanded agriculture subsidies and food-assistance programs. Congress also postponed a scheduled 10.6% reduction in physician reimbursements for Medicare, paying for the measure by trimming payments to insurance companies that provided supplemental health care programs. Bush vetoed both measures, but his vetoes were overridden both times. Two bills augmenting veterans' benefits—for housing, health care, life insurance, and family allowances—were signed into law. Another law dramatically expanded G.I. Bill education awards, essentially providing a full college education to veterans who had at least three years of service and allowing benefits to be transferred to family members.

      Preparing to leave office, the Bush administration at year's end proposed several dozen regulatory changes. Among them were provisions for expanding federal land eligible for shale oil development, increasing allowable on-road hours for truck drivers, allowing health care workers to refuse to participate in procedures that violated their moral or religious beliefs, permitting the possession of licensed firearms in national parks, reducing access to Medicaid vision and dental benefits, eliminating factors such as greenhouse conditions in Endangered Species Act reviews, and slowing federal protection for workers exposed to toxic chemicals. Obama transition officials promised to review the entire list in 2009.

Foreign Policy.
      U.S. relations with a resurgent and energy-rich Russia deteriorated further in 2008. Effects of heightened tensions could be seen worldwide as the two countries sparred over missile defense, Latin America, Iraq, Iran, and Russia's invasion of a province of Georgia. In one example, Russia almost single-handedly blocked U.S. efforts to ratchet up UN sanction pressure on Iran over its refusal to allow nuclear inspections. By year's end some commentators were saying that U.S.-Russia relations were at their lowest ebb since the end of the Cold War nearly two decades earlier.

      In April, under U.S. prodding, NATO agreed that it would eventually accept Georgia, Russia's southern neighbour, as a member—even though Russia opposed NATO's eastward expansion and viewed it as a security threat. Four months later, Russian troops invaded two rebellious Georgian provinces, South Ossetia and Abkhazia, and recognized them as independent states. NATO stepped up its military presence in the region, with U.S. warships delivering relief efforts to Georgia via the Black Sea. In what was widely viewed as a response, Russia dispatched a military flotilla to Venezuela in November in a show of support for Pres. Hugo Chávez, a critic of the U.S., and at year's end Moscow also staged a rare Russian navy visit to Cuba.

      With Chávez and Cuba's Raúl Castro in the lead, Latin American leaders formed a South American union (Unasur) and took other steps aimed at reducing U.S. influence in the region. A group of 33 countries staged a summit meeting in Brazil in December, pledging internal cooperation and welcoming Cuba after having failed to invite U.S. representatives.

      Efforts to prevent nuclear weapons proliferation suffered setbacks during the year. No progress was made in stopping nuclear development in either Iran or North Korea or in numerous Middle Eastern countries that were nervous about a potential threat from Iran; a number of Middle Eastern countries had initiated steps toward starting their own nuclear programs. Iran, continuing to insist that its nuclear development was solely for civilian energy purposes, persisted in stonewalling international watchdogs, even while Russia supplied Iran with uranium for enrichment and processing that could be diverted to weapons purposes. At midyear, in Geneva, U.S. authorities engaged in direct talks with Iranian nuclear negotiators for the first time and also joined major powers in offering yet another package of incentives for Iranian abandonment of its nuclear ambitions. Iran continued to obfuscate, however, and Congress tightened U.S. economic sanctions on Iran in September.

      After agreeing in 2005 to scrap its nuclear weapons program in return for normalized world relations, North Korea accepted promised food and fuel assistance from the U.S. and allies. As a show of good faith, Pres. George W. Bush removed Pyongyang from an international blacklist as a state sponsor of terrorism. In December five countries met to persuade North Korea to accept a verification regime written by its ally, China. The talks collapsed, however, when the North Koreans refused to sign the agreement, with analysts speculating that they were waiting for more favourable terms from the new U.S. administration. Prior to the breakdown, the U.S., Russia, China, and South Korea had already delivered 500,000 tons of fuel oil promised to North Korea for its cooperation.

      The U.S. continued to push for rapprochement between India and Pakistan, both to facilitate critical support for antiterrorism efforts and to counter growing Chinese influence in Asia. In October the U.S. signed an agreement to supply technological aid for India's nuclear program, even though India had tested nuclear weapons and refused to sign the Non-proliferation Treaty. In November, after Pakistan-based terrorists staged a bloody raid on Mumbai (Bombay), U.S. Secretary of State Condoleezza Rice visited the subcontinent to pressure both countries to continue normalizing relations. (See Special Report (Terror in Mumbai ).)

David C. Beckwith

Developments in the States 2008
      The national economic recession hit U.S. states with a vengeance in late 2008, throwing budgets deeply into the red and prompting forecasts of even more financial trouble ahead. Forced to balance their books, a few states raised taxes or fees to generate new revenue. Most states, however, tightened their belts—postponing or canceling new programs, laying off state employees, and trimming spending across the board to weather the fiscal storm. The action came as state capitals continued to wrestle with a host of issues left unresolved on the federal level, including immigration, global warming, children's health insurance, and education reform. Regular legislative sessions were held in 44 states during the year, and 22 states staged one or more special sessions, often to deal with financial issues.

      Eleven states held gubernatorial elections, and Democrats took over the Missouri governorship previously held by Republicans; this left the prospective 2009 governorship lineup at 29 Democrats and 21 Republicans. Legislative elections were staged in 44 states and resulted in modest gains for Democrats. Republicans won control of the Montana Senate and the Tennessee House and Senate, all previously tied or held by the other party. Democrats, however, took charge in the Delaware House, New York Senate, Nevada Senate, Ohio House, and Wisconsin Assembly. The Alaska Senate, previously Republican, and the Montana House, previously Democratic, were tied. That meant that Democrats had two-chamber control of 27 state legislatures, Republicans dominated in 14 states, and control was split or tied in 8 others. Nebraska had a nonpartisan unicameral legislature.

Structures, Powers.
      Voters in three states— Connecticut, Hawaii, and Illinois—rejected ballot measures authorizing conventions to write new state constitutions. Opponents said that the conventions could be hijacked by special interests—including opponents of same-sex marriage—and were an inefficient way to resolve local governmental concerns. California and New York became the first states to create a cabinet-level position to oversee volunteer and charitable activity.

       Arkansas became the 45th state to authorize annual legislative sessions. South Dakota voters decided to keep its term limits for legislators. By a narrow margin, California voters endorsed a proposal to have state legislative districts drawn up every 10 years by a citizen panel instead of by the legislature itself.

      As a mid-decade housing boom turned to bust, state revenue projections declined early in 2008, sending state authorities scrambling for cost savings. The outlook turned even more bleak in the fall as the financial crisis accelerated the U.S. descent into recession and pushed most state budgets into deficit. States were particularly hit by economic slowdowns because sales taxes and property-transfer levies were adversely affected, while state spending on unemployment assistance, Medicaid, and other benefits rose quickly. Among the hardest-hit states were California, which was forced to lay off thousands of state workers, and New York, which was dependent upon Wall Street transactions for one-fifth of state revenue. (See Special Report (Financial Crisis of 2008 ).)

      Most states were required to balance their budgets every year. Spending restrictions were enacted in some 40 states, often targeting health care and even education, the biggest items in most state budgets. The National Conference of State Legislatures reported that states found $40 billion in cost savings or additional revenue during the year but still faced an additional $97 billion in deficits for the 2009 and 2010 fiscal years. At year's end, governors petitioned President-elect Barack Obama (Obama, Barack ) for federal infrastructure assistance and for increased federal funds to help defray fast-rising Medicaid, unemployment insurance, and food-stamp costs.

      In November balloting, Colorado voters refused to repeal the state's strict limits on increased spending. Voters in North Dakota turned down a proposal to halve the state's income tax, and Massachusetts voters rejected the abolition of the state income tax. Maine voters voided a legislative plan to increase taxes on beer, wine, and soft drinks.

      Several states took steps to mitigate the mortgage crisis. North Carolina approved a foreclosure-prevention law offering state mediation assistance for borrowers. Twenty-nine states tightened laws covering mortgage licensing, and four—Kentucky, Maryland, Utah, and Washington—established mortgage fraud as a crime. Seven others tried to curb unscrupulous foreclosure-rescue scams.

      At midyear, with energy prices at record levels, the country's governors sought a doubling of the federal government's low-income heating-assistance program. Energy prices dropped markedly in the fall, however, and the anticipated crisis disappeared. New York became the first state to force online retailers to collect sales taxes; e-commerce company Amazon quickly filed suit in an attempt to void the law.

Social Issues.
      The Supreme Courts of California and Connecticut established same-sex marriage as a state constitutional right during the year, making those states the first to legalize same-sex unions since the top Massachusetts court authorized full marriage rights for homosexuals in 2003. The California decision, which was announced in June, was quickly challenged, however, and in the November election was overridden (52–48%) by state voters. The ballot result sorely disappointed gay rights advocates who were hoping for the first major voter ratification of same-sex marriage, and it also called into question the legality of 18,000 marriages performed in California in the five months following the court decision. New York's governor announced that the state would recognize gay marriages performed elsewhere. Even so, voters in Arizona and Florida banned same-sex marriage in their states, and in a related measure Arkansas voters required that foster parents be a married couple. At year's end 40 states had specifically outlawed same-sex marriage, through either state law or constitutional amendment, while 11 states and the District of Columbia legally recognized some form of domestic partnerships, civil unions, or gay marriage.

      Nebraska became the fourth state to ban race-based preferences in state hiring, contracting, and educational admissions decisions. A similar referendum, however, failed on a close vote in Colorado, which represented the first defeat for the anti-affirmative-action measure.

      Right-to-life advocates suffered reverses during the year. Washington voters joined Oregon in approving “death with dignity” acts allowing physician-assisted suicide. Michigan voters terminated a long-standing ban on embryonic stem cell research. South Dakota voters turned down a highly restrictive proposition banning abortion except in cases of rape, incest, or danger to the mother's health. For the second time, California voted down a ballot measure requiring parental notification before a minor could obtain an abortion.

Law, Ethics.
      Deadlock within the federal government on immigration reform led to state legislative action, but no consistent pattern developed. The administration of Pres. George W. Bush moved to head off a growing revolt over Real ID, a 2005 federal law requiring states to verify the identity of all drivers and issue tamper-proof licenses, a measure that states said was too costly and infringed on privacy rights. Facing widespread foot-dragging and noncompliance, the administration gave all states two additional years to conform. Oregon and Texas banned illegal immigrants from obtaining driver's licenses, and California's governor vetoed a legislative measure allowing them to be licensed. Seeking to combat accidents involving undocumented immigrants, Georgia upgraded to felony status a repeat conviction of driving without a license. Georgia and Mississippi increased mandatory use of the federal E-verify system to combat the hiring of illegal immigrants, but a U.S. judge blocked a similar Oklahoma law. Arizona voters refused to amend a controversial law that cracked down on employers who knowingly hired illegal immigrants. Oregon voters defeated a ballot measure restricting bilingual education.

      Arkansas voters, seeking to fund college scholarships, approved the 43rd state lottery. Maryland legalized slot machines at racetracks. Ohio and Maine voters rejected new casinos, but Colorado and Missouri voters expanded casino games and hours of operation. Voters in Massachusetts decriminalized the possession of one ounce or less of marijuana, and Michigan became the 13th state to allow marijuana for medical use. California voters rejected a major drug-law rewrite that would have decriminalized possession of small amounts of marijuana.

      Six states increased penalties for dog and other animal fighting. Massachusetts banned greyhound racing. Concealed-carry gun laws continued to expand: Florida allowed permit holders to take weapons to work (if they were left in a parked vehicle), and Georgia allowed guns in restaurants, parks, and public transit. Alaska, Indiana, Georgia, and Tennessee toughened laws against Internet predators.

      The year produced numerous ethics investigations, one involving Alaska Gov. Sarah Palin (Palin, Sarah ), whom state legislators accused of having improperly fired the state public-safety commissioner. A special counsel exonerated her one day before the November election, in which she ran as the Republican vice presidential candidate. New York Gov. Eliot Spitzer was forced to resign after he admitted having engaged a prostitute. Ohio Attorney General Marc Dann also resigned in a sexual-harassment scandal. In December, Illinois Gov. Rod Blagojevich was arrested by federal agents and charged with conspiracy to commit fraud and solicitation of bribery, including an alleged attempt to sell Barack Obama's vacated U.S. Senate seat.

      State use of the death penalty was suspended early in the year while the U.S. Supreme Court reviewed the constitutionality of lethal injections. After executions resumed in May, the use of capital punishment continued to decline. During 2008, 37 inmates were executed, down from 42 in 2007. Florida enacted a statute setting compensation for wrongful criminal convictions; the amount was $50,000 for every year served in prison.

Health, Welfare.
       Iowa became the 28th state to ban smoking in any public place, including bars and restaurants. Six additional states (for a total of 28) required cigarettes to be wrapped in self-extinguishing paper to prevent fires; this effectively made the statute a national requirement.

      Health-conscious California became the first state to ban trans fats and also the first to require posting of calorie and nutritional content on fast-food menus. In another antiobesity move, five states boosted the mandatory time that schoolchildren must spend at recess or gym classes.

      Budget problems forced several states (including California, Illinois, Missouri, New Mexico, and Pennsylvania) to postpone expansion of state health insurance coverage. Iowa, Colorado, and Montana expanded children's health care, and Florida and Maine increased funding for their novel health insurance assistance programs. New Jersey became the first state to require that all children have health insurance, though the measure contained no enforcement clause.

      New Jersey joined California and Washington in mandating that employers provide up to six weeks of paid leave annually to care for family members, but funding for Washington's law never materialized during the year. Seeking to curb infant deaths, Nebraska on July 1 joined states providing a “safe haven” for unwanted children. The new law failed to specify any age parameters, however, and within a few months, nearly three dozen older children, several from other states and some as old as 17, had been legally handed over to state care. At year's end Nebraska legislators amended the law with a 30-day age limit.

Environment, Education.
      California became the first state to enact a law encouraging home building in areas near workplaces and public transportation; the measure was designed to curb suburban sprawl and air pollution. Connecticut joined four other states in capping greenhouse gas emissions, and Delaware, Florida, and New Hampshire also approved measures to reduce emissions blamed for global warming. Delaware approved a major offshore wind energy project.

      Massachusetts became the first state to exempt non-food-based biofuels from state gasoline taxes and also approved a unique plan to manage its waters as a wind, wave, and tidal energy resource. Alaska issued a license for a $20 billion natural gas pipeline. Meanwhile, California voters approved nearly $10 billion in bonds for high-speed-rail construction between Los Angeles and San Francisco.

      Minnesota voters set aside a percentage of state sales-tax revenue for wetland protection. Alaska voters, seeking to protect moose, approved a game-management program that allowed the shooting of wolves from airplanes. Missouri voters approved a measure that required utilities to produce 15% of energy through renewable sources by 2021, but Californians rejected a more drastic requirement of 50% by 2025. Hawaii became the first state to require solar-powered water heaters in new homes.

      Some 32 states increased funding for prekindergarten education programs, but some plans were trimmed in late-year budget cutting. A shortage of funds torpedoed Arizona Gov. Janet Napolitano's plan to grant free public college tuition to all high-school graduates who had at least a B average.

      Reacting to high accident rates, several states tightened restrictions on new drivers, particularly teenagers. Virginia established for teen drivers a “baby DUI” law, a strict .02 blood-alcohol standard, which was one-quarter the allowable amount for adults. California banned teens from using cell phones while driving and also outlawed text messaging for all drivers. California and Washington joined three states that banned motorist use of handheld cellular phones. Arizona and Ohio voters rejected proposals to tighten restrictions on “payday lenders” accused of having predatory business practices.

David C. Beckwith

▪ 2008

9,366,006 sq km (3,616,235 sq mi), including 204,083 sq km of inland water but excluding the 156,049 sq km of the Great Lakes that lie within U.S. boundaries
(2007 est.): 302,633,000
Washington, D.C.
Head of state and government:
President George W. Bush

      For four years the United States economy had expanded robustly and virtually without incident, shrugging off concerns about potential overextension in a costly and deteriorating military expedition in Iraq, but 2007 brought abrupt change. A long-shot plan to temporarily increase the U.S. military presence seemed to work, reestablishing hope for a stable Iraq and easing pressure on an unpopular president—even as a fast-appearing disaster in the U.S. housing and financial sectors disrupted world markets and threatened to plunge the U.S. economy into recession. (See Sidebar (Subprime Mortgages: A Catalyst for Global Chaos ).)

War on Terrorism.
      As 2007 began, the U.S.-led international coalition in Iraq was fraying noticeably, American casualties were rising, and the newly elected Democratic congressional majority was demanding a prompt U.S. exit. Facing a humiliating forced withdrawal and likely defeat in his quest to establish a stable Middle East democracy, Pres. George W. Bush decided instead to replace his military leadership and escalate the U.S. military presence in the conflict. The bold plan attracted comparisons to the disastrous U.S. experience in Vietnam and ran counter to majority opinion—from Congress, the Iraq Study Group, and even U.S. public opinion polls. The conflict also led to the bloodiest year yet for U.S. troops fighting the war on terrorism.

      After a spike in violence at midyear, reinforced coalition forces in Iraq were able to forge cooperation pacts with numerous factions and root out terrorists in Baghdad and elsewhere. Security began improving dramatically across Iraq. As 2007 ended, Bush appeared to have won his last-minute gamble, at least temporarily, and bought more time for his policies.

 Bush vowed in January to augment the 132,000 Iraq-based U.S. forces with 30,000 reinforcements. Fresh troops began arriving within weeks, taking on both Sunni and Shiʿite militias for control of Baghdad neighbourhoods and creating alliances with tribal chiefs to combat suspected al-Qaeda fighters in outlying provinces. The military also provided financial support to Awakening Councils, formed by Sunni sheikhs designed to turn Iraqi neighbourhoods against foreign terrorist fighters by appealing to the residents' nationalist sentiment. The efforts were particularly successful in the unruly western Al-Anbar province, where previously hostile tribes began turning against al-Qaeda.

      By fall, as the U.S. surge reached its peak, some 160,000 U.S. troops were on Iraqi deployment, and congressional opposition to the plan grew ferocious. The new U.S. military commander, Army Gen. David Petraeus (Petraeus, David ), was summoned to Washington in September to answer skeptics and defend his cautious claims of progress. One prominent antiwar group, Moveon.org, ran a controversial full-page newspaper ad questioning General Petraeus's credibility and patriotism. Throughout the year Congress held more than 80 votes designed to reduce funding or force U.S. withdrawal from Iraq, but President Bush was able to obtain $200 billion for the war in three emergency spending bills that were eventually approved without strings attached.

      By October it had become obvious that the insurgency—bombings, attacks, and both civilian and military deaths—was losing momentum rapidly. At year's end violent incidents were down by two-thirds, Iraqis had taken over security in many areas, and officials were able to announce initial U.S. troop withdrawals. Even so, U.S. military deaths in Iraq reached 899 for the year, the highest number since the 2003 U.S.-led incursion. The U.S. monthly death toll peaked in May at 126, but it dropped to 37 in November and 23 in December. Although fighting between Islamic factions was also reduced during the year, critics pointed out that the Iraqi government had failed to make substantial progress in achieving national reconciliation.

      In Afghanistan Islamic radicals continued their resurgence following the 2001 NATO coalition invasion that toppled the Taliban from power. Sheltered in sanctuaries in lawless tribal areas of western Pakistan and financed in part by opium production, Taliban fighters escalated armed clashes in remote areas, at times retaking effective control of up to half the country. For the first time U.S. troop deaths topped 100, and the overall 232 coalition deaths for the year were almost evenly divided between U.S. and other NATO forces. For much of the year, the U.S. encouraged its allies to substantially increase their troop commitments in Afghanistan but with little success, and some observers suggested that the U.S. would soon be compelled to increase its own presence in Afghanistan even as it drew forces down in Iraq.

Domestic Policy.
       Democrats took control of Congress from scandal-plagued Republicans in January, promising major changes in national priorities, ethics, entitlements, health care, fiscal policy, and the war in Iraq. However, President Bush and the Republican minority, utilizing veto threats and procedural rules, managed to alter many Democratic initiatives and halt others altogether.

      The tone was set early in the year when the House of Representatives quickly approved virtually all of the Democrats' “Six in '06” campaign promises—including stepped-up stem-cell research, a minimum-wage increase, reduced-cost student loans, and mandatory negotiation on Medicare drug prices. All of the initiatives bogged down in the Senate, where a 60-vote supermajority was often necessary to move legislation. By May none of the Democratic legislation had reached Bush's desk.

      In the spring, antiwar Democrats attached an amendment to a $90 billion supplemental Iraq War appropriation requested by President Bush, setting a timeline for withdrawal of U.S. forces. Bush rejected the measure—only his second veto in more than six years in office. Amid GOP warnings that U.S. troops needed resupply, Democrats were forced to pass the supplemental without timeline amendments.

 Relations between Congress and the administration were contentious. The new congressional majority launched numerous investigations of past administration actions; one inquiry into the firing of eight U.S. attorneys in 2006 led to the resignation of Attorney General Alberto Gonzales, who was replaced in November by Michael Mukasey (Mukasey, Michael ). Bush was also weakened by the conviction in March of former vice presidential aide I. Lewis (“Scooter”) Libby on charges of having lied to a special counsel about his involvement in the leak of a covert CIA officer's identity. Bush commuted Libby's two-and-a-half-year prison sentence on the eve of Libby's incarceration.

      The pace of congressional legislation was glacial. Immigration reform, frustrated by the U.S. House in 2006, appeared headed for passage early in the year when the Bush administration started negotiations with Republican and Democratic Senate leaders on a compromise bill. Initially, in a test vote, 69 Senators signaled willingness to consider the measure, but the bill rapidly lost support on the Senate floor. The measure allowed most illegal aliens to stay in the U.S. and earn permanent status by paying taxes, learning English, and avoiding criminal activity, but it was quickly denounced by opponents as an amnesty that would encourage future disrespect for border laws. After three weeks of contentious debate, supporters agreed to add a “touchback” provision requiring undocumented aliens to return to their home countries at least briefly before receiving legal status. In the critical procedural test, however, only 46 Senators (of 60 required) voted to pursue the legislation, and comprehensive reform died again.

      President Bush vetoed five additional bills during the year and threatened rejection of some 50 more while seeking to force legislative changes. Bush vetoed expansion of federal funding for embryonic-stem-cell research and thus killed the measure again. Republicans stymied the plan to force pharmaceutical companies to negotiate with the government over Medicare drug prices. An ambitious bipartisan bill to double expenditures on the State Children's Health Insurance Program was vetoed twice; at year's end Congress extended the existing bill for 18 months, removing the issue from the 2008 election cycle. Congress was able to override only one Bush veto—an appropriations measure containing numerous local infrastructure projects, including funding for rebuilding the hurricane-devastated Gulf coast.

      The new Congress was ultimately successful at midyear in raising the minimum wage for the first time in a decade; the law increased the rate from $5.15 to $7.25 per hour by 2009. A Democrat-led effort to provide additional assistance to students by expanding Pell Grants and reducing interest rates on student loans also became law.

      Congress provided a record funding increase for veterans' health care programs and significantly tightened Washington lobbying and ethics rules. Critics noted that the new rules did not directly address concerns over rapidly expanding congressional earmarks—projects inserted in appropriations bills by individual lawmakers—and President Bush complained that a massive spending bill at year's end contained more than 9,800 such additions, with an estimated cost of $10 billion. Congress also provided a one-year fix for the Alternative Minimum Tax (AMT), a 1978 law originally written to ensure that the wealthy paid at least minimal taxes. For 2007, owing to inflation, the AMT threatened some 23 million taxpayers. After extended negotiations, the $50 billion AMT expansion was suspended, and the lost government revenue was to be added to the federal deficit.

      As oil prices moved close to $100 per barrel during the year, Congress passed new energy legislation to expand alternative energy sources, increase vehicle mileage standards by 40% (to an average of 35 mi per gal by 2020), and phase out incandescent light bulbs in favour of fluorescent lighting. Yet another threatened veto forced removal of provisions rolling back tax breaks to oil and gas companies, and Bush thus had successfully stopped any major tax increase during the year.

The Economy.
      Excesses in the domestic housing sector finally caught up with the U.S. economy during 2007, causing major disruption among financial firms and marring an otherwise solid sixth consecutive year of economic growth. The turmoil had worldwide ramifications. In most recent years the U.S. economic engine had pulled the global economy forward. In 2007, however, with the U.S. overextended and struggling, less-developed economies in India, China, Russia, and elsewhere shared the mantle of world economic leadership.

      Spurred by near 5% growth in the third quarter, the U.S. economy expanded by almost 3% for the year, close to its long-range potential. Employment increased every month, setting a national record of 52 consecutive months of net job growth. Some 1.5 million new payroll jobs were created, and although the unemployment rate moved upward from 4.4% to 5.0% during the year, employers continued to report worker shortages in many areas.

      The positive statistics, however, masked tumult caused by an urban housing bubble earlier in the decade. Brokers had helped fuel a boom in home construction and resales by offering adjustable-rate mortgages at low initial interest rates. The easy cash drove home prices up markedly, producing an overheated real-estate market that peaked in 2005. The new mortgages were typically packaged together and resold as securities to banks and other investors in the U.S. and worldwide. By mid-2007, however, it had become clear that a substantial minority of homeowners could not make their payments when their interest rates were adjusted upward. That led to rising delinquency rates and foreclosures, and an estimated $500 billion worth of “ subprime” mortgage securities were devalued, which reduced the lending capacity of financial institutions.

      With equity markets in turmoil, federal officials in August changed signals and began easing short-term interest rates, which had remained largely unchanged for a year. In an effort to avoid an economic slowdown, the U.S. Federal Reserve Board (Fed) lowered the key federal funds rate by one-half percentage point in August and followed with two additional quarter-point reductions in the fall. The administration also sought voluntary private-sector cooperation to ease the crisis, including a controversial plan to freeze interest rates temporarily. The action came too late, however, to forestall multibillion-dollar losses reported in the fall by holders of subprime paper. The chairmen of Citibank and Merrill Lynch resigned under pressure, and several major financial institutions were forced to seek infusions of foreign funds to bolster their books.

      Other economic news was mixed. With energy prices again rising, the threat of inflation reappeared, and the consumer price index topped 3% for the year, well above the Fed's guidelines. National workplace productivity, a key measure of economic efficiency, resumed substantial growth after a brief slowdown. As expanding economic activity bolstered revenues, the federal budget deficit declined again in 2007, to $163 billion. As the U.S. trade deficit continued at a historic peak, the U.S. dollar suffered, losing 10% of its value to the euro during the year.

      Overall, despite turmoil among financial firms, Wall Street ended a turbulent year with solid, if unspectacular, gains. Equity markets rallied following the August interest-rate cut but gave back most of the year's gains later in the year. The broad Dow Jones Wilshire 5000 index finished up by 3.9% for the year, while the Dow Jones Industrial Average gained 6.4%. At year's end, however, consumer and investor confidence was dropping, and economists were divided on whether the national economic expansion would continue into 2008.

Foreign Policy.
      With its attention and resources concentrated on Iraq and Afghanistan, the U.S. was unable to focus sustained diplomatic attention on overseas issues and recorded little real progress during 2007. One apparent exception involved North Korea, which President Bush in 2002 had named as one of three “axis of evil” countries because of weapons exports and support for terrorism. For four years Japan, Russia, China, the U.S., and South Korea had negotiated with North Korea to dismantle its fledgling nuclear weapons capacity. On September 3, however, diplomatic negotiators announced that North Korea had agreed to catalog and dismantle its nuclear testing sites and would in turn receive a $300 million aid package. At year's end North Korea failed to honour yet another disclosure deadline, but diplomats remained optimistic that a breakthrough had been achieved.

      Efforts to prevent Iran from achieving nuclear capability were largely unavailing. After Iran denied UN inspectors access to suspected weapons sites, the Security Council approved a unanimous resolution tightening international economic sanctions in March—again, with few ascertainable results. In early December U.S. intelligence agencies released a surprise consensus National Intelligence Estimate (NIE) declaring with “high confidence” that Iran had abandoned its pursuit of nuclear weapons capacity in 2003—reversing a 2005 “high confidence” estimate by the same agencies that Iran was rapidly developing that weaponry. The new NIE undermined the international consensus seeking to stop the Iranian nuclear development, and critics charged that U.S. intelligence officials were effectively overturning Bush administration policy.

       Russia maintained substantial trade with Iran, including its first delivery of uranium fuel, and U.S. relations with Russia continued to slowly deteriorate during the year. At midyear President Bush invited Russian Pres. Vladimir Putin to Kennebunkport, Maine, in an unsuccessful attempt to warm up bilateral relations. U.S. officials were openly critical of Putin's centralization of control over the Russian government, suggesting democracy was being undermined. After the U.S. pushed toward installing missile defense shields in Poland and the Czech Republic, Putin announced that Russia would suspend its participation in the 1990 Conventional Forces in Europe (CFE) treaty, the arms-control agreement.

      U.S. policy toward Asia was dominated by the growing influence of China, a current trading partner viewed as a future economic or even military rival. The U.S. filed three World Trade Organization complaints during the year against China, which nonetheless continued to enjoy a huge export advantage in the bilateral trade balance. In the spring China was forced into massive recalls of substandard products shipped to the U.S., including defective tires, tainted pet food, and toys with lead paint. (See Special Report (Perils of China's Explosive Growth ).)

      The U.S. concentrated on tightening relations with India and an increasingly unstable Pakistan in an effort to counter China's growing influence. The U.S. signed a controversial agreement with India to facilitate production of domestic nuclear power, even though the deal arguably infringed on international nuclear nonproliferation agreements. Relations with India's rival, Pakistan, were rockier. The U.S. pushed the Pakistani military regime for democratic reform even while seeking from Pakistan additional military action against Taliban fighters attempting to destabilize Afghanistan. The U.S. openly criticized Pakistani Pres. Pervez Musharraf's declaration of a short-lived state of emergency in November but was less outspoken when President Musharraf's chief rival, former prime minister Benazir Bhutto (Bhutto, Benazir ), was assassinated in late December.

      The U.S. was able to claim closer ties with one of its traditional major allies following the presidential elections in France. The warming was especially noteworthy because for years France had been openly critical of U.S. policies in Europe and the Middle East.

      With the U.S. Senate bogged down in partisan gridlock, international treaties received scant attention. Over opposition from trade unions, the Senate finally approved a free-trade agreement with Peru, but similar proposed pacts with Colombia, Panama, and South Korea languished at year's end. A Senate committee voted 17–4 in late October to ratify the decades-old Law of the Sea treaty, which had previously been signed by virtually every other country, but conservatives argued that the treaty would grant the UN powers that rightfully belonged under exclusive U.S. sovereignty. The full Senate did not take up the treaty by year's end.

 President Bush attempted to counter a distinct regional movement toward socialism and the growing influence of Venezuelan Pres. Hugo Chávez by visiting five Latin American countries in the spring. Chávez, the most visible manifestation of a discernible leftward shift in Latin American politics, stepped up his anti-American rhetoric during the year and established a close relationship with U.S. adversaries such as Iran. U.S. influence was bolstered when Chávez appeared to overreach and narrowly lost a December referendum that would have allowed him to rule the oil-rich country indefinitely.

      U.S. policy was severely tested in two international conferences at year's end. Responding to complaints about the lack of leadership toward Middle East peace, U.S. Secretary of State Condoleezza Rice convened a 40-country summit in Annapolis, Md., in late fall. The conference, which included the Israeli and Palestinian heads of state, ended amiably with mutual vows to draft another framework for peace in 2008. The U.S. found itself isolated at a UN-sponsored conference on global warming held in Bali, Indon., in December. Criticized for its failure to sign the 1996 Kyoto Protocol and largely abandoned in public sessions by other major industrialized countries, the U.S. delegation reversed itself in mid-conference and agreed to a new process that promised involvement of less-developed countries, speedier antipollution technology transfers to Third World countries, and development of a worldwide plan to combat global warming by the end of 2009.

David C. Beckwith

Developments in the States
      Reacting to perceived failures by the federal government, U.S. states moved forward in 2007 on several lawmaking fronts, including health care, immigration, security, climate change, and other areas heretofore considered national issues. The tension with Washington, D.C., enlivened an active year for state governments and resulted in a marked deterioration of fiscal balances by year's end. All 50 states staged legislative sessions during the year, and 22 states held one or more special sessions.

Party Strengths.
       Democrats recorded gains in limited state elections during the year. In governorships Republicans took over a Democratic seat in Louisiana but were ousted in Kentucky, thereby maintaining the Democratic advantage at 28–22. Democrats also added to their majority control in legislative balloting, having taken over the state Senates in Virginia and Mississippi and adding seats elsewhere. In 2008 Democrats would have control of both legislative chambers in 23 states, and the GOP would dominate in 14, with 12 split or tied. (Nebraska has a nonpartisan, unicameral legislature.)

Structures, Powers.
      Several states took action to safeguard ballot procedures. Montana and South Dakota moved to curb abuses in citizen initiatives by prohibiting signature gatherers from being paid by the signature. After enacting a model election-reform law in 2001 in response to the presidential election debacle the previous year, Florida was forced to revisit the subject; this time the state required a paper trail in all electronic voting machines. Iowa, Maryland, and Virginia approved similar mandates; as a result, 27 states required a paper record for auditing purposes.

Government Relations.
      A rare revolt over mandates in a federal law broke into the open during 2007 when several state legislatures—including New Hampshire, Montana, Oklahoma, and Washington—refused to comply with the Real ID Act. Many others also took preliminary steps in the same direction. The federal law required states to verify the identity of all 245 million licensed drivers and to impose other security features, at an estimated cost of $14 billion.

      Real ID had been under fire since its passage as an antiterrorism measure in 2005. States objected to its cost; civil libertarians raised privacy concerns; and immigrant rights groups objected to provisions impairing states' ability to grant driver's licenses to noncitizens. Under the act, licenses that did not comply could not be used as identification for entering airports or federal buildings. Tennessee followed North Carolina in denying driver's licenses to illegal immigrants during 2007, and a proposal by New York's governor to issue licenses to illegal residents was abandoned after widespread criticism.

 Tension over funding and control of state National Guard troops continued to simmer. After Louisiana's governor turned down a federal National Guard takeover in the wake of Hurricane Katrina in 2005, Congress permitted its federalization in future disasters, which prompted objections from numerous governors. Kansas Gov. Kathleen Sebelius implied that Iraq deployment had reduced the ability of the National Guard in her state to assist in May when tornadoes devastated Greensburg. Under pressure from the White House, however, Sebelius said that her real worry was preparedness for possible future needs.

      After two years of bustling revenue and spending growth, states tightened their belts in 2007 as national economic growth slowed at year's end. Tightening mortgage standards helped depress the housing market in many areas, and sales and real-estate tax collections slowed markedly. Arizona, California, Florida, Illinois, Maryland, Michigan, Virginia, and Wisconsin were among the states facing major budget deficits. (See Sidebar. (Subprime Mortgages: A Catalyst for Global Chaos )) With its auto industry slumping, Michigan continued in what became known as a “one state recession.”

      State general fund expenditures rose by 9%, paced by increases in Medicaid and pension spending. For the first time, owing to rising health care costs, state spending on Medicaid programs for low-income individuals in 2007 surpassed state expenditures on K–12 education. Numerous states adjusted tax rates, but overall changes in revenue collections were minor by historical standards. Twenty-four states reduced and 4 raised personal income taxes, saving taxpayers $1 billion, while 22 states lowered and two increased sales taxes. Nine states, led by Michigan and New York, increased corporate income taxes. The biggest revenue increases were assessed against tobacco products, with eight states raising cigarette taxes by $761 million. No state adjusted alcohol taxes during the year. Two states raised motor fuel taxes, and 13 boosted motor vehicle and other user fees.

      As the real-estate slump deepened late in the year, states began tapping their “rainy day” funds, carryover balances, and other reserves to combat looming budget deficits. More than a dozen states raised alarms over pending budget deficits, increasing the prospect of further belt tightening or tax increases in 2008.

Law and Justice.
      Opposition to capital punishment gained significant ground during 2007, fed by concerns over wrongful convictions and the humaneness of executions. In late September the U.S. Supreme Court announced that it would review whether the lethal-injection method used in virtually all executions constituted “cruel and unusual punishment.” States immediately suspended imposition of the death penalty for the year. Only 42 inmates were executed during 2007, the fewest number since 1994. At year's end New Jersey became the first state in 42 years to abolish capital punishment; death penalty statutes remained on the books in 36 states.

      Though Congress failed to enact immigration reform, 46 states ratified immigration-related legislation. Several moved to increase employer responsibility for ensuring that their workers were in the U.S. legally. Arizona, Nevada, Oklahoma, Tennessee, and West Virginia joined Colorado and Georgia in restricting immigrant services or increasing enforcement penalties against illegal aliens. The Arizona law suspended the license of any business convicted of having hired illegal aliens; a second offense was grounds for permanent revocation. Arizona's governor also deployed National Guard forces to assist in border enforcement. Oklahoma prohibited the hiring or the transporting of illegal workers and banned undocumented aliens from receiving public benefits. A dozen other states approved a variety of measures that cracked down on identity fraud or required proof of legal status to receive public benefits. California specifically extended public benefits to migrant workers, and Illinois became the first state to prohibit officials from checking identities by using a federal database.

      After National Football League star Michael Vick was arrested on federal dogfighting charges, several states moved to bolster animal cruelty laws. New Mexico and Louisiana became the last two states to ban cockfighting, although the Louisiana law would not take effect until August 2008.

      States continued to expand legalized gaming during the year. Seeking a greater share of profits, Kansas became the first to authorize large-scale casino resorts owned and operated by the state. Indiana joined 11 other states that allowed slot machines at horse tracks, and legislation was pending in Maryland and Michigan; West Virginia added table games at casino racetracks. Maine voters rejected a harness-racing track with slot machines, but Florida and California were among the states that allowed expanded gaming in Native American casinos.

Health and Welfare.
  Health care issues—including access, cost, and delivery—dominated legislative agendas during 2007. The debate came as the federal government moved slowly to expand and reform its State Children's Health Insurance Program (SCHIP), which some states had used to cover parents, single adults, and middle-class families. New York and New Jersey helped to fuel the controversy by seeking federal matching SCHIP funds for families earning up to 400% of the poverty income level.

      Illinois became the first state to guarantee health insurance to all children. Florida and Indiana initiated closely watched experiments in Medicaid reform, expanding coverage while trying to hold down costs through insurer competition and requiring recipients to contribute to personal health savings accounts.

      More states moved toward universal health insurance coverage. California's governor proposed a $12 billion plan to cover all state residents—learning from universal coverage experiments under way in Maine, Massachusetts, Vermont, and Hawaii—but the initiative bogged down in the state legislature.

      Eight additional states—Illinois, Maryland, Minnesota, New Hampshire, New Mexico, Oklahoma, Oregon, and Tennessee—banned smoking in public areas and places of employment, including restaurants and bars. By 2008 a total of 31 states would mandate smoke-free environments.

      Bucking a national trend, Oregon voters turned down a proposal that would raise tobacco taxes to finance increased health insurance for children. State stem-cell research had a mixed year; voters in New Jersey rejected a major bond issue related to such research, but New York budgeted $600 million over 10 years. Texas voters approved $3 billion in bonds for cancer research. Texas and Florida joined New Jersey in testing high-school athletes for steroid use. Nearly half the states considered requiring schoolgirls to be vaccinated against the human papillomavirus (HPV), which causes cervical cancer, but only one—Virginia—enacted a statewide program (parents were allowed to opt out, however).

      Programs that would allow for choice in K–12 education made minimal progress during 2007; in the past, such programs had been on the rise. Utah became the first state to enact a universal voucher law that allowed any child to receive public funds to attend private school, but Utah voters repealed the measure in November. Three states expanded voucher programs but only for students with disabilities.

      Washington joined California in requiring employers to grant paid leave of up to $250 per week for parents with newborn children. Illinois became the 12th state to require a mandatory daily moment of silence in public schools. Maryland became the first to enact a “living wage” law that required state contractors to pay their employees up to $11.30 per hour. Alabama, Maryland, North Carolina, and Virginia legislatures expressed remorse for their states' past support of slavery.

      Advocates of equal rights for homosexuals made progress during the year. New Hampshire became the fourth state to approve civil unions, giving same-sex couples all rights granted under traditional marriage laws. Oregon and Washington joined California, Maine, and Hawaii in enacting domestic-partnership laws, with many of the same benefits. Rhode Island's attorney general declared that his state would recognize marriages performed in Massachusetts, the only state that recognized same-sex marriages. Iowa and Colorado banned discrimination in the workplace on the basis of sexual orientation, and Colorado specified that homosexuals could adopt children.

      Oregon voters rolled back a controversial 2004 initiative that required that the government compensate property owners for land-use restrictions; the measure had produced demands for $19 billion in little more than two years. Florida and Maryland restored voting rights for convicted felons who had served their time.

       Global-warming fears, augmented by a perception that the federal government was foot-dragging on environmental protection, spurred significant state legislation during the year. Hawaii, New Jersey, Minnesota, and Washington endorsed a 2006 California law that limited smokestack emissions from power plants and industrial sources. After President Bush signed a law boosting automobile fuel economy standards over 12 years, the administration formally rejected a tougher 2002 California law requiring an even faster reduction in auto carbon-dioxide emissions. The state initiative had been endorsed by a dozen additional states, including Maryland in 2007, and at year's end California announced new plans to sue the federal government.

      States continued to boost goals for producing electricity from renewable sources, with Minnesota, New Hampshire, and Oregon officially aiming at a goal of 25% clean production by 2025. A total of 23 states had renewable energy standards.

Consumer Protection.
      State laws requiring that cigarettes be self-extinguishing gained rapidly in popularity. A total of 15 states approved new “fire-safe” measures, bringing the number to 21 states that required that manufacturers add bands of paper that snuffed the flame quickly if a cigarette was not being smoked.

      Following the collapse of a Minnesota I-35W highway bridge on August 1, states nationwide moved to reinspect similar structures and propose infrastructure-repair plans. Even though studies showed that more than one-quarter of the country's bridges were rated either structurally deficient or obsolete, minimal additional funding was allocated during the year.

      A battle continued in state legislatures between telephone and cable companies over regulation and taxation of multichannel television; more than a dozen states moved from local to statewide control. Telephone firms wanted to bypass complicated local requirements as they attempted to compete with cable on Internet access as well as telephone and television delivery.

      With $40 billion in insurance claims from Hurricane Katrina, insurers moved to raise rates or reduce coverage, creating a serious backlash in several states, particularly along the Gulf Coast. Louisiana and South Carolina offered tax breaks to insurers, and in a controversial move, Florida dramatically expanded its state-run “insurer of last resort” to cover more than one million residents. Critics warned that the state was taking on excessive risk. Nevada, New Mexico, and Oregon increased regulation of short-term-interest “payday” lenders. Washington state voters approved a measure that allowed triple-damage lawsuits against insurers who wrongfully rejected claims.

David C. Beckwith

▪ 2007

9,366,008 sq km (3,616,236 sq mi), including 204,083 sq km of inland water but excluding the 156,049 sq km of the Great Lakes that lie within U.S. boundaries
(2006 est.): 299,330,000; on October 17 the population passed 300,000,000
Washington, D.C.
Head of state and government:
President George W. Bush

 Following the terrorist attacks in the U.S. on Sept. 11, 2001, the administration of Pres. George W. Bush mounted an aggressive international response, organizing a military coalition of willing industrialized countries to root out international terrorism. By 2006, however, the effort had been bloodied by religious-inspired violence, and even though there again was no subsequent terrorist attack on U.S. soil, U.S. defeat on the central battlefield appeared possible. For the third consecutive year, more than 800 U.S. soldiers died in Iraq (more than 3,000 had died since the conflict began in 2003), and the patience of the American public was exhausted. Following electoral reverses in November, President Bush faced the real prospect that his legacy in the war on terrorism would be one of overreach and failure. (See Sidebar. (U.S. 2006 Midterm Elections ))

      As the year began amid signs of easing tensions, top U.S. commander Gen. George Casey was planning a yearlong gradual reduction in the 155,000 U.S. troops on the ground in Iraq. At that time minority Sunni Muslims, aided by non-Iraqi terrorists, were carrying out repeated bombing and kidnapping attacks against majority Shiʿite members, but U.S. and Iraqi government forces were handling the violence. On February 21, however, seven al-Qaeda terrorists staged a predawn attack on the revered al-Askari shrine in Samarraʾ, north of Baghdad, blowing apart the mosque's famed golden dome. The sacrilege at one of Shiʿism's most holy sites started an outburst of retaliatory violence that escalated steadily during the year, drawing much of Iraq into a sectarian civil war. As thousands of Iraqis were murdered and U.S. troop losses mounted, military reinforcements were ordered, and domestic support for President Bush and his Iraq policy contracted rapidly. Antiwar sentiment rose, and as elections approached, even loyal Republicans began to break with the president, many decrying the absence of a strategy to win in Iraq.

 After voters emphatically repudiated Republic leadership in congressional elections, Bush accepted the resignation of Secretary of Defense Donald Rumsfeld, a clear indication that the administration's Iraq policy had failed. Bush appointed as Rumsfeld's replacement former director of central intelligence Robert M. Gates (Gates, Robert ). (See Biographies.) A bipartisan Iraq Study Group of government elders cochaired by former secretary of state James A. Baker III and former congressman Lee H. Hamilton issued a report calling for increased regional diplomacy and phased withdrawal of the overstretched U.S. military from Iraq. The report was designed to provide political cover for disengagement. At year's end, however, Bush appeared to be pondering instead a “surge,” or temporary escalation in U.S. forces, led by aggressive new field commanders, in a last-ditch effort to win the peace in Iraq and establish democracy in the Arab world.

      Amid increasing pessimism, there were occasional signs of progress in the war on terrorism. A U.S. air strike in June killed Abu Musab al-Zarqawi (Zarqawi, Abu Musab al- ) (see Obituaries), the leader of al-Qaeda in Iraq. British authorities in August broke up a London-based plot by Islamic terrorists to carry small liquid bombs, disguised as sports drinks, on up to 12 U.S.-bound jumbo jetliners. The episode was a stark reminder of the stakes faced by the West in the terrorism struggle. In December a U.S. aircraft-carrier task force supported Ethiopian army forces that routed Islamic fighters from Somalia.

      The war on terrorism also exposed deep internal divisions over constitutional protections during wartime. News stories detailed government surveillance techniques against terrorism, including wiretapping of overseas calls and monitoring of international bank records—measures that alarmed civil libertarians. After persistent rumours, Bush announced that CIA officials had been holding high-value terror detainees in secret prisons around the world and subjecting them to aggressive interrogation methods that some critics labeled torture. In September, following weeks of contentious debate, Congress approved new legislation allowing the president latitude in approving interrogation techniques and granting detainees only a well-regulated military-commission prosecution, not a full hearing in federal courts.

Domestic Policy.
      With public-opinion polls showing popular disdain for the political leadership in Washington, Congress accomplished little during 2006 and earned comparison with “do-nothing” legislatures of earlier eras. Republicans blamed unfinished business on opposition obstructionism, but Democrats replied that congressional leaders needed to work harder and listen more closely to public opinion. In November voters opted for new congressional management.

      With the Bush administration distracted by Iraq and investigations, the White House was unable to furnish strong leadership for much of the year. In June the president's top strategist, Karl Rove, was finally cleared by a special prosecutor following the investigation of the publication in 2003 of a CIA employee's identity. It was revealed later that a top Department of State official, Richard Armitage, had inadvertently leaked the name—a fact known by Prosecutor Patrick Fitzgerald before he started his three-year investigation.

      The legislative year got off to a fast start. In late January, by a 58–42 vote, New Jersey federal judge Samuel Alito (see Biographies) was confirmed as a U.S. Supreme Court justice, replacing retired justice Sandra Day O'Connor. The Bush nomination produced the most contentious high-court confirmation battle in 15 years, with opponents suggesting that Alito would expand presidential powers at the expense of Congress and curb abortion rights.

      A month later Congress overwhelmingly approved reauthorization of the Patriot Act, initially enacted after the 2001 terrorism attacks. The law, which had attracted widespread criticism, was changed only modestly to provide subpoena targets with additional procedural rights as information was gathered in terrorism investigations.

      The U.S. Senate conducted a tumultuous debate on immigration policy during the spring. Initially, the Judiciary Committee approved a bill that would allow most undocumented persons now residing in the country to stay in the U.S. and “earn citizenship” by paying $2,000 in fines, working for six years, learning English, undergoing a background check, and paying any back taxes owed. That bill was widely criticized as an amnesty that rewarded illegal conduct. Republicans soon substituted a new version on the Senate floor that also contained a guest-worker program but stiffened requirements for obtaining legal status and eliminated most recent arrivals altogether.

      During consideration of more than 40 amendments to the bill, senators barred immigrants from obtaining legal status if they had committed a felony or violated a court order; this provision alone eliminated an estimated 500,000 of the 12,000,000 undocumented persons currently in the country. In a close vote the Senate allowed even workers who had used false identity to claim Social Security benefits on their earnings. The compromise bill, which had President Bush's tacit support, increased border and workplace enforcement and expanded visa authorization; it was approved 62–36. The debate prompted strong public reaction. On April 10, Hispanics and their sympathizers staged massive protest marches in 102 cities across the country. Some marchers carried Mexican flags, which generated a backlash in public opinion, and flags disappeared from later demonstrations. The protests led to a demand for stronger border enforcement. On May 15, President Bush announced that he was sending National Guard troops to the Mexican border to assist the U.S. Border Patrol.

      The U.S. House had previously approved an enforcement-only border-security plan, with stiffened penalties for immigration violations and no provision for guest workers. Instead of negotiating with the Senate, House leaders refused to appoint conferees, staging instead a series of 40 public hearings across the country designed to highlight perceived deficiencies in the Senate bill. As the election approached, Congress reconfirmed support only for 1,125 km (700 mi) of fencing along the U.S.-Mexico border, leaving comprehensive immigration reform for another year. (See Special Report (Immigration's Economic Impact ).)

      The year saw no progress on reforming Social Security, even as baby boomers began to retire and receive benefits. Despite persistent headlines from the corruption scandal involving lobbyist Jack Abramoff, lobbying reform was not seriously considered. Legislation to address global warming died, as did the bills to combat identify theft and to increase the minimum wage. Congress approved one controversial measure, which authorized dramatically expanded funding for embryonic- stem-cell research, but right-to-life groups vigorously opposed the bill, and President Bush vetoed it in July. It was his first veto in more than five years in office.

      Congress renewed the 1965 Voting Rights Act for 25 additional years. It also overhauled national pension legislation and allowed travelers to bring back up to three months' supply of prescription drugs from low-price retailers in Canada. Only 2 of 11 appropriations bills were approved—in large part because taxpayer groups found more than 10,000 congressional earmarks in the drafts. This meant that government would be largely funded into 2007 via a resolution that would continue existing programs. At year's end, Americans were mourning the loss of former U.S. president Gerald R. Ford (Ford, Gerald Rudolph, Jr. ), who died in Rancho Mirage, Calif., on December 26. (See Obituaries.)

The Economy.
      Despite a slowdown in key auto and housing sectors and renewed turmoil in corporate executive suites, the U.S. economy expanded for the fifth consecutive year. Some 1.8 million new payroll jobs were created, and equity markets headed substantially higher. The jobless rate fell further, from 5% to 4.5%, creating virtual full employment in most areas of the country, even as tens of thousands of new undocumented workers from abroad joined the workforce.

      Bustling economic activity continued to stir inflationary fears at the country's central bank. The U.S. Federal Reserve continued a two-year policy of nudging up interest rates and boosted the federal funds rate by 0.25% on four occasions early in the year. With higher rates pinching housing sales and gasoline prices again heading over $3 per gallon because of political unrest and summer demand, the Fed halted further increases for the year. Even so, the expansion was slowed, and inflation dropped to the lowest rate in three years. GDP jumped by 5.6% in the first quarter before settling down to a healthy 2–3% growth range for the remainder of the year. The spring energy price rise proved short-lived, and by mid-October the price of gasoline was dropping rapidly back down to $2.20. With consumer prices rising at 2.5% for the year and energy prices again under control, national financial markets staged a substantial late-year rally. The S&P 500 finished the year up 13.6%, and the closely watched Dow Jones Industrial Average did even better, gaining 16.3%.

      Some critics asserted that the year's rosy financial news concealed deep underlying problems in the U.S. economy. Productivity growth, a key measurement of economic efficiency, slowed during 2006 after five years of major gains. Rapid growth in government revenues slashed the U.S. budget deficit for fiscal 2006 to $248 billion, well under early estimates, but a worry to some economists.

      After overheating in 2005, the nation's housing market continued to cool dramatically, with average prices in some areas falling by up to 10%. That, plus higher interest rates, reduced the ability of consumers to borrow against their home equity, a major source of economic liquidity in recent years. Even so, American consumers continued to spend heavily on foreign goods, including automobiles, and the three major U.S. auto companies joined housing on the short list of industries that failed to join in the year's economic good news. Consumer demand for foreign goods coupled with a decline of U.S. energy reserves meant that the country's trade deficit set another record during the year. The U.S. dollar rose slightly against the Japanese yen and dropped more than 10% against the euro.

      Corporate America was hit by another major internal scandal in 2006, this one centring around executive stock options. Under federal investigation, nearly 200 publicly held companies reported irregularities in granting of options on corporate stock, often involving backdating of options to maximize their worth. By mid-December more than 55 executives and directors had been forced out and others implicated in questionable transactions, some in major firms. In one particularly dramatic case, the CEO of Comverse Technology fled to Namibia to avoid extradition and questioning about option grants.

      The country's most successful retailer, Wal-Mart, sustained a rare tumultuous year capped by disappointing year-end holiday sales. The company was the target of a major public-relations attack by union activists objecting to the company's pay and benefits policies, its reliance on Chinese merchandise, and its adverse affect on local small retailers.

Foreign Policy.
      With American power tied down by sectarian conflict in Iraq, diverting both focus and resources, U.S. diplomacy suffered through a forgettable year in 2006. Although relations with some allies, including India, improved during the year, the globe's sole remaining superpower appeared impotent at times, captive to events in most areas, and unable to exert accustomed will on world events.

      In an attempt to relieve stress on the U.S. military, complete control of security in Afghanistan was turned over to NATO during the year. Taliban rebels continued, however, to stage a fierce resurgence punctuated by bombings and suicide attacks, making 2006 the country's bloodiest year since the Taliban was ousted from power in 2001. The Afghan drug trade, technically illegal but tolerated by the government, flourished as security deteriorated. About half of the 40,000 troops in the country belonged to the U.S., and despite repeated calls for assistance, most NATO countries were unable or unwilling to step up their commitment. Efforts to capture Osama bin Laden, believed to be hiding in a lawless area of northern Pakistan near the Afghan border, went nowhere during the year.

      Two rogue states with nuclear ambitions and unstable leadership, Iran and North Korea, took particular advantage of the overstretched U.S. military and preoccupation with Iraq. For most of the year, North Korea declined to participate in six-nation diplomatic efforts designed to stop development of its nuclear weapons program. Instead, on July 5, North Korea test-fired seven missiles, including a Taepodong-2 long-range version that some analysts said was capable of hitting the western United States. The missile failed after 40 seconds, however, landing in the East Sea (Sea of Japan), but not before the U.S. had activated still-unproven interceptor missile systems in Alaska and California. North Korea then shocked even its closest ally, China, by detonating its first confirmed nuclear device inside a Korean mountain tunnel on October 9. Following universal criticism, North Korea agreed to resume international talks, but the outlook was unpromising. North Korean negotiators initially declined to discuss the nuclear program and instead limited discussion to economic sanctions previously imposed on the regime for counterfeiting and illegal technology transfers. (See Korea, Democratic People's Republic of , above.)

      International attempts to persuade Iran to curb its nuclear program, which the oil-rich country insisted was necessary for civilian energy production, were met with continued stalling by Tehran. With Russia and China competing actively for Iranian contracts and trade, Iran was able to play world powers against each other and elude international sanctions. As the year began, a Russian offer to defuse the crisis by enriching uranium for Iran was rejected. The UN Security Council then gave Tehran until August 31 to stop enrichment or prove its program was peaceful; member countries offered a package of economic and political concessions as encouragement. When International Atomic Energy Agency inspectors, previously ejected from Iran, returned in mid-August, Iranians refused to release a key 15-page report on possible uranium shaping for weapons uses. Amid arguments over what should be done, the Security Council unanimously voted in December to impose economic sanctions on Iran—but only after watering down, at the behest of China and Russia, provisions for freezing assets and restricting travel of Iranian officials. At year's end, despite internal political problems, Iran's nuclear program remained intact, and its influence in the Middle East appeared to be growing rapidly. (See Iran: Special Report (Iran's Power Dilemma ), above.)

      U.S. efforts to stop a civil war that had claimed more than 200,000 lives in Darfur, the westernmost region in The Sudan, also proved largely ineffectual. President Bush signed a law in October imposing economic sanctions on The Sudan following the central government's refusal to admit 17,000 United Nations peacekeeping troops. During 2006 the U.S. contributed more than $400 million in humanitarian aid to Darfur, but some aid was intercepted, and the Sudanese government continued to ignore international protests.

 With attention focused on Iraq, U.S. diplomats were unable to apply significant new influence on Russia or China during the year. In a May speech that proved controversial, Vice Pres. Dick Cheney accused the government of Russian Pres. Vladimir Putin of rolling back human rights and using its energy reserves as “tools of intimidation or blackmail” against its neighbours. The Putin government rejected U.S. suggestions that authoritarianism was returning to Russia and threatening democracy in Eastern Europe.

      The year produced a record U.S. trade deficit with China of more than $215 billion, severely hampering U.S. efforts to influence perceived human rights, currency, and environmental issues in the world's largest country. The Chinese economy continued its rapid growth, challenging U.S. economic power in Asia, and the U.S. moved notably closer to India during the year. A treaty granting technological assistance to India's fledging civilian nuclear-power program was approved by the U.S. Senate in December.

       Venezuelan Pres. Hugo Chávez won a substantial reelection victory in December, increasing his prestige in the Western Hemisphere. Chávez repeatedly referred to President Bush as “the devil” in public speeches and led an effort to reduce U.S. economic and political influence in the region. Following his reelection, Chávez moved to nationalize key industries and shut down opposition media, ignoring U.S. criticism in the process.

David C. Beckwith

Developments in the States
      In 2006 U.S. states enjoyed a relatively tranquil year, which was marked by strengthening fiscal conditions and improved intergovernmental relations, and a respite from major natural disasters. Responding to perceived inaction by the federal government, states took action on numerous issues previously considered outside their province, including illegal immigration, climate change, minimum wage, stem cell research, and health care. In the November midterm elections, states followed a national trend in opting for a major shift in partisan control of state capitals. (See Sidebar (U.S. 2006 Midterm Elections ).) Regular legislative sessions were staged by 44 states during the year, and 20 states held one or more special sessions.

Party Strengths.
       Democrats made substantial gains in 2006 state elections, winning 20 of 36 governorships at stake and establishing a clear advantage in state legislatures. Going into the election, Republicans enjoyed a 28–22 advantage among governors, but the lineup for 2007 would be reversed, with 28 Democratic governors. In legislative elections Democrats reestablished majority control after several years of virtual parity between the parties. Prior to the election, Republicans had a two-house majority in 20 legislatures, Democrats were dominant in 19, and the remaining states were split or tied. The new breakdown included two-chamber Democratic majorities in 23 states, GOP control in 15, and 11 split or tied. (Nebraska has a nonpartisan unicameral legislature.)

Structures, Powers.
      By a 58–42% margin, voters in Michigan approved a constitutional amendment banning race-based affirmative action in college admissions and government hiring. The initiative, opposed overwhelmingly by educators and opinion leaders, was similar to measures approved by voters in California in 1996 and Washington in 1998. Support for antitax and state spending-cap measures waned during 2006, continuing a recent trend. Voters in Maine, Nebraska, and Oregon rejected the Taxpayer Bill of Rights measures that would limit spending increases to population growth plus inflation and mandate a legislative supermajority to increase taxes. As comprehensive immigration reform languished in Congress, numerous states grappled with rules for illegal aliens. Arizona voters approved a series of get-tough measures that included denial of day-care or tuition benefits. Colorado and Georgia targeted employers, prohibiting tax deductions for payment to illegals and sanctioning a state lawsuit against the federal government for lack of immigration enforcement. A gubernatorial veto voided the California legislature's attempt to allow licenses for undocumented drivers. (See Special Report (Immigration's Economic Impact ).) Arizona voters rejected an innovative attempt to encourage voting; the proposal would have entered voters in an annual million-dollar lottery.

      With the national economy continuing to expand, state revenues across the country exceeded expectations and led to the first overall net tax decrease in six years. Some 40 states posted significant surpluses, and officials responded by restoring funding for critical programs, such as education, and by rebuilding “rainy day” funds and addressing perceived inequities in taxation.

      Overall, 24 states cut taxes, while 15 states increased rates, for an overall modest $2.1 billion reduction in tax revenue. The largest tax increase occurred in New Jersey, where a budget dispute between newly elected Gov. Jon Corzine and the legislature led to a six-day shutdown of state government at midyear that idled 45,000 state employees, closed state parks, and even shuttered Atlantic City casinos. The standoff ended with agreement to raise the state sales tax immediately from 6% to 7%, followed by a promised reduction later in state property taxes, the highest in the nation.

       Ohio, pursuing a five-year reduction plan, led 18 states that decreased personal income taxes; only 2 states raised income-tax rates during 2006. Five other states raised sales taxes, and 15 reduced them, usually by cutting levies on food, clothing, or other essentials. Six states raised cigarette and tobacco taxes, and Idaho boosted alcohol revenues. As energy prices spiked at midyear, most states left motor-fuel taxes unchanged.

 Seventeen states made adjustments to corporate taxes, most of them modest. Texas, however, enacted a new business tax as part of a school-finance-reform plan, boosting revenues by more than $400 million. Alaska enacted a major increase in oil-company taxes. Although soaring home prices leveled out or even declined nationwide, house appraisals often rose, and major protests occurred over increased property taxes in numerous states. Texas joined Arizona, New York, New Jersey, Pennsylvania, Georgia, Indiana, Rhode Island, Idaho, and South Carolina in promising relief during the year, often by transferring funds to local governments to offset property-tax revenue.

      With memories of the 2001–03 state budget crisis still fresh, legislators were often wary of enacting measures associated with new spending. As unemployment declined and welfare rolls stabilized, mandatory programs, such as Medicaid, grew at a modest rate during 2006. Numerous states replenished reserves and restored spending on K–12 education, higher education, highways, and other general expenditures that had been trimmed earlier.

Social Issues.
      Although supporters of traditional marriage won most ballot contests during the year, advocates of legal rights for same-sex couples claimed progress in establishing protection for homosexual unions. New Jersey became the third state, with Vermont and Connecticut, to establish civil unions as a formal alternative to traditional marriage. The change came after New Jersey's highest court unanimously declared that homosexual couples deserved all rights and privileges of marriage and ordered the state to either legalize same-sex marriage or provide equal statutory treatment for gay couples. During the year voters in 7 additional states, for a total of 27, amended their constitutions to define marriage as being between one man and one woman. Arizona, however, became the first state to reject a gay-marriage ban after opponents stressed that the measure would ban governments from recognizing domestic-partnership arrangements between heterosexual couples as well. Massachusetts remained the only state to have legalized same-sex marriage. California, Hawaii, and Maine maintained domestic-partnership registries that conferred specific benefits to any couples who registered, same sex or opposite sex. (Colorado voters turned down a “domestic-partnership” proposal in November balloting.) Lawsuits seeking the legalization of same-sex marriage were pending in several states, and high-court decisions were being awaited in Maryland, Connecticut, and California.

      The South Dakota legislature approved a total ban on abortion, seeking to set up a new challenge in the U.S. Supreme Court to Roe v. Wade. In the November election, however, state voters voided the measure by a 56–44% margin. Louisiana enacted a near-total halt to abortions except those required for saving the life of the mother. Countering a national trend, voters in California and Oregon rejected a measure requiring parental notification when minors sought an abortion. A total of 35 states required either notification or consent by parents in such cases.

Law, Ethics.
       Gun rights advocates flexed their political muscle during 2006, and 14 states joined Florida in approving measures specifying that crime victims need not retreat before using deadly force against attackers. Supporters of the measures called the bills “stand your ground” legislation, but critics labeled them “shoot first” laws. Keying off reports from the Hurricane Katrina disaster, 10 states prohibited authorities from confiscating personal weapons during natural disaster recovery efforts. Two additional states, Nebraska and Kansas, joined the 46 that allowed “concealed-carry” gun permits to be issued to qualified applicants. Only Illinois and Wisconsin prohibited the carrying of a hidden weapon.

      Responding to disturbances staged by anti-homosexual-rights activists, 27 states banned picketing and demonstrations at funeral and memorial services for U.S. servicemen and women.

 Several states grappled with ethics issues. After state legislators became embroiled in financial scandals, North Carolina and Tennessee enacted sweeping ethics-reform legislation. Kentucky's governor was indicted on misdemeanor charges of having hired workers on the basis of their political loyalties, but the charges were later dropped. Former Illinois governor George Ryan was sentenced to six and a half years in a federal prison after his conviction on 18 federal felony corruption charges dating from his tenure as secretary of state. Outgoing Ohio Gov. Robert Taft was reprimanded by the state's Supreme Court for failing to report gifts.

       Tennessee became the first state to require retailers to check identification of all beer purchasers, regardless of how old they looked. Alaska's legislature attempted to recriminalize possession of small amounts of marijuana, but the attempt was largely voided by an Alaskan court. Following an accidental death, Florida prohibited military-style juvenile detention camps.

      Numerous states approved measures cracking down on sexual predators, including those using the Internet. California and New York joined Florida in enacting “Jessica's Law,” which imposed harsher prison sentences on convicted sex offenders and mandated that they be electronically monitored during their lifetime. The California version prohibited offenders from living within 600 m (2,000 ft) of a school or park, but a federal court declared that the law could not be applied retroactively; it would affect future moves of residence by registered sex offenders but not pertain to existing addresses.

      Imposition of the death penalty continued to decline across the country. California and Florida suspended capital punishment in December, after officials mishandled executions employing lethal injections. Federal courts ruled injection methods in Missouri and California to be unconstitutional. During the year, 53 convicts (including 24 in Texas) were executed, down from 98 in 1999. Countering a trend, Wisconsin voters approved a nonbinding referendum to restore the death penalty.

Health, Welfare.
      Massachusetts and Vermont approved innovative strategies for achieving universal health care coverage. The Massachusetts law provided subsidies for purchase of health insurance and levied fines on employers who failed to provide insurance to employees. Vermont's plan required private insurers to offer coverage to all, overseen by a new state board. Though California's legislature approved what would have been the nation's first publicly financed universal health care system, the measure was vetoed by Gov. Arnold Schwarzenegger, who signed a bill that pressured drug makers to negotiate discounts or risk losing contracts under the state's medical system.

      Following a contentious campaign, voters in Missouri, by 51–49%, granted legal protections to researchers studying embryonic stem cells. Antiabortion groups opposed destruction of embryos, and the referendum was closely watched nationally as an important political test on a divisive subject. Seven states endorsed stem cell research measures beyond limits established by the federal government.

      New statewide bans on smoking in public places were approved in Arkansas, Arizona, Colorado, Hawaii, Louisiana, Nevada, New Jersey, Ohio, and Pennsylvania, bringing to 21 the number of states prohibiting tobacco use in public places. Illinois, Massachusetts, and New Hampshire joined New York, Vermont, and California in requiring all cigarettes to be “fire safe”—to extinguish themselves if left unattended. The laws in Illinois and Massachusetts would become effective in 2008.

Environment, Education.
      Amid further complaints about federal inaction, California enacted the nation's first significant measure designed to combat global warming. The controversial measure ordered that greenhouse-gas emissions in the state be cut by 25% by 2020 through a cap-and-trade system. Washington became the first state to ban phosphates in residential dishwashing detergent.

      As Congress prepared to reauthorize the landmark 2001 No Child Left Behind education act, states continued to wrestle with federal mandates, including testing and accountability requirements. A series of federal waivers to state officials markedly reduced intergovernmental conflicts over the act. Illinois effectively created the first statewide preschool, which would include children three and four years old. In a controversial move that raised the spectre of resegregating classrooms, Nebraska divided Omaha into three racially distinct school districts for the purpose of restoring local control of education.

      Reacting to Washington, D.C.'s inaction on the minimum wage, frozen since 1997, legislators in 11 states and voters in 6 more approved increases in state minimum-wage rates. Four states provided automatic increases with inflation. At year's end 29 states would mandate rates above the $5.15 federal minimum.

      A nationwide campaign headed by union activists against Wal-Mart, the largest American retailer, created turmoil and legislative proposals in numerous states. Maryland's legislature overrode a gubernatorial veto and mandated that Wal-Mart increase employee health care benefits, but a federal court later overturned the law.

      States stepped up protections for private property in the wake of the 2005 Supreme Court Kelo v. City of New London decision, which allowed the government to condemn property, arguably for private purposes. Two dozen additional states limited local eminent domain powers, bringing to 27 the number of states curbing property appropriation over the past two years. In a November referendum Arizona joined Oregon in allowing compensation for property owners subject to government land-use restrictions. Similar initiatives in California, Idaho, and Washington failed by substantial margins, however.

      In an effort to aid victims of identity theft, 26 states allowed such individuals to put a security freeze on their credit reports to inhibit thieves from opening new accounts under their names. West Virginia approved a tough underground coal-mine-safety law following an accident in 2005 that killed nine miners. The measure was a model for a U.S. statute signed into law at midyear. Ohio, Oregon, Rhode Island, and Tennessee outlawed predatory practices by mortgage and payday lenders.

David C. Beckwith

▪ 2006

9,366,008 sq km (3,616,236 sq mi), including 204,083 sq km of inland water but excluding the 156,049 sq km of the Great Lakes that lie within U.S. boundaries
(2005 est.): 296,748,000
Washington, D.C.
Head of state and government:
President George W. Bush

      In 2005, amid world skepticism and domestic opposition, the administration of U.S. Pres. George W. Bush forged ahead with its bold and aggressive response to international terrorism. Progress in pacifying a determined Iraqi insurgency and in establishing capable Iraqi security forces proved far more difficult than expected, however. American deaths in Iraq continued at a rate of nearly three per day. A drumbeat of criticism from a unified Democratic opposition helped tax American patience and weaken Bush's base of support. Even a purring U.S. economy failed to assuage doubters. By the fall of 2005, with more than 60% of Americans disapproving of his job performance and his conduct of the Iraqi war, President Bush appeared to be in serious danger, perhaps lacking the political support necessary for him to be able to continue pursuing his plan.

War on Terrorism.
      The American-led effort to establish a functioning democracy in Iraq again dominated world news during 2005. A determined resistance, including both Iraqi and foreign fighters, continued incessant bombing, small arms, and suicide attacks, and U.S. military deaths—846—were only slightly fewer than the 848 recorded in 2004.

 Iraq showed unmistakable signs of progress during the year, starting with a historic election on January 30, in which 57% of the voters turned out for elections to the National Assembly. Voter turnout in an October election to ratify the constitution was even higher (63%), and a third “purple-finger” election, held on December 15, produced a voter turnout of 70%. (See Iraq , above.)

      Allegations of widespread illegality in the UN's Iraq oil-for-food program in the months leading up to the U.S.-led 2003 invasion produced an independent investigation led by former U.S. Federal Reserve chairman Paul Volcker. The inquiry found “corrosive corruption” at the UN and blamed UN Secretary-General Kofi Annan for mismanagement. The report stated that Saddam Hussein had collected at least $229 million in bribes from a majority of companies involved in the program and that $10 billion in Iraqi oil had been illegally smuggled into adjacent countries. The report showed that French and Russian companies received $23.7 billion in Iraqi contracts from 1996 to 2003, during the period when both countries were strong critics of Iraqi sanctions and ultimately opposed the U.S.-led invasion.

      Even while violence continued in Iraq and Afghanistan, a potent political battle was being waged in the U.S. over the war on terrorism. Democrats continued to hammer at President Bush's decision to invade Iraq, suggesting that his stated fear of Iraq's harbouring weapons of mass destruction had been concocted. The controversy eroded Bush's polling numbers, and by October surveys were finding that the majority of the public believed that the decision to invade Iraq was a mistake.

      After Rep. John Murtha, a Democrat from Pennsylvania, called for the U.S. withdrawal from Iraq in November, the public focus turned from President Bush's decision in 2003 and his credibility to the future. Murtha's remarks delighted antiwar activists. Polling soon showed, however, that many Americans disagreed with that assessment and believed that the U.S. should stay the course in the war on terrorism. Defense Secretary Donald Rumsfeld announced plans to reduce U.S. troop strength from 160,000 to below 138,000 in early 2006, saying that trained Iraqi security forces would make up the difference. At year's end Bush's approval rating stood at 40%, up 5% from a month earlier.

Domestic Policy.
      President Bush laid out an unusually ambitious agenda following his second inauguration. He announced plans to regularize the national system of immigration and border control, which had fallen into disrepair. He promised a revamping of the nation's tax code and offered proposals to reform controversial legal liability procedures covering medical malpractice, class-action lawsuits, and asbestos cases. Finally, as the centrepiece of his 2005 agenda, Bush tackled the “third rail” of American politics, the Social Security retirement system, by suggesting an alternation of the current scheme, in which wage earners effectively fund benefits paid to retired Americans. Instead, Bush proposed that workers be given the opportunity to fund their own private retirement accounts, which they would own.

      Little of Bush's agenda became law. Instead of receding after their 2004 election defeat, congressional Democrats showed unusual unity and organized to stop the administration agenda; they were occasionally joined by key Republicans. Ethics problems sapped the majority party. When the U.S. House's GOP leader, Tom DeLay, was forced to step down after a Texas grand jury indicted him on election-law charges, Republican effectiveness frayed noticeably. The result was the worst political and legislative season of Bush's presidency.

      In early 2005 Bush traveled the country extensively, touting his Social Security proposals to enthusiastic, carefully selected crowds. He claimed that reform was needed to avoid the system's bankruptcy as baby boomers retired and laid claim to system payments. Democrat critics, however, rallied opposition by suggesting that Bush was attempting to “privatize” the system, throwing guaranteed benefits into doubt, and by pointing out that the transition period in Bush's plan would actually require more funding than the current plan. Political support for Bush's program was so anemic that the president never offered specific legislation, and the issue had died by year's end.

      Bush's immigration proposals also met with a storm of criticism from both the left and the right, with the most-heated comments coming from his own party. Instead of amnesty for the estimated 12 million illegals living within the U.S., Bush proposed establishing a “ guest worker” program that would grant them legal status and the opportunity for eventual citizenship. Outraged conservatives said that the Bush plan rewarded illegality and called instead for tighter border security and enforcement of often-ignored immigration statutes. The U.S. House, in a largely symbolic vote before adjourning, approved the establishment of a 1,100-km (700-mi) fence along key portions of the U.S.–Mexico border, and Bush was forced to add border-security language to his proposal for congressional consideration in 2006.

      Congress approved a limited portion of Bush's legal reform, moving many class-action lawsuits from state to federal courts, which had historically been less receptive to innovative claims from plaintiff's lawyers. No progress was made, however, on administration proposals to reform the tax system, asbestos litigation, or medical malpractice lawsuits.

      Some significant legislation passed the Congress, but little of it met with Bush's full approval. After nearly a decade under consideration, a bankruptcy-reform bill was signed into law; supporters claimed that by requiring more overextended debtors to adopt a long-term repayment plan instead of having their debts discharged, the measure would reduce credit abuse. Another long-stalled measure, a national energy bill, was approved amid claims that it mostly benefited highly profitable energy companies. Moderate Republicans joined most Democrats to strip from the bill an administration-backed provision allowing energy exploration in the Arctic National Wildlife Refuge.

      After promising to veto any highway-construction legislation that exceeded $256 billion over five years, the president in August signed a $286 billion measure that contained a record 6,371 congressional “ earmarks”—special provisions that individual senators and representatives had inserted for pet projects. One earmark inserted by powerful Alaska legislators was funding for a $223 million bridge from Ketchikan (pop. 8,000) to Gravina Island (pop. 50), currently served by an efficient ferry. After a nationwide protest, the bridge spending was rescinded, but Alaska authorities were allowed to take control of the funds for use on any project—including a Gravina bridge. Despite taxpayer group complaints over excessive spending by Congress, Bush completed his fifth consecutive year in office without casting his first veto.

 Legislative setbacks were almost directly tied to public antipathy over Bush's handling of the Iraq war. As violence continued and U.S. casualties mounted, Democrats concentrated on Bush's credibility, suggesting that he had deliberately misled the country about the threat of Iraqi weapons of mass destruction, never found following the Iraq invasion. When Bush spent his usual August recess month at his ranch in Crawford, Texas, he was dogged by Cindy Sheehan, the antiwar mother of a slain U.S. serviceman, who attracted daily news media attention as she demanded to meet with Bush. He declined.

      Bush's poll ratings, adversely affected by growing public impatience over Iraq, declined even further when government authorities proved incapable of dealing promptly with the fallout from Hurricane Katrina, a major disaster that devastated parts of Louisiana, Alabama, Mississippi, and Florida. Bush eventually took responsibility for the failed federal effort and promised a broad rebuilding package that some experts thought would reach $200 billion. Louisiana's congressional delegation proposed federal aid for that state alone that exceeded $250 billion. By year's end Congress had set aside about $64 billion for storm relief. (See Economic Affairs: Special Report (Preparing for Emergencies ).)

 Republicans were hard hit by a series of scandals. Shortly after DeLay was indicted, Senate Majority Leader Bill Frist revealed that he was being investigated by two federal agencies for having sold stock in a hospital company controlled by his family, shortly before bad news drove its stock price down. A long-running special counsel investigation into the 2003 naming of an undercover CIA operative by Washington columnist Robert Novak culminated in the indictment of a top White House aide. Lewis (“Scooter”) Libby, chief of staff to Vice Pres. Dick Cheney, was indicted for lying to Special Prosecutor Patrick J. Fitzgerald (Fitzgerald, Patrick J. ) (see Biographies) before a grand jury; Libby immediately resigned. Fitzgerald's probe continued into 2006. In a development that threatened to expose corrupt fund-raising and trading of favours on Capitol Hill, federal investigators in November obtained a guilty plea on a conspiracy charge and $19.7 million in restitution from Michael Scanlon, a former DeLay aide. Scanlon promised to testify against another grand jury target, lobbyist Jack Abramoff, over alleged bilking of Indian tribe clients whom they represented on gambling issues.

 A long-running dispute over confirmation of federal appellate judges was at least partially resolved during the summer, with a “Gang of 14” centrist senators, 7 from each party, agreeing to a compromise that seated eight contested Bush nominees. The agreement came just before two seats opened on the U.S. Supreme Court, one caused by the death of Chief Justice William Rehnquist (Rehnquist, William Hubbs ) (see Obituaries) and the other by the retirement of Justice Sandra Day O'Connor. Under terms of the agreement forbidding filibusters except in “exceptional circumstances,” a Washington, D.C., judge, John Roberts (Roberts, John G., Jr. ) (see Biographies), was quickly confirmed as chief justice. Bush suffered another setback when his choice to replace O'Connor—Bush confidant and White House counsel Harriett Miers—was judged unacceptable by conservative activists and withdrew. Bush then nominated New Jersey appellate judge Samuel Alito, whose confirmation was being opposed at year's end by an alliance of liberal interest groups.

      The administration suffered a final setback in December when Congress attempted to renew expiring portions of the 2001 USA PATRIOT Act designed to update law-enforcement tools against terrorism. After House and Senate conferees approved a compromise extension, a bipartisan coalition of senators refused to sign off, with four key Republicans claiming that the renewal potentially infringed on civil liberties. As the vote approached, the New York Times published details of a National Security Agency eavesdropping program on international calls; although technically unrelated, the article reinforced fears about the PATRIOT Act's reach. After applying political pressure by threatening to veto any temporary extension, President Bush in late December signed a mere five-week extension.

The Economy.
      On paper the U.S. economy enjoyed a banner 2005, shaking off natural disasters and spiking energy prices and growing at a robust 3.5% rate for the third consecutive year. Nearly two million new jobs were created, and the nation's unemployment rate fell from 5.4% to 4.9%. Interest rates and inflation, while rising modestly, remained at historically low levels. Labour productivity rose for a fifth consecutive year.

      The economic performance was particularly impressive in the third quarter as Hurricanes Katrina and Rita devastated the Gulf Coast region. The storms eliminated 600,000 jobs, disrupted shipping traffic, and shut down refining and energy infrastructure, sending gasoline prices nationwide temporarily over $3 per gallon. Relief from the federal government and from private insurers helped to jump-start rebuilding efforts, and the national economy grew by a healthy 4.1% during the August–October period.

      As the U.S again provided its traditional economic leadership among industrialized nations, however, there were disquieting signs of excess. The U.S. trade deficit, which had hit a record $618 billion in 2004, topped $700 billion in 2005.

      As the U.S. economy expanded, the Federal Reserve pursued its 18-month policy of nudging short-term interest rates higher, to combat anticipated inflation. The key federal funds rate was boosted by 0.25% on eight occasions during the year, to 4.25%, up from 1% in early 2004. U.S. consumer price inflation, pushed by rising fossil-fuel prices, rose more than 4% for the year, but core inflation (excluding food and energy) remained at modest levels, just over 2%. The gradual interest-rate rise finally contributed at year's end to a cooling of an extended boom in housing construction, sales, and refinancing. Meanwhile, property values in some major urban areas had doubled over the previous five years.

      In another cautious indicator, the solid economic growth failed to impress major equity markets. Stock averages dipped during the spring, recovered later in the year, but ended 2005 with only slight gains. Overall, smaller companies outperformed major firms. Most broad market gauges rose less than 5%, and the Dow Jones Industrial Average actually dropped by nearly 0.5% for the year.

Foreign Policy.
      As the year began, the U.S., Japan, India, and Australia led the world's humanitarian response to the December 2004 tsunami disaster in the Indian Ocean, which claimed an estimated 212,000 lives. U.S. Navy helicopter carriers arrived off Aceh, Indon., only five days after the devastation and were particularly effective in preventing additional disease and hardship by delivering fresh water, medical care and supplies, food, and other relief. The U.S. allocated about $1 billion in official aid, and private U.S. citizens donated another $700 million to the relief effort. The U.S. also provided significant aid when a cataclysmic earthquake struck Kashmir on Oct. 8, 2005, killing more than 87,000 people. (See Pakistan: Sidebar, above.)

      In his second inaugural address, President Bush ambitiously pledged to end tyranny around the globe and spread liberty and freedom “to the darkest corners of the world.” As he spoke, the U.S. was fully extended, financially and militarily, in Iraq and Afghanistan, arguably doing what Bush promised, but the strenuous effort seriously hampered U.S. ability to deliver further on Bush's goal.

      Even so, the administration could point to numerous advances in self-government, human rights, and democracy worldwide, all encouraged by U.S. policy. The breakthroughs included Syria's withdrawal from Lebanon, political progress by women in Muslim countries such as Kuwait and Saudi Arabia, advances toward free elections in Egypt and Liberia, and the historic seating of the first democratic national parliament in Afghanistan. The scheduled Palestinian vote, in addition to Israel's unilateral withdrawal from the occupied Gaza Strip, provided a glimmer of hope for that region.

      International efforts to stop persistent rogue nuclear-weapons-development programs in Iran and North Korea went nowhere during 2005. President Bush had dubbed both countries, with Iraq, “the axis of evil” in 2001, in part because of their nuclear ambitions. With allied military efforts overextended in Iraq and Afghanistan, the U.S. was forced to rely on diplomacy to bring pressure on North Korea and Iran.

      When six-nation talks were belatedly resumed in Beijing in July, North Korea agreed to curb its nuclear program and return to international safeguards provided that it received trade concessions, economic assistance, and security guarantees. Within days, however, the apparent deal broke down as the North Koreans demanded renewed assistance on two substitute light-water reactors, and the U.S. publicly accused North Korea of counterfeiting currency and assisting illegal nuclear proliferation. Pyongyang repudiated its concessions and claimed openly that it had already manufactured several atomic weapons in apparent violation of international law.

      Iran successfully stalled ongoing efforts by France, Great Britain, and Germany to negotiate an end to an illegal enrichment plan. The U.S. favoured a hard-line approach, threatening to seek economic sanctions against Iran at the UN Security Council, but did not press the issue because Russia and China, both with veto power over UN sanctions, opposed the move. At year's end, in an effort to break the impasse, Russia offered to host Iran's enrichment efforts and ensure that the uranium would be used only for energy production.

      U.S. relations with the United Nations, never smooth, suffered through an especially tumultuous year. As details of bribery and corruption in the UN's Iraq oil-for-food program came to light, the Bush administration appointed a vocal UN critic, conservative John Bolton (Bolton, John R. ) (see Biographies), as U.S. ambassador, over substantial U.S. Senate opposition. Bolton arrived at UN headquarters in August and immediately began pushing for significant reforms in transparency and efficiency. At one point Bolton unsuccessfully sought postponement of the UN budget until the management, finance, and appointment changes enacted at a September UN summit had been approved by the General Assembly.

      With China rapidly emerging as a world economic and military power, U.S. policy makers attempted to find a delicate balance in bilateral relations that were superficially correct but laden with serious tensions just below the surface. As the country's trade deficit with China topped a record $200 billion, its options were narrow in pursuing complaints about Chinese currency manipulation, political suppression, DVD and computer software piracy, and arms exports. The U.S. forged historically strong ties with Japan, Pakistan, and especially India in an attempt to counter steadily increasing Chinese influence all over Asia.

      As a wave of populism swept across Latin America, U.S. policy suffered several setbacks. President Bush's attempt to expand a free-trade zone was rejected by major South American countries at a November Western Hemisphere summit in Buenos Aires. A vocal critic of the U.S., Pres. Hugo Chávez (Chavez, Hugo ) (see Biographies) of oil-rich Venezuela, continued to taunt the U.S.; to highlight U.S. internal problems, he sent subsidized heating oil to low-income families in Boston and New York City. A Chávez admirer, Evo Morales, was elected president of Bolivia after promising to defy U.S. antidrug objections and facilitate coca-leaf production.

David C. Beckwith

Developments in the States

Party Strengths.
      An often-difficult relationship with the federal government marked 2005 for the 50 U.S. states; differences over funding, power, and responsibility frequently roiled the federalism partnership. State officials stepped up complaints over unfunded federal mandates and U.S. preemption of authority over traditional state powers. Uneven state/federal response to major natural disasters created major news, but the differences extended to numerous additional areas, including education, health care, and economic development. Meanwhile, the national economic recovery allowed states to restore some services that had been cut in previous years and prompted setbacks for antitax activists. All 50 states held regular legislative sessions during the year, and 24 of them staged special sessions on matters ranging from hurricane relief to school finance.

       Democrats fared well in limited 2005 state elections, capturing a handful of legislative seats and retaining governorships in Virginia and New Jersey. The partisan gubernatorial lineup across the country was therefore maintained at 28 Republicans and 22 Democrats. State legislatures remained at virtual parity between the parties nationwide. Republicans would enter 2006 with a two-house control of 20 states, Democrats dominated in 19 states, and the two parties split legislative authority in 10 states, all unchanged from 2005. Nebraska had a nonpartisan unicameral legislature.

Structures, Powers.
      Voters decided a record 18 citizen initiatives during off-year elections and rejected 16 of them. A recent trend toward limiting state spending, pushed by low-tax advocates, stalled during the year as states recovered from a national economic downturn.

      Voters in California and Ohio decisively rejected proposals to shift contentious legislative redistricting authority away from the state legislature. The California initiative would have turned redistricting over to a panel of retired judges, while Ohio's measure would have substituted a nonpartisan citizen commission.

      New Jersey became the 43rd state to establish the office of lieutenant governor, with power to succeed when the governorship became vacant. In 2004 when that state's governor resigned, the job had devolved to the state Senate president, who simultaneously served as acting governor and as a legislator. New York voters rejected a proposal to overhaul the state's chronically tardy budget process; the measure would have shifted significant budget responsibility from the governor to the legislature. Washington voters approved an initiative requiring periodic audits of local governments.

      In a late-night July vote, the Pennsylvania legislature approved a pay raise for legislators and judges without public notice or comment. Although no legislative elections were scheduled, the resulting public furor resulted in one state Supreme Court justice's losing his position in November balloting—the first judicial rejection in state history. The pay raise was rescinded later that month.

      Alabama, Delaware, and Texas approved new laws restricting eminent domain powers of local officials. The laws were approved after a divided U.S. Supreme Court, in the controversial Kelo v. City of New London (Conn.) decision, affirmed that local governments could condemn and seize private property to make way for commercial development that paid higher taxes. (See Law, Crime, and Law Enforcement : Court Decisions.)

Government Relations.
      Arguments over allocation of power between state and federal governments were front-page news during most of 2005. With fallout from Hurricane Katrina the most glaring example, state officials struggled to maintain productive relationships—and their traditional lines of authority in the U.S. system of federalism—during often-contentious dealings with Washington. Some state officials claimed that the federal government was neglecting its responsibility in vital areas, such as curbing global warming, lowering the prices of costly drugs, and funding stem-cell research. In other instances states asserted that federal authorities were not providing resources to pay for mandates that they imposed on the states. The National Council of State Legislatures claimed that over a two-year period it had identified $51 billion in largely uncompensated annual costs that states incurred as a result of federal mandates, not including the additional mandates that were on the drawing board. The officials also complained about increased federal preemption of state power to regulate health care, land use, technology, and other programs.

      In May Congress approved the REAL identification act, which set rigorous national standards for documents needed in order to obtain a driver's license. The new law effectively prohibited licenses for undocumented aliens, which a dozen states allowed. The law mandated costly new documentation requirements without providing any funding for state compliance.

 After Hurricane Katrina swept over Louisiana, Mississippi, Florida, and Alabama in late August, the devastation was exacerbated by arguments over responsibility for rescue, relief, and rebuilding. Disaster planning had traditionally been the purview of states, but the federal government had taken a steadily expanding role in recent years, blurring lines of authority and responsibility. With news media accounts blaming FEMA (the Federal Emergency Management Agency) for delays in providing relief services and supplies, federal officials made ill-disguised attempts to take control. Officials in Louisiana, Florida, and other affected states pushed back—even while demanding that the U.S. government pay for virtually all rebuilding efforts. The year ended in an uneasy truce, with lines of authority and responsibility remaining largely undefined. (See Economic Affairs: Special Report. (Preparing for Emergencies ))

      Pennsylvania, Connecticut, and Illinois sued the U.S. government in an attempt to save Air National Guard aircraft from being transferred to other states during the federal government's periodic Base Realignment and Closure procedure. National Guard units were controlled by state governors during peacetime but were susceptible to federal call-up in time of war. State officials also threatened lawsuits over provisions of the 2005 national energy bill that granted the federal government broad authority over the siting of liquefied natural gas ports and power lines.

      States completed their recovery from the 2001–03 economic downturn during the year. An expanding economy generated revenue beyond projections and outpaced increased outlays for programs such as Medicaid and allowed states to replenish “rainy day” reserve funds that had been tapped in previous years. Legislatures avoided significant tax changes. Several states produced large surpluses, notably California, which boasted $3.4 billion of black ink and its first surplus since 2000. The year saw only a modest overall increase in state taxes, and a majority of the states were preparing for tax reductions in 2006.

      As fiscal restrictions eased, many states increased spending on both K–12 and higher education, which had been targeted for unpopular reductions in previous years. Often by tightening eligibility and reducing some benefits, states managed to slow growth of Medicaid spending from the nearly 15% increase in 2004. Tennessee, for example, started trimming 190,000 recipients from its generous TennCare program. State expenditures on correction facilities increased but also at a slower rate as a 10-year prison expansion stalled. Hurricane-battered Louisiana was forced to make major reductions across the board in state expenditures.

      Ohio was the only state to increase overall taxes significantly, enacting a new commercial-activities tax and boosting both sales and tobacco taxes. Idaho, Iowa, and Virginia approved modest tax reductions. Seven states increased cigarette taxes, and most states increased fees for motor vehicles, driver's licenses, court costs, and other state services.

      Efforts to curb state spending suffered setbacks in several state elections. In a significant setback for antitax enthusiasts, Colorado voters approved a suspension of a landmark 1992 Taxpayer Bill of Rights law that limited revenue increases to population growth plus inflation. Though the moratorium resulted in a refund of more than $3 billion to state taxpayers, it also prompted a shrinkage in state government relative to the state's economy and crimped state education and highway funding. The Colorado plan was being eyed as a model by several other state legislatures.

 California voters rejected an initiative backed by Gov. Arnold Schwarzenegger that would have capped state spending and given additional budget authority to the governor. Washington voters turned down a spending limit and refused to overturn a 9.5-cent gasoline-tax increase approved by the state legislature.

Marriage, Gay Rights.
      Activists seeking equal marital and other rights for homosexuals made additional progress during the year in the aftermath of a 2003 Massachusetts high-court decision that legalized gay marriage. Maneuvering to exploit or blunt the ruling's effect accelerated in courts, legislatures, and at the ballot box across the country. Voters in two additional states, Kansas and Texas, overwhelmingly approved a state constitutional amendment banning recognition of same-sex unions, bringing to 19 the number of states that rejected gay marriage in their basic state document.

      Equal-rights advocates also made breakthroughs, however. Connecticut's legislature voluntarily joined Vermont in recognizing same-sex civil unions. A similar measure, approved by the Maryland legislature, was vetoed by the state's governor. The Alaska Supreme Court ordered state and local governments to grant the same benefits to employees' same-sex partners as those offered to spouses. A federal judge in Nebraska added a new wrinkle to the debate in striking down that state's prohibition of same-sex marriage. The ruling said that state law went impermissibly beyond regulating marriage and denied gay couples fundamental rights guaranteed by the U.S. Constitution.California lawmakers failed in an attempt to recognize same-sex marriage. A measure, the first by a state legislature without a court order, was approved even though California voters had rejected the concept in a 2000 statewide referendum. Governor Schwarzenegger vetoed the bill, however, saying that he preferred that the state Supreme Court decide the matter. Maine voters rejected a measure that would have overturned a legislature-approved state law banning discrimination against homosexuals in housing, employment, and education.

      Ohio Gov. Robert Taft pleaded no contest to four misdemeanour counts of violating state ethics laws by failing to report golf outings and other gifts. Taft, a Republican, was found guilty and fined $4,000. The ethics probe began after it was discovered that an Ohio Republican fund-raiser had lost more than $10 million of the $50 million of state money that he had invested in rare coins.

Law, Justice.
      Continuing a recent trend, states including California, Montana, and New Hampshire toughened laws governing sex crimes against children. Iowa's new law was particularly dramatic, mandating life imprisonment for a major second offense.

       Arkansas, Nevada, North Dakota, and Texas joined California in prohibiting government use of data from chip-recording devices that were contained in most new cars. South Dakota authorities had used information from the chip—which recorded speed, brake and seat-belt use, and other data recoverable after a crash—to convict Gov. William Janklow of vehicular homicide in 2003. The new state laws required an owner's permission or a court order before insurers or law-enforcement personnel could access the data.

Health, Welfare.
      Continuing a recent trend, 11 states approved new laws that further restricted abortion. Mississippi, a state with only one abortion clinic, required that an abortion be done in a hospital or surgical centre in cases in which the pregnancy had exceeded three months. Arkansas, Florida, and Idaho approved new laws requiring consent of a parent or guardian before a minor could receive an abortion. California voters, however, rejected a similar law. Though 35 states now required parental involvement for abortions obtained by minors, courts struck down those laws in 9 additional states. Georgia mandated a 24-hour waiting period for most abortions; Indiana required doctors to offer ultrasound images to prospective abortion seekers; and Arkansas ordered that applicants who were seeking abortions after their 20th week of pregnancy receive mandatory counseling on the possibility of fetal pain during the procedure.

      States were badly split on their approach to a “ morning-after” pill to prevent pregnancies. New Hampshire and Massachusetts became the seventh and eighth states to allow purchase of the pill specifically without a prescription, and Bay State legislators overcame a gubernatorial veto. New York Gov. George Pataki successfully vetoed a similar bill. Some pharmacists balked at dispensing the drug, but Illinois Gov. Rod Blagojevich and the California legislature enacted measures that required pharmacies that sold birth-control pills to also stock the morning-after pill. Mississippi joined Arkansas, Georgia, and South Dakota in giving pharmacists the right to refuse to dispense the pill.

      State relationships with federal authorities on health care were uneven at best. The federal government's 2003 reform of Medicare included a new prescription-drug benefit that was initially expected to save significant state funds. Congress imposed a last-minute “clawback” provision, however, that required offsetting state payments, and nearly 30 states instead projected increased costs from the program. The U.S. Supreme Court, in a major blow to states' rights, declared that laws in California and 10 other states that allowed the medical use of marijuana had to give way to federal antidrug enforcement laws.

      A grassroots rebellion over federal mandates for K–12 schools simmered in numerous states throughout the year, despite Washington's efforts to accommodate complaints. Critics charged that the No Child Left Behind (NCLB) and Individuals with Disability Education (IDEA) acts were excessively costly and underfunded and usurped traditional local and state control of public schools. Connecticut and Michigan filed unsuccessful legal challenges to require full NCLB reimbursement from the federal government; states estimated that the unfunded mandates would cost $18 billion annually. Utah's legislature allowed school districts to ignore NCLB requirements that necessitated state financing or conflicted with state test guidelines. The Texas education commissioner declared that the state would ignore NCLB guidelines on testing special-education students.

      Federal officials attempted to mollify state critics by granting increased flexibility. The U.S. Department of Education announced that up to 10 states would be allowed to use a “growth-based” NCLB assessment scheme similar to that of Utah's testing regimen.

      Texas became the first state to require public schools to spend 65% of funding on classroom expenses. The gubernatorial mandate came after the state legislature had turned down the proposal. Legislatures in Kansas and Louisiana also approved measures that encouraged the “65-cent solution.” The proposal, which was aimed at reducing administrative spending, also had an impact on school buses, counselors, libraries, and ancillary educational services. Support for another reform idea, school vouchers, remained sluggish. Utah joined Florida in enacting a statewide voucher program but limited its application to special-education students.

Consumer Protection.
 Georgia and Washington approved tough statewide smoking bans, bringing to 13 the number of states that prohibited smoking in most public areas. The Washington ballot initiative was particularly sweeping; it outlawed smoking in all public buildings and workplaces, including private clubs, and even lighting up within 7.6 m (25 ft) of doorways, windows, and air vents of public buildings. New York, in an attempt to protect students, prohibited the “unrestricted marketing” of credit cards on college campuses. Georgia declared the sending of multiple unsolicited “ spam” e-mails—10,000 in a month or 1,000,000 in a year—to be a felony punishable by up to five years in prison.

David C. Beckwith

▪ 2005

9,366,008 sq km (3,616,236 sq mi), including 204,083 sq km of inland water but excluding the 156,049 sq km of the Great Lakes that lie within U.S. boundaries
(2004 est.): 293,850,000
Washington, D.C.
Head of state and government:
President George W. Bush

      For a third consecutive year, the strategic response to the Sept. 11, 2001, terrorist attacks by the administration of Pres. George W. Bush (see Biographies (Bush, George W. )) dominated world affairs. The U.S. plan included two highly controversial initiatives—a proclaimed right of preemptive attack, to forestall perceived threats against U.S. interests, and a long-term objective of exporting democracy worldwide, to bring human rights to such areas as Afghanistan and Iraq, which had previously known mainly tyranny and despotism.

      The administration's initiatives caused deep divisions abroad. Support came from the U.K., Australia, and emerging Eastern Europe, but other nations voiced strong opposition and resentment. At home the body politic was also split, and President Bush's foreign policies, particularly the occupation and rehabilitation of Iraq, became the central issue in the 2004 national elections.

      Costs of the Iraq intervention continued to mount during the year. At times the U.S.-led effort appeared greatly overextended, putting unsustainable strain on U.S. resources, even the well-functioning U.S. economy. Domestic critics were unable to put forward an attractive alternative path as, in one sense, the November election became a referendum on the Bush terrorism strategy. In a high turnout of more than 60% by U.S. voters, Bush won reelection by a relatively narrow margin, 51–48%. (See Special Report (U.S. Election of 2004 ).)

War on Terrorism.
      The Bush administration could point to substantial progress in Iraq, from construction and infrastructure rebuilding to election preparations, but the U.S. was again on the defensive for most of 2004. Pentagon officials reported that 848 Americans died in Iraq during the year, and another 6,000 were wounded, a casualty rate nearly twice as high as 2003, which included the military invasion that had toppled Saddam Hussein.

      Early in 2004, in an assessment that cast a pall over the administration's rationale for the war, former U.S. arms inspector David Kay reported that allied prewar intelligence on Iraqi weapons of mass destruction was “almost all wrong.” Under pressure, President Bush reluctantly agreed to appoint a bipartisan commission to study the 9/11 attacks and their aftermath. The commission, headed by Republican former New Jersey governor Thomas Kean, proved activist and highly critical, and its periodic public hearings and reports roiled the domestic political landscape through the year.

      In late March, as the U.S.-dominated occupation attempted to prepare Iraq for elections and a handover to Iraqi control, authorities in Baghdad closed down a newspaper controlled by Muqtada al-Sadr, a militant Shiʿite cleric. A few days later four U.S. security contractors were ambushed and killed while driving in Fallujah, a city controlled by Islamic militants, and their bodies were publicly defiled. Militia forces loyal to Sadr then launched coordinated attacks in five Iraqi cities. The rebellion was particularly disheartening because Shiʿites, who had long been suppressed, were seen as the major beneficiaries of the transition to democracy.

      Allied forces eventually decimated the militia, retook several cities, and, with tacit backing of a more senior Shiʿite cleric, Ayatollah Ali al-Sistani (see Biographies (Sistani, Ali al- )), arranged a cease-fire with Sadr. Allied plans to pacify Fallujah, however, the apparent heart of the opposition, proved highly divisive, which prompted the resignation of two Iraqi Governing Council members. In a controversial step, the U.S. then postponed a planned major offensive on Fallujah.

      In late April photographs showing apparent U.S. military abuse of detainees at the notorious Abu Ghraib prison in Baghdad began circulating on the Internet, setting off a firestorm of criticism around the world against the U.S. occupation. The photos, taken by fellow soldiers, became key to a dozen investigations, including inquiries by both houses of Congress. Seven U.S. military personnel, most of them low-ranking, were prosecuted on abuse charges. One report called the Abu Ghraib abuse the result of “fundamental failures” in military oversight, but claims by some critics that the abuse stemmed from official U.S. policy, approved by Bush appointees, were never proved. (See Military Affairs: Special Report (POWs and the Global War on Terrorism ).)

      Coalition authorities handed over nominal control of Iraq on June 28, two days ahead of schedule, to an Iraqi interim government headed by Prime Minister Ayad Allawi (see Biographies (Allawi, Ayad )), a neurosurgeon allied with the U.S. Under the unusual arrangement, U.S. forces continued to lead security operations but operated technically under Iraqi supervision. The arrangement proved workable but did little to slow a continuing, apparently growing guerrilla insurgency, especially in Sunni areas.

      In early September, in a tacit acknowledgement of ongoing problems, the Bush administration asked Congress to reprogram funds designated for Iraqi reconstruction and shift $3.5 billion to law-enforcement and security accounts. At that point, largely owing to dangerous conditions, only 6% of the $18.4 billion appropriated in 2003 for rebuilding had been actually spent.

      Less than a week after the U.S. election, some 10,000 U.S. troops surrounded Fallujah and began a house-to-house campaign to uproot heavily armed insurgents. The assault took little more than a week to overrun the rebel area, and authorities announced that some 1,600 suspected insurgents had been killed, but most resistance leaders escaped the allied dragnet.

      Bombings, surprise attacks, and even frontal military assaults continued at a high level through the end of 2004. U.S. authorities, under continuing criticism for failing to supply adequate troop strength and supplies, including body and vehicle armour, announced plans to increase the U.S. presence to 150,000 in early 2005.

Domestic Policy.
      In 2004 numerous bills bogged down in partisan wrangling as both political parties maneuvered for electoral advantage, and congressional productivity was light.

       Democrats continued to throw up roadblocks to Bush appellate court nominees deemed excessively conservative, preventing 10 of 34 named by Bush during his first term from gaining an up-or-down vote on the Senate floor. The gridlock became an issue in the fall elections, with Senate Majority Leader Bill Frist, in a break from tradition, traveling in May to South Dakota, the home state of Sen. Tom Daschle, his Democratic counterpart, to campaign for Daschle's GOP opponent. Daschle was defeated. Following the election, Republican Sen. Arlen Specter of Pennsylvania, slated to become chairman of the Senate Judiciary Committee, seemed to warn President Bush in an interview against nominating antiabortion judges; following a storm of protest that reached his Senate colleagues, Specter withdrew his statement.

      With few exceptions, only relatively minor legislation was approved prior to November. One significant election-eve law awarded $140 billion in tax relief to U.S. business, including a $10 billion buyout for tobacco growers. Another bill extended temporarily four middle-class tax cuts previously won by the Bush administration but scheduled to expire, including a $1,000-per-couple child tax credit, expansion of the lowest (10%) tax bracket, exceptions for the alternative minimum tax, and relief from the so-called marriage penalty for two-income families.

      Reacting to increased abuse in the computer age, Congress increased penalties for identity theft, a growing source of fraud. At the urging of the Bush administration, and over objections of abortion rights advocates, Congress also specified that an individual alleged to have committed a violent crime against a pregnant woman could also be charged with a second offense, against the unborn child.

      Four hurricanes—Charley, Frances, Ivan, and Jeanne—rolled over Florida, a hotly contested presidential battleground state, during a six-week period in the fall, causing an estimated $50 billion in property damage. Congress responded with a $2 billion disaster-relief appropriation for the Federal Emergency Management Agency, followed later by another $11 billion in hurricane aid.

      As Massachusetts became the first state to legalize same-sex marriage, Congress struggled to fashion a federal legislative response. A proposed U.S. constitutional amendment defining marriage as only between a man and a woman went nowhere; the House approved the measure by only 227–186, less than the two-thirds required, and the Senate also failed, by 48–50, even to gain sufficient votes to stop debate on the measure. The House pursued an alternative idea, approving a measure to prohibit federal courts from hearing challenges to the 1997 Defense of Marriage Act. The Senate, however, never took up the bill. (See Law, Crime, and Law Enforcement: Special Report (Legal Debate over Same-Sex Marriages ).)

      Numerous congressional bills died or were postponed, including ones regarding bankruptcy reform, the banning of assault weapons, welfare reform, asbestos lawsuits, class-action and medical-malpractice legislation, and increased funding for federal highway construction.

      Congress adjourned in early October without having made major changes to the highly decentralized U.S. intelligence structure. Pressure generated by the 9/11 commission, however, helped prompt a congressional lame-duck session in early December. The result was a bipartisan reorganization of national intelligence operations under a single director, along with new surveillance and antiterrorism powers for the new agency.

The Economy.
      World turmoil impacted the nation's domestic business climate but failed to stop a continued expansion of the resilient U.S. economy. Dramatically higher oil prices provided a damper on strong United States economic growth. The U.S., spending heavily at home and abroad, resumed its place as the world's main economic engine in 2004, at least temporarily shrugging off heavy costs associated with homeland security and the war on terrorism, and finally reversing a decline in employment that had started with the 2001 recession.

      As the year began, the economy was growing at a robust pace. Expansion was stimulated by federal tax cuts and outlays from a record federal budget deficit and aided by low interest rates, modest inflation, and oil selling for $32.50 per barrel. Energy supplies, however, tightened under demand pressure from growing economies worldwide, especially in China. The growing insurgency in Iraq threatened supplies, as did less-violent uncertainty during the year in other major petroleum-producing countries, including Saudi Arabia, Russia, Nigeria, and Venezuela. By late October oil topped $55 per barrel, which acted as a major drain on the U.S. economy and helped turn what might have been an extraordinary economic year into a mere solid one.

      The U.S. GDP grew by 4.5% in the first quarter and readily topped 3.5% for the remainder of the year. The Federal Reserve Board increased historically low short-term interest rates by a modest 0.25% on five separate occasions, ending the year at 2.25%. The consumer price index rose by more than 3.5% for the year, higher than in recent years, but nearly half of that increase was attributable to higher energy prices.

      The national prosperity was fueled in part by unprecedented and disquieting red ink. The 2004 federal budget deficit, impacted by war, homeland security, and tax-cut measures, was $422 billion, less than forecast early in the year but easily topping the previous record 2003 deficit of $377 billion. U.S. imports of petroleum and Asian consumer goods paced record trade deficits that exceeded $50 billion a month through the year, another record pace. The weight of both deficits helped drive down the value of the U.S. dollar, a drop that accelerated after the November elections. The dollar finished the year at a historic low against the euro.

       Unemployment drifted lower during 2004, from 5.7% to 5.4%. About two million new jobs were created in the U.S. during the year, a creditable performance but not sufficient to fully offset jobs lost during the recession. In addition, jobs were also being “offshored” to countries that had lower labour costs. (See Economic Affairs: Special Report (Offshoring ).)

      The nation's equity markets followed a major bounce back in 2003 with a solid, if unspectacular, upward move in 2004. Broad indicators demonstrated that overall, share prices rose nearly 10% during the year, but some indexes were lower. The Dow Jones Industrial Average started the year above 10,400, but energy price increases and election uncertainty caused a sell-off to 9750 in late October. With election jitters settled, the Dow started a year-end rally and finished at 10,783, a gain of 3%.

      Business news was dominated by continued fallout from 2001–02 corporate scandals. Two onetime business titans, Kenneth Lay of Enron and Bernie Ebbers of WorldCom, were indicted for their roles in accounting irregularities that afflicted their companies. John Rigas, CEO of Adelphia, a major cable company, was convicted on 18 felony counts for misappropriation of corporate funds. Martha Stewart, head of a successful marketing and publishing company carrying her name, was convicted of having lied about stock trades and sentenced to five months' imprisonment. Stewart appealed the decision but began serving the sentence in October at a West Virginia penal facility in hopes of limiting damage to her firm.

      New York Attorney General Eliot Spitzer (see Biographies (Spitzer, Eliot )), who had rocked the mutual-fund industry in 2003 with allegations of after-hours trading and other improprieties, turned his attention to insurance in 2004. In a wide-ranging investigation affecting almost all types of insurance, Spitzer charged two companies with civil fraud for alleged bid rigging and steering business. At year-end, several insurers, while acknowledging problems in their industry, called for Congress to take over for state regulation of insurance companies.

Foreign Policy.
      With maneuvering ability almost nonexistent, owing to the war in Iraq, and constricted by domestic political considerations, U.S. diplomacy struggled through a dark 2004. Resentment toward perceived U.S. unilateralism coloured relationships with several countries, and despite earnest efforts, only marginal progress was recorded in expanding international participation in Iraq's security and reconstruction. The year saw some bright moments, particularly in nurturing democracy in Afghanistan, Indonesia, and Ukraine, but overall the year was replete with frustrations.

      U.S. attempts to stop Iran's and North Korea's progress in their development of nuclear weapons capability met little success. Early in the year Iran reneged on 2003 promises to cease uranium enrichment that can produce either low-grade nuclear fuel or raw material for nuclear weapons. The U.S. pressed the International Atomic Energy Agency for punitive sanctions. The U.K., France, and Germany, however, offered Iran a trade pact with the European Union instead. Iran eventually agreed to a temporary halt in enrichment activities, one that critics said would be meaningless in the country's drive for weapons capability.

      A long-running effort to dismantle North Korean nuclear designs made even less progress during 2004. The U.S. again refused North Korean demands for bilateral negotiations, insisting instead on six-party talks that included Japan, Russia, China, South Korea, North Korea, and the U.S. A June meeting produced no notable result, and North Korea then refused further negotiations, openly suggesting that the U.S. election might produce a new U.S. administration. The talks remained stalled at year's end.

      The brightest chapter in international cooperation came in Afghanistan, which had lacked a democratic tradition. With the assistance of numerous countries, however, Afghans set up a voter-registration system and attracted nearly eight million voters, with substantial participation by previously disenfranchised women. The Afghan success, along with democratic electoral progress in Indonesia and Ukraine, was considered a major accomplishment in the Bush administration's campaign to spread democracy worldwide.

      U.S. relations with Russia deteriorated amid charges that Russian Pres. Vladimir Putin was eroding democratic reforms, confiscating private property, and interfering in the internal affairs of European neighbours. In the Middle East, Russia was also suspected of providing assistance to Iran in its nuclear ambitions. U.S. authorities maintained a public facade of cooperation with the Putin regime but expressed private dismay over a variety of Russian actions, including nationalization of the giant Yukos oil company and heavy-handed—and ultimately unsuccessful—attempts to influence the election in Ukraine. (See Ukraine .)

      Bush administration relations with the UN were also superficially correct but deteriorated significantly. The international organization was rocked by scandal, ranging from harassment allegations against ranking officials at the UN headquarters in New York to sexual mistreatment of women and girls by UN peacekeepers in the Democratic Republic of the Congo to culpability in having allowed Saddam Hussein to divert an estimated $21 billion from the “oil for food” program. A Republican-led congressional inquiry into oil for food was largely stonewalled by UN officials, and prominent U.S. legislators publicly called for the resignation of UN Secretary-General Kofi Annan.

      The U.S. also fumed over lack of UN support for Iraq. UN relief officials had largely departed from Iraq in 2003 following a bombing attack on their headquarters and, citing ongoing security concerns, failed to return in 2004. In a notable interview in mid-September, only weeks before U.S. elections, Annan declared the 2003 U.S.-led invasion of Iraq to have been an illegal act, a declaration that Bush officials judged excessively political.

      The UN's largely ineffectual response to humanitarian concerns in the Darfur region of The Sudan was yet another issue. More than 100,000 largely Christian Darfur residents were driven out of their homes by Islamic Sudanese, and thousands died. U.S. Secretary of State Colin Powell called the situation “genocide” and facilitated U.S. aid, but UN efforts to stop the ethnic disruption were minimal.

      The tsunami disaster that followed the December 26 earthquake near Sumatra, Indon., also strained U.S.-UN relations. As the magnitude of the disaster began to unfold, the U.S. pledged an initial $15 million to the relief effort, and a ranking UN official labeled donations by wealthy countries as “stingy.” Within hours of the disaster, however, the U.S. began deploying military resources and mounted a major humanitarian-relief campaign to affected areas in conjunction with Australia and Japan, often bypassing the UN relief bureaucracy. The U.S. contributed $350 million to the relief effort, and Americans gave more than $200 million in private funds; donations were rising at year's end. (See Disasters: Sidebar (Deadliest Tsunami ).)

      The long-stalled Middle East peace process appeared close to renewal in October with the death of Palestinian leader Yasir Arafat (see Obituaries (Arafat, Yasir )), whose intransigence and encouragement of violence against Israel were widely blamed for the breakdown of a key 2000 U.S.-sponsored peace accord.

David C. Beckwith

Developments in the states
      A long-awaited economic expansion finally ended a serious budget crisis in U.S. state governments in 2004. Although the recovery was modest and allowed replenishment of exhausted accounts, there was little expansion of services. States continued to wrestle with the federal government over education, health care, and prescription-drug reimbursement, among other problems.

Party Strengths.
       Democrats made notable gains in 2004 in state legislative elections, and Republicans appeared to increase their control of governorships. The results left the two parties at virtual parity in state governments nationwide at year's end. In 2005 Republicans would control both state legislative chambers in 20 states, down from 21 in 2004, and Democrats would dominate both bodies in 19 states, up from 17 in 2003. Ten states were split, with neither party organizing both chambers, and Nebraska had a nonpartisan legislature.

      Republicans enjoyed a 28–22 edge in governorships for most of the year. In the November balloting Democrats took away GOP seats in Montana and New Hampshire, but Republicans were awarded previously Democratic governorships in Indiana, and Missouri. In Washington, after the closest gubernatorial election in state history, it appeared after the first recount of 2.9 million ballots that Republican Dino Rossi had bested Democrat Christine Gregoire by 42 votes, but the Democrats challenged the results. Following a second recount, Gregoire was declared the winner by 129 votes in December. That left the Republican prospective advantage for 2005 at 28–22.

Structures, Powers.
      An attempt to divide Colorado's presidential votes in the electoral college proportionately, abandoning the winner-take-all system, was soundly defeated in November voting. Citizens in Arkansas and Montana rejected November ballot proposals to relax term-limit laws for state officials. Wyoming's Supreme Court invalidated that state's term-limit law just as it began to take effect. Of 21 state laws approving term-limit laws over recent years, 6 were thrown out or repealed.

      Numerous states expanded early-voting opportunities, and Missouri, North Dakota, and Utah allowed overseas military personnel to vote by e-mail. South Dakota established a constitutional review commission.

Government Relations.
      State relationships with the federal government, which had always been strained, were tumultuous during 2004, particularly on public-education policy. Congress again extended a ban on state taxation of Internet services, this time until 2008. In another controversial action, a federal ban on the manufacture and sale of certain semiautomatic weapons was allowed to expire; only five states had enacted curbs on so-called assault rifles.

      The U.S. Supreme Court, in a 5–4 decision affecting 13 states, prohibited judges from considering aggravating factors and extending jury sentencing verdicts. In a bow to seven states that failed to impose a personal income tax, Congress approved a two-year measure to allow optional deduction of sales taxes on federal income-tax returns.

      Pressure on state budgets eased markedly in 2004 as the national economy recovered, and this led to an uneventful year for tax legislation. States still faced substantial budget shortfalls, but most were able to balance their books without raising taxes or substantially cutting state spending. With budgets tight, few states expanded social services.

      Only nine states raised taxes during the year. Arkansas and Virginia increased their sales taxes. Alabama, Colorado, Michigan, Oklahoma, and Rhode Island raised their tobacco tax. Two states boosted personal-income levies on their highest-income taxpayers; California dedicated the added revenue to expanding mental health programs, and New Jersey funded a property-tax-rebate plan. Oregon voters repealed substantial personal and corporate tax increases approved by the 2003 legislature, and legislators in Iowa and New Hampshire reduced state sales taxes.

      Overall, states began rebuilding “rainy-day” funds and repaying accounts that had been used to steer state budgets through the 2001–03 down cycle. In recent years California, which was particularly hard-hit by the bursting of the dot-com bubble, had accounted for nearly 40% of state budget shortfalls. At the urging of California Gov. Arnold Schwarzenegger, voters extended resolution of the crisis via a $15 billion bond issue early in the year. The state worked through the down period by reducing spending (particularly on education), raiding other state funds, and increasing revenue incrementally via a tax-amnesty plan.

      Some 35 legislatures considered bills designed to curb outsourcing of jobs abroad, usually by banning out-of-state or foreign companies from doing state work. Only Tennessee enacted an antioutsourcing law, however, while governors in Maryland and Massachusetts vetoed similar measures. (See Economic Affairs: Special Report (Economic Affairs ).)

      Fallout from the November 2003 Massachusetts Supreme Judicial Court decision making single-sex marriage a state constitutional right created turmoil nationwide throughout the year. Backers of traditional marriage took vigorous steps to overturn the decision and to limit its effect to Massachusetts, with only partial success.

      When the decision became effective on May 17, state officials forestalled a nationwide influx by declaring that only Massachusetts residents were eligible for marriage licenses. The state legislature took initial steps toward placing the issue on the 2006 statewide ballot, obtaining 105 votes (with 101 required) for a constitutional amendment permitting civil unions but not same-sex marriage. Another legislative vote in 2005 was required before the ballot measure would be scheduled.

      Reaction in some states was sympathetic. New Jersey, anticipating a similar court decision in an ongoing lawsuit, joined Vermont in recognizing same-sex civil unions. Two lower court decisions in Washington state also declared the state ban on same-sex marriage to be unconstitutional, but the case was appealed. Local authorities in several jurisdictions, including San Francisco and Portland, Ore., began issuing same-sex marriage licenses before state authorities intervened; the San Francisco action was voided by the state Supreme Court.

      Other states began taking legal steps to prevent the Massachusetts decision from being recognized under the U.S. Constitution's “full faith and credit” clause. Louisiana and Missouri voters and state legislators in Wisconsin joined four other states in amending their state constitutions to ban same-sex marriages. On November 2, voters in 11 additional states overwhelmingly approved constitutional amendments: Oregon, Mississippi, and Montana barred same-sex marriages; Arkansas, Georgia, Kentucky, Michigan, North Dakota, Oklahoma, and Utah banned civil unions as well as domestic partnerships; and Ohio outlawed any benefits to same-sex couples. (See Crime and Law Enforcement: Law: Special Report (Legal Debate over Same-Sex Marriages ).)

      Two governors, John Rowland of Connecticut and James McGreevey of New Jersey, were forced to resign under a cloud of scandal during the year. Rowland, a Republican, quit June 21 as a federal grand jury probed multiple charges that he had steered state contracts to favoured firms and received free remodeling services from state contractors. His resignation halted impeachment proceedings initiated by the state legislature. In December Rowland pleaded guilty to a single federal felony count of conspiracy to steal honest service.

      McGreevey, a Democrat, became the first governor in history to be forced out over a sex scandal. On August 12, after a male former aide threatened him with sexual-harassment litigation, McGreevey announced that “I am a gay American” and declared that he would quit three months later. He was succeeded by the state Senate president, a Democrat, who would serve until January 2006; if McGreevey had left immediately, a special election in November would have filled the vacancy.

Law and Justice.
      States moved aggressively to combat escalating medical-malpractice insurance premiums, which were widely blamed on personal-injury lawsuits. Thirteen legislatures approved malpractice-relief bills, but governors in three states (Connecticut, Iowa, and Missouri) vetoed them. Florida voters approved a far-reaching plan to curb lawsuits and place a ceiling on noneconomic damage awards, and Nevada voters embraced a cap on noneconomic damages, but similar measures in Oregon and Wyoming were rejected in November balloting.

      Ohio became the first jurisdiction to reform asbestos-exposure litigation, which in recent years had led to the bankruptcy of more than 70 corporations. The new law required that plaintiffs prove that they were actually ill before they could receive compensation; up to two-thirds of current asbestos claimants had not been diagnosed with cancer or other diseases.

      Voters in Alaska rejected a proposal to effectively legalize and regulate marijuana use. Montana became the 11th state, most of them in the West, to allow the use of marijuana for medicinal purposes, but Oregon voters rejected an expansion of the state's similar program. Voters in Alaska and Maine turned down proposals to stop using baited traps in the hunting of bears.

      State-sponsored gambling enjoyed mixed luck during the year. Oklahoma and Pennsylvania allowed slot machines or video lottery terminals at horse-racing tracks. Oklahoma voters approved a new state lottery, with proceeds dedicated to education. Michigan voters, however, demanded veto power over any further expansion of gambling. Nebraska voters rejected a casino gambling plan approved by the state legislature, and California and Washington voters turned down revenue plans funded by expansion of Native American casinos.

      Loopholes exposed in the highly publicized case involving basketball player Kobe Bryant of the Los Angeles Lakers prompted California and Colorado to strengthen their shield laws protecting the identity of rape victims. Wisconsin barred police from requiring that rape victims submit to a lie-detector test.

      California became the first state to order suspects to submit DNA samples for testing after a felony arrest. Voters also narrowly defeated a proposal to relax the state's “three strikes” law, which mandated life imprisonment on a third felony conviction. A downward trend in application of the death penalty continued during 2004. During the year only 59 convicts were executed nationwide, down from 98 in 1999.

Health and Welfare.
      Conflict between state and federal approaches to health care policy was high during 2004, particularly over prescription drugs. A growing number of states—including Illinois, Minnesota, North Dakota, New Hampshire, and Wisconsin—actively defied a Food and Drug Administration (FDA) ban on the importation of drugs from abroad, particularly Canada, by setting up Internet sites to assist with such purchases. Oregon floated a plan to license foreign pharmacies; Minnesota waived co-payments for state employees and ordered Canadian drugs; and Vermont filed a lawsuit against the U.S. government seeking permission to import drugs directly. At year's end the FDA was continuing to battle the state action, asserting that uninspected imported drugs were not safe.

      The limits imposed by the administration of Pres. George W. Bush on federal stem-cell research were challenged in several states. New Jersey expanded funding for a state stem-cell institute, and in November California voters approved $3 billion in state bonds to support embryonic stem-cell research over 10 years. Delaware established a novel $10 million anticancer research program, which would guarantee health benefits for uninsured patients.

      States reacted warily as initial benefits began flowing from the federal government's 2003 reform of Medicare. Prescription-drug discount cards were offered to seniors nationwide, which created some confusion in 22 states that assisted with drugs via discount or subsidy programs. Twelve states approved new legislation to help transition seniors into expanded federal drug benefits expected in early 2006. (See Social Protection: Sidebar (Medicare's New Prescription-Drug Program ).)

      Georgia and Wisconsin became the first states to grant a major tax credit to encourage organ donation. Illinois allowed organ transfers from HIV donors to HIV-infected patients. Colorado, Tennessee, and Washington joined four other states that restricted student access to candy-, snack-, and soda-vending machines in public schools.

      State spending on Medicaid low-income health assistance—the states' fastest-growing program—continued to strain budgets, with a fourth consecutive year of double-digit increases. States continued to react by trimming benefits and eligibility, and Tennessee contemplated a wholesale revamping of its signature TennCare plan.

      California issued regulations aimed at fighting global warming by mandating reduced greenhouse-gas emissions, including carbon dioxide, in automobiles. Seven northeastern states tied their emission standards to California's. Arizona voters approved a law barring undocumented aliens from voting or applying for social services.

      State officials chafed under increasing pressure of the 2002 federal No Child Left Behind Act, which mandated gradually increasing standards for teachers and students. One-quarter of public schools failed initial testing requirements, and states sought exemptions from requirements for stepped-up teacher certification and achievement for at-risk and minority students. Protests against the estimated $9 billion annual costs, penalties, and unprecedented federal oversight were introduced in more than 20 legislatures. Only Maine and Utah, however, enacted legislation promising critical review of the Bush administration initiative.

Consumer Protection.
      Utah became the first state to ban “ spyware,” software installed on a computer without the owner's consent. New Jersey joined New York in banning the use of handheld cellular phones while driving. Massachusetts became the sixth state to outlaw smoking in virtually all public places, and Idaho also approved public-smoking curbs, with the exception of bars.

David C. Beckwith

▪ 2004

9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
(2003 est.): 291,587,000; based on 2000 unadjusted census results
Washington, D.C.
Head of state and government:
President George W. Bush

      Even as the U.S. struggled for months during 2003 with a sluggish economy and the multiple burdens of an unprecedented war on terrorism, overextension of unrivaled U.S. military and economic power seemed a remote prospect. In March, however, the United States initiated its second major military incursion in a Muslim country in 18 months when it led an invasion into Iraq. (U.S. troops were still committed to Afghanistan.) While staged combat was over quickly, an untidy aftermath in Iraq seriously strained both American resources and the national will. The aggressive U.S. action, grounded in a new assertion of the right to wage “preemptive war” against terrorists, badly divided the country's traditional allies and energized a long-dormant antiwar faction in the domestic American body politic. By year's end, although there were signs of stabilization in Iraq, the U.S. was scaling back ambitious plans to transition Iraq into a Western-style democracy, and the ultimate outcome of the U.S. commitment was very much in doubt.

      Backed by a handful of major countries, dubbed the “coalition of the willing” by Pres. George W. Bush, the U.S. in early spring overran Iraq in a little over three weeks. The invasion was at least partially justified on the basis of fears, fueled by reports compiled by Western intelligence agencies, about Iraq's possession of weapons of mass destruction, which by year's end had not been found. The liberation of Iraq was a clear humanitarian triumph, however, and a tonic for the U.S. economy as well. By coincidence or not, U.S. business expansion resumed with a vengeance in the weeks following the war, emphatically ending a 30-month economic malaise.

War in Iraq.
      During January and February some 300,000 U.S. and British troops and 1,150 coalition aircraft were deployed near Iraq—even while 200 newly admitted United Nations inspectors under Hans Blix scoured suspected Iraqi sites, looking for evidence of nuclear, chemical, and biological weaponry and banned missile systems. (See Military Affairs: Sidebar (Defining Weapons of Mass Destruction ).) The inspection team had limited success; they located and began arranging destruction of 120 al-Samoud 2 missiles but found no evidence of an active nuclear-weapons program. Additionally, Iraq could not account for chemical and biological agents, including anthrax, that had been in its possession in the late 1990s.

      Several influential countries, including France, Germany, and Russia, viewed the inspections as a major step forward in disarming Iraq; they counseled patience and additional diplomacy. Bush and British Prime Minister Tony Blair, however—their armed forces extended on combat readiness—declared the Iraqis to be stalling and continued the allied military buildup. The coalition suffered a major setback on March 1 when the Turkish parliament narrowly rejected a plan to allow U.S. troops to use Turkey, on Iraq's northern border, as a staging area.

      On March 17 President Bush gave Iraqi Pres. Saddam Hussein (see Biographies (Hussein, Saddam )) and his family 48 hours to leave the country so that all UN weapons-disarmament decrees could be fully enforced. Two days later, with no explicit UN approval, the U.S. began launching Tomahawk missile strikes on suspected Iraqi leadership sites. Coalition troops began crossing the Iraqi border from Kuwait on March 20. The attack moved quickly toward Baghdad from the west and southwest, covering 300 km (186 mi) in less than a week. Direct resistance was light, although guerrilla attacks behind supply lines inflicted some casualties on coalition forces. A week later U.S. airborne forces opened a third front from the north.

      By April 4 the U.S. expeditionary force had captured Saddam International Airport near Baghdad. Threats of block-by-block Iraqi resistance in crowded urban areas of Baghdad proved illusory. Repeated armoured probes of the capital failed to encounter major resistance, and the city was largely under coalition control by April 9. By the middle of the month, the final remnants of Iraqi military forces had been dispersed. That led to the toppling of statues of Saddam all over Iraq even while looters ravaged government offices and cultural centres that occupation troops had left unprotected. Fewer than 200 allied service personnel, including 138 Americans, died from hostile action during the invasion period.

      Almost immediately, however, hit-and-run attacks began on coalition forces even as the allies appointed an Iraqi Governing Council to oversee the transition to Iraqi civilian rule. The death of Saddam's sons Uday and Qusay on July 22 did little to stop sabotage and resistance. (See Obituaries (Hussein, Uday, and Hussein, Qusay ).) The attacks reached a crescendo in November when a series of bombing and missile attacks on helicopters, planes, and military vehicles left 81 Americans dead. On December 13, however, U.S. forces discovered Saddam hiding in an underground “spider hole” near his hometown, Tikrit. He was captured and held for trial. Even so, by year's end U.S. casualties had reached 480, and attacks on U.S. troops were continuing daily.

      The war on terrorism, including the Iraq invasion, dominated both U.S. domestic policy and foreign politics throughout the year. The war split Democrats and roiled the Democratic presidential campaign, leading directly to the emergence of former Vermont governor Howard Dean as the front-runner for the 2004 nomination. Traditional allies of the U.S., led by France, declined to share in the costs of putting Iraq back on its feet. In September the Bush administration acknowledged reluctantly that reconstruction costs in Iraq and Afghanistan would require $86 billion in additional U.S. funds. After extended controversy, Congress eventually approved the outlay.

Domestic Issues.
      Although Bush had vowed to bring a bipartisan civility to Washington, partisan divisions in Congress deepened during the year as the country prepared for the 2004 elections. In the U.S. Senate, Democrats expanded a campaign to block administration judicial nominees they considered excessively conservative from being confirmed to circuit courts of appeal. By the end of the year, an unprecedented six nominations were being stalled by threat of filibuster. The Congress also again failed, owing to regional and partisan differences, to approve long-considered legislation to stimulate U.S. energy supplies.

      After 15 years of discussion, however, legislators approved reform of the national Medicare system for older Americans, adding a controversial prescription-drug benefit and introducing private-sector competition to the plan. The price tag for the new drug-benefit entitlement was $400 billion over 10 years, an amount deemed inadequate by liberals and excessive by fiscal conservatives. Republicans were not keen to expand Medicare, but with medical costs rising rapidly and shifting heavily toward drug therapy, public support for a drug benefit was rising. In his 2000 campaign, candidate Bush had promised action on the measure, and in 2003 he pushed aggressively for its passage prior to the 2004 election year. The bill was approved only after it was endorsed by AARP (formerly the American Association of Retired Persons), an influential lobbying group for seniors, which vowed to seek improvements, including expansion of benefits, in future years.

      The Federal Trade Commission established a national “do not call” registry for persons wishing to avoid unsolicited sales calls over the telephone. More than 60 million telephone numbers were quickly registered, and Congress endorsed the registry in later legislation. Adopting an idea pioneered by state governments, Congress also approved a bill that permitted consumers free access to their credit reports. After a series of forest fires in California killed 22, destroyed 4,800 buildings, and burned nearly 400,000 ha (1 million ac) of land, Congress approved the Bush administration's Healthy Forests initiative. The measure, which was signed into law in December, provided for active federal land management, including thinning of undergrowth and planned burns, to reduce fire damage.

      Following an adverse ruling from the World Trade Organization, the Bush administration moved to rescind protective tariffs on steel imports first imposed in 2002. The tariffs followed another Bush campaign promise, this time to steel-manufacturing areas, but they were wildly unpopular among consumers of steel, including automobile manufacturers.

      The U.S. Supreme Court, in a 5–4 decision, upheld most of the 2002 McCain-Feingold campaign-finance law designed to reduce the influence of special-interest money in federal elections. The high court approved the law's ban on national “soft money” donations by corporations and labour unions and endorsed curbs on advertising by third-party groups that benefited individual candidates. In another landmark decision, also by a 5–4 vote, the high court approved limited use of affirmative-action policies to benefit minority candidates for admission to institutions of higher learning. (See Law, Crime, and Law Enforcement: Court Decisions.)

      The struggling national economy and controversy over President Bush's handling of Iraq drew a large field for the 2004 Democratic presidential nomination—10 candidates at one point, before their ranks were reduced to 9. The 10 were Dean; Rep. Richard Gephardt of Missouri, the former U.S. House leader; Rep. Dennis Kucinich of Ohio; North Carolina Sen. John Edwards; Florida Sen. Bob Graham (who dropped out after five months); Massachusetts Sen. John Kerry; Connecticut Sen. Joseph Lieberman; former Illinois senator Carol Moseley Braun; U.S. Army Gen. Wesley Clark; and African American leader Al Sharpton. Washington outsiders soon established themselves as the front-runners, however. Dean distinguished himself with a strong antiwar stance and savvy use of the Internet for fund-raising and organization; by year's end he was ahead in public-opinion polls but under assault from other contenders as excessively liberal and unelectable against Bush. Late entry Clark was viewed by some Democrats as best positioned to challenge Bush; he received backing from party moderates and aides to former president Bill Clinton.

      The U.S. space program sustained a catastrophic loss on February 1 when the space shuttle Columbia orbiter disintegrated on reentry over Texas, killing all seven astronauts on board. (For Obituaries of Columbia astronauts, see Michael P. Anderson (Anderson, Michael P. ), David M. Brown (Brown, David M. ), Kalpana Chawla (Chawla, Kalpana ), Laurel Blair Salton Clark (Clark, Laurel Blair Salton ), Rick D. Husband (Husband, Rick D. ), William C. McCool (McCool, William C. ), and Ilan Ramon (Ramon, Ilan ).) The tragedy was eventually traced to a 680-g (24-oz) section of foam insulation that had broken away from an external fuel tank on liftoff, damaging Columbia's left wing and dooming the mission. A commission of inquiry later criticized NASA for having a culture that allowed schedule requirements to dominate safety concerns. (See Physical Sciences: Space Exploration (Physical Sciences ).)

The Economy.
      With the world watching its main economic engine nervously, the U.S. economy finally shrugged off a lingering hangover from dot-com overexuberance and resumed serious growth during 2003. The revival ended two agonizing years of national economic drift and arrived even as the business community was wrestling with new allegations of wrongdoing in financial markets.

      The year started sluggishly, with the economy technically expanding but at such an anemic rate that jobs continued to disappear overall. Economic growth averaged only about 2% for the first six months, and unemployment rose from 5.7% in January to 6.4% by midyear. Government officials appeared to have exhausted their ability to recharge the economy. The Federal Reserve System reduced the already-nominal federal funds interest rate by one-quarter point to 1% in June.

      A coalition victory in Iraq, however, appeared to inspire an early spring revival in the equity markets, and by the third quarter the national economy was growing at an 8.2% rate, the fastest clip in two decades. The brisk expansion was fully under way by summer, spurred by low interest rates and inflation and the stimulus of major tax cuts and government spending flowing through the economy. Growth was also aided by a sizable jump in worker productivity, reflecting business economizing and efficiencies.

      Another stimulative factor was a record federal budget deficit. The shortfall was estimated at $455 billion in July, but rapid second-half growth lowered the actual deficit to $374 billion by October. Reversing their historic role, Democrats criticized the Bush administration for fiscal irresponsibility, alleging that Republican tax cuts mostly benefited the wealthy and were creating debt to be paid by future generations. Republicans attributed most of the shortfall to temporary costs associated with the moribund economy and the war on terrorism. Even so, the deficit and the rapid growth failed to produce any revival of inflation, with consumer prices growing less than 2% for the year. Though unemployment fell back to 5.7% in December, only 1,000 jobs were created that month, and about 309,000 people stopped looking for work.

      By year's end the equity markets had posted substantial gains, their first in three years. The Dow Jones Industrial Average finished the year at 10,453.92, more than 3,000 points higher than the March low, and the technology-heavy Nasdaq average rose from 1253.22 in March to above 2000 at year's end. Even so, both averages remained well under their record highs, established in 2000.

      Incidence of corporate wrongdoing and accounting irregularities, widespread during 2002, subsided during the year, but the national system of market regulation sustained major strains. Robert Grasso, chairman of the New York Stock Exchange, was forced to step down after his $187.5 million compensation package was revealed. The nation's $7 trillion mutual-fund industry was rocked by allegations of misconduct, including after-hours and insider trading. The mutual-fund investigation was spearheaded not by the federal Securities and Exchange Commission, which normally took the lead role in market regulation, but by Eliot Spitzer, the controversial and aggressive New York state attorney general.

Foreign Policy.
      The new U.S. preemptive-war policy, and particularly U.S. action in Iraq, threatened to fracture U.S. relations with several European powers. In March, France, Germany, and Russia refused to allow a United Nations vote authorizing the Iraq incursion. After the U.S.-led coalition victory, France was among the countries refusing to contribute security forces to restore law and order in Iraq and declining to assist in that country's economic reconstruction.

      With costs rising, including financial outlays and U.S. troop casualties, diplomacy came close to a breakdown. At one point U.S. Defense Secretary Donald Rumsfeld derisively dismissed recalcitrant major powers as “Old Europe,” contrasting their foot dragging with the actions of new democracies such as Poland, Romania, and Bulgaria, as well as other countries that wholeheartedly supported the coalition effort. The Pentagon then explicitly refused to consider corporate construction and supply bids for Iraq from countries that had failed to support the war effort, which further angered French, German, and Russian interests. At year's end, however, President Bush dispatched former secretary of state James Baker to negotiate a reduction of the $120 billion external debt left by the Saddam regime. Baker was largely successful, and U.S. diplomatic relations with its estranged allies improved.

      Another major effort to resolve the long-standing Israeli-Palestinian standoff foundered during the year. The European Union, Russia, the United States, and the United Nations devised a “road map to peace” and obtained nominal agreement to it from both sides. To aid in breaking the deadlock, Palestinian leader Yasir Arafat was forced to share power by appointing a prime minister. The new official, Mahmoud Abbas (see Biographies (Abbas, Mahmoud )), was not able to assert his authority, however, and he resigned his post, leaving the Middle East peace process with no significant progress for the year.

      Concerns over nuclear proliferation in Third World countries continued to preoccupy U.S. diplomats. As the year began, North Korea withdrew from the Nuclear Non-proliferation Treaty, the first signatory ever to do so, and threatened concerted efforts toward building up its nuclear-weapons program. North Korea insisted on direct negotiations with the U.S., preceded by a U.S. nonaggression guarantee. Six-country talks, including North Korea's ally China, were held at midyear, without apparent progress, but after the U.S. offered limited security promises, negotiations were again resumed at year's end.

      Iran and Libya, under international pressure, promised to open their long-running and secretive nuclear programs to inspection during the year. Iran revealed that its efforts had been under way for 18 years, which prompted U.S. calls for punitive measures, but UN authorities elected instead to push only for more effective future inspections. Libya, struggling to escape UN economic sanctions, agreed to pay $2.7 billion to families of victims of the 1988 airline tragedy in Lockerbie, Scot. Later in the year a shipment of centrifuge equipment heading to Libya was intercepted at an Italian port, the first action under a U.S.-led 11-nation Proliferation Security Initiative. Within weeks the Libyan regime publicly disclosed its own nuclear-weapons-development program and promised to dismantle it. Bush administration backers attributed progress on nuclear nonproliferation to the U.S. hard line on Iraq.

      U.S. relations with China continued to warm despite concerns over a major trade imbalance and Taiwan. As the Chinese economy expanded rapidly, creating a massive trade surplus with the U.S., the Bush administration suggested that China was manipulating its currency to make the trade imbalance even more one-sided. Later, however, as Taiwan politicians talked of independence, the U.S. forcefully reminded them that the U.S. “one China” policy opposed any complete and permanent Taiwan-China break.

David C. Beckwith

Developments in the States
      A roller-coaster national economy and unsettled relations with the federal government made 2003 a turbulent year for U.S. state governments. Severe budget problems deteriorated further early in the year, which prompted a variety of measures to balance revenue and spending. The national economy leveled off and began growing rapidly at midyear, which eased financial pressures on state governments but not before the tumult helped produce a rare event, the recall of a state governor.

Party Strengths.
      Democrats made modest gains overall in limited state legislative balloting in 2003; though they lost seats in Mississippi, they made gains in New Jersey and Virginia. Those results left the two major parties at virtually equal strength across the country, with Republicans holding a slight advantage of fewer than 1% of overall legislative seats.

      For 2004, Republicans would continue to control both state legislative chambers in 21 states. Democrats would dominate both bodies in 17 states, up from 16 in 2003. Eleven states were split, with neither party organizing both chambers. Nebraska has a nonpartisan legislature.

      For most of the year, Republicans had a 26–24 advantage in governorships. In October voters in California recalled Democratic Gov. Gray Davis and replaced him with Austrian-born actor Arnold Schwarzenegger (see Biographies (Schwarzenegger, Arnold )), a Republican. The next month Republicans won two of three gubernatorial elections, prevailing in Kentucky and Mississippi but losing in Louisiana. The gubernatorial lineup for 2004 would thus include 28 Republicans and 22 Democrats.

Structure, Powers.
      The chief justice of Alabama, Roy Moore, was removed from office after a judicial evaluation commission determined that he had failed to heed a federal court order. In 2001 Moore had installed a 2,400-kg (5,300-lb) granite monument to the Ten Commandments in the state judicial building lobby, and Moore later ignored a federal court order that it be removed.

      The Colorado Supreme Court, in a controversial ruling, declared that the state constitution prohibits mid-decade redistricting. Following the 2000 census, the legislature defaulted to the courts in its duty to draw a new district map. With Republicans in full control in 2003, the legislature attempted to strengthen its hold on seven of nine seats, but the state high court said that no further redistricting could be done until after the 2010 census.

      Republicans were more successful in Texas. After having taken majority control of the legislature in 2002 elections, Republicans started redrawing U.S. House of Representatives district lines. Democrat House members fled to Ardmore, Okla., for four days and thereby prevented a quorum from assembling during regular session. During a subsequent special session, Senate Democrats flew to Albuquerque, N.M., and stayed out of state for more than a month, which also prevented a quorum. In a third special session, however, a new map was approved that promised to add at least five new Republicans to a delegation previously controlled 17–15 by Democrats.

Government Relations.
      Relations between states and the federal government, always contentious, were uneven during 2003. After having mandated improvements in public education, homeland security, election procedures, and other local concerns, the U.S. government made only partial reimbursement for costs, and this had an adverse impact on deteriorating state budgets. With some state taxes tied to federal levies, administration-backed tax cuts eroded state revenue collections. Congress extended a ban on taxation of some Internet service providers, depriving states of a needed, growing revenue source.

      The administration of U.S. Pres. George W. Bush at midyear proposed converting six existing federal programs—Medicaid, low-income housing, workforce development, child protection, transportation, and Head Start—into block grants administered by the states. Backers suggested that local control would eliminate overhead and provide needed flexibility in the administration of social programs. No action was taken on the proposal during 2003.

      Citing excessive expense, legislators in Colorado, Kansas, Maine, New Mexico, North Dakota, Washington, and Utah canceled their states' presidential primaries, which had been scheduled for 2004. Governors in Arizona and Missouri vetoed similar bills and restored primary-election funding.

      An underperforming national economy continued to limit state revenue growth and increase social-service costs, and many relatively painless budget adjustments were quickly exhausted. At one point 45 states faced budget shortfalls, and the cumulative state deficit nationwide was estimated at a record $70 billion. More than half of that, $38.2 billion, was the responsibility of California.

      States responded with a wide variety of measures. Nearly 30 states raised taxes, 8 of them by more than 5%. Alabama attempted to raise taxes by nearly 10%, but the measure was rejected by voters. Although 20 states were able to avoid significant tax increases, only Hawaii was able to reduce overall taxation levels during the year. Most states increased user fees on everything from health care and motor-vehicle licensing to court costs. Many states showed creativity in finding new revenue sources; Massachusetts, for example, increased fees for skating-rink licenses and for taking the bar exam. Fifteen states increased tuition at public colleges. Eight states raised revenue by expanding state-sponsored gambling, but Maine voters rejected a referendum to allow Indian-owned casinos. Other revenue measures included exhausting rainy-day savings, diverting other appropriated money, and enacting tax-amnesty or stepped-up tax-enforcement programs.

      Some 35 states slashed spending, usually by a reduction in workforce. The cuts even extended to previously sacrosanct areas such as public-school funding and safety-net expenditures. With health costs rising rapidly, many states trimmed Medicaid and children's health insurance, usually eliminating some coverage, reducing benefits, or establishing waiting periods.

      For months the Bush administration opposed federal assistance to hard-pressed state treasuries, urging states instead to reduce spending. In May, however, as part of a tax-cut compromise, Washington agreed to send $20 billion to state governments, roughly half in flexible grants and half in additional Medicaid funding. Those payments coincided with a midyear economic pickup that dramatically improved the outlook for state budgets. By year's end a majority of states were running ahead of budget projections, most states were recovering, and only California among major states was still projecting a significant deficit.

California Recall.
      Prior to 2003, citizens of only a single U.S. state—North Dakota in 1921—had ever recalled their governor by popular election. “A perfect storm” of economic and political maelstroms had enveloped California Governor Davis only months after his November 2002 reelection, however, and it prompted his recall and replacement by a political newcomer.

      Davis, faulted for a relatively colourless personal style, was weakened by his handling of California's electricity crisis in 2001 and his perceived failure to reign in state spending after the “dot-com boom” ended and government revenues plunged. After his reelection Davis boosted state-deficit estimates and then encountered gridlock in budget negotiations—Republicans refused to raise taxes, and Democrats resisted major cuts in spending. By midyear, after the state had tripled an unpopular automobile tax, opinion polls showed Davis's approval ratings hitting record lows.

      Recall advocates needed 897,000 voter signatures to force a recall election. Aided by funding from a wealthy Republican gubernatorial hopeful who later dropped out, anti-Davis forces gathered more than 1.3 million valid signatures. The election was eventually set for October 7 to decide two questions: should Davis be recalled, and, if so, who should replace him?

      During the campaign, Democrats were badly split; some concentrated on retaining Davis, but others backed Lieut. Gov. Cruz Bustamante in case Davis was recalled. On October 7 Davis was ousted by a margin of 55.4% to 44.6%. On the second question, voters chose from among 135 candidates of wildly varying backgrounds. The winner was Schwarzenegger, with a plurality of 48.7%; Bustamante was second with 31.6%. Schwarzenegger was sworn in after the results were certified on November 14.

Laws and Justice.
      With business groups warning of potential job losses, Washington voters overturned a legislature-approved ergonomics law that provided workers with strong protection against repetitive-motion injuries. Maryland joined 13 states providing protection to users of marijuana for medical purposes.

      Budget pressure spurred review of state corrections policies, and a recent prison-construction boom slowed. States executed 65 death-row inmates during the year, 24 of them in Texas. Illinois Gov. George Ryan, two days before leaving office in January, issued a blanket statewide clemency to all 167 convicts on death row. Ryan had suspended the imposition of capital punishment in 2000, saying it was applied arbitrarily. At year's end, Ryan was indicted on federal corruption charges, which were unrelated to his death-penalty actions.

Health and Welfare.
      States struggled to contain medical costs, particularly for expensive prescription drugs. Some states attempted to negotiate prices directly with pharmaceutical companies on behalf of low-income or elderly users, and the U.S. Supreme Court approved a closely watched Maine plan that drug companies alleged was coercive. Other states formed pools to facilitate bulk purchases of popular medications. Officials in several states moved to reimport American drugs from Canada, where prices were often cheaper, but the federal Food and Drug Administration rejected the idea. (See Canada: Sidebar (Filling Prescriptions for Americans-Big Business in Canada ).)

      New York and Massachusetts joined California, Connecticut, Delaware, and Maine in banning smoking in virtually all workplaces, including taverns and restaurants.

      A trend toward more competition in K–12 education expanded during 2003. Colorado's legislature approved a school-voucher plan, although a federal judge later struck it down as an unconstitutional interference in the local control of education. Officials in Arkansas, California, and Texas banned the sale of candy, gum, and soft drinks in public elementary and secondary schools. Tuition savings plans that guaranteed future state-university enrollment at current fees were a budget casualty in several states; Kentucky, Ohio, Texas, and West Virginia suspended new enrollments, and Colorado terminated its plan.

      States struggled with mandates of the federal No Child Left Behind Act, which required “high stakes” testing, upgraded teacher-qualification requirements, and prescribed penalties for lagging schools. The Bush administration said the tumult was an expected product of significant reform of public education, however, and state requests for waivers from or amendments to the act were postponed until after the 2004 election.

Equal Rights.
      At the urging of embattled Governor Davis, the California legislature approved a law allowing illegal aliens to obtain state drivers' licenses. The measure was widely viewed as having facilitated Davis's recall, and at year's end legislators repealed it by a near-unanimous vote. In a widely anticipated ruling based on two cases from the University of Michigan, the U.S. Supreme Court permitted affirmative action benefiting minorities in university admissions. The ruling had no effect in California and Washington, where voters had banned race-conscious state policies, but it allowed the resumption of affirmative action in Texas, Louisiana, and Alabama, where lower federal courts had ruled it unconstitutional.

      Supporters of homosexual rights made major gains during the year. The U.S. Supreme Court, in a Texas case, invalidated state sodomy statutes on privacy grounds. Critics charged that the ruling would inevitably lead to judicial sanction of same-sex marriage. Later in the year, in a 4–3 decision, the Massachusetts Supreme Judicial Court ruled that the state constitution forbade denying homosexual couples the right to marry. Similar rulings had been overturned by state constitutional amendments in Hawaii and Alaska and by a “civil unions” law in Vermont that granted only marriagelike rights. Though amendments to the Massachusetts constitution required at least two years for passage, the state high court gave the legislature only six months to comply. Supporters cheered the ruling as providing equality for homosexuals in hospital visits, inheritance rights, and even Social Security entitlements.

      The decision also created uncertainty nationwide on both state and federal levels. Reacting to the Hawaii decision, 37 states had approved laws defining marriage as a union between a man and a woman. The U.S. Constitution, however, requires states to give “full faith and credit” to laws of other states, and it was thus inferred that a homosexual marriage in Massachusetts had to be recognized universally, so the validity of those 37 state laws was in doubt. At year's end, traditional-family proponents vowed support for a U.S. constitutional amendment that would overturn the Massachusetts ruling, which they predicted would undermine traditional marriage, harm children, and threaten social stability.

David C. Beckwith

▪ 2003

9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
(2002 est.): 287,602,000; based on 2000 unadjusted census results
Washington, D.C.
Head of state and government:
President George W. Bush

      In the decade following the collapse of the Soviet Union, the reign of the United States as the world's sole superpower was largely positive, with little apparent downside. The U.S. military created a Pax Americana, its might virtually unchallenged, complementing a dependable U.S. economic engine that seemed to pull the global economy through good times and bad. In 2002, however, Americans came to understand that leadership was costly and often involved disquieting risk.

      The year started with the U.S. determinedly addressing fallout from the Sept. 11, 2001, terrorist attacks and apparently emerging from a mild economic recession. By year-end, however, both external and internal problems appeared far more complicated. Confrontation with the al-Qaeda terrorist network produced modest progress, but the overall terrorism conflict actually expanded; the U.S. was preparing for a potential military assault on Iraq and attempting to defuse a nuclear crisis with North Korea. The national economy, plagued by war jitters and corporate accounting irregularities, stalled in midrecovery, with stock prices plunging and unemployment edging upward, which threw the federal budget back into long-term deficit.

      Contributing to the national malaise were a series of crises suffered by major American institutions. Virtually unprecedented revelations of dishonesty in corporate executive suites, accompanied by a wave of major business bankruptcies, shook confidence in the foundations of U.S. economic prosperity. A sexual-abuse scandal rocked the Roman Catholic Church. (See Religion: Sidebar (Roman Catholic Church Scandal ).) In addition, the competency of the CIA and the FBI was questioned during inquiries into intelligence lapses before September 11.

      Nevertheless, Pres. George W. Bush managed to solidify his position with the American people, in large part owing to his purposeful handling of the “war on terrorism.” He announced a new policy favouring preemptive strikes against increased terrorist threats, expanding the national right of self-defense, and his allies steered several measures through Congress that increased U.S. preparedness. The U.S. Senate, however, controlled by Democrats, delayed approval of several administration initiatives, including terrorism-related bills. Bush took the issue into the midterm election in November, and his party regained total control of Congress. (See Sidebar (U.S. 2002 Midterm Elections ).)

War on Terrorism.
      In his January state of the union address, President Bush effectively broadened the antiterrorist struggle by declaring that nations attempting to produce “weapons of mass destruction” were part of the world terrorist threat. He specifically named Iraq, Iran, and North Korea as “an axis of evil” developing nuclear, chemical, or biological weaponry, and he challenged other governments to confront these states as well. The speech set the tone for a year in which the new terrorist threat dominated foreign relations as well as U.S. domestic politics.

      Dramatic developments in the war on terrorism were rare during 2002. U.S. forces led a successful March coalition military effort in Afghanistan, dubbed Operation Anaconda, that claimed an estimated 500 Taliban and al-Qaeda dead. The top al-Qaeda and Taliban leaders, Osama bin Laden and Mullah Mohammad Omar, remained at large throughout the year, however, and rumours of Bin Laden's death were never confirmed. Despite plentiful warnings and alarms, there were no new terrorist attacks on American soil. The perpetrator of anthrax attacks through U.S. postal facilities, which killed five Americans in late 2001, was never identified, nor was any connection with the September 11 events established. Nonetheless, a political consensus developed behind the main elements of the president's drive to increase domestic precautions against terrorist attacks—to beef up military preparedness and to lead the world response to the threat.

      Bush proposed a 14% increase—to $379 billion annually—for defense spending, the largest increase in two decades, and he sought a doubling of expenditures for homeland security, to $37.7 billion. Some proposals became entangled in politics. Numerous U.S. allies, including top officials of the European Union and France, faulted Bush's approach as excessively unilateral and jingoistic. Two key parts of Bush's antiterrorism legislative package—establishment of a new federal Department of Homeland Security and the provision of federal terrorism reinsurance—became stalled in the U.S. Senate owing to objections from labour unions and trial lawyers. They were belatedly approved only after the November election, along with a measure creating a bipartisan commission to study intelligence failures prior to the September 11 attacks. Most administration initiatives, however, including a major bioterrorism defense bill that increased vaccine stockpiles and protected water and food supplies, were swiftly put into place.

      Congress also accepted Bush's expanded definition of the war on terrorism, including his call for a “regime change” in Iraq. In October, only days before national elections, both chambers overwhelmingly approved a resolution authorizing the use of force against Saddam Hussein and Iraq. After an extended delay led by Russia, France, and other countries, the United Nations also agreed to demand Iraqi compliance with inspections to ensure that weapons prohibited in the 1991 peace agreement were not being developed. The inspectors were not scheduled to report their findings until early 2003, but by year's end a U.S.-dominated coalition had more than 100,000 troops deployed or en route to the region.

Domestic Issues.
      Election-year maneuvering had always had an impact on U.S. federal legislation, but the close division in the U.S. House and Senate made 2002 notable for bills that failed to become law. Only 2 of 13 final appropriation bills were cleared by year's end, for example, and partisan gridlock became a major issue in November balloting.

      Both chambers approved separate energy bills during the year, but conference negotiators failed to agree on a compromise; the Republican-controlled House insisted on oil exploration in Alaska's Arctic National Wildlife Refuge, a measure opposed by environmentalists. A major bankruptcy reform measure, approved by both the House and the Senate in 2001, also died over a partisan argument on the treatment of bankrupt abortion protesters. Congress also failed to agree on prescription drug benefits for Medicare recipients, on denying tax benefits to companies incorporating in offshore tax havens, on reforming medical malpractice liability, and on reauthorizing a successful 1996 welfare-reform law.

      Political considerations were apparent in legislation affecting corporate fraud and farm subsidies. Early in the year, amid early indications that Republicans would suffer from the 2001 Enron bankruptcy and other corporate malfeasance, Democrats pressed for punitive measures to address business accounting problems, corporate governance, and securities-law fraud. Public opinion polls showed, however, that neither political party had an advantage on the issue of corporate dishonesty; Congress easily approved a compromise bill tightening securities regulation and establishing an oversight board for the accounting industry. In renewing farm legislation, Republicans initially resisted a proposal to increase agricultural subsidies dramatically. A $248 billion, six-year bill was approved, however, after party strategists noted that most federal payments would go to states that had voted for Bush in 2000.

      Two measures regulating elections also became law, but their impact was in doubt. A campaign finance reform bill was approved that banned unrestricted “soft-money” donations from corporations and labour unions to national political parties and regulated campaign advertising by outside groups. The bill was quickly challenged in federal court, however, as violative of First Amendment free-speech protections. Critics noted that the law continued to allow soft-money donations to other groups, including state political parties, and reform supporters complained that Federal Election Commission members had begun watering down the reform via regulations. Congress later approved a long-delayed reform law, inspired by year 2000 problems in Florida and elsewhere, setting national standards for voting rules and equipment. The law envisioned $3.9 billion in federal aid to states to meet the standards, but Congress failed to appropriate those funds.

      After Republicans made unexpected gains in November, Democratic House Minority Leader Richard Gephardt of Missouri, a moderate who had sided with the president on national security issues, resigned his leadership post. Gephardt later announced his candidacy for president in 2004. He was replaced by Democratic Rep. Nancy Pelosi of California. The Senate Republican leader, Trent Lott of Mississippi, was forced to resign his post in a bizarre controversy that started at a 100th birthday party in December for Republican Sen. Strom Thurmond of South Carolina. Lott implied to the crowd that the U.S. might have been better off if Thurmond, who had run as an archsegregationist, had been elected president in 1948 instead of Harry S. Truman. Criticism of Lott's remarks started slowly but snowballed, and he resigned as presumptive Senate majority leader two weeks later.

      FBI statistics indicated that the incidence of serious crime in the U.S. began inching up again in 2002 following nine years of decline. The figures showed that while violent crimes dropped during the first six months of the year, crimes against property rose significantly, and the result was an overall 1.3% increase in seven index crimes. The body of former intern Chandra Levy, victim of the most notorious crime of 2001, was found in a Washington, D.C., park in May. She had apparently been strangled, but authorities brought no charges in the case. The U.S. congressman from her Modesto, Calif., district, Gary Condit, who had admitted to a relationship with Levy, was defeated in his reelection bid in the Democratic primary.

      The national capital area was again traumatized during 2002 by apparently random sniper shooting attacks that killed 10 people and wounded 3 in Maryland, Virginia, and Washington, D.C., over a 20-day period. The crime spree ended on October 24 with the arrest of John Allen Muhammad, a former army infantryman, and his teenage companion, John Lee Malvo. The pair, later named suspects in other crimes in Alabama, Louisiana, Arizona, and Georgia, apparently operated out of a 1990 Chevrolet Caprice that had been modified to allow rifle shots from a hiding place in the car's trunk. (See Law, Crime, and Law Enforcement: Crime.)

The Economy.
      For most of the previous decade, while other countries were suffering economic hard times, the U.S. economy had continued to expand, providing a market and needed economic activity that benefited global economic health. In 2002, however, the U.S. economic beacon flickered markedly, the strain aggravated by a declining stock market, fears over war and terrorism, government uncertainty, a historic wave of corporate dishonesty, and a near breakdown in the system of regulation that framed American economic success.

      The economic landscape was littered with casualties. Technically, the U.S. economy continued to expand during 2002, although anemically, but in little more than a year, 6 of the 10 largest corporate bankruptcies in U.S. history were recorded. Widespread accounting irregularities were reported, and Arthur Andersen LLP, one of the “Big Five” accounting firms, went out of business after its criminal conviction on obstruction of justice charges regarding the Enron investigation. (See Economic Affairs: Business Overview: Sidebar (Enron-What Happened? ).) Some 250 companies, a record by far, were forced to restate their earnings. Prominent businessmen were arrested, and some were led off in handcuffs, doing the “perp walk” for news cameras. The nation's stock markets declined for the third consecutive demoralizing year. At year's end, as problems mounted, President Bush replaced his economic team leadership, including the chairman of the Securities and Exchange Commission, Harvey Pitt (see Biographies (Pitt, Harvey )), in search of a fresh start.

      Some analysts blamed the debacle on a hangover from the 10-year expansion, the longest in U.S. history, that ended in March 2001 shortly after the technology-dominated dot-com bubble was deflated. Alan Greenspan, chairman of the Board of Governors of the Federal Reserve System, however, attributed the stock decline to “infectious greed” that corrupted even those who should police it: analysts, credit-rating agencies, and auditors. Others placed the blame on the rise of incentives for managers, especially stock options, which prompted a focus on short-term results rather than long-range strategy.

      As the year began, the national economy appeared to be rebounding smartly from a short-lived recession and adverse consequences of the September 2001 terrorist assault. Both interest rates and inflation remained low, and the economy expanded at a healthy 5% rate in the first quarter. Although business investment contracted, consumer spending, especially for homes and automobiles, remained vigorous, spurred by low interest rates. In April, however, the continuing wave of devastating corporate business news sent equity markets reeling. The Dow Jones Industrial average dropped from 10,600 to 7,200 over the next six months.

      Because the U.S. economy had proved so resilient in the past, government response was muted. The recession helped produce a federal deficit for fiscal 2002 of $159 billion, the first government red ink in four years. Federal Reserve officials had little room to maneuver: they had lowered interest rates 11 times in 2001, and they dropped the key federal funds rate another one-half point, to 1.25%, as markets deteriorated. After extensive discussion, Congress approved a corporate fraud reform law, known as the Sarbanes-Oxley bill, that provided for accounting standard oversight, banned auditors from supplying other services, and required audit committee board members to be independent company directors. The law also required corporate chief executive and financial officers to attest personally, with their signature, to the accuracy of their financial reports. For his part President Bush replaced his treasury secretary and his top economic adviser.

Foreign Policy.
      U.S. allies overwhelmingly supported the 2001 incursion into Afghanistan, but the Bush administration's stepped-up aggressiveness toward perceived terrorist threats in 2002, targeted initially at Iraq, attracted numerous skeptics. Especially in Europe, critics complained about U.S. arrogance and unilateralism. The new U.S. line was formalized in September in a document, “National Security Strategy of the United States—2002,” that promised U.S. preemptive removal of weapons of mass destruction from those deemed to be a national enemy. “The gravest danger our nation faces lies at the crossroads of radicalism and technology.…In the new world we have entered, the only path to peace and security is the path of action,” the Bush administration declared.

      Only a handful of countries, including Britain and Australia, endorsed the preemption policy openly. Reaction in France and Germany was hostile. German Chancellor Gerhard Schröder, running for reelection, repeatedly promised that his administration would never join any U.S. war effort against Iraq. President Bush early on demanded “regime change” in Iraq, but following domestic and international criticism, he appeared before the United Nations in September to urge multilateral support for merely disarming Iraq in accordance with agreements made following the 1991 Persian Gulf War. After an uncomfortable delay, the UN Security Council unanimously approved a strong resolution demanding that Saddam Hussein admit UN weapons inspectors with intrusive authority. Both France and Russia made it clear, however, that their involvement in any potential military action against Iraq would require specific UN approval.

      Hussein's government eventually agreed to—and did—provide a catalog of facilities, products, and scientists and submit to an inspection regime. At year's end the U.S.-Iraqi face-off intensified as inspectors examined Iraqi sites. Meanwhile, both sides worked a clamorous public relations strategy, with U.S. authorities proclaiming that Iraqis were violating their obligations by resisting enforcement of U.S.-led no-fly zones and Iraqis insisting that inspections had found nothing incriminating.

      A decade-old border conflict between India and Pakistan, two nuclear powers, threatened to escalate into open combat at midyear. At one point the two populous countries had one million troops massed on their common border. Top Bush administration officials, including Secretary of Defense Donald Rumsfeld (see Biographies (Rumsfeld, Donald )), led an international mediation effort that defused the immediate crisis.

      The Bush administration's tilt toward Israel in its half-century conflict with Palestinian interests—another issue dividing the U.S. from much of Europe—became more pronounced during the year. After a particularly bloody series of terrorist bombings that killed more than 30 Israelis in three days, the government of Ariel Sharon mounted a determined incursion into Palestinian territory. President Bush urged moderation on Israel but pointedly continued to refuse to meet with Palestinian leader Yasir Arafat or to intervene decisively to stop the Israeli action.

      U.S. relations with Russia under Pres. Vladimir Putin continued to improve. The two countries finally signed a delayed nuclear arms treaty reducing warheads on both sides. Nevertheless, U.S. exhortations failed to dissuade Russia from assisting Iran in weapons-capable nuclear-power projects.

      In early fall, even as the U.S. was focusing diplomatic and military efforts on Iraq, the third axis of evil country lurched again into world headlines. Confronted with evidence that its scientists had been working on a uranium-enrichment program in apparent violation of a 1994 promise, North Korean officials freely admitted the violation and implied that they were working on nuclear weapons as well. Under the 1994 pact, negotiated in part by former U.S. president Jimmy Carter, North Korea had agreed to accept two light-water reactors and 500,000 tons of heavy fuel oil annually in exchange for a freeze on weapons-capable nuclear power. North Korean officials followed the admission with further breaches, expelling International Atomic Energy Agency inspectors, removing surveillance cameras and seals from key sites, and restarting a nuclear plant using plutonium-generating spent fuel rods.

      Some analysts suggested that North Korean strongman Kim Jong Il was using a renewed nuclear threat to extort additional concessions from the West. North Korea, a land of scant resources, in recent years had devoted most of them to military purposes and depended on outside assistance in recent years to thwart famine, power shortages, and hardship for its 22 million citizens. Other analysts suggested that Kim, sensing that North Korea would be the next target of President Bush's campaign against the axis of evil, was arming himself with a nuclear deterrent. In any event, the Bush administration refused to negotiate with the North Koreans, and Rumsfeld pointedly warned that the Pentagon was prepared to fight a second war if Kim felt “emboldened” because of the world's preoccupation with Iraq.

      At year-end the threat of immediate conflict was receding. North Korea had 500 Scud missiles, plus additional Nodong and Taepodong-2 ballistic missiles capable of reaching Japan, Alaska, and eastern Russia. Since signing the 1994 agreement, according to Western intelligence reports, North Korea had gained the capability of producing both chemical and biological weapons. In December former president Carter was awarded the Nobel Prize for Peace, in part for his work on the North Korea situation. (See Nobel Prizes .)

David C. Beckwith

Developments in the States
      A decade-long revenue boom for state governments came to an abrupt halt in 2002 after events conspired to produce the most drastic state fiscal crisis in a half century. After having expanded spending programs freely and cut taxes in sunny economic times, officials were forced to reverse course sharply during the year, raising revenue and reducing services on even essential programs across the board.

      The hard economic times were exacerbated by continuing state struggles with the federal government, usually over which level should fund expensive initiatives such as those covering low-income persons' health coverage, election reform, education mandates, homeland security, and prescription drug costs. Although public education traditionally had been the purview of states, the year saw enactment of a significant new federal law addressing K–12 education, and federal courts approved state tax support for private schools. Those courts also banned state execution of the mentally impaired.

      Forty-four states held regular legislative sessions during the year, and more than two dozen held special sessions, often to deal with budget problems.

Party Strengths.
      Republicans made notable gains in state legislative elections and edged ahead of Democrats in total state legislative seats for the first time in five decades. Democrats, however, continued to erode a recent GOP advantage in governorships, particularly in larger states. The net result was that the two major parties were at virtual parity nationwide at year's end.

      After the new Congress assumed office in January 2003, Republicans would hold both state legislative chambers in 21 states, up from 17 before the election. Democrats would have control in 16 states, down from 18 in 2002. Twelve states were split, with neither party organizing both chambers. Nebraska had a unicameral, nonpartisan legislature.

      The incumbent party was turned out in half of the 36 gubernatorial elections nationwide, and Democrats made modest gains overall. Republicans had a 27–21 advantage (with two independents) prior to November balloting. In 2003 the party lineup would be 26 Republicans and 24 Democrats. (See Sidebar (U.S. 2002 Midterm Elections ).)

Government Structures, Powers.
      Efforts to limit the service of state officials, a popular cause in the 1990s, suffered setbacks during the year. Idaho voters endorsed a legislative initiative, and the state became the first to repeal a term-limit law. Oregon failed to overturn a late 2001 court decision invalidating that state's term limits.

      Rhode Island and North Dakota reduced the size of their legislatures. In Rhode Island the House saw a reduction of 25% (from 100 to 75), while the Senate was reduced from 50 members to 38. The reduction in North Dakota was smaller, the number in the House moving from 98 to 94 and that in the Senate from 49 to 47.

Government Relations.
      Controversy over the appropriate balance of responsibilities between states and the federal government, always fluid in the U.S. federalist system, escalated during 2002. States continued to protest unfunded mandates from Washington and complained that promises of added federal funding had been broken. State officials also campaigned specifically for additional U.S. funds to combat the state fiscal crisis. They noted that, in the absence of federal help, state budget-cutting efforts—raising taxes and cutting spending—would actually aggravate problems caused by the lagging national economy. A measure to provide temporary assistance to states was approved by the U.S. Senate but died owing to opposition from the administration of Pres. George W. Bush.

      States continued to complain about federal foot-dragging in homeland security reimbursement. President Bush proposed spending $3.5 billion to train local first responders, but Congress failed to appropriate the funds. Though Congress approved a law to clean up election procedures nationwide, no money was sent to states for new election machinery or for training the workers at the polls in compliance with the law.

      Nevada Gov. Kenny C. Guinn became the first state chief executive to veto a U.S. presidential decision, having turned down an executive order to establish a nuclear-waste repository at Yucca Mountain, near Las Vegas, Nev. After Congress reversed the state action and reinstated the executive order, the repository battle moved to federal courts. In another federalism struggle, a federal judge enjoined efforts by the U.S. Department of Justice to overturn Oregon's unique assisted-suicide law, which had been approved by state voters twice in the 1990s.

      Long-term trends and cyclic events combined to thrust states into their most dramatic budget crisis since World War II. Revenues plunged as the national economy remained sluggish, and structural problems with state tax sources belatedly surfaced. Even as officials rushed to trim outlays, expenditures continued to rise owing to the escalating costs of medical, security, education, and other programs. The situation was aggravated by actions taken during the 1990s, when an economic bonanza allowed states to reduce tax rates and increase spending extensively.

      Several tax sources continued to deteriorate during the year. States, having found it difficult to tax services—which played a growing role in the modern U.S. economy—experienced a lag in sales-tax revenue. In addition, corporate income taxes dropped owing to the increasingly sophisticated measures used by corporations to move profits to low-tax jurisdictions. Income taxes garnered from capital gains and the exercise of stock options dried up as capital markets edged lower. A survey by the National Governors Association in late 2002 declared that “nearly every state is in fiscal crisis,” with a cumulative budget shortfall for the year of more than $40 billion.

      With 49 states required to balance their budgets, officials moved to stanch the red ink mainly by cutting costs—freezing employee salaries, laying off employees, and cutting Medicaid. Twelve states increased higher-education tuition. Though some 30 states had created a “rainy-day fund” to weather economic hard times, those savings were used to cushion the immediate impact of the downturn. States also tapped funds from the 1998 settlement with tobacco companies to shore up revenues.

      Most states resisted significant politically unpopular tax increases during the election year, although 19 states boosted cigarette levies. At year's end, however, the budget shortfalls in many states continued to accelerate, and fiscal experts predicted that tax-increase legislation was inevitable in many jurisdictions. In late December, California Gov. Gray Davis announced that the state's two-year deficit projection had been raised to just under $35 billion. State workers vowed to resist pay cuts and job losses, and conservative legislators declared that the shortfall was the result of profligate spending and promised to stop tax increases.

Health and Welfare.
      As state finances deteriorated, officials increasingly looked to Medicaid for savings. Outlays for the program, targeted at low-income individuals, rose more than 13% owing to rising medical costs and additional enrollees, even as increasing numbers of middle-class Americans lost their health coverage. Many states responded by reducing medical reimbursements and tightening eligibility, measures that further roiled an embattled health care system.

      Proposals for major reform were debated in several states, but progress was slow as states awaited relief from the federal government. Oregon voters turned down a referendum that would have established the nation's first universal health care program.

      A historic federal education law, which was titled No Child Left Behind, dramatically increased accountability requirements for states and their local school districts. The U.S. Education Department issued the regulations late in the year, however, and some states complained about inadequate direction and funding from Washington. At year's end several states were seeking temporary waivers from federal requirements, but critics viewed the law as a major step forward in improving public education. (See Education .)

      In a landmark 5–4 decision, the U.S. Supreme Court declared that it was constitutional to utilize public funds to assist elementary and secondary students in private and even parochial schools. The ruling upheld a pilot “voucher” program in Cleveland, Ohio, and appeared to settle a key issue in providing additional choice in education. No new states joined Florida, Ohio, and Wisconsin in allowing private school assistance during the year, but the high-court ruling ensured that the idea would be widely considered in 2003.

      Massachusetts voters joined California and Arizona in banning bilingual education, but Colorado rejected a similar measure. In Florida voters approved a measure limiting the number of pupils in a classroom. California endorsed funding for a new after-school enrichment program.

Law and Justice.
      Responding to perceived abuses, West Virginia, Pennsylvania, Mississippi, and Nevada approved new measures to reform their civil liability systems. Critics claimed that sizable jury awards in lawsuits brought by plaintiffs' trial lawyers were creating “jackpot justice” that distorted the economy, caused bankruptcies, and drove some lawsuit targets, including physicians, out of business. The new laws brought to 17 the number of states that had established a limit on punitive, or noneconomic, damage awards and boosted standards of proof in order to stabilize lawsuit risks.

      Voters in Arizona, Ohio, and Nevada rejected marijuana-liberalization proposals. The Nevada measure would have allowed possession of three ounces of the substance for personal use. A federal judge endorsed an antitrust suit settlement between the Bush administration, the Department of Justice, and Microsoft Corp., but intervening attorneys general in Massachusetts and West Virginia vowed to appeal, saying that the deal did not adequately address the software giant's alleged monopolistic practices.

      In a controversial ruling, the U.S. Supreme Court told 20 states that they could no longer execute mentally retarded convicts. The court cited changing public standards, including action by several state legislatures to eliminate the death penalty for those with low IQs.

      The high court decision did not quiet controversy over capital punishment. Maryland joined Illinois in imposing a moratorium on all executions pending a review of procedures. (See Law, Crime, and Law Enforcement: Special Report (Death Penalty on Trial ).)

Energy and Environment.
      The bankruptcy of energy giant Enron Corp., a politically active backer of deregulatory policies, helped stall the spread of electricity deregulation in state legislatures. No new states were added in 2002 to the 26 that had initiated a free market for electricity in previous years. (See Economic Affairs: Sidebar (Enron-What Happened? ).) Oregon voters rejected a proposal to require labeling of genetically modified foods.

David C. Beckwith

▪ 2002

9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
(2001 est.): 286,067,000; significant revision based on the 2000 census
Washington, D.C.
Head of state and government:
Presidents Bill Clinton and, from January 20, George W. Bush

      Resilience had been a fundamental element of the American character from colonial times, but in 2001 the United States' ability to recover from adversity was severely tested. Its national economy, weary from years as the engine of world growth, finally slipped into recession. An energy crisis threatened further disruption, producing major bankruptcies. Terrorist attacks on September 11 coupled with a subsequent public health scare sent shock waves across the nation; the dispirited American morale slowed economic activity further, and the U.S. was soon plunged into a distant Asian war against an implacable fundamentalist regime.

      Within weeks, however, the country had righted its listing self-confidence. Security measures gradually began restoring trust in public institutions. A series of government economic measures, including 11 interest-rate reductions and substantial emergency spending, established a foundation under the rocky economy. The Taliban regime in Afghanistan was rapidly uprooted and dispersed by a devastating show of American military technology. By year's end the United States was on the road to recovery, its position as the world's economic, cultural, and military leader not only restored but burnished in a year of challenges.

Security Crisis.
      Authorities responded immediately to the September 11 events, bolstering safety measures at public buildings, upgrading screening at airports, freezing assets of groups with suspected terrorist ties, and detaining more than 1,000 noncitizens for questioning. The measures included such extraordinary steps as the granting of authority to air force generals to shoot down hijacked civilian airliners and a provision for wartime military tribunals to try suspected alien terrorists. Some measures prompted criticism from civil liberties groups, but public opinion polls showed that the measures were widely supported.

      In a September 20 address to Congress, Pres. George W. Bush announced the creation of an Office of Homeland Security under former Pennsylvania governor Tom Ridge to coordinate the antiterrorism efforts of 40 federal agencies. (See Biographies (Ridge, Tom ).) Within days the new agency confronted a new threat when several employees of a tabloid newspaper publisher in Florida contracted anthrax, an infectious disease ordinarily confined to farm animals, via suspicious mail. Additional anthrax spores were soon discovered in a variety of places, including the offices of Senate Majority Leader Tom Daschle (Daschle, Tom ) (see Biographies), post offices, and various news organizations. Most of the spores were traced to mail originating near Trenton, N.J., but a connection to the September 11 terrorism was never established. By year's end two forms of anthrax had killed 5 persons and sickened 14 and prompted authorities to extend precautionary drug treatment to 32,000 persons and to update inadequate public health emergency-preparedness laws.

      Congress approved a variety of measures to counter economic and security concerns following the terrorist attacks; $15 billion was appropriated to assist U.S. airline firms, including $5 billion in grants; lawmakers appropriated an immediate $40 billion in additional spending for a variety of causes, including stepped-up military activity and assistance to affected areas, such as New York City; and President Bush received authority to expend half of the funds at his discretion. Congress also authorized the use of force to respond to the attacks, provided for federal takeover of some 28,000 airport security workers, and approved an antiterrorism law that allowed expanded law-enforcement powers over money laundering, electronic and telephone eavesdropping, and detention of suspected terrorists.

      By year's end the death toll from the attacks had been revised sharply downward. At one point unofficial estimates had projected up to 10,000 deaths in New York and 500 or more at the Pentagon near Washington. Authorities in December, while cautioning that the precise number of deaths might never be known, put the toll at nearly 2,900 in New York City, with an additional 189 at the Pentagon and 44 in Pennsylvania, where another hijacked plane crashed after passengers attempted to overpower the terrorists.

Domestic Issues.
      The September 11 events proved to be a critical turning point for President Bush and his administration. Bush was inaugurated in January after having lost the popular vote and enjoying the weakest mandate of any recent U.S. president. (See Sidebar (Election Reform Debate in the U.S. ).) Congress was nominally in Republican hands but was almost evenly divided. Bush surprised many observers by pushing an aggressively conservative agenda, including a 10-year, $1.6 trillion tax cut, expanded energy exploration, a faith-based social assistance initiative, and withdrawal from several international treaties.

      Following compromise with congressional Democrats, Bush signed an 11-year, $1,350,000,000,000 tax-reduction bill on June 7 that provided instant $300–$600 rebates to most taxpayers, reduced the four major marginal rates, repealed the estate tax, increased the child-care credit, and provided relief for married couples and incentives for savings.

      In late May veteran Republican lawmaker Sen. James M. Jeffords of Vermont announced that he would leave the GOP and become an Independent caucusing with Senate Democrats. That turned the Senate, previously divided 50–50 but under Republican organization, over to a 50–49–1 configuration under Democratic control. Jeffords cited disappointment with conservative GOP policies, including inadequate spending for education, and allies noted that the White House had slighted him by failing to invite him to a ceremony honouring a Vermont schoolteacher. With Congress now officially divided along partisan lines, Bush's agenda bogged down over the summer, and the president, while still enjoying general popular support, was widely viewed as tentative and ineffective in his public appearances.

      Within days of September 11, however, Bush had shed that image. He delivered a thoughtful eulogy to victims at the National Cathedral service in Washington, D.C., and won praise for his presence in an early visit to the World Trade Center site. Bush's September 20 speech to a special joint session of Congress received widespread acclaim for its eloquence and delivery and helped launch Bush's personal approval ratings in public opinion polls to record levels through the remainder of the year.

      During the fall, measures responding to the terrorist assault were approved by Congress with only modest opposition, particularly legislation covering military preparedness and disaster relief. In the realm of ongoing domestic policy, however, entrenched partisan arguments stopped passage of numerous bills, including several that had been debated for years. Among legislation failing to pass Congress during 2001 were the president's energy security bill (which included oil exploration in an Alaskan wilderness area), campaign finance reform, fast-track trade-negotiation authority, Bush's faith-based social initiative, an agriculture subsidy bill, a federal patients' bill of rights, and a fiscal stimulus bill that administration partisans said was vital to the national economic recovery.

      At year's end Congress did approve a compromise education-reform act cosponsored by Democratic Sen. Edward Kennedy of Massachusetts. The bill required for the first time annual reading and mathematics testing for students in grades three through eight nationwide. It also required school districts to close the gap between poor and middle-class achievement and mandated that consistently underperforming schools allot part of their federal financial assistance to tutoring or providing transportation to other schools. (See Special Report. (Does Testing Deserve a Passing Grade? ))

      Debate over the wisdom and ethics of advanced scientific research grew in intensity during the year. The U.S. House of Representatives approved a bill banning cloning of humans from embryos and prohibiting creation of cloned embryos for research, but the Senate delayed the measure. Under pressure to take a position, President Bush announced in August that he would allow federal funding only for research on the approximately 60 colonies of embryo cells that had been already created, saying he did not believe taxpayer dollars should support further destruction of human embryos. A National Academy of Sciences panel quickly published a report detailing problems with the Bush position, and little was settled on the subject.

      Recent FBI figures revealed that the incidence of serious crime had remained virtually unchanged following eight years of significant decline. The figures showed a modest 0.3% reduction in seven index crimes during the first half of 2001. On June 11 Timothy McVeigh (see Obituaries (McVeigh, Timothy James ))—the main perpetrator of the 1995 Oklahoma City, Okla., bombing of the Alfred P. Murrah Federal Building that killed 168 persons—was executed at a U.S. prison in Terre Haute, Ind. It was the first federal execution since 1963. A second federal prisoner, Juan Raul Garza, convicted of three 1993 drug-related murders, was put to death eight days later in the same prison.

      Republican businessman Michael Bloomberg (see Biographies (Bloomberg, Michael )) prevailed in the highest-profile election of 2001, the race to succeed Rudolph Giuliani as mayor of New York City. Bloomberg spent a record $69 million of personal funds on the campaign. The year's most bizarre political story involved the disappearance from Washington, D.C., of a 24-year-old government intern, Chandra Levy, shortly before she was to return home to Modesto, Calif. Her parents hired lawyers and investigators and turned a glaring media spotlight on their hometown congressman, Democratic Rep. Gary Condit, who eventually admitted to a “close relationship” with the missing woman. Levy remained missing at year's end, and Condit announced that he would launch an uphill bid for reelection.

The Economy.
      The national economic expansion ended with a whimper during 2001. A panel of the National Bureau of Economic Research (NBER) declared in November that the nation's economic growth had ended the previous March, exactly 10 years after it had started, which made it the longest-running expansion since the organization began keeping records in 1854. Government figures showed that gross domestic product had increased by a modest 1.2% in the first quarter and an anemic 0.3% in the second, followed by a 1.3% retraction in the third quarter. Though recessions had traditionally been declared after two consecutive quarters of negative growth, NBER economists, noting continued economic deterioration, cited other factors in their assessment.

      The trauma of September 11 effectively kicked the national economy while it was down. The events further shook consumer confidence, which had been declining, and markedly reduced personal and business travel, entertainment expenditures, and other economic activity. The national jobless rate, which had bottomed at 3.9% in 2000, had started to climb early in the year; it jumped from 4.9% to 5.4% in October, the biggest one-month jump in two decades. By December unemployment had soared to 5.8%, the highest level in six years. Another victim of the terrorist-exacerbated recession was the short-lived federal budget surplus: after a record $237 billion in black ink during fiscal 2000, the U.S. ended fiscal 2001 on September 30 with a fast-diminishing $127 billion surplus, with many fiscal 2002 projections anticipating a return to deficit spending.

      Even so, the recession's impact was cushioned by several events. Fearing an overexuberant stock market and inflation, the nation's Federal Reserve System had nudged up interest rates six times in 1999–2000. In 2001, however, the Fed sharply reversed field and lowered its key federal funds rate on 11 occasions, from 6.5% to 1.75%, in a desperate attempt to revive the failing national economy. The actions provided a ripple effect that lowered borrowing costs across the board for credit cards, mortgages, and businesses. Additionally, as the recession reduced energy demand, oil prices began dropping worldwide, providing further relief to consumers. The nation's major automobile manufacturers began offering no-interest loans in a successful effort to maintain high demand, and new auto sales continued through the last months of 2001 at record levels. The federal government further contributed with cash tax rebates and at least $60 billion in emergency spending following the terrorist attacks.

      By year's end some economists were predicting imminent resumption of national economic expansion. Two major measurements of consumer confidence were rising sharply in December. The Dow Jones Industrial Average, which had dipped as low as 8,235 in the wake of September 11, finished the year over 10,000 and rising. The national inflation rate dropped back to a modest 2.6%, and productivity gains remained strong, which led several economists to predict an end to the recession as the country put memories of the attacks behind it.

      The recession helped avoid a widely predicted energy disaster in California and neighbouring states. As the year began, California was suffering under a mishandled deregulation of electricity that led to severe power shortages and the bankruptcy of a major state public utility. Rolling blackouts plagued the state during January, and many analysts predicted further outages and economic disruption during the summer, when air-conditioner use would be high. A combination of state government assistance to the utilities, a cool summer, upgrading of electrical distribution line efficiency, reduced usage due to recession and conservation, and the worldwide energy surplus largely prevented serious incidents.

      During the height of the crisis, California Gov. Gray Davis denounced out-of-state energy companies for taking advantage of the state and its consumers, and he specifically named the Houston, Texas-based Enron Corp. Late in 2001 Enron—the seventh largest American corporation, with over $100 billion in revenue in 2000—filed for Chapter 11 bankruptcy protection. The company, listing $49.5 billion in assets, became the largest company in U.S. history to go under. The failure was only tangentially related to its long-running exploitation of deregulated markets for wholesale natural gas and electricity. Analysts discovered that key company officials, while operating largely unregulated marketplaces trading derivative energy contracts, were simultaneous running private off-book partnerships and profiting personally, even while they overstated Enron profits. The company's failure was particularly hard on employees, many of whom had retirement funds tied up in near-worthless company stock.

      The world's leading software company, Microsoft Corp., avoided a court-ordered breakup by settling its antitrust case with the Bush administration Justice Department. The company had been found guilty of monopolistic practices in a case brought by the Bill Clinton administration and ordered divided into at least two parts. An appeals court panel in June confirmed that Microsoft had monopoly power but disqualified the original trial judge for injudicious comments outside the courtroom. After Microsoft allowed computer makers to disable some parts of its Windows operating system and replace them with software from other firms, the replacement judge approved a settlement allowing the company to stay intact.

Foreign Policy.
      Within hours of the September 11 attacks, the Bush administration began preparations for a military assault on the al-Qaeda network in Afghanistan and started assembling international support for the mission. The U.S. received immediate and strong support from British Prime Minister Tony Blair, who helped the U.S. rally world opinion. The partners took pains to assure that it was international terrorists and their protectors who were targets, not Islam. In the end some 60 countries offered tangible assistance, including Muslim Pakistan as well as Russia, which provided access to military bases in nearby Tajikistan. The U.S. doubled its military presence in the region to 50,000 during the month the hostilities began.

      Demands that the Afghan Taliban regime locate and turn over Osama bin Laden (see Biographies (bin Laden, Osama )) to international forces were met by evasion, then refusal. U.S.-dominated military action started with cruise missile, bomber, and fighter jet attacks throughout Afghanistan on October 7, followed by continued military operations in support of the Northern Alliance Afghan resistance fighters. At the beginning U.S. preparations were met by a hailstorm of criticism and doubts; critics suggested that Americans would be repeating Russian mistakes in Afghanistan or would be bogged down in a Vietnam-style Asian conflict. Instead, the operation was largely completed in 11 weeks as the Taliban was driven from power, replaced by a UN-brokered coalition; Bin Laden's fighting forces, which included Arabs, Pakistanis, and Chechens, were killed or dispersed.

      In his September 20 congressional speech, President Bush declared, “From this day forward, any nation that continues to harbor or support terrorism will be regarded by the U.S. as a hostile regime.” By year's end neither Bin Laden nor Taliban leader Mullah Mohammad Omar had been located. While continuing to search for Taliban and al-Qaeda leadership in the area, the U.S. turned its attention toward other countries facilitating terrorist activity. The ongoing confrontation with rogue organizations and states, especially those believed to be developing chemical, biological, or nuclear weapons, continued to dominate world affairs.

      Cooperation on Afghanistan was a highlight of improved U.S. relations with Russia. In mid-November, Russian Pres. Vladimir Putin visited Washington and Bush's ranch in Crawford, Texas. Talks appeared promising when Putin said he would consider allowing the U.S. to test a missile defense system even though the test would be an apparent violation of the 1972 Anti-Ballistic Missile (ABM) Treaty, provided the two countries could agree on nuclear weapons reductions. Bush announced that the U.S. would slash nuclear warheads from 7,000 to the 1,700–2,200 range over the next decade, and Putin hinted at similar reductions in the Russian 5,800-warhead arsenal. The two were never able to hammer out an agreement on the antimissile test, however, and in December Bush announced that the U.S. would withdraw from the treaty and thereby leave the way open for missile defense testing.

      The long-running U.S. effort to broker a lasting peace in the Middle East appeared to collapse during the year. Talks between Israeli and Palestinian leaders sponsored by former president Clinton had fallen apart in late 2000, producing violence that escalated during 2001. After a period of inaction, the Bush administration attempted to revive talks but without success, and after September 11 Israeli advocates successfully likened Palestinian bombing and assaults to the terrorist attacks in the U.S. President Bush pointedly declined to condemn Israeli military responses against the Palestinian population and refused to meet with Palestinian leader Yasir Arafat.

      Free-trade advocates scored a major advance at an international meeting in Doha, Qatar, when major countries agreed to begin a new three-year round of trade negotiations. The talks would be aimed at reducing agricultural trade barriers and industrial tariffs. The U.S. made concessions, putting its antidumping law under review in spite of opposition from American steel interests and agreeing that less-developed nations could override drug patents in the interests of public health. Most analysts declared that no new trade agreement could be negotiated, however, unless the U.S. Senate voted fast-track negotiating authority to President Bush. The U.S. normalized trade relations with China during the year after having cleared the way for China's membership in the World Trade Organization.

      During 2001 the focus of war concerns shifted to Asia, including Afghanistan. U.S. military efforts aggravated the decades-long conflict between India and Pakistan, and the two countries, both possessing nuclear weapons, were at the brink of war at year's end. Ironically, in an effort to encourage cooperation in the Afghan operation, the U.S. had lifted sanctions imposed on both countries following their 1998 nuclear tests. Problems with North Korea, one of the world's last communist regimes, continued to fester and led to periodic threats of war against the U.S. and its allies, including Japan.

      The Bush administration's efforts to build a coalition to support military measures in Afghanistan reversed what critics had labeled U.S. rejection of international solutions to world problems, including its refusal to sign the ABM Treaty and the Comprehensive Test Ban Treaty. Earlier in the year the Bush administration had officially rejected the Kyoto Protocol, suggesting that the anti-global-warming treaty would affect the global economy disproportionately. In late summer the U.S. sent a delegation to the UN World Conference Against Racism in Durban, S.Af., but walked out in protest against proposed conference resolutions calling for reparations to blacks for slavery and for condemnation of Israel for alleged racism against Palestinians.

David C. Beckwith

▪ 2001

9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
(2000 est.): 275,372,000
Washington, D.C.
Head of state and government:
President Bill Clinton

      The United States stormed into 2000 full of energy and confidence, its economy purring, its world leadership role unchallenged, and its two-century-old democratic experiment still vigorous. Incidence of crime, welfare dependency, and joblessness were down, and the stock market was soaring.

      In February economic expansion surged through its 108th straight month, surpassing the nation's consecutive growth record set in the 1960s. A month later national capital markets hit all-time highs. A spirited battle was under way as both major political parties eagerly vied to supply the successor to Pres. Bill Clinton, whose legacy of economic prosperity and centrist-policy successes had been diminished only by personal scandal. Optimism was soaring, and the U.S. was the envy of the world in the realms of democracy, economy, cultural offerings, and military might.

      By year's end, however, the national mood had markedly changed. The new tone was one of bewilderment, even creeping pessimism. The national election, far from confirming a clear new path, had ended in a puzzling stalemate capped by an unprecedented and dispiriting legal challenge. The stock market was slumping badly; consumer confidence was shaken; and economic statistics had suddenly turned ominous. The effectiveness of American world leadership was under challenge. Americans seemed badly divided, even rudderless, and commentators had difficulty pinning down a precise cause.

Politics and the Election.
      Ever since the end of the Cold War a decade earlier, the U.S. had struggled to find a sense of national direction. Secure in the dominance of its economy and national security apparatus, the country internally split into two relatively equal political camps. The Democratic Party favoured the government's moving more actively to assist those citizens left behind in the general prosperity, whereas the Republican Party (GOP) believed that government should step back and allow the ingenuity of the American people to produce without interference. The 2000 election, if anything, muddled the debate further—the most equivocal result in U.S. history, a near 50–50 split on virtually every level of government, with no clear call for any political party or ideology.

      If any trend emerged from the national balloting, it was that the incumbent party lost. The Republican ticket of George W. Bush (Bush, George W. ) and Richard B. Cheney (Cheney, Richard B. ) narrowly defeated Democratic challengers Albert A. Gore, Jr. (Gore, Albert A., Jr. ) and Joseph I. Lieberman (Lieberman, Joseph I. ). (See Biographies.) Democrats, however, narrowed their deficit in the U.S. House of Representatives for the third election in a row, leaving the Republicans with less than a 10-seat advantage. Democrats also erased the Republican lead among U.S. senators, creating a 50–50 tie in the upper chamber. It was much the same on the state level—Republicans made gains in state legislatures (where Democrats had enjoyed a slight advantage), creating a virtual tie in party control nationwide, and the GOP lost part of its sizable lead in govenorships. (See Special Report (U.S. Election of 2000 ).)

      For the first time, a presidential spouse entered elective politics. Hillary Rodham Clinton, though outspent by her Republican opponent, Rick Lazio, won the open U.S. Senate seat in her adopted state of New York. Only six weeks prior to the election, special counsel Robert Ray had concluded a six-year investigation of the Clintons, pointing out untruthful testimony by Hillary Clinton but concluding that there was insufficient evidence to prove indictable criminal wrongdoing. In another unusual congressional contest, a plane crash claimed the life of Missouri Gov. Mel Carnahan, Democratic challenger for the Senate seat held by John Ashcroft, only days before the election. Nonetheless, Carnahan won a narrow victory after the new governor promised to appoint Carnahan's widow, Jean, to the Senate seat.

      Awaiting a signal from voters, Congress approved almost no major legislation during the year. With both parties contesting for support from the technology-driven “new economy,” two bills sought by Silicon Valley were easily approved. They expanded H1-B visas for highly skilled foreign workers and settled the legality of electronic signatures for commercial transactions.

      Other legislative accomplishments in a divided government were scarce. Neither Congress nor President Clinton made any serious attempt to reform Social Security or Medicare. For the third year, legislators could not establish a “patient's bill of rights” in dealing with health maintenance organizations, provide prescription-drug coverage for seniors, or enact more than nominal campaign-finance-reform legislation. Late in December, however, Clinton unveiled sweeping new rules to guard the privacy of patients' medical records; doctors and hospitals would be required to secure a patient's consent before disclosing health information to a third party. Congress also was unable to undo a June U.S. Supreme Court decision that voided state attempts to outlaw “partial-birth” abortions.

      Amid charges of election-year posturing, the Republican Congress approved bills eliminating the national estate and gift tax and ending the penalty imposed on two-income married families. Both were vetoed by President Clinton, who claimed the measures disproportionately favoured the wealthy. Clinton also vetoed a measure establishing a long-sought repository for nuclear waste at Yucca Mountain, Nevada, 160 km (100 mi) northwest of Las Vegas, Nev. None of the vetoes was overridden.

The Economy.
      The country's historic economic expansion finally ran out of steam late in the year. The slowdown arrived abruptly, without overt warning, and economists later blamed a combination of causes, including higher oil prices, violence in the Middle East, the uncertain election, delayed effects of multiple interest-rate increases, a stock market decline, the bursting of the Internet bubble, and the simple age of the up cycle. By year's end, with economic activity slowing and consumer confidence dropping, most analysts were predicting a period of reduced growth or even an economic recession.

      Fueled by world leadership in telecommunications and high tech, the U.S. economic engine actually accelerated early in the year. Effects of a widely feared year 2000 computer problem proved minimal, thanks to expensive remedial preparations. After gross domestic product (GDP) posted a robust 4.2% gain in 1999, the economy expanded by an extraordinary 5.6% in the first half of 2000. This led economists to worry anew over the potential revival of inflation, which crept steadily upward after several years in the nominal 2% range.

      Under Chairman of the Board of Governors of the U.S. Federal Reserve System (Fed) Alan Greenspan (see Biographies (Greenspan, Alan )), the inflation-fighting Fed had enacted three small interest-rate increases in 1999 and followed that with three more in early 2000, including a full 0.5% boost in May. That left interest rates 1.75% higher than in 1999, driving up costs for corporations and individuals alike. Even as those increases flowed through the system, the economy was further shocked by rapid increases in oil prices worldwide, the result of a 1999 cutback in production by OPEC cartel countries. At one point oil prices topped $35 per barrel, three times the price level in December 1998. Increased energy costs affected everyone, particularly in the Midwest, where supply and refinery problems sent gasoline prices spiraling above $2 per gallon.

      The twin blows from interest and energy increases produced a marked effect on financial markets. The National Association of Securities Dealers automated quotations (Nasdaq) stock market index, heavy with high-flying technology companies and overbought dot-coms, topped out above 5000 in March and then began an erratic and prolonged descent. By year's end the average had been halved—the worst performance in Nasdaq's nearly 30-year history. (See Economic Affairs .) The year's biggest economic story was the long-anticipated shakeout in dot-coms, companies attempting to capitalize commercially on surging use of the Internet. Dozens of once-high-flying firms exhausted their start-up funds without showing a profit during the year, and their bankruptcies or mergers contributed to the darkening mood by year's end.

      By the third quarter, GDP growth was down to 2.2% and slowing. Joblessness stayed near the 30-year low rate of 3.9% established during the year, but many companies were announcing layoffs and cutbacks at year's end. Inflation rose a modest 3.5%, but it too was trending upward.

Domestic Issues.
      Serious crime, which had declined in the U.S. for eight consecutive years, leveled out during 2000. Incidence of eight major personal and property offenses reported to local law-enforcement authorities dropped 0.3% during the first half of the year, compared with a 9.5% decrease in 1999. Analysts noted that demographic trends spurring the decrease over the previous decade—including a reduction in the crime-prone 15–25-year-old male population—would be reversed in coming years.

      Researchers funded by the federal government announced in June that they had virtually completed deciphering the entire human genetic code, well ahead of schedule. (See Life Sciences: Special Report (Human Genome Project:Road Map for Science and Medicine ).) A jury in Florida awarded a record $145 billion in punitive damages in a class-action suit brought against major tobacco companies by smokers afflicted with tobacco-related illnesses. Tobacco company officials warned that the verdict could prompt bankruptcies in the industry and adversely affect the 25-year, $246 billion settlement negotiated with states in 1998.

      In Washington, D.C., Judge Thomas P. Jackson, who had earlier declared software giant Microsoft Corp. guilty of antitrust violations, ordered the company broken up. The decision was immediately appealed by Microsoft, which started the year as the world's most valuable enterprise. If sustained on appeal, the ruling would produce the country's largest government-mandated breakup since AT&T was restructured in 1984.

      Reports of numerous deaths and injuries—eventually totaling 148 and 500, respectively—on Ford Motor Co. products, particularly the popular Explorer sports utility vehicle, prompted a historic recall of 6.5 million Firestone tires. As federal officials investigated, Ford and Firestone parent Bridgestone Corp. each blamed the other firm's manufacturing or design process for the problems. Lawmakers criticized both companies at separate House and Senate hearings. Firestone's chief executive officer made a public apology, while Ford said it would not rest until every faulty tire had been replaced. By year's end, Ford had settled at least seven lawsuits and planned to settle more cases stemming from accidents involving Explorer vehicles and Firestone tires.

      Another major economic setback occurred when forest fires swept a dozen western states in the summer, charring more than 400,000 ha (1,000,000 ac). One particularly virulent fire, which caused an estimated $300 million damage to the federal Los Alamos (N.M.) National Laboratory alone, started as a “controlled burn” set by the U.S. Forest Service in New Mexico.

      The booming economy and a landmark 1996 federal law helped spur a continued reduction in public-assistance rolls during the year. President Clinton noted that welfare caseloads nationwide had dropped by eight million, or 60%, during his presidency, most after passage of a bipartisan welfare-reform act just prior to the 1996 election. Apparently ending a decade-long controversy, the Food and Drug Administration (FDA) approved the U.S. sale of the European-developed “morning after” drug RU-486. The drug, also known as mifepristone, allows women to terminate a pregnancy up to several weeks following sexual contact. Republican presidential candidate George W. Bush said that he opposed the FDA move, but he stopped short of promising to reverse it.

      Preliminary census results for April were released on December 28. The U.S. population swelled to 281,421,906.

Foreign Policy.
      No challenger emerged during the year to the U.S.'s claim as the sole world superpower. Russia, Japan, and China continued to struggle with internal economic weakness, and European attempts to consolidate were hampered by an underperforming currency and intramural political difficulties. Throughout the year the U.S. military was deployed around the world to keep the peace, and its superiority in any pitched engagement was unquestioned.

      The resulting U.S. vulnerability to terrorism was underscored anew on October 12, however, when an explosives-laden rubber boat rammed a U.S. destroyer, the USS Cole, docked in Yemen for refueling. The resulting charge tore a major hole amidships, killing 17 American sailors and wounding 39. American investigative authorities rushed to the scene but received only desultory cooperation from sovereignty-minded Yemeni officials. U.S. forces were placed on alert worldwide, and, fearing sabotage, American authorities temporarily stopped military vessels from using Egypt's vulnerable Suez Canal.

      No credible group claimed responsibility for the assault. U.S. investigators soon focused suspicion on Osama bin Laden, a Saudi dissident operating a terrorist-training organization under protection of Taliban authorities in Afghanistan. Bin Laden had reportedly planned coordinated terrorist assaults on U.S. interests worldwide on Jan. 1, 2000, including an attack on a U.S. ship visiting Yemen, but most plans had been at least temporarily thwarted.

      The probe of a mysterious October 1999 EgyptAir plane crash off the coast of Nantucket, Mass., stalled as American and Egyptian investigators produced conflicting interpretations of available evidence. U.S. officials attributed the cause of the crash to a suicide by an off-duty co-pilot, Gamil al-Batouti, who was at the controls as the jumbo jet stalled and went into a fatal dive. Egyptians suggested that equipment failure prompted the disaster.

      U.S. foreign-policy makers could claim a major victory when Yugoslav Pres. Slobodan Milosevic resigned on October 6. U.S.-led NATO forces conducted a major bombing campaign against the Milosevic regime in early 1999 to stop mistreatment of ethnic Albanians in Kosovo (a province of Serbia) and had maintained economic sanctions against his regime following cessation of military action. Milosevic lost an election in late September but was holding out for a runoff when Serbian citizens stormed government buildings in Belgrade, prompting an immediate change in government.

      Two other peace initiatives championed by President Clinton suffered setbacks during the year. A peace plan in Ireland, which Clinton helped negotiate in 1998, stalled as the Irish Republican Army refused to decommission (surrender or destroy) its heavy weapons.

      The long-running Middle East peace process, on the verge of a major breakthrough at midyear, virtually collapsed despite major efforts by Clinton and his administration. Clinton summoned Israeli Prime Minister Ehud Barak and Palestinian leader Yasir Arafat to Camp David, Maryland, on July 11–25 for intensive discussions. With Clinton shuttling between the two and exerting maximum pressure, the principals edged close to an agreement before an impasse was ultimately declared. The major sticking point was the legal status of Jerusalem, which both Arabs and Jews claimed as their capital.

      In ensuing weeks the process broke down completely. Palestinian rioting began after conservative former general Ariel Sharon visited Temple Mount, technically in a neutral zone but traditionally off-limits for prominent Jewish visitors. Barak, suffering political criticism for excessive accommodation at Camp David, responded with force. As violence escalated, Israel suspended participation in the peace process, and Barak announced new national elections in 2001.

      Fidel Castro, the target of U.S. economic sanctions since shortly after he took over Cuba in 1959, enjoyed propaganda victories at the U.S.'s expense. Castro mobilized Cuban public opinion to demand the return of Elián González, a six-year-old boy whose mother had died at sea while fleeing Cuba for the U.S. in late 1999. The administration announced it would comply in early January, but the boy's Miami, Fla.-based relatives sued, tying his fate up in legal wrangling for months. In April Elián's father, Juan Miguel González, traveled to the U.S. to escort his son home.

      Following a legal ruling, armed agents of the Immigration and Naturalization Service stormed the Miami home of the boy's relatives in the early hours of April 22, seizing the child at gunpoint and reuniting him with his father in the Washington, D.C., area. U.S. authorities, however, prohibited the Cubans (now joined by several of Elián's Cuban classmates) from leaving until court appeals had been exhausted. Finally, on June 28, after the U.S. Supreme Court refused to issue a stay, Elián and his father returned to Havana and a highly publicized Castro welcome. Castro later poked fun at election difficulties in Florida, offering to send election assistance to ensure that democracy prevailed.

      U.S. relations with China continued on an uneven path. Trade relations between the two countries were finally normalized in October, overcoming U.S. concerns over Chinese human rights problems, China's militant attitude toward Taiwan, and the exclusion of U.S. investment. China lodged vigorous objections to U.S. prosecution of Wen Ho Lee, a scientist at the Los Alamos (N.M.) National Laboratory accused of having sent U.S. nuclear secrets to China. After having publicly proclaimed overwhelming evidence against Lee, the U.S. abruptly allowed the Taiwan native to plead guilty to reduced charges and thus seemingly confirmed China's reservations.

      The 1997 Kyoto global warming treaty, which would require the U.S. and other industrial countries markedly to reduce greenhouse gas emissions, suffered a major setback in a conference at The Hague. Complications over higher oil prices, the collapse of the Russian economy, and a plan to allow wealthy nations to buy “credits” for excessive emissions from less-developed countries prompted a near collapse of ongoing negotiations.

David C. Beckwith

▪ 2000

9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries
(1999 est.): 273,131,000
Washington, D.C.
Head of state and government:
President Bill Clinton

      The 20th century had become widely known as the American Century, and the United States ended it by implanting an exclamation point on that concept. Even while its national government was effectively mired in gridlock—perhaps because of it—the U.S. economy in 1999 roared ahead in a ninth consecutive year of vibrant expansion, its most enduring ever. U.S. leadership was recognized worldwide. Seldom in history had a country so dominated the globe in so many ways—militarily, culturally, economically, scientifically. Commentator Alan Murray, writing in The Wall Street Journal, encapsulated the country's enviable position by saying, “The U.S. enters the 21st century in a position of unrivaled dominance that surpasses anything it experienced in the 20th. Coming out of World War II, the U.S. may have controlled a larger share of world output; but, it also faced threats to its security and its ideology. Today, those threats are gone, and the nation far outstrips its nearest rivals in economic and military power and cultural influence. America's free-market ideology is now the world's ideology; and the nation's Internet and biotechnology businesses are pioneering the technologies of tomorrow.”

      The national economic prosperity, however, masked an internal disquiet and raised difficult, perhaps unanswerable questions about the country's direction. Some concerned the U.S. responsibility as unquestioned world leader to act as a global policeman and confront human rights abuses abroad. Other questions addressed perceived inequity and deterioration in American society. The gap between rich and poor continued to widen, and evidence of breakdown in the traditional American family mounted as well. A series of mass shooting incidents across the country, highlighted by a major tragedy in Littleton, Colo., shocked a nation that still cherished its frontier heritage. At century's end the United States was still seeking internal harmony to accompany its economic might.

The Economy.
      The purring U.S. economy seemed to defy gravity during the year. Economic expansion continued at an average 3.5% rate for the eighth consecutive year. Such old-line measures as housing starts and vehicle production recorded unprecedented results; unemployment drifted down to 4.1%, its lowest level since 1970; and consumer confidence again hit a record high. These figures, which historically would have aroused inflationary fears, were now accompanied by low interest rates and a slow 2% growth in the consumer price index, combinations that economists said were unprecedented.

      To a great extent the economic boom in the U.S. was powered by its unquestioned premier position in technology, which allowed major gains in productivity and made near-instant millionaires out of thousands of entrepreneurs and investors. Some 570 new companies—half of them Internet-related enterprises— sold stock in initial public offerings during the year, raising a record $69.2 billion in new capital. Technology also powered the stock market; while the Standard & Poors Index of 500 stocks rose 20% during 1999, the tech-dominated Nasdaq index climbed 85%, and some Internet mutual funds rose 300% or more. To some critics the soaring values confirmed the market's faith in future technology profits. Others, however, viewed the soaring equity prices as a mania, the triumph of greed over common sense, a speculative bubble that would eventually have to burst.

      Unlike past years, the U.S. economic leadership enjoyed favourable global tail winds, with most major economies in Europe and Asia also posting positive growth. Fine-tuning of the U.S. economy was again supervised by the U.S. Federal Reserve System, which nudged federal fund interest rates upward three times during the year—from 4.75% to 5%, to 5.25%, and to 5.5%. These minihikes precisely reversed three identical stimulative cuts in 1998 and this time served to quench inflationary fears inflamed by the country's torrid economic performance. The irrepressible economy virtually ignored several major natural disasters. These included a record five category-four hurricanes along the east coast, including Floyd, which inflicted $6 billion in damage in North Carolina, where 30,000 homes were inundated and 42 people were killed.

      The economy also shrugged off two major challenges: the Year 2000 computer problem and a major antitrust ruling against Microsoft Corp., the country's largest company in terms of market value. Any adversity for Microsoft was serious, since the Seattle,Wash.-based firm, along with chip maker Intel Corp., had largely created the standards by which the U.S. dominated world technology. In findings of fact issued by a U.S. district judge in Washington, D.C., on November 5, after a 13-month trial, Microsoft was determined to have abused its near monopoly on personal computer operating systems, the software that runs computers. The judge also found that Microsoft had misused its dominant position to try for a similar advantage in marketing “browser” software used to explore the Internet. Remedies, which could include the breakup of the company, were to be decided in 2000. In the financial markets, however, the ruling was a nonevent. Prices of technology issues, including Microsoft, rose by more than 20% in the six weeks following the ruling.

      Concern over Y2K problems was intense in early 1999, with some experts predicting disasters ranging from a breakdown of the international banking system to a near-complete shutdown of world power supplies. The worry abated during the year, however, as companies and governments spent billions correcting computer programming and reassuring consumers about their efforts. Still, the year ended with some apprehension. Major airlines canceled one-third of their December 31 schedule, and numerous commercial parties celebrating the new millennium (see Mathematics and Physical Sciences: Sidebar (New Millennium-Just When Is It Anyway? ).) were canceled as potential patrons decided to stay safely at home.

Domestic Affairs.
      Antipathy between the Democratic president and the Republican Congress led to a virtual legislative stalemate during the year. The list of major measures either defeated or deferred was far longer than the number of significant legislative accomplishments. “This was a session that was postimpeachment and preelection,” observed U.S. Sen. Joseph Lieberman, a Democrat from Connecticut. Both sides entered 1999 in a weak position; Clinton faced an impeachment trial, and the GOP control over the U.S. House had been reduced to only five seats in late 1998 elections. As 1999 ended in near gridlock, both Clinton and the GOP Congress were, if anything, even weaker; Clinton battled oncoming lame-duck status and declining support in the polls, and many commentators predicted that Democrats would regain control of the U.S. House of Representatives in 2000 elections.

      A long-delayed reform of the nation's banking laws was signed into law, largely breaking down barriers to entry between the banking, financial, and insurance industries. Congress also gave flexibility to states in using federal education dollars and, following years of contentious debate, committed to development of a ballistic missile defense system for U.S. territory and armed forces.

      For the fourth consecutive year, Senate Republicans killed an overhaul of the nation's campaign finance laws. A bill approved by the House trimmed back “soft money” contributions to major political parties but was judged by GOP senators as restricting free-speech rights of their supporters, including corporations. Congress also turned down major legislative initiatives to restrict sales of handguns and to reform the nation's bankruptcy laws.

      President Clinton vetoed a 10-year, $792 billion tax-cut measure approved by Congress, calling the measure inequitable and excessive. The Senate rejected a Comprehensive Test Ban Treaty submitted by the administration; Republican-led opponents maintained that the treaty would hinder U.S. defense efforts without providing any real benefits. Owing to preelection maneuvering on both sides, no serious attempt was made to address badly needed reform of both Social Security retirement and Medicare systems, which were financially endangered by an imminent influx of baby-boomer recipients. Election-year considerations also delayed deliberation of two other popular ideas—a proposal to add a prescription drug benefit to the Medicare program for seniors and various bills regulating health maintenance organizations (HMOs), including the establishment of a “bill of rights” for health-plan patients. In both cases Democrats advocating the measures decided that debate on the proposals would be more useful during election year 2000.

      Two unexplained airplane crashes received overwhelming news coverage. On July 16 a plane that took off from Newark, N.J., piloted by John F. Kennedy, Jr. (see Obituaries (Kennedy, John Fitzgerald, Jr. )), crashed on approach to Martha's Vineyard, Mass., killing Kennedy, his wife, Carolyn, and her sister. On October 31 EgyptAir Flight 990 fell into the sea only minutes after takeoff from New York's Kennedy Airport en route to Cairo, killing all 217 aboard. U.S. investigators found no evidence of explosion or mechanical failure aboard the Boeing 767-300 and initially pointed to a suicide attempt by a relief co-pilot on board as the possible cause. When angry Egyptians blamed anti-Arab bias for this theory, however, U.S. officials backed away. The mystery remained unsolved at year's end.

      In a tragedy that dramatically affected the national mood, two heavily armed students terrorized Columbine High School in Littleton, Colo., on April 20, killing 12 other students and a teacher before turning their weapons on themselves. Although Eric Harris, 18, and Dylan Klebold, 17, were apparently reacting to personal social rejection, their actions highlighted the long-running U.S. struggle to reconcile its constitutional protection of gun ownership with the realities of modern urban violence.

      Mass shootings also erupted during the year all across the country—at a high school in Conyers, Ga., at two Atlanta, Ga., brokerage firms, on city and suburban streets in Indiana and Illinois, at a Jewish community centre in Los Angeles, at a Baptist church in Fort Worth, Texas, and at a Xerox warehouse in Honolulu. In most cases hatred of minorities appeared to fuel the attacks. The incidents renewed the national debate over gun control, reversed a trend toward liberalized gun-possession laws nationwide, and prompted concentrated examination, even as the U.S. enjoyed unprecedented prosperity, of the direction in which the country would move in the future.

      Even so, preliminary FBI figures indicated that incidence of serious crime in the U.S. dropped by 10% during 1999, the seventh consecutive year of declining crime rates. Analysts attributed the trend to a healthy economy, tougher laws, longer sentences, and added prison capacity.

Clinton and Politics.
      As the year began, the second presidential impeachment trial in U.S. history began with pomp and ceremony in the U.S. Senate chamber. Conviction of the president by the required two-thirds vote was never a serious possibility in the partisanly divided upper chamber, particularly with public opinion polls showing nearly two-thirds of Americans opposed to removing Clinton from office on the two impeachment counts submitted by the U.S. House of Representatives. At one point the Senate came close to postponing the proceedings indefinitely by majority vote. A compromise, approved largely along party lines, however, allowed the trial to proceed but permitted new testimony from only three witnesses.

      Most of the five-week trial was rhetorical, with 13 impeachment managers from the House summarizing previously recorded evidence against the president, followed by rebuttal from Clinton's personal and White House attorneys. The final vote was not close, with only 45 of 100 senators supporting conviction on Article One, the perjury count, and 50 voting guilty on Article Two, obstruction of justice. Following acquittal, senators of both parties nonetheless condemned Clinton's conduct, but a Democrat-led effort to issue a resolution of censure against the president was blocked by Sen. Phil Gramm, a Republican from Texas. (See Sidebar (Prosecuting the President ).)

      Although intensity diminished, Clinton's image troubles continued during the year. In February the NBC television network broadcast a detailed interview with an Arkansas woman, Juanita Broaddrick, who alleged that in 1978 Clinton had sexually assaulted her in a Little Rock, Ark., motel room. On April 12 U.S. Judge Susan Webber Wright held Clinton in contempt of court for having provided “intentionally false” testimony in the Paula Jones sexual harassment case. Clinton's sworn statement denying “sexual relations” with a White House intern, Monica Lewinsky, had helped persuade Wright to dismiss the Jones case, but Clinton later admitted to “inappropriate intimate contact” with Lewinsky. Clinton, who had paid Jones $850,000 to settle the case in 1998, was ordered to hand over an additional sum of close to $89,000 in legal expenses as compensation for the errant testimony.

      Clinton's reputation hung heavily over early maneuvering for the 2000 U.S. presidential election. Texas Gov. George W. Bush, son of former president George Bush, sprinted to a commanding early lead for the Republican nomination, in part by pledging to restore dignity to the Oval Office. Bush raised nearly $70 million in contributions during the year, double the previous record, and announced he would forgo matching federal funds in order to increase his flexibility in campaign spending. After some hopefuls dropped out, Bush was being contested at year-end by five other GOP candidates, notably maverick Sen. John McCain of Arizona, a former prisoner of war in Vietnam.

      Former senator Bill Bradley of New Jersey, a onetime professional basketball player, mounted an unexpectedly serious challenge to Vice Pres. Al Gore for the Democratic nomination. Gore was endorsed by Clinton and enjoyed backing from many party regulars, but Bradley's campaign was lifted by “Clinton fatigue,” a feeling that the Clinton-Gore administration had worn out its welcome.

Foreign Affairs.
      With no real challenger for world economic leadership, the U.S. nonetheless struggled to find its proper role in post-Cold War political affairs. Internally, there was no clear direction on when the United States should use its power and influence to intervene in conflicts abroad. Beset by internal politics, the U.S. also suffered setbacks in its efforts at building international consensus.

      The West's long-running problem with Yugoslav Pres. Slobodan Milosevic (see Biographies (Milosevic, Slobodan )) over ethnic persecution in his country boiled over into major violence. Milosevic repeatedly stalled efforts to enforce United Nations resolutions seeking exit of Serbian forces from the province of Kosovo, where Serbs dominated a population consisting of 90% ethnic Albanians. A stream of Albanian refugees into neighbouring areas turned into a flood when native Serbs, apparently with military backing, stepped up a campaign of terror, property destruction, and killings early in the year. Up to one million Albanians were displaced.

      On March 24 U.S.-led NATO forces began devastating bombing and missile attacks on Yugoslav positions. The strikes continued for 78 days and finally caused Milosevic to agree to the withdrawal of Serbian forces, the safe return of Albanian refugees, and the introduction of armed UN peacekeepers to ensure an end to violence. In the process, however, NATO forces made numerous mistakes, bombing journalistic buildings, civilian residential areas, and bridges. In one notable miscue, a U.S. B-2 stealth bomber destroyed the embassy of China in Belgrade, Yugos., killing three Chinese civilians and wounding 20 others. It was later revealed that the CIA had obtained the correct street address for a Serbian-controlled building but assigned it on a map to the wrong building, the embassy. President Clinton expressed “regrets and profound condolences,” but the incident had an impact on the U.S.'s rocky relations with China. In mid-December the U.S. promised to pay China $28 million in compensation for the May bombing.

      For 10 years, since the fall of the Berlin Wall and the breakup of the Soviet Union, the U.S. had made special efforts to influence Russia, in large part to encourage dismantlement of Russia's 3,000 nuclear warheads. Relations deteriorated markedly during 1999, however, as suspicions grew that individuals in Russia had embezzled and squandered billions of dollars in foreign assistance. In October three Russian immigrants as well as their companies were indicted on charges that stemmed from an investigation of money laundering at the Bank of New York. At year-end, over U.S. objections, Russian military forces engaged in another attempt to subdue the breakaway republic of Chechnya, which further strained U.S.-Russian relations.

      China, seen by many as the eventual challenger to U.S. world domination, continued to provide major headaches for U.S. policy makers. (See World Affairs: China: Special Report (China: Asia's Emerging Superpower ).) Despite U.S. entreaties, Chinese leaders provided no substantive satisfaction on charges that they had supervised the theft of U.S. nuclear lab secrets, improperly financed the 1996 U.S. election, threatened Taiwan militarily, violated human rights by suppressing political dissent and imposing population control measures, and unfairly barred U.S. businessmen from operating in China. Chinese Premier Zhu Rongji visited Washington in April, but President Clinton refused to sign a trade pact with China because he feared political backlash in the U.S. The Chinese embassy bombing a month later caused relations to deteriorate further and prompted virulent anti-American demonstrations all over China. By November, however, Clinton had agreed to a wide-ranging trade agreement that promised Chinese membership in the World Trade Organization (WTO) and increased access for American business in Chinese markets. The agreement required U.S. Senate approval in 2000 of “normal trade relations,” or permanent most-favoured-nation status, however, and prospects for Senate passage appeared anything but assured.

      With China and Russia reestablishing a close relationship after 30 years of estrangement—motivated in part by mutual antipathy toward the U.S.—world leaders looked to a November WTO meeting in Seattle to provide evidence of comity in the world community. Organizers had hoped for the international ratification of a major agreement reducing trade barriers worldwide, a document that had been prepared over years by trade officials. Instead, sometimes violent protests by masked demonstrators rocked the city, forcing delegates to be confined to their hotels at times. President Clinton had long championed the trade-agreement process despite opposition from his supporters in labour unions and environmental groups; in Seattle, Clinton abruptly acknowledged the protests and temporarily scuttled signing the pact. His decision was widely derided as caving to domestic political pressure and was denounced by numerous world leaders.

David C. Beckwith

▪ 1999

      Area: 9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries

      Population (1998 est.): 270,262,000

      Capital: Washington, D.C.

      Head of state and government: President Bill Clinton

      In 1998 the United States experienced the best of times and the worst of times. On one level the national economy moved steadily forward through its eighth consecutive year of vigorous expansion, accompanied by remarkably low and declining inflation, interest rates, and unemployment. On an individual basis it was a great time to be an American, with the economy producing record real personal income, hundreds of thousands of new jobs, and lofty financial market prices for a prosperous and satisfied public. On another level, however, the national body politic was in turmoil. Years of investigations into various charges against Pres. Bill Clinton (see BIOGRAPHIES (Clinton, Bill )) coalesced during 1998 into a focused probe of his efforts to evade a sexual harassment lawsuit, which led to his impeachment at the year's end by a partisan and divided U.S. House of Representatives. The disconnection between the sunny economic conditions and the stormy wrangling in the capital split the country into two camps, a larger one happy with their lot under Clinton and bored by Washington's seeming obsession with scandal and a smaller group outraged by Clinton's conduct and determined to see him removed.

      The fragmented national mood confounded public opinion pollsters and helped produce an inconclusive national midterm election in November. With his personal popularity incongruously bolstered by the assaults on his character, President Clinton won almost every important battle with the Republican Congress when final tax and spending measures were enacted in October. The setbacks seemed to demoralize Republicans and energize the president's core supporters. Although most commentators predicted that Republicans would add to their majorities among governors and in the U.S. House and Senate, the election produced no change in the Senate and a reduction in the slim Republican advantage in the House. That result in turn prompted another major surprise: Clinton's chief Republican nemesis, controversial Speaker of the House Newt Gingrich, resigned his post and thereby effectively became the first major victim of the Clinton scandal.

The Economy.
      Although Clinton's problems captured more headlines, the most significant news of 1998 was the continuing awesome and enduring strength of the American economy, which loomed like a colossus over a troubled world. Two million new American jobs were created, many of them high-paying positions in technology, pharmaceuticals, finance, and health care. The national economy shook off a spate of bad external news and grew at a 3.5% rate for a third consecutive year, nearly double the rate at which economists begin to fear overheating. Yet inflation remained well below 2%, even while unemployment sank to a 29-year low of 4.5%, real incomes rose at near-historic rates, housing construction was booming, and consumer confidence reached a record high.

      If the U.S. had merely been leading worldwide economic growth, its business performance would have been impressive enough. More remarkably, the muscular American economic engine surged forward even as trouble enveloped much of the world, refusing to slow significantly as other major economies faltered and stalled. When the year started, Japan was mired in a long-term economic malaise, and other vibrant Asia economies, especially Thailand, South Korea, Indonesia, and Malaysia, were still reeling from a 1997 currency crisis. By midyear the small but emerging Russian economy had begun to unravel as the ruble began to lose value, and rumours of trouble in Brazil and Argentina had reached Wall Street.

      In July the U.S. stock market seemed to lose heart under the accumulated weight of world adversity and began a long, steady price drop, erasing in two short months a 20% gain posted early in the year. The Dow Jones industrial average fell from 9200 to 7400 by September, and a major New York hedge fund, Long Term Capital Management, heavily invested in Russia, was threatened with bankruptcy. Once again, as it had through much of the record American peacetime expansion, the U.S. Federal Reserve System rode to the rescue. Under Chairman Alan Greenspan's supervision, a consortium of private lenders poured liquidity into the fund, effectively taking it over. Greenspan also orchestrated three small but rapid reductions in short-term interest rates over a seven-week period from September to November. The cuts in the federal funds rate—from 5.5% to 5.25% to 5% to 4.75%—helped Wall Street recover its footing and reverse the downturn. By year's end the Dow had made its most spectacular comeback in history, eliminating the entire summer decline and again threatening record territory, even as the country's political leadership seemed to be disintegrating.

      The hearty American economic performance produced a political side benefit: elimination of the federal government's chronic budget deficit. The red ink had hit a high of $290 billion in 1992, and administration economists had projected permanent $200 billion deficits as recently as 1996. The relentless surge of the national economy, however, reduced social expenditures (especially as the 1996 national welfare-reform law took full effect) and increased tax revenues far more rapidly than any economist could predict. When President Clinton and Congress agreed on a balanced-budget deal in August 1997, the 1998 deficit was estimated at $90 billion. By January 1998 the projected deficit had shrunk to $22 billion, but when the fiscal year ended in October, authorities announced a surplus of $70 billion and forecast future black ink as far as the eye could see.

Domestic Affairs.
      In Washington, D.C., the vibrant economy did little to pacify an increasing partisanship that infected the nation's capital. Lawmakers were distracted during the year by the Clinton inquiry and worried about upcoming elections. For his part, the president was unable to provide effective and consistent leadership throughout the year. As a result, the year was more notable for legislation defeated than for initiatives approved. Congress did approve a massive $216 billion highway and transit reauthorization that provided many new public works projects in every congressional district during the next six years. Also, when polls showed overwhelming public support, Clinton signed a Republican-backed reform bill ordering the Internal Revenue Service to become more responsive to taxpayer concerns.

      Most other major legislative initiatives died in partisan crossfire. Going down to defeat were plans to fund a massive national missile defense effort and to cap punitive damages in product-liability cases (both blocked by Democrats) and efforts to hike the minimum wage, to overturn the president's partial-birth-abortion veto, to reform bankruptcy laws, and to expand patient's rights in their dealings with health care providers and employers (all opposed by Republicans).

      An effort to reform the nation's easily evaded campaign finance laws also died on the U.S. Senate floor. In August the House approved a measure limiting both "soft money" and "independent expenditures," two major loopholes in campaign laws. Unlimited soft money dollars flowed from corporations, labour unions, and other interested groups directly to major political party coffers; independent expenditures allowed groups to spend without limit as long as they purported to advocate issue positions rather than individual candidates. Republicans feared the plan would not curb Democratic-oriented donors such as labour unions and environmental activists as much as it would inhibit GOP-leaning contributors such as businesses. Consequently, a month later a substantial minority of 48 Republican senators talked the Senate version to death via filibuster, refusing to stop talking until the measure was abandoned.

      An attempt to fashion a comprehensive national settlement with tobacco companies over costs of smoking-related health problems met a similar fate. A bill implementing a $368 billion proposed settlement in 1997 was shepherded easily through the Senate Commerce Committee by Chairman John McCain. It would have raised federal cigarette taxes by $1.10 per pack, restricted tobacco advertising, ordered Federal Drug Administration regulation of tobacco, and established fines if the incidence of teenage smoking failed to drop. Antismoking senators then raised the industry cost to $516 billion and dropped company protections against future litigation. At that time the four major tobacco firms ceased negotiations and started a $40 million advertising campaign, attacking the Senate bill as a large tax increase to fund new government spending programs. Even though a majority of senators continued to favour the bill, they could not break another filibuster, and the measure died.

      Incidence of crime, particularly violent offenses, dropped in the U.S. for the sixth consecutive year, but several high-profile acts nonetheless raised fears about trends in American society. In July a deranged gunman attempting to enter the U.S. Capitol building in Washington killed two policemen, the first deaths ever recorded at the Capitol. The offender was critically wounded and was later found mentally incompetent to stand trial. Young students with firearms were involved in two highly publicized tragedies. In Jonesboro, Ark., two boys, 12 and 13, killed 4 students and a teacher and wounded 10 others in a shooting spree at their junior high school in March. Two months later a 15-year-old Springfield, Ore., boy shot and killed his parents and then took their guns to his high-school cafeteria, where he shot 24 students, 2 of them fatally. Three white men, two wearing tattoos identifying them as members of a white racist prison gang, were charged with murder after an African-American man was dragged to death behind a pickup truck near Jasper, Texas, in June. In Laramie, Wyo., an openly gay University of Wyoming student was kidnapped from a bar, tied to a fence in a rural area, and beaten to death by two men. The incidents prompted renewed calls for new laws to combat so-called hate crimes and illegal possession of firearms.

Investigating the President.
      Throughout his long political career, President Clinton had benefited from his extraordinary communication skills, a talent that enabled him to demonstrate empathy with his audience and to turn close arguments decisively in his favour. This skill often infuriated his rivals, who complained that his verbal adroitness hid a lack of character and appreciation for the truth. For years Clinton sailed serenely over those criticisms. In 1998, however, through a series of wildly improbable and unexpected events, he became enmeshed in a quagmire over credibility that culminated at the year's end with his becoming only the second president in history to be impeached by the U.S. House of Representatives.

      Clinton was compelled to give a sworn deposition as defendant in a sexual harassment civil rights lawsuit filed by Paula Jones, a former Arkansas state employee. Jones, having alleged that then governor Clinton had improperly propositioned her in a Little Rock hotel room in 1990, sought evidence of other extramarital adventures by Clinton. At the January 17 deposition, Clinton generally denied all but one such affair (he belatedly acknowledged a previously denied liaison with Gennifer Flowers), but he specifically rejected suggestions he had dallied with a former White House intern named Monica Lewinsky. An Arkansas federal judge, relying in part on Clinton's denials, subsequently dismissed Jones's case.

      Unknown to Clinton, however, Linda Tripp, a former White House employee, had secretly recorded some 20 hours of conversations with her friend Lewinsky in which the young woman detailed an intimate relationship with the president. Tripp took her evidence to Kenneth Starr (see BIOGRAPHIES (Starr, Kenneth W. )), the court-appointed independent counsel who had been investigating Clinton and associates for three years. Starr received court approval to expand his investigation to the Lewinsky matter. (See Sidebar (Limits of Power of the Independent Counsel ).)

      Through the spring, as Starr's grand juries gathered evidence on Lewinsky, the administration attempted to block testimony of various Clinton aides and security officers by asserting privilege claims, but most were rejected by the courts. After months of delay Lewinsky hired new lawyers and eventually began cooperating with Starr. Clinton on August 17 gave an extraordinary interview via video to the Starr grand jury, now admitting an "improper relationship" with Lewinsky but invoking precise word definitions to deny that he had lied or committed perjury in his previous sworn statements. Far from indicating damage, Clinton's high public-approval ratings actually rose following these concessions.

      In the fall a series of unexpected and virtually unprecedented events rocked Washington. Starr sent the U.S. House two truckloads of documents with a message alleging that Clinton may have committed at least five impeachable offenses in his handling of the Jones suit and subsequent investigation. Most commentators predicted the charges would aid Republicans in November elections, but the GOP actually lost five House seats. House Speaker Gingrich, a severe Clinton critic, then announced his resignation. At that point pundits declared any impeachment inquiry dead, killed in effect by will of the voters. Instead, however, the House Judiciary Committee under Chairman Henry Hyde conducted a hearing and voted along strict party lines to recommend impeachment of the president on four counts.

      In December the matter moved to the full Republican-controlled House, with Democrats continuing to complain that Clinton was being charged with personal offenses in his private life that had nothing to do with public conduct of his office. Only a few minutes before the actual voting, Gingrich's designated successor as speaker of the House, Louisiana Rep. Robert Livingston, announced that he would resign from the House after acknowledging he had engaged in extramarital affairs. The House then impeached Clinton on two counts, perjury and obstruction of justice, again largely along partisan lines, with only a half dozen representatives from each party straying from the party-line vote.

Foreign Affairs.
      The treaty ending the 1991 Persian Gulf War specified that international sanctions against Iraq should remain in place until United Nations inspectors could verify that Saddam Hussein's missile, biological, chemical, and nuclear weapons programs had been completely dismantled. As the year began, the U.S. and Great Britain sent a major military task force to the Persian Gulf to force compliance with inspection demands. In February UN Secretary-General Kofi Annan flew to Baghdad and hammered out an 11th-hour agreement calling for "unconditional and unrestricted" access for inspections. That pact began to fray almost immediately, as Iraq demanded a certain date for the conclusion of weapons monitoring. In August Saddam Hussein suspended cooperation with inspectors, which precipitated yet another slowly evolving crisis. Meanwhile, the top American inspector, Scott Ritter, resigned his post, accusing the Clinton administration of deliberately canceling aggressive inspections to avoid provoking Hussein. On November 13, following another allied military buildup in the region, President Clinton ordered U.S. forces to attack Iraq. After B-52 bombers were airborne, however, Clinton announced that Iraq had "backed down" and had promised full cooperation with inspectors.

      That agreement lasted little more than a month. UN inspectors, whom the Iraqis called deliberately provocative, were turned away from political and military sites in what Iraqis called deliberate provocations. In mid-December, just one day before the House was to vote on his impeachment, Clinton again issued orders for joint U.S.-British air strikes on Iraq. The resulting 70-hour bombardment produced uncertain damage to Iraqi installations but an apparently decisive political result; at year's end Hussein declared he would no longer allow UN inspectors to operate within his country.

      Within minutes on August 7, U.S. embassies in Nairobi, Kenya, and Dar es Salaam, Tanz., were hit by terrorist bombs that killed 262, including 12 American citizens. Within days, authorities in a dozen countries developed information linking the attacks to Al-Qaeda, a militant Islamic Army offshoot run by Saudi-born millionaire Osama bin Laden. (See BIOGRAPHIES (Bin Laden, Osama ).) On August 20 the U.S. military fired 75 Tomahawk cruise missiles at a bin Laden military training compound in eastern Afghanistan and at a pharmaceuticals factory in Khartoum, Sudan, that U.S. authorities claimed manufactured the "precursors" of chemical weapons.

      The U.S. policy of stopping nuclear weapons proliferation suffered several setbacks during the year. Catching U.S. intelligence largely unaware, India conducted a series of underground nuclear tests on May 11 and 13. President Clinton announced economic sanctions against India and dispatched a high-level delegation to dissuade rival Pakistan from duplicating the feat. Pakistan, however, performed six of its own weapons tests on May 28 and 30, again prompting U.S.-led world economic sanctions. Critics charged the U.S. with hypocrisy for partnering with China, another declared nuclear power. Both countries declared a moratorium on future tests, and by early November Clinton had canceled the short-lived sanctions.

      President Clinton, who had criticized his predecessor George Bush for preoccupation with foreign policy, spent an unprecedented 86 days traveling abroad during the year. He was periodically accused of taking action on a variety of international issues as a means of distracting attention from his personal legal problems. Even so, Clinton achieved a number of unqualified foreign policy successes during 1998, however. He was universally credited with a vital role in the brokering of a historic agreement to end 80 years of religious-based strife in Northern Ireland. He achieved a similar apparent breakthrough in the Middle East peace process in late October when Israeli Prime Minister Benjamin Netanyahu and Palestinian leader Yasir Arafat signed an interim agreement for Israeli withdrawal from part of the occupied West Bank. Clinton brought the two sides together and laid plans for what turned out to be an intense nine-day series of closed meetings at the Wye River Conference Center on Maryland's Eastern Shore. Although skeptics questioned the Wye Memorandum's durability, it appeared to provide a basis for historic cooperation between adversaries.

      As the Asia currency crisis appeared to bottom out during the year, the U.S. Congress reluctantly endorsed International Monetary Fund efforts to shore up foundering world economies. Republican congressmen faulted the IMF's secrecy and claimed that the organization's stringent lending requirements helped compound troubles faced by some countries. The U.S.'s regular $3.5 billion dues were eventually authorized, however, and another $17.9 billion special contribution was approved as part of the year-end budget deal in October.


▪ 1998

      Area: 9,363,364 sq km (3,615,215 sq mi), including 204,446 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries

      Population (1997 est.): 267,839,000

      Capital: Washington, D.C.

      Head of state and government: President Bill Clinton

      In 1997 the United States experienced a truly vintage year: a time of peace, prosperity, relative harmony, and rising prospects— favourable indicators that had not been seen for at least 25 years. On the world stage the U.S. stood unchallenged as the globe's sole superpower, and at home a business expansion already some seven years old continued. The U.S. was also at the centre of a global reorganization of production—the so-called new economy of computers and the Internet. As financial storms battered other parts of the world, U.S. stock markets were at an all-time high, and unemployment was at a 25-year low and shrinking. Inflation, the bane of fiscal conservatives during any economic surge, was virtually nonexistent, even though wages, for years stagnant as the economy endured painful restructuring, were finally on the rise. Unlike 25 years earlier, no great social or political conflicts shook the nation. Crime, the blight of the urban U.S., was on a sustained decline, and welfare rolls were shrinking dramatically.

Domestic Affairs.
      In Washington, D.C., Pres. Bill Clinton showed himself to be less of a master bridge builder than a shrewd fence straddler. In the wake of his resounding 1996 election victory, Clinton, the first Democrat to have won reelection since Franklin D. Roosevelt, continued to follow his "triangulation" strategy—placing himself to the right of most Democrats and to the left of most Republicans. His popularity ratings stayed consistently above 50% through much of the year, despite a variety of alleged and interminable scandals and investigations that had become a hallmark of his presidency. Even with a Republican-dominated Congress, Clinton achieved a goal that had eluded presidents since 1969—an extraordinary bipartisan agreement to balance the federal budget by the year 2002. In the process he presided over the largest U.S. tax cut since 1981, including reductions in capital gains (the maximum rate would drop from 28% to 20%) and estate taxes (the basic $600,000 exemption would double over time). In all, the tax reductions were estimated to be worth $96 billion over five years and $282 billion over a decade. In addition, Clinton doled out billions in additional subsidies for middle-class college education and health insurance for children.

      The main parts of the deal included a $58 billion reduction in nonmilitary spending, about $12 billion more than Clinton had originally proposed. More than $115 billion was also anticipated in savings from Medicare programs. Despite the austerity, the agreement provided $34 billion for important presidential priorities, including health insurance for up to 10 million children not covered by private or public plans. It also allowed for restoration of welfare benefits to legal immigrants who had been dropped during the budgetary wars of 1996, expansion of student loan programs, and new funding for early childhood assistance through Head Start programs. The $135 billion tax cuts were offset somewhat by the $50 billion saved by raising tax revenues on airline tickets and by closing alleged tax loopholes. Both the spending and the tax portions of the budget passed the two houses of Congress by wide margins.

      The sudden breakthrough in fiscal probity was attributed to economic growth, which changed government projections for social outlays and tax inflows and reduced the estimated budget deficit in 1997 to a comparatively paltry $22.6 billion. The agreement on such a sweeping deal between Clinton and Congress was a tribute to the president's political skills as well as a sign that the nation had retreated from a confrontational mood and expected politicians to do the same.

      The tangible decentralization of power showed itself in a multitude of ways, but one of the most obvious was welfare reform. Since 1996, when Congress passed the welfare-reform law, state and local governments had used their power to change dramatically their systems of social protection. Revised work and eligibility rules for welfare had cut rolls in Wisconsin by 55% since the start of the decade. Oregon, Indiana, West Virginia, Rhode Island, and Connecticut all experienced decreases of 40% or more. Throughout the Midwest and most of the old South, welfare rolls fell anywhere from 20% to 40%. Only California registered an increase.

      Americans endorsed mayors who followed federal and state trends toward spinning off government services to private contractors, balancing budgets, and reshaping old-fashioned labour-management relations while dealing briskly with crime. As a result, such urban areas as Philadelphia and Cleveland, Ohio, cities that had been fiscal sinkholes in the 1980s, were reporting substantial surpluses, better services for residents, and renewed optimism.

      In Los Angeles low-key Republican Richard Riordan soundly defeated Democratic Sen. Tom Hayden to win reelection to a second term as mayor in a city where Democrats outnumbered Republicans by two to one. In New York City Republican Rudolph Giuliani coasted to a similar victory in an even more stalwart Democratic stronghold.

The Economy.
      The reinvigoration, however, would not have been possible without the phenomenal performance of the economy, which entered the year growing at nearly a 4% rate, with unemployment hovering around 5.3%, and the Dow Jones industrial average heading toward 7000. Debate grew over whether the pace could be sustained without a revival in inflation, which had hit a meagre 2.4% in 1996. As growth surged at 5.9% in the first quarter of the year, Federal Reserve Board Chairman Alan Greenspan fired a warning shot by raising the federal funds' interest rate by 25 basis points, to 5.5%, the first interest-rate rise in two years. During the first half of the year, the economy continued to boom at a 4.1% rate—roughly double the pace at which economists generally feared a reignition of inflation. Yet Greenspan took no further action.

      The most striking economic phenomenon of 1997 was an enormous surge in jobs that did not bring about a corresponding rise in prices, even as real U.S. wages began to climb. By November the unemployment rate had fallen to 4.7%, the lowest since 1973. Meanwhile, over the 12-month period ended in November, Americans' incomes rose 4.1%, a real gain of 2% when adjusted for inflation—the highest rate recorded since the mid-1970s.

      The combined effect on the U.S. stock market of high growth, low unemployment, rising wages, and low inflation was galvanic. The Dow Jones industrial average broke through 8000 in July, and economists predicted that it might reach 10000 or even 12000 without a significant retrenchment. As a major financial crash in Southeast Asia cast clouds on the horizon, American optimism continued undiminished—until a minicrash came on October 27 that knocked 554 points off the Dow Jones in a single day. Yet 24 hours later the bull market regained momentum as the market climbed 337 points in a single session, the biggest rise in a decade.

      Organized labour, however, showed its resentment at Clinton's perceived bias in favour of conservative economic policies and corporate globalism. During his first term Clinton had strongly supported passage of the North American Free Trade Agreement between the U.S., Canada, and Mexico. During the deal making, he had traded away renewal of the administration's "fast-track" authority to negotiate trade agreements that could be approved or denied by Congress only without amendment. Without such authority the president was weakened in his position to reach agreements on trade issues with other nations. In November, however, the White House was forced to announce that it would not seek renewal of the fast-track authorization, chiefly because of opposition from Democrats, heavily supported by organized labour, who opposed free trade because they believed it resulted in job losses for Americans. Clinton vowed to seek the authorization again in early 1998, but the setback was a blow to his international prestige.

Ethics in Government.
      The president and his administration continued to be troubled by a number of scandals. The most personal accusation was the charge of sexual harassment made by Paula Corbin Jones, who had been an Arkansas state employee when Clinton was governor. The White House argued before the Supreme Court that a president in office should be allowed to postpone until the end of his term civil suits derived from past actions. The court, however, did not agree, and by the end of the year, the country was facing the prospect of the president's being forced to give testimony in court.

      First lady Hillary Rodham Clinton also sought, and failed, to create a Supreme Court precedent in the Whitewater affair. Her attorneys argued, and lost, an assertion that notes taken by White House lawyers during conversations with her were privileged under lawyer-client confidentiality and could be withheld from Kenneth Starr, the special prosecutor investigating the case. The court gave Starr access to the documents, but they did not lead to any startling changes in the three-year, $30 million probe of various real-estate deals conducted while Clinton was in Arkansas.

      All paled, however, before the outcry that arose, both in Republican circles and in the press, over the financing of the 1996 election campaign. At no time in U.S. history had more money been spent on electoral politics—$2.2 billion at all levels. A substantial amount of Democratic campaign funds, it appeared, had come from questionable sources, especially from businessmen with Asian backgrounds and often, it seemed, with interests in China. Revelations about Democratic Party fund-raising, which had trickled out even during the campaign, caused the Democratic National Committee (DNC) eventually to return $2.8 million in donations. The accusations became even more serious as various members of the U.S. national security establishment questioned the appropriateness of visits by some of the donors to the White House.

      Much was made of the activities of Charles Yah Lin Trie, a Taiwanese-born entrepreneur who ran a Little Rock, Ark., restaurant and who eventually became a top Democratic fund-raiser; he had visited the White House 23 times. Clinton admitted that it was "clearly inappropriate" for Trie, who had helped raise a substantial amount of money for the Democrats and for the Clintons' legal defense, to have escorted a known Chinese weapons dealer through the White House.

      Another figure in the fund-raising effort was California businessman Johnny Chung, who had donated a total of $366,000 to the DNC, all of which was later returned because the source of the money could not be verified. Among other indiscretions, Chung had managed to pass on a $50,000 check to the DNC through Hillary Clinton's then chief of staff, Margaret Williams. Two days later he escorted a number of Chinese business associates to a taping of Clinton's weekly radio address. The donation raised the issue of possible impropriety on the part of Williams for having accepted a campaign contribution on government property.

      The Republican-led furor over these and other revelations took on a shriller tone after it was discovered that Clinton and Vice Pres. Al Gore had made a number of fund-raising calls from their executive offices. The actions raised the spectre of a possible violation of the Pendleton Act, which forbids federal employees to solicit contributions on federal property. Although both denied wrongdoing, Gore said that he made calls on only "a few occasions," and the president claimed little recollection.

      Eventually, the campaign fund-raising issue came before a Senate investigating committee, chaired by Fred Thompson of Tennessee, who charged that the alleged scandal involved a plot on the part of China's government to influence U.S. politics. His committee issued 52 subpoenas, and fund-raiser Trie, for one, fled to China rather than testify. The hearings aired secret communications intercepts that indicated that Chinese officials in Beijing and Washington at least discussed how to increase their government's influence with U.S. local, state, and federal officials. In addition, the committee heard testimony that the Republican National Committee (RNC) had also received questionable support from abroad, dating back to 1994. The major donor was Hong Kong businessman Ambrous Tung Young, who had introduced Haley Barbour, then chairman of the RNC, to top Chinese officials. The RNC ultimately returned a $100,000 Young donation.

      Serious strains developed between Attorney General Janet Reno and FBI Director Louis Freeh over the fund-raising controversy. The dispute involved different interpretations of the 1978 Independent Counsel Act. Freeh believed that the act could be read broadly to ensure an impartial investigation; he urged Reno to turn the entire fund-raising matter over to an independent prosecutor because she, as a Cabinet official, faced a conflict of interest in investigating her own boss. Reno, however, took a narrower view of the legal grounds for appointing a special prosecutor. She insisted, with the backing of departmental attorneys, that only clear evidence of wrongdoing could trigger an independent investigation. Reno was shaken, however, when soon after she had made one of her clearest assertions of the lack of need for outside investigation, the White House began releasing videotapes of Clinton's meetings with various campaign donors, including controversial figures. Although none revealed anything illicit, Reno had not been informed of the existence of the tapes. In the end she remained firm—an independent counsel would not be appointed.

      The entire fund-raising issue clearly established that U.S. campaign-financing laws were in a quagmire, with bewildering distinctions between "hard" and "soft" campaign donations. As Clinton declared, reform of some kind was highly desirable, and several proposals were aired in Congress.

Foreign Affairs.
      Relations with China marked the point of greatest difficulty in making the distinction between foreign and domestic affairs in a globalized economy as greater numbers of Asians immigrated to the U.S. and more business was done with China. Greater commercial dealing with Asia's authoritarian regimes also raised larger questions of how to impress upon them the need for increased observance of human rights. All of these issues came to a head in late October and early November when Chinese Pres. Jiang Zemin made his first trip to the U.S., the first by a Chinese head of state in 12 years. His visit, coming only months after the return of Hong Kong to Chinese sovereignty, raised the issue of democracy and trade to a special level of sensitivity. In more than two hours of conversations with Clinton at the White House, and again in public, Clinton took unusual pains to stress that on the issue of democracy China's leadership was "on the wrong side of history." Jiang seemed unfazed by the admonition. On a more practical level, China pledged to cut off nuclear aid to Iran in exchange for future sales of American nuclear reactors to China.

      Late in the year the Clinton administration orchestrated a series of multibillion-dollar bailouts to shore up the short-circuited economies of Thailand, Malaysia, Indonesia, and South Korea, among others, which were caught up in a dominoes-style financial collapse. The International Monetary Fund was called in to provide what could prove to be upwards of $100 billion in interim financing, and the U.S. was embarrassed as a recalcitrant Congress refused to approve $3.5 billion in IMF contributions.

      In Europe U.S. foreign policy was on surer ground. In July the U.S.-led NATO alliance welcomed three new members to the security alliance—Poland, Hungary, and the Czech Republic—all of which would become members in 1999. The enlargement of NATO had been preceded by a lively debate within the administration over its advisability and had encountered vociferous Russian opposition. Nonetheless, the move proceeded as planned, with the alliance promising that it would deploy no combat troops or nuclear weapons in its new regions.

      Although most of the world's nations gathered in Ottawa in December to sign a treaty banning the use of antipersonnel land mines, the United States was not among the signatories. Clinton explained that treaty negotiators would not allow an interim exemption for the U.S., which had requested the continued use of antipersonnel mines to protect antitank defenses of vital importance in the Korean peninsula, where 40,000 U.S. troops and their South Korean allies were vastly outnumbered by the forces of North Korea.

      Clinton's most difficult foreign-policy challenge involved an old nemesis—Iraqi dictator Saddam Hussein, who had repeatedly shown an uncanny ability to win political advantage while still enduring the military and economic straitjacket imposed by a U.S.-led alliance after the Persian Gulf War. When teams of UN weapons inspectors apparently closed in on secret stocks of biological and chemical weapons, Hussein declared that American inspectors would no longer be allowed on the UN team hunting for Iraqi weapons of mass destruction. He eventually forced UN personnel to leave the country. Although the UN Security Council unanimously condemned Iraq but initially refused to follow an American lead of further sanctions against Baghdad, it later imposed additional sanctions because of Hussein's continued unwillingness to cooperate and his threats to U.S. reconnaissance aircraft. When Hussein threatened to shoot down American U-2 spy planes overflying sensitive Iraqi areas, Clinton ordered three carrier groups to operate within striking range and massed aircraft in Saudi Arabia and Turkey. Hussein shrewdly backed down and invited UN inspectors back into the country but refused to grant them entree to some 47 rebuilt presidential compounds. Although the U.S. sought to balance threats of continued economic embargo against incentives for further Iraqi cooperation, the consensus was that U.S. dependence on coalition building had perhaps resulted in a shift of political momentum toward its most dangerous regional adversary.

      See also Dependent States .

▪ 1997

      The United States of America is a federal republic composed of 50 states. Area: 9,362,753 sq km (3,614,979 sq mi), including 203,679 sq km of inland water but excluding the 155,534 sq km of the Great Lakes that lie within U.S. boundaries. Pop. (1996 est.): 265,455,000. Cap.: Washington, D.C. Monetary unit: U.S. dollar, with (Oct. 11, 1996) a free rate of U.S. $1.58 to £ 1 sterling. President in 1996, Bill Clinton.

      In 1996 Bill Clinton (see BIOGRAPHIES (Clinton, Bill )) showed that he was a master at gauging shifts in national mood, and indeed of helping to create them, as he maneuvered in Washington, D.C., and campaigned across the country to become the first two-term U.S. president from the Democratic Party since Franklin D. Roosevelt 60 years earlier. Clinton's victory over his Republican opponent, former senator Bob Dole (see BIOGRAPHIES (Dole, Robert Joseph )), was all the more remarkable in that voters, in the lowest turnout since 1924, also returned a Republican-majority Congress for the first time since 1930. Never before had a Democrat won the nation's highest office with the Congress controlled by his opponents. Once again, however, the people had opted for the U.S. equivalent of minority government. (See Special Report (SPECIAL REPORT: The U.S. Presidential Election ).)

      Nonetheless, Clinton could claim a clear victory. He won 49.2% of the popular vote, compared with 40.8% for his Republican rival; the remainder went to maverick populist Ross Perot, who ran as the Reform Party candidate. According to exit polls, Clinton was particularly favoured by women, who endorsed him 54% to 38%; by African-Americans, who voted for him 83% to 12%; and by the elderly, who voted Democratic 50% to 43%. The Republican majority, by contrast, was shaved marginally in the House of Representatives and expanded slightly in the Senate. More than half of the Republican casualties came from among the representatives who had first been elected in 1994.

      Clinton won his victory by moving with agility to the right, a talent he had demonstrated throughout his national political career but never against such odds as those he faced in 1996. In the process he managed to emerge once again in the public eye as a moderate. To many he seemed more moderate than Dole and his fellow Republicans, especially the aggressive speaker of the House of Representatives, Newt Gingrich, whom Clinton brilliantly demonized in the presidential campaign as an avatar of mean-minded radical conservatism, threatening the poor, the elderly, and the middle class with cuts in federally mandated entitlements. The net result was that Clinton, who began the year almost passively, with the government paralyzed through a budget deadlock, emerged as a mediating chief executive who could urge his defeated adversaries to join him in seeking a "common ground" during his upcoming term.

      Clinton, moreover, achieved this feat despite a continuing rain of scandals great and small upon his administration. They covered everything from the continuing investigation into the decade-old Whitewater land deal to more sinister questions about the abuse of confidential FBI files on political opponents and the improper raising of campaign funds from non-U.S. sources. As the year closed, the U.S. Supreme Court was prepared to hear arguments on whether the president should be allowed, on account of his office, to postpone a civil suit leveled against him by Paula Corbin Jones, a former Arkansas state employee who alleged that Clinton had made sexual advances to her while he was governor. It was one sign of the administration's political skills that, although none of the scandals had gone away by the end of 1996 and some might return to hurt the president in his second term, none proved fatal to Clinton's reelection.

The Economy.
      The fact was that, however many questions were raised about the president's character or that of his administration, other, more fundamental factors weighed heavily in favour of his reelection. The nation was at peace, and, above all, it was prosperous. The monetary manipulations of the Federal Reserve System (Fed) chairman, Alan Greenspan, and his Open Market Committee ensured that economic growth continued. The Fed cut short-term rates just before the new year began, with the aim of keeping growth in the range of 2.5% for 1996. Any fears of flat growth or recession were thus dispelled, and the president signaled his approval for this course by renominating Greenspan, a Republican, for his third four-year term as Fed chairman and naming two other economic moderates to the seven-member board.

      The steady growth put further downward pressure on the U.S. jobless rate, which was only 5.6% when the year began. By the time the year ended, it was 5.3%, not much changed but nonetheless at the lowest level since the 1970s. Inflation, too, was contained, staying at roughly 2.5%. Blue-collar workers registered a real, if marginal, rise in income, as wage increases averaged 2.8%, and white-collar workers saw a 3.1% increase in pay. Overall economic productivity rose at a 1.2% rate, while productivity in manufacturing rose 3.2%. Thus, the nation's economic progress was steady, if not muscular. One of the more negative signals, however, was the steady rise in personal bankruptcies, which reached more than one million during the year. There was also continued volatility in sectoral employment as large-scale corporate downsizing continued.

      The most dynamic sector of the economy was the high-tech, particularly the computer-oriented, firms that continued to drive the stock market to new heights. In the first half of 1996, the sale of new public stock offerings continued to be one of the fastest avenues of growth for new companies, which went public at a rate of 70 or more per month. In the process many suddenly became worth 200 or 300 times their previous value, creating a steady procession of new millionaires. The same frothy optimism continued to affect more traditional stocks, as the Dow Jones industrial average continued its steady rise past 6,000. Among other things, the rise reflected a steady flow of money into equities from members of the baby-boomer generation, who were skeptical of the value of Social Security and were replacing it with contributions to such vehicles as 401(k) accounts. In midyear, however, there was a sudden correction in the upward rise of stocks, and the high-tech over-the-counter market, in particular, swooned. Nonetheless, by year's end the market had recovered, albeit selectively.

Developments in Government.
      If the November elections underlined anything, it was that the American people were eager to pull back from extremes that might erode their sense of stability, however transitory that might prove to be. The nation had been badly shocked in 1995 by signs that the social and political consensus was fraying in ways not seen since the Vietnam War. In Washington the tension was symbolized by the trench warfare between the White House and Congress over the 1996 budget, which had left the government essentially inoperative. Some 280,000 government workers were laid off, and another half million were working but not being paid. At issue were the differing ways in which the two sides proposed to close the budget deficit over seven years, chiefly in terms of taxes and in slowing the growth of such huge entitlement programs as Medicare and Medicaid. The Republicans wanted to cut $270 billion from Medicare growth, for example, while the president wanted to pare only $124 billion. Clinton had also rejected Republican efforts to give the military more than the $256 billion he had originally proposed.

      The standoff, which had begun in mid-December 1995, continued for 18 days before the Republican wall began to crack. It was Clinton's soon-to-be presidential rival, Senator Dole, who first urged his party to begin providing funds on a continuing basis so that the government could get back to work. He was then joined by Speaker Gingrich, who broke with more radical members of his party to do so. Both men realized that the American people, while sympathetic to the goal of cutting the size and scope of the government, were profoundly uneasy at its paralysis. After 21 days the funding cutoff ended on January 6, with both sides having submitted their proposals for seven-year reductions in spending. The squabbling over the actual 1996 budget continued until late April, however, with 13 separate temporary spending bills required for keeping the government functioning while the horse trading went on.

      In general, the outcome of the exhausting battle confirmed the thinking that had propelled the congressional Republicans to power in 1994. In the final budget more than 200 federal programs were abolished, mostly in the Labor and the Health and Human Services departments. Funding for the Corporation for Public Broadcasting, a longtime target of conservative ire, was slashed, though the corporation survived. So did such Clinton programs as the subsidized national service for youth, funding to put 100,000 extra police on the streets, and extra money to improve the quality of education, viewed by conservatives as a federal prop to a pillar of the Democratic Party, the National Education Association.

      The president was quick to turn the situation to political advantage and to articulate the theme that was to dominate the electoral politics of the year. "The era of big government is over," he told Congress and the nation in his annual state of the union address. He added, however, that "we cannot go back to the time when our citizens were left to fend for themselves." Clearly positioning himself as a moderate, he called for such achievements as bipartisan welfare reform, an increase in the minimum wage, and portability of health insurance so that workers would not lose coverage if they changed jobs. He also asked for a line-item veto of the kind already wielded by 43 of 50 state governors and endorsed by the Republican congressional majority.

      The limited nature of Clinton's 1996 goals stood in sharp contrast to the grandiose first-term proposals he had outlined for health care reform, which died ignominiously in 1994. The fate of the new proposals was also different. In May Congress endorsed the first hike in the federally mandated minimum wage in five years, from $4.25 to $4.75 an hour, with another rise to $5.15 a year later. Some 3.7 million Americans were affected by the measure, most of them women. The change was fiercely opposed by small business lobbies, but in the end Republicans split over the issue.

      At virtually the same time, Clinton won approval of the line-item veto, which allowed the president to strike a limited number of items from an appropriation bill rather than veto the entire document. The veto was highly limited, however. It applied to tax concessions only if they affected no more than 100 taxpayers and specifically could not be used on entitlement programs like Medicare and Social Security. Nor could it be used to block major tax reductions, and it could be overturned by a two-thirds congressional majority. Nonetheless, the veto was decried by Sen. Mark Hatfield, chairman of the Senate Appropriations Committee and one of only three Senate Republicans to vote against it, as "the greatest effort to shift the balance of power to the White House since Franklin Roosevelt attempted to pack the Supreme Court." The veto was immediately promised a constitutional challenge.

      If the president was able to win incremental victories that gave solace to the liberal constituencies within his party, he also made moves that set him apart from them. None was more symbolic, or fraught with more sweeping potential to affect American society, than his decision to sign the welfare reform act passed by Congress, the first comprehensive overhaul of the system in over 60 years. Momentum for some sort of change was clearly unstoppable. In polls the American people had frequently showed their unhappiness with welfare, particularly the $16 billion program known as Aid to Families with Dependent Children (AFDC). Clinton had already declared his willingness to accept a two-year limit on recipients in the program, but liberal members of his party had long argued that a welfare cutoff was meaningless, and perhaps dangerous, unless it was matched with expensive job-creation measures, probably in the public sector. The Republican Congress would have none of that. In the long wrangle over the bill, the White House was able to add a number of palliatives to the notion of a welfare cutoff: child nutrition programs, extra aid for recession-hit states, and money for child care and foster care. The overall direction of reform, however, was to take the federal government out of the social welfare business where possible and to hand its administration over to the states.

      Under the provisions of the measure, states were to receive block grants for all welfare expenditures, set in relation to 1996 levels, with added money to take account of recessions or unusual population growth. The act abolished the AFDC program entirely and gave the states until July 1, 1997, to come up with plans that required welfare recipients to go to work within two years, while setting a total limit on welfare assistance of five years per family. After six years states that failed to put welfare families in work of some kind would lose their federal funds, although 20% of a state's caseload could be exempted. The law contained a number of clauses aimed at reinforcing the work ethic. Administrators could cut payments to teenage mothers who did not finish high school, for example, or who did not live with an adult (a response to frequent criticisms that the AFDC program encouraged broken families and illegitimacy). State legislatures would need to provide a waiver to add payments for children born while their mothers were on welfare. On the other side, the measure set aside $400 million in bonuses for states that reduced or contained rates of illegitimate birth, including $250 million for education in abstinence as a form of birth control. The bill also barred legal immigrants who had not applied for citizenship from receiving food stamps and other forms of assistance. The law recognized that many states had long been trying to find more workable formulas, and it gave 44 states a year to wind down various experiments already under way.

      Some questioned whether this welfare reform was actually an answer to the problem or merely a means of shuffling the issue onto lower levels of government. Most experts agreed that without substantial levels of job training and placement, the two-year limit to federal funding might merely shift an immense burden onto state budgets. Many child-care advocates warned that the reforms would strike hardest at the children of those on welfare, perhaps adding millions to the rolls of a permanent underclass. Of course, the full impact of the welfare changes were not likely to be felt for several years, a point that was often made by its opponents, some of whom were closely aligned with the president's wife, Hillary Rodham Clinton. That, however, did not deter the president from signing the measure.

      Clinton also took a variety of conservative postures on other social and so-called family-values issues, especially those related to crime and drugs. He appointed a four-star army general, Barry R. McCaffrey, previously commander of the Pentagon's Southern Command in Panama, as the nation's drug czar. He raised the possibility of a mandatory drug test for teenagers seeking to obtain a driver's license. The president caused a fierce storm of protest among homosexuals when he announced his support for legislation that would ban the provision of federal benefits to the partners in a same-sex marriage. When the Defense of Marriage Act passed, Clinton signed it.

      The issue of same-sex partnerships proved a heated one across the country in an election year. The immediate reason for the furor was a series of court decisions in Hawaii, reaching to the state's Supreme Court, that ruled the prohibition of same-gender unions to be in violation of the state constitution's equal protection clause. The decisions led to conservative warnings that the ruling would usher in homosexual marriages across the nation as states were forced to recognize their legality under the "full faith and credit" provisions of the U.S. Constitution. In fact, the likelihood of such legitimacy was small, for 15 states had laws explicitly banning such marriages, and others were considering them.

Law Enforcement.
      While a looming election raised temperatures on some divisive social issues, the country clearly was in no mood to countenance a radicalism that threatened social war. The 1995 bombing of the Alfred P. Murrah Federal Building in Oklahoma City, Okla., which killed 169 people, had savagely underlined the horrors of extremism, and the nation clearly wanted no part of it. The two men charged with the crime, allegedly fringe members of a heavily armed antigovernment militia, awaited trial in 1996. There were no similar bombings during the year, but in July the federal Bureau of Alcohol, Tobacco and Firearms arrested 10 men and 2 women—members of a little-known Phoenix, Ariz., splinter group called the Viper Militia—who seemingly had like plans. The authorities confiscated two machine guns, six rifles, hundreds of rounds of ammunition, and hundreds of kilograms of chemicals similar to those used in the Oklahoma City bombing. They also impounded videotape of sundry Vipers giving guided tours of nearby federal buildings, with detailed instructions on how to blow them up.

      Federal authorities pulled off an even bigger coup when they staged a raid on a remote Montana mountain cabin and announced that they had arrested Theodore J. Kaczynski, thought to be the anonymous bomber who had eluded them for 18 years. Intermittently since 1978, the so-called Unabomber had mailed handmade explosive devices to a number of academics and business executives, killing 3 people and injuring 23. In the wake of the Oklahoma City bombing, he sent a bomb to the president of the California Forestry Association and threatened to blow up an aircraft leaving the Los Angeles airport unless the New York Times and Washington Post published his manifesto against industrialized society. The publication proved Kaczynski's undoing when his brother recognized the rhetoric and notified authorities. Kaczynski had no link to any organized causes.

      The arrest won back some lustre for federal law-enforcement agencies, which had suffered a great loss of prestige as a result of their handling of the 1993 siege near Waco, Texas, of the headquarters of the Branch Davidian sect, in which 82 members had died, and for the bungled 1992 arrest of a white separatist in Idaho, in which his wife and 14-year-old son had been killed. The FBI used different tactics in 1996 in outwaiting a group of self-described libertarian Freemen holed up on a ranch outside Jordan, Mont. The Freemen were faced with federal charges of writing millions of dollars' worth of bad checks and money orders and of threatening to kidnap and kill a federal judge involved in foreclosure on the farm. Mindful of the innocent women and children in the beleaguered camp, the FBI simply outwaited the defenders until they surrendered.

      The FBI's prestige was once again tarnished, however, this time in the midst of the year's most festive occasion, the Centennial Olympic Games in Atlanta, Ga. The Games had just finished their seventh day when, early in the morning, a homemade pipe bomb exploded in Centennial Olympic Park, killing one person and wounding 111. It was the first violence to occur at the Olympics since the massacre that had taken place in Munich, Ger., in 1972, and it happened despite unprecedented security. The bomb was contained in a knapsack left against a television broadcast tower in the park, a central meeting place. About 18 minutes before the explosion, an anonymous caller had phoned in a warning, and security personnel were trying to clear the area when the bomb went off. Official suspicion soon focused on Richard Jewell, an Olympics security guard, who was detained, interrogated, and investigated for months before being told that he was no longer a suspect. Jewell sued not only the authorities but also news media who publicized suspicions of his guilt. No other suspect was named in the bombing, despite a $500,000 FBI reward.

      The Olympics bombing came on the heels of a much greater disaster. On July 17 a TWA flight from New York City to Paris suddenly exploded over the Atlantic Ocean near Long Island, with 230 passengers and crew aboard. All perished. A massive underwater search across 620 sq km (240 sq mi) of ocean eventually recovered most of the bodies and about 95% of the Boeing 747 aircraft. Authorities worked to determine whether a bomb or a mechanical problem had caused the calamity aboard Flight 800. By the end of the year, the investigation was far from over, but some authorities were venturing that the cause was a buildup of explosive vapour in a fuel tank.

Foreign Affairs.
      Terrorism, nonetheless, continued to strike a strong chord with Americans. A month before the TWA disaster, a small group of men wheeled a large tanker truck up against a link fence in front of an apartment building in Dhahran, Saudi Arabia, and then fled before an enormous explosion tore the face off the building. The edifice housed U.S. Air Force personnel involved in interdicting flights in southern Iraq in the wake of the 1990-91 Persian Gulf War. A total of 19 airmen were killed and 50 hospitalized by the blast. The explosion was believed to be the work of Saudi Islamic militants.

      The Saudi attack was no doubt on President Clinton's mind two months later when he declared terrorism to be "the enemy of our generation" while signing a new law ordering sanctions against any nation investing in Iran and Libya, both considered terrorist states by the U.S. In fact, Clinton's action did nothing to lessen terrorist dangers, while it infuriated some of the closest U.S. allies. The law specifically penalized foreign firms that made investments in oil in the two countries, which were major petroleum suppliers to Europe. Clinton declared that the lesson for U.S. allies was "You cannot do business with countries that practice commerce with you by day while funding or protecting the terrorists who kill you and your innocent civilians by night." The allies said that this was posturing and an attempt to limit their sovereignty, and they filed a protest at The Hague.

      In fact, when it came to actual outrages perpetrated by tyrants, the administration's policy seemed singularly feckless. In a test of U.S. will, Iraq's Saddam Hussein sent some 40,000 armoured troops north from Baghdad on an incursion into ethnic Kurdish territories specifically declared a "no-go" zone by the victors in the Gulf War. Hussein effectively installed a puppet regime beholden to himself, wiped out bases where the CIA had launched covert actions against his government, and then withdrew. In retaliation, Clinton ordered two strikes of a total of 44 cruise missiles against replaceable Iraqi air defenses far to the south and increased the no-go zone in the same region. The symbolic action did nothing to restore the status quo.

      Clinton had irked allies earlier in the year with his posturing toward another old enemy, Fidel Castro. The U.S. was shocked when Cuba shot down two small, unarmed civilian planes as they flew over Cuban territorial waters from airfields in Miami, Fla. The aircraft were flown by members of the so-called Brothers to the Rescue, who had earlier goaded Castro by dropping anticommunist leaflets on Havana. In the wake of the shoot-down, Clinton threw his support behind the so-called Helms-Burton law, which allowed Cuban Americans whose businesses had been taken over during the 1959 revolution to file suit against foreign companies that bought or leased the assets from the Castro government. The law also mandated that the U.S. government deny a visa to any foreigner with a stake in such property. Clinton waived the more onerous sections of the law, but businesspeople from Canada and other countries were warned that they could face such sanctions. Their irate governments created countervailing sanctions in case the law was applied, and they filed suit against the U.S. before the World Trade Organization.

      In a further bow to conservative sentiment that irked many U.S. allies, not to mention many in the Third World, the Clinton administration cast a veto against the reelection of UN Secretary-General Boutros Boutros-Ghali. The U.S. was vexed at his secretive style, slowness to implement financial reforms, and ill-advised efforts to make the UN into a peacemaker in areas such as Bosnia and Herzegovina where peace might not be had without force. Boutros-Ghali's successor, Kofi Annan of Ghana, was applauded in the U.S. as a more open and reform-minded choice, but the move was resented, particularly by France.

      Such actions discomfited friends of the U.S., but in general the country's foreign policy during 1996 was aimed at avoiding political harm. Clinton endured criticism for his administration's continued support for the government of Russian Pres. Boris Yeltsin, but it seemed justified after Yeltsin had won elections against the resurgent Communist candidate, Gennady Zyuganov. (See BIOGRAPHIES (Zyuganov, Gennady Andreyevich ).) Yeltsin's health, however, continued to make the Clinton policy an open issue after the Russian president later underwent quintuple bypass surgery. Clinton's 1995 gamble to send U.S. troops to Bosnia in the aftermath of the Dayton Accords that ended the slaughter in former Yugoslavia likewise paid off as peaceful elections were carried out. The results followed predictable ethnic lines, and virtually no action was taken before world courts against the authors of acknowledged genocide. Growing public protests against the Serbian president, Slobodan Milosevic, whose irredentist ambitions were a prime cause of the Bosnian catastrophe, further seemed to vindicate the Clinton approach. The major loss to the U.S. in the Balkans during the year was the death of Commerce Secretary Ron Brown (see OBITUARIES (Brown, Ronald Harmon )), who died in an airplane crash near Dubrovnik, Croatia, as he led a group of business executives exploring the possibilities of economic reconstruction in the shattered area.

      The Middle East peace process, which Clinton had proudly midwifed, suffered a severe setback with the election in Israel of the conservative Likud government of Benjamin Netanyahu. The West Bank became embroiled in the worst Israeli-Palestinian violence in years. Nonetheless, by the end of the year, an uneasy peace had returned, and it seemed that progress was being made. Late in the year, Clinton also shuffled his foreign policy team, among other changes replacing Warren Christopher with the first woman to serve as secretary of state, former UN ambassador Madeleine Albright, and naming Bill Richardson as chief delegate to the UN.

      The area where U.S. foreign policy seemed to grow the most convoluted was in Asia, and once again election considerations lay at the bottom of it. The U.S. launched no major initiatives across the Pacific, where Asia was the focus of an immense industrial boom. The administration, however, had not come to a clear view of how to deal with this rising economic power, much of it the result of investments by U.S. businesses, or with an increasingly assertive China. In 1996 China replaced Japan as the largest single source of the U.S. trade deficit, and the U.S. frequently locked horns with China over that country's alleged violation of copyright laws, software piracy, and other economic issues. Despite allegations that the Chinese had sold magnets to Pakistan that could be used in developing nuclear weapons and the charge that U.S. businesses lost more than $2 billion annually to factories that illicitly copied software, films, and other intellectual property, the administration backed the extension of most-favoured-nation trading status for China.

      If Asian wealth was complicating foreign policy, it was also making a mockery of U.S. election law. As the election drew near, attention focused on the activities of John Huang, an Asian-American with connections to a wealthy Indonesian family that had business connections with China. Huang had raised more than $4 million for the Democratic Party during 1996. Possessed of a top security clearance, he had gathered in, among other things, an illegal $250,000 from a South Korean firm and $450,000 from an Indonesian couple. Another Asian-American fund-raiser and Clinton acquaintance, Taiwan-born Charles Yah Lin Trie, was revealed to have once taken a major Chinese arms dealer to the White House. Trie had also raised funds for the Clintons' steep legal bills in the Whitewater affair, some in the form of cash and checks in plain brown envelopes. Much of the money was returned, and there was no evidence of favours having been granted in return for the funds. Nonetheless, at year's end the Department of Justice had issued subpoenas to the White House for records on as many as 20 Democratic Party fund-raisers. (GEORGE RUSSELL)

      See also Dependent States .

▪ 1996

      The United States of America is a federal republic composed of 50 states. Area: 9,372,571 sq km (3,618,770 sq mi), including 205,856 sq km of inland water but excluding the 156,492 sq km of the Great Lakes that lie within U.S. boundaries. Pop. (1995 est.): 263,057,000. Cap.: Washington, D.C. Monetary unit: U.S. dollar, with (Oct. 6, 1995) a free rate of U.S. $1.58 to £ 1 sterling. President in 1995, Bill Clinton.

      By all rights 1995 should have marked a political nadir for U.S. Pres. Bill Clinton. As a result of the 1994 congressional elections, he had become chief executive in what amounted, in U.S. terms, to a minority government. Control of the legislative agenda shifted to Congress, dominated, for the first time in 40 years, by Republicans, and especially to the combative speaker of the House of Representatives, Newt Gingrich. (See BIOGRAPHIES (Gingrich, Newt ).) A massive rollback of welfare legislation and federal dominance was set in motion as the Republicans moved to fulfill the conservative "Contract with America" within their first 100 days in office. (See Special Report .) The president seemed reduced to the role of a bystander. Defections from the Democratic Party continued apace; in all, five Democrats switched parties after the elections. Nonetheless, by the end of the year, the president, while giving considerable ground, had managed to achieve more of a stalemate with Congress than many had believed possible.

      In November the president's veto of the Republican budget led to a standoff that idled 800,000 employees and shut down so-called nonessential functions of the federal government for six days. Treasury Secretary Robert Rubin nimbly raided selected federal pension funds in the interim to forestall default on the government's obligations, while the two sides reached accommodation on such issues as the target of balancing the budget in seven years, as Republicans demanded.

      The president and the Congress remained far apart on the specifics of how to achieve that aim, however, with Republicans looking for more than $1 trillion in spending cuts, largely from social welfare programs, along with $245 billion in tax relief, spearheaded by a $500-per-child tax credit. Along with the tax issue, one of the central disagreements was over controlling Medicare and Medicaid costs. The Republicans wanted to save $270 billion over seven years by cutting back increases in Medicare spending from 10% to 7% annually. Clinton deemed that unacceptable and proposed savings of $124 billion. On Medicaid, Congress was determined to make cutbacks in spending, convert the remainder into block grants to the states, and allow each state to set eligibility requirements. The president was determined to keep Medicaid as an entitlement. When agreement was not reached by mid-December, those parts of the government not yet funded were again forced to shut down while the president and congressional leaders attempted to work out a compromise. This time some 280,000 government employees were furloughed, and thousands who did government work on a contract basis also were not paid. In spite of a series of meetings between Clinton and top congressional leaders, no solution to the impasse had been reached by the time the year ended. Bipartisan attempts by senate leaders to reach a compromise failed to gain backing from hard-line Republicans in the House of Representatives.

American Disaffection.
      While the budget dominated headlines, the forces swirling in the American political cauldron in 1995 were more dramatically epitomized in an event far from Washington, D.C. The country was stunned on April 19 when a rented truck parked outside the Alfred P. Murrah Federal Building in Oklahoma City, Okla., erupted shortly after 9 AM, tearing the front off the nine-story structure and leaving 168 people dead, including 19 children. In addition, a nurse was killed during rescue efforts. The truck had contained homemade explosives, a mixture of diesel fuel and ammonium nitrate. The man who was allegedly responsible for the bomb was a former member of the U.S. Army and a veteran of the Persian Gulf War, Timothy J. McVeigh, a fringe member of a heavily armed American subculture of "militia" that espoused antigovernment views. His alleged coconspirator was Terry Lynn Nichols, a farmer from Herington, Kan. Both men were charged with offenses that carried the death penalty.

      The Oklahoma bombing drew attention to a radical degree of disaffection with the government in general and a number of federal agencies in particular. In its most extreme form, the disaffected militia movement claimed about 100,000 members who expressed hostility to the federal government, believed in foreign conspiracies to erode the sovereignty or even the territory of the nation, and often stored food and arms and practiced military training in anticipation of either invasion or some form of federal police state. All such groups disclaimed anything to do with the Oklahoma City bombing.

      Like McVeigh, however, almost all militia members were virulently opposed to gun-control laws, like the 1994 federal assault weapons ban, and many saw the antigun actions of the FBI and the Bureau of Alcohol, Tobacco and Firearms as being, in the words of a National Rifle Association official, the work of "jack-booted government thugs" intent on tearing down what they saw as the Second Amendment's guarantee of the right to bear arms. In particular, they saw the 1993 siege in Waco, Texas, of the Branch Davidian compound, in which 82 cult members died, as being evidence of a sinister and cold-blooded federal government attitude toward like-minded dissidents. Authorities investigating the Oklahoma tragedy were convinced that the date of the crime—the anniversary of the federal raid at Waco—was no coincidence.

      At a series of congressional hearings, Attorney General Janet Reno justified her endorsement of the assault on Waco, but she did not convince many skeptics. The FBI, however, took a more self-critical view in another case that had aroused a similar furor: the 1992 attempt to arrest a heavily armed Idaho man named Randall Weaver, a believer in white racial separatism, at his mountain cabin. After Weaver's 14-year-old son was killed in the clash, an FBI sharpshooter killed Weaver's wife as she stood behind a door with their 10-month-old daughter in her arms. Three years after the firefight, the agency paid Weaver and his surviving children $3.1 million in a civil settlement. FBI Director Louis Freeh also suspended his close friend and the number two man at the FBI, Larry Potts, while probing Potts's involvement in a change of the rules of engagement at the shoot-out.

      The militias were only the most highly charged manifestation of a deep-rooted anger with the encroachments of the federal government that also showed itself in hostility to those wearing its civil uniforms, from the FBI to the Bureau of Land Management and the Forestry Service. The anger led to a sense of siege among many members of the federal bureaucracy. In some parts of the country—notably the West, where feelings ran high against federal control of as much as 80% of the land in certain jurisdictions—some federal officials refused to be seen in their work clothes for fear of attracting sniper fire. Others faced lawsuits and even disobedience from state officials, who claimed that they, rather than federal authorities, should claim ownership of such public property.

      Much like the fringe anti-Vietnam War radicalism of the 1970s, the antigovernment terrorism and civil disobedience of 1995 represented the overheated froth of a much broader and more moderate consensus—that government, particularly the federal government, had taken more than its share of resources and political space and had to be reduced. The consensus, however, was coupled with a continuing sense of disquiet and uncertainty about the future that gave a sharp edge to the national debate in many arenas, including the jostling leading up to the 1996 elections. Anti-Washington sentiment and a desire for leadership outside the traditional mold powered a deep groundswell of support for the idea of a presidential candidacy by Gen. Colin Powell, a black man who had retired as chairman of the Joint Chiefs of Staff. Powell, who declared himself a Republican, eventually declined to run, however, leaving Senate Majority Leader Robert Dole as the Republican front-runner, but it also fueled renewed candidacies by Texas billionaire H. Ross Perot, who announced the Independence Party as his political vehicle, and by the nativist conservative Patrick Buchanan, a combative orator with a strong anti-immigrant and anti-free-trade platform. Both of the dissident candidates reflected an isolationist uncertainty about the U.S. political and economic role in the world that paralleled the domestic uncertainty.

The Economy.
      There was considerable uncertainty on the economic front. For the first time since 1992, in July the Federal Reserve Board (Fed) announced a cut in short-term interest rates, from 6% to 5.75%. Chairman Alan Greenspan and the Fed's Open Market Committee then made another, year-end rate cut, to 5.5%. The Fed actions signaled that the economy, in Greenspan's view, had achieved the so-called soft landing that he had tried to manage through seven previous interest-rate hikes. Growth for 1995 appeared to be headed for the 2.5% level that Greenspan deemed optimal. The unemployment rate was hovering in the range of 5.4%, and inflation seemed likely to be no more than 2.5% for the year. Flat retail sales and weakness in a number of leading indicators, however, gave some warning of slightly lower growth in early 1996.

      Meanwhile, in the midst of the budget battle, the Dow Jones industrial average rose past 5,000 after having pushed through 4,000 early in the year. Low interest rates, the prospect of reduced government spending, and a welter of high-performing high-tech issues had a lot to do with the performance, as did a continuing wave of mergers and acquisitions. Hikes in stock prices and merger mania went hand in hand with economies of scale, however, and the continuing globalization of the U.S. economy produced pink slips and fear alongside the bullishness. Typical of the paradox was the behaviour of AT&T, a profitable $75 billion megalith, which announced that it would break itself into three separate companies and shed 78,000 jobs.

      In the atmosphere of uncertainty amid fast-changing economic forces, many Americans found it easy to believe that stability was indeed eroding and that their government was not doing enough to stem the advantages wielded by foreign countries that "gained" the jobs lost at home. Mindful of the sentiment, the Clinton administration used the threat of 100% tariffs on luxury-car imports to pressure the Japanese into expanding their North American auto production and buying more U.S.-made parts and also threatened China with $1 billion in tariffs to force the government into policing the rights of U.S. manufacturers of such often-pirated goods as computer software.

      One of Clinton's earlier international economic initiatives came back to haunt him, however. When the Mexican peso collapsed in December 1994, the U.S. had rushed to bail out its partner in the hard-won North American Free Trade Agreement (NAFTA). The administration helped to cobble together a $50 billion international credit arrangement that included $20 billion worth of U.S. guarantees, and Congress grudgingly went along with the fiscal legerdemain. By international standards the bailout was a considerable success in stemming a financial hemorrhage from Mexico and in restoring investment confidence. The country's living standards, currency values, and labour costs swooned, however. Purchases of foreign-made goods, especially from the U.S., collapsed, while exports, boosted by a cheap peso, took off. The result was that after years of enjoying trade surpluses with Mexico, the U.S. suddenly found itself running a deficit, and a number of U.S. companies announced that they would forsake the U.S. for the cheaper labour available there. At the same time, the number of Mexicans entering the U.S. illegally in search of work took a strong upward hike.

      One effect of the Mexican crisis was a likely halt to further expansion of NAFTA. A more dramatic effect was the boost that Mexico's plight gave to opponents of immigration to the U.S., both nationally and in states like California that were particularly hard hit by the influx. In the 1994 elections California residents had already given approval to Proposition 187, a measure that would deny schooling and other benefits to the children of illegal immigrants. The proposition was endorsed by Gov. Pete Wilson, but parts of the measure, notably the schooling ban, were declared unconstitutional by a federal judge. Meanwhile, the U.S. Congress also seemed intent on cutting back benefits to legal immigrants as part of its budget tightening. In a bow to the same anti-immigrant sentiments, the Clinton administration announced that it would end the policy of giving Cuban boat people special status as political refugees and would instead return them to their homeland.

Developments in Government.
      President Clinton had long been notorious in his critics' eyes for trimming sails to suit whatever political breezes were blowing, but the new Republican majority in Congress made that tendency a sometimes helpful tool of statecraft. While it caused considerable anguish in left-wing Democratic circles, the president, who was the native of a region where states' rights were still a shibboleth, found it easier to accept many of the decentralizing initiatives of the Republican legislators. On the other hand, the president also seemed capable of taking advantage of splits in his opponents' ranks. He was able, for example, to head off some cutbacks in the Environmental Protection Agency, long a demon of many Republicans, after a number of more moderate Senate Republicans reconsidered the measure.

      In the midst of the new federal diffidence toward expanding or defending its reach, more initiatives emerged from the states. Some were nothing less than reactionary, like the decision of Alabama to restore prison road gangs and bring back leg irons (though other states concurred with the notion of a tougher prison regimen less aimed at catering to prisoner comfort). On issues of broader import, however, many states had shown the way in endorsing programs of voucher-driven education and "workfare" for welfare recipients, but many also began to tackle other areas. One of the touchiest and most explosive issues was race-based preferment. In California, Governor Wilson signed an executive order that abolished almost all affirmative action policies. (President Clinton ordered a review of federal affirmative action policies but then declared that most should continue.)

      The issue of race, perhaps the most sensitive tissue in the body politic, seemed to be undergoing a different kind of examination on each side of the black-white divide. While whites debated affirmative action, the largest black demonstration in Washington, D.C.'s history—larger than the 1963 march led by Martin Luther King, Jr.—took place under the auspices of the black separatist Louis Farrakhan, head of the Nation of Islam. The "Million Man March" was a powerful demonstration of the concerns of black males about family disintegration and personal responsibility and endorsed personal and spiritual, rather than governmental, solutions to such ills. The demonstration also gave a powerful boost to the standing of Farrakhan, hitherto considered a mesmerizing but marginal racial demagogue.

      Race also played an underlying role in the trial of O.J. Simpson, a black television pitchman and former football star, for the slaying of his white former wife, Nicole Brown Simpson, and her acquaintance Ronald Goldman. Simpson was acquitted after less than four hours of jury deliberation. The trial's turning points were the fiery, racially tinged address of Simpson defense counsel Johnnie Cochran and the discrediting of the Los Angeles police detective Mark Fuhrman, an investigator of the slaying who had, long before the trial, boasted to an interviewer of his racial prejudice and his planting of evidence to convict other alleged criminals. Enthusiasm or dismay at the trial outcome seemed to split largely along racial lines, which reinforced the notion that blacks and whites had entirely different views about the nature of the justice system.

      In looking anew at affirmative action, both federal and state governments were following the lead of the Supreme Court. In 1995 the court agreed that affirmative action programs had to meet tests of strict judicial scrutiny to be constitutional. By a 5-4 vote the justices also struck down a Georgia statute that allowed the gerrymandering of electoral districts to compensate for past racial segregation. In a setback for homosexual activists, the court ruled that private parades such as Boston's St. Patrick's Day celebration could exclude those it did not want to participate.

      In a decision that could prove to be one of the more far-reaching of its term, the court set a limit on the federal government's ability to use the interstate commerce clause of the U.S. Constitution to impinge on matters otherwise outside its jurisdiction. The clause, which became a cornerstone of federal activism in the era of Franklin D. Roosevelt, had been used to justify everything from food standards to civil rights investigations. In overturning the federal Gun-Free School Zones Act of 1990, which used the clause to declare the possession of firearms around education sites to be a federal crime, the justices ended its infinite elasticity. On the other hand, the court agreed that no limits could be set on reelection to Congress without a constitutional amendment, a blow to the term-limits movement.

      In another development relating to interstate commerce, the Interstate Commerce Commission (ICC), once the most powerful bureaucracy in Washington, closed its doors at the end of the year. As of the first day of 1996, it would be no longer in existence. Established in 1887 to curb the power of the railroad "robber barons," the commission at one time had the power to regulate almost everything that moved across state lines. The deregulation of transportation in the 1980s had deprived the ICC of most of its reason for existing, but it had survived several attempts to close it. The remaining employees and commissioners were transferred to the Department of Transportation.

      The House of Representatives passed a nearly total ban on gifts from lobbyists, following in the wake of a less stringent Senate ban. The measure did little, however, to stem the most questionable source of money for influence, donations to political action committees, and other devices that congressmen used to finance their political survival. House Speaker Gingrich, who had earlier given up a multimillion-dollar book advance from communications mogul Rupert Murdoch, whose vast holdings were much affected by federal oversight, drew a House ethics investigation after questions were raised about his alleged use of GOPAC, a not-for-profit organization, to funnel money to Republican causes. Congress proved itself tough on matters of legislators' sexual behaviour. Sen. Robert Packwood, a Republican who headed the Finance Committee, resigned after the Senate Ethics Committee voted for his expulsion. Packwood had been charged with sexual harassment by 19 women, including a 17-year-old.

      On the most high-profile ethics issue, the turgid Whitewater scandal, little insight was gleaned. Much of the focus of congressional concern had long since shifted from the original property deal, which took place long before the Clintons reached Washington, to the behaviour of administration officials after the July 1993 suicide of Vincent Foster, the White House counsel and overseer of the Clintons' personal finances. Deputy Attorney General Philip Hyman told a Senate investigating committee that his department had been forced to stand by while White House Counselor Bernard Nussbaum entered Foster's office and took files related to the Clinton family's personal affairs. The senators were intrigued by telephone logs that showed long conversations between Hillary Rodham Clinton and two of the intruders immediately after the entry. After initially balking, President Clinton agreed at the end of the year to turn over to Senate investigators notes from meetings on the matter.

Foreign Affairs.
      Nothing a president does is likely to affect the feelings of the American people as much as his decision to send U.S. troops into harm's way. In this, Clinton crossed the Rubicon with his Bosnian policy. The war in the Balkans between Serbs, Croats, and Muslims had been a frustration and a challenge to U.S. diplomacy since its inception. A Vietnam-era protester who had not served in the military, Clinton was sensitive to the difficulty, frequently underlined by his military advisers, in becoming involved in a civil war in a country where American high-tech superiority might count for little and the possibility of casualties was high. The scale of the Balkan atrocities—perhaps 250,000 killed and 3 million displaced in "ethnic cleansing"—and the inability of European allies in NATO to find a solution prompted Clinton to act, however.

      At first Clinton did so rhetorically, urging a relatively safe bombing campaign against the Bosnian Serbs—considered the chief aggressors—as a way of halting the war. This did not suit American allies, who pointed out that the U.S. had no UN peacekeeping troops on the ground to worry about. Eventually, however, when the Bosnian Serbs began overrunning protected "safe areas" and killing or expelling Muslim inhabitants, Clinton acted, with unhappy results. As NATO aircraft bombed Bosnian Serb artillery positions, the Serbs took over 300 UN peacekeepers hostage and threatened to kill them if the bombing did not stop.

      In August a sudden Croatian military offensive regained territory previously taken by the Serbs. The offensive, it turned out, was the result of a covert U.S. retraining and reorganizing of the army of Croatian Pres. Franjo Tudjman, part of a policy advocated by Assistant Secretary of State Richard Holbrooke, who had emerged as the maestro of Balkan realpolitik. The next important stage was to bring together Tudjman with Serbian Pres. Slobodan Milosevic and Bosnian Pres. Alija Izetbegovic at Wright-Patterson Air Force Base near Dayton, Ohio, for talks in November that ended after three weeks with a fragile treaty. The agreement was to be overseen by a 60,000-member NATO force that would keep the enemies apart along 4-km (2.5-m) cease-fire zones. In the long run, the U.S. would train the weaker Muslim army to underpin the peace with a credible balance of power.

      The peace accord was a dramatic vindication of the U.S.'s role as the only remaining superpower and a huge political risk for Clinton as he entered an election year. Despite assurances that the troops would depart from Bosnia and Herzegovina within a year and would be able to respond with maximum force if attacked, the likelihood of at least some U.S. casualties seemed high, and no vital U.S. interest appeared to be served. Public opinion polls registered a great deal of opposition, but Clinton received support for his initiative from his likely presidential rival, Senator Dole. Other prominent Republicans attacked him for the risky venture.

      Twenty years after the end of the Vietnam War, Clinton extended diplomatic recognition to Hanoi. The action was greeted with protest by disaffected U.S. military veterans, but it was hailed by American business, which rushed in to make deals long available to European and Asian competitors. Skeptics also growled as the U.S. and North Korea signed a deal in which the U.S. provided two nuclear reactors in exchange for an agreement by the economically battered regime of Kim Jong Il that it would dismantle its nuclear enrichment program, widely seen as a prelude to acquiring nuclear weapons.

      Under congressional pressure, Clinton reversed a decade-old policy that had kept Taiwan's head of state, Pres. Lee Teng-hui, from setting foot on U.S. soil, a bow to China's claim to be the sole legitimate government. The administration decided to allow Lee to visit his alma mater, Cornell University, Ithaca, N.Y., to receive an honorary degree. The action led to strong statements from China about subversive American intentions, the punitive awarding of lucrative automotive contracts to non-American firms, and a tougher stance toward selected dissidents. China's continuing desire to gain entry to the world trading community, however, made it unlikely that the U.S. gesture would permanently mar relations with the world's most populous nation. (GEORGE RUSSELL)

      See also Dependent States .

▪ 1995

      The United States of America is a federal republic composed of 50 states. Area: 9,372,571 sq km (3,618,770 sq mi), including 205,856 sq km of inland water but excluding the 156,492 sq km of the Great Lakes that lie within U.S. boundaries. Pop. (1994 est.): 260,967,000. Cap.: Washington, D.C. Monetary unit: U.S. dollar, with (Oct. 7, 1994) a free rate of U.S. $1.59 to £ 1 sterling. President in 1994, Bill Clinton.

      U.S. Pres. Bill Clinton must have been hard pressed to discern much cause for cheer by the time 1994 wore to a close. Battered by allegations of sexual and financial misconduct—the latter focused as well on first lady Hillary Rodham Clinton—the president also saw the centrepiece of his legislative program, health care reform, die in Congress. Within the White House, a new chief of staff failed to bring much-needed discipline or prevent a steady string of resignations by top aides under attack for alleged improprieties or conflicts of interest. By the end of the year, the president was deemed anathema even by considerable numbers of fellow Democrats, who declined his campaign support during the November elections. Paradoxically enough, the man elected in 1992 to solve the nation's festering domestic problems could take solace as 1994 ended chiefly in a string of foreign policy successes and a hard-won victory in expanding the global free-market system.

      For the first time since 1954, the Democrats lost control of both houses of Congress. (See Sidebar (UNITED STATES: The 1994 Midterm Elections ).) Newt Gingrich of Georgia, who would become the new speaker of the House, was hailed as the chief architect of the Republican triumph. The trend continued among the states, where Republicans had a net gain of 11 governorships, boosting their total to 30 and ousting such powerful figures as Mario Cuomo of New York and Ann Richards of Texas.

The Economy.
      The sentiment that seemed to motivate voters was not, on the surface, inspired by dire economic facts. The economic outlook in 1994 generally appeared to be good. The unemployment rate in December, 5.4%, was at a four-year low, down dramatically from a high of 7.8% two years earlier, and the economy was generating an average of some 275,000 new jobs every month, some 3.5 million for the year. The U.S. share of world manufactured exports, a time-honoured measure of national economic strength, was rising toward 16%, while those of Japan and Germany were in decline. Per capita disposable income was rising steadily, and so were corporate profits. General Motors, for example, the world's biggest industrial company, which had reported a titanic $4.9 billion loss in 1991, was showing a $2.8 billion profit by mid-1994, more than for all of 1993. A new wave of mergers and acquisitiveness gripped a number of U.S. business sectors, notably the telecommunications and health care industries. Inflation remained under control—the consumer price index rose 2.7% during the year—and price stability seemed more or less assured, at least for the short term.

      There was, however, a steady ratcheting up of interest rates by the Federal Reserve Bank (Fed), from a short-term figure of 3% at the beginning of the year to 5.5% at year-end. Between February and November the Fed raised rates six times, and at one point it hiked its key interest rate twice in little more than a month. The main reason for the Fed's action was the feeling on the part of its chairman, Alan Greenspan, and a majority of the members of the Open Market Committee that the continuing economic expansion might lead to eventual overheating and supply bottlenecks, which would, in turn, refuel inflation. By making money more expensive and thereby slowing the rate of expansion, the Fed aimed to keep the underlying potential for inflation under control. The moves spread turmoil in the financial markets, however, always sensitive to interest-rate hikes, and early in the year there occurred the biggest single-day drop in the Dow Jones industrial average since 1991.

      The effects were even more parlous in the bond markets, which had become highly dependent on mathematically complicated forms of futures contracts, known as derivatives, that offered substantial gains—and equally severe losses—depending on how successfully investors bet on the prevailing financial bellwethers. With the change in Fed policy, large numbers of institutional investors—from corporate treasurers to managers of college endowment funds—bet spectacularly wrong. In a move that rocked the municipal bond market, Orange county, in southern California, filed for bankruptcy protection after highly leveraged investments went sour and cost the county $2 billion. Lesser shocks were felt by millions of individual investors who had moved money out of traditional, low-interest forms of insured savings into mutual funds that held derivatives. The effect was to dispel some of the feeling of security and well-being that might have been inspired by the economic performance of goods, services, and jobs.

      As is common in economic recoveries, U.S. productivity and profitability increased in important measure because workers stayed on the job for more overtime hours—more so in 1994 than in previous business cycles. In the third quarter of the year, for example, the factory workweek reached a near-record 42 hours, including almost 5 hours of overtime. Among debt-laden consumers, however, the resulting income gains were offset by hikes in the interest costs for credit card purchases, mortgages, and car payments. Too, despite the swelling number of available jobs, many corporations continued to cut payrolls to maintain their competitive advantage. Consumer spending remained strong through most of the year, with an annual rise of 7.6% in 1994, but retail sales unexpectedly slumped in December. Overall the improved economic picture was marred by a continuing, deep-rooted sense among individuals that all was not as well as it should have been or as secure as it had been in the past.

Health Care.
      It was just such a feeling of insecurity that Clinton had addressed during his successful 1992 election campaign and that his proposal for universal health care seemed designed to allay. At first the nation seemed willing to make the changes required for providing health coverage for the 37 million or more Americans said to be uninsured. At the same time, there was a strong feeling that the patchwork U.S. health care system—with its welter of private insurers, employer-sponsored insurance plans, private doctors and hospitals, plus a government subsidy system for the poor and elderly—was far too expensive. Nonetheless, the plan the Clintons had unveiled in September 1993—with Mrs. Clinton as the overseer—ran into a minefield of opposition after it was presented to Congress. Its sheer complexity—the original document weighed in at 1,368 pages and included radical innovations such as national price controls, huge mandatory health care alliances, and government-mandated coverage by employers—brought together a broad array of opposition forces.

      In fact, a number of dramatic changes had been occurring in the health care system. Spurred by the notion of widespread government intervention, private care providers had begun to rein in spiraling costs. More and more employers had enrolled workers in health maintenance organizations (HMOs)—networks of doctors and hospitals that closely monitored costs and rewarded caregivers for keeping them under control. The HMOs were sometimes bureaucratic and unwieldy, but their rapid expansion through start-ups, mergers, and acquisitions was one of the salient features of economic activity during the year. Further, the more large-scale employers began to get their costs under control, the less enthusiastic they became about endorsing enhanced government control. For example, the Business Roundtable, a group of 200 of the largest U.S. corporations, endorsed a rival congressional scheme that did not place emphasis on controlling prices or on universal coverage. There was also opposition from other groups, including small businesses, insurance companies, and the elderly.

      As the president faced an increasing number of opponents to the proposal, he frequently tried to be conciliatory to all sides at once, even while trying to talk Congress into doing his bidding on the issue. At various times he declared almost every aspect of the Clinton health care plan to be negotiable. Universal coverage itself, however, the president declared to be inviolable—until he eventually gave a nod to a competing proposal that would settle for 95% coverage over several years' time. Opponents came up with even more alternative schemes to bleed momentum from the reform movement, and at one point more than 150 different health care bills clogged the congressional system. Eventually none of the proposals picked up the legislative support necessary to force a bill through Congress.

Welfare Reform and Crime.
      In his state of the union address, Clinton also turned his attention to two other social issues of long-standing concern, welfare reform and crime. Welfare reform in particular was a notion that stirred enthusiasm across the country, where it was assumed to mean a cutback in support payments to the poor and near poor, including such programs as Medicaid and food stamps. Of particular concern in the public mind was Aid to Families with Dependent Children, a program that cost $16 billion annually—not much in the overall budget but symbolic to many of the culture of welfare dependency, involving unwed mothers, neglected children, and unemployed teenagers. Various states were already experimenting with "workfare" programs involving mandated employment when Clinton announced in his address that he would propose a similar scheme, including a welfare payment cutoff after two years coupled with aggressive programs of job training and retraining. Traditional constituencies within his party objected, however, and Congress took no action.

      Even though various violent crime rates were still declining, Americans continued to see a growing threat to their way of life and to demand ever more draconian punishments. By 1994 the number of people sentenced to federal, state, and local prisons had far outstripped the nation's capacity to jail them. Federal and state prisons held some 925,000 inmates, about double the population of a decade earlier. Local jails held another 450,000, or triple the capacity 10 years earlier. The average cost of holding that population was $23,500 per inmate, yet the public demanded more: more police, more prisons, and more mandatory sentences.

      Clinton's 1994 crime bill attempted to ride the law-and-order wave by endorsing the controversial proposal of mandatory life sentences for violent offenders found guilty of three consecutive offenses. It also included $28 billion for additional prisons and police, which Congress speedily bid up to $33.5 billion—and, after a series of horrifying massacres around the country, a proposal for the first time to ban outright 19 different so-called assault weapons, firearms capable of rapid, automatic fire. The ban was virulently opposed by the National Rifle Association but was supported by law-enforcement agencies, and it narrowly passed the House 216 to 214. It eventually became law separate from the crime bill. The overall bill, however, went down to defeat when Republicans attacked it for containing excessive amounts of pork-barrel funding. After lobbying by the White House, a slightly trimmed version, calling for expenditures of $30.2 billion, became law.

Personnel and Personal Problems.
      Such near disasters only contributed to the Clinton White House's reputation for ill discipline, fecklessness, and lack of attention to the minutiae of pushing a program through Congress. The Clintons, loyal to the team of Arkansans and other friends they had brought to Washington, resolutely rejected the idea of a major administrative shake-up until the clamour grew too strong to ignore. The president in effect fired his boyhood chum, White House Chief of Staff Thomas ("Mac") McLarty, and replaced him with the head of the Office of Management and Budget, Leon Panetta. The anticipated broader shake-up failed to take place, however. Instead, the heads of top administration officials began to roll in connection with a variety of alleged scandals—none involving much hard evidence of wrongdoing—that had mostly been over long before the Clintons went to Washington and that were collectively known as the Whitewater affair.

      The details of Whitewater rivaled, in their numbing complexity, the details of the Iran-contra scandal of the Reagan era but without the grave implications for the institution of the presidency, since most of the Whitewater action had taken place during 1978-91, while Clinton mainly occupied the attorney general's office and the governor's mansion in Little Rock, Ark. The finger-pointing mostly revolved around the Clintons' failed investment in a small-scale rural land development north of Little Rock in partnership with James McDougal, owner of the Madison Guaranty Savings and Loan. Madison Guaranty eventually went bankrupt, costing taxpayers $45 million, and McDougal was charged with, but eventually acquitted of, bank fraud. There was no evidence that the Clintons, who claimed to have lost almost $69,000 in the land deal, were aware of any wrongdoing, but critics made much of their association with McDougal at a time when Clinton was ultimately responsible for banking oversight in the state and when his wife, then an attorney with the Rose Law Firm in Little Rock, at one point performed minor legal work for Madison Guaranty.

      The accusations of scandal had percolated without much result in 1993 until the apparent suicide that July of Vincent Foster, a Rose Law Firm partner who had gone to Washington as Clinton's personal counselor and the family lawyer. It was discovered that in the suicide's wake a number of top Clinton aides, including White House Counselor Bernard Nussbaum, had entered Foster's office and taken files related to the Clinton family's personal affairs. As critics cried cover-up, the Clintons spent much of 1994 in a determined effort to protect the privacy of their past dealings—which only convinced many, particularly in the press, that they had something to hide. The situation became even more difficult when a number of White House officials were subpoenaed to appear before Congress to explain their attempts to ride herd on the Whitewater scandal. Many of the officials suffered lapses of memory during their testimony, and one of them, Deputy Secretary of the Treasury Roger Altman, resigned after being accused of intentionally misleading Congress about his reports to the White House while serving as the acting head of the Resolution Trust Corporation, which was investigating the Madison Guaranty failure. An independent prosecutor continued investigation of Whitewater throughout the year.

      Another matter that continued in the news was a series of investments in 1978 and 1979 by Mrs. Clinton in cattle futures, which netted a profit of about $100,000 on an investment of $1,000, less than the usual minimum for such high-risk trading. She had been advised in her moves by an attorney associated with the Tyson food-processing empire, Arkansas's largest private company and one regulated by both state and federal governments.The clamour went up that the investment was an apparent conflict of interest, and eventually the stain spread to include Secretary of Agriculture Mike Espy, who resigned after it was revealed that he had accepted favours from Tyson while in office.

      On December 28 a federal district court judge ruled that a sexual harassment lawsuit filed against Clinton by a former Arkansas state employee should not proceed to trial until after the president left office.

Other Developments.
      One domestic triumph that stood out was the president's choice to replace Supreme Court Justice Harry Blackmun, who stepped down from the bench at age 85. In seeking a successor, Clinton first looked to Senate Majority Leader George Mitchell, who had decided to retire, but Mitchell declined. A month later Clinton named Boston federal appeals court judge Stephen Breyer (see BIOGRAPHIES (Breyer, Stephen )) to the post. Breyer, a onetime chief counsel to the Senate Judiciary Committee, an antitrust specialist, and an expert on administrative law, was almost universally applauded for his intellect and his consensus-making skills.

      On three occasions during 1994, the White House was the object of physical attacks. In September a small plane crash-landed on the grounds, killing the pilot. A month later a man, subsequently charged with several felonies, fired on the residence with a semiautomatic weapon. Near the end of the year, in December, shots were fired that reached the grounds and the White House itself, one bullet piercing a window in the State Dining Room. In none of them was the president injured or in immediate danger.

Foreign Affairs.
      In his first year in office, Clinton had gone to great lengths to avoid involvement in foreign affairs while pursuing his domestic agenda. In 1994, however, the sense of priorities was gradually reversed. The president began the year at a foreign policy summit, meeting with Russian Pres. Boris Yeltsin in Moscow in January and scoring a major national security triumph when the U.S. and Russia formally ended their mutual nuclear terror by agreeing to point their strategic missiles at empty oceans rather than at any country's territory. Ukrainian Pres. Leonid Kravchuk added further lustre to the trip when he agreed to dismantle about 175 former Soviet intercontinental ballistic missiles on his territory, along with their attendant 1,800 nuclear warheads, in exchange for $1 billion in aid. Soon thereafter, Clinton ended another decades-old enmity when he formally dropped the 19-year U.S. trade (and investment) embargo against Vietnam, citing the Hanoi government's cooperation in the search for U.S. servicemen still missing in action in Southeast Asia. Clinton then cauterized the embarrassment of the intervention in Somalia, undertaken by his predecessor, George Bush, by ordering U.S. troops out of the warlord-riddled country.

      As much as possible, Clinton installed trade and economics rather than military and ideological considerations at the centre of his foreign policy. Among other things he scrapped almost all export controls on previously sensitive telecommunications devices and computers to Russia, Eastern Europe, and China. In the case of China, he ended the linkage between human rights and most-favoured-nation trading status. Later in the year he met again with the other leaders of the 18-nation Asia-Pacific Economic Cooperation forum, and he agreed to join in the creation of an enormous trans-Pacific free-trade zone by 2020. Similar action for the Western Hemisphere was taken at the 34-nation Summit of the Americas held in December. In the wake of the punishing midterm election results, the president successfully lobbied for passage by Congress of the General Agreement on Tariffs and Trade.

      Throughout the year the administration kept up arduous and often frustrating negotiations with North Korea. (See East Asia and the Transition in North Korea (Spotlight: East Asia and the Transition in North Korea ).) The U.S. tried a wide variety of blandishments and threats to persuade the North Koreans to once again allow international inspections of their nuclear facilities. After Pres. Kim Il Sung died (see OBITUARIES (Kim Il Sung )) and was replaced by his son Kim Jong Il (see BIOGRAPHIES (Kim Jong Il )), former U.S. president Jimmy Carter resumed talks he had begun in June and successfully brokered an arrangement whereby North Korea would turn over outmoded equipment in exchange for less dangerous power reactors and agree to inspections in 10 years' time. In December, however, another crisis developed when a U.S. helicopter was downed on North Korean territory. One crew member was killed in the crash, while the other was released unharmed after 13 days of tense negotiations.

      In the Middle East, long a focus of U.S. preoccupation, Clinton did not have a major role to play in 1994, yet for the second year in a row, he witnessed the signing of a historic peace accord. This time the pact was between Jordan and Israel, and it left the issue of the Golan Heights and peace between Israel and Syria as the major unmet goal of diplomacy in the region. Clinton himself made a bid to move the process along at a meeting with Syrian Pres. Hafez al-Assad, but to little effect. Yet when it seemed appropriate to draw the sword in the Middle East, Clinton reacted with energy and dispatch. After Iraqi Pres. Saddam Hussein ordered 50,000 heavily armed troops toward the frontier with Kuwait, in October Clinton airlifted thousands of U.S. troops to the region, and the Iraqi dictator quickly backed away.

      The same could not be said for the warring sides in the Balkans, who scoffed at half-hearted efforts by NATO forces to impose limits on the long-running war in Bosnia and Herzegovina through ineffectual air strikes at nearly valueless targets. The NATO effort reflected a deep split between the U.S. and its chief European allies, notably Britain and France, which had peacekeeping forces on the ground in Bosnia, as the U.S. did not. The rift deepened and even threatened the foundations of the North Atlantic alliance as the year wore on, and the U.S., prompted by sentiment in Congress, tried to redress the military balance between the beleaguered Bosnian Muslim forces and the Bosnian Serbs, who had essentially won the genocidal war. The U.S. unilaterally ended its own arms embargo against both sides (which meant effectively against the Muslims) and said that it would not help its allies to enforce their ban. Later, the U.S. pressed for NATO air strikes. Finally, however, Washington acknowledged that NATO solidarity was more important than the integrity of Bosnia and backed down amid admissions from Secretary of State Warren Christopher that the entire crisis had been bungled. At the invitation of the Bosnian Serbs, Carter went to the area in December to broker a tentative cease-fire.

      The president was faced with equally thorny choices in defending U.S. borders from a flood of Cuban and Haitian refugees who took to the Caribbean in virtually anything that would float in order to escape conditions at home. In the case of the Cubans, Clinton at first hesitated and then reversed decades of U.S. policy that embraced such escapees automatically as legitimate seekers of political asylum. Some 30,000 were interned at U.S. bases at Guantánamo Bay and in Panama while the White House negotiated with the regime of Fidel Castro (see BIOGRAPHIES (Castro, Fidel )) to stanch the flow, to which the Cuban government had turned a blind eye. The two sides eventually agreed to an increase of 20,000 per year in the quota of Cubans allowed into the U.S. through proper channels.

      The Haitian tide was harder to stem. Throughout much of the year, the Clinton administration hoped that an effective economic embargo of Haiti would cause the regime of Gen. Raoul Cédras, the Haitian army commander, to accept the return of ousted Pres. Jean-Bertrand Aristide (see BIOGRAPHIES (Aristide, Jean-Bertrand )). For his part, Aristide fumed that the U.S. did not object to Cédras' remaining in control. As thousands of boat people washed up on the coast of Florida, however, the administration came to the view that only military intervention would work. In September the U.S. assembled a fleet of 23 warships and 20,000 troops and set out for Port-au-Prince. Once again a last-minute intercession by Carter proved to be decisive. With U.S. warships in sight, Cédras and his cohorts agreed to allow the troops ashore. The U.S. soldiers quickly took control, ferried the top military leadership into exile, reinstalled Aristide, and began the longer-term, and more difficult, task of helping to rebuild the poorest country in the Western Hemisphere from the ground up.


      See also Dependent States .

▪ 1994

      The United States of America is a federal republic composed of 50 states. Area: 9,372,571 sq km (3,618,770 sq mi), including 205,856 sq km of inland water but excluding the 156,492 sq km of the Great Lakes that lie within U.S. boundaries. Pop. (1993 est.): 258,233,000. Cap.: Washington, D.C. Monetary unit: U.S. dollar, with (Oct. 4, 1993) a free rate of U.S. $1.52 to £ 1 sterling. Presidents in 1993, George Bush and, from January 20, Bill Clinton.

      William Jefferson ("Bill") Clinton (see BIOGRAPHIES (Clinton, William Jefferson )) swept into the White House in 1993 on a wave of high expectations. As the candidate of "change," a word he used often during his presidential campaign against incumbent George Bush, President Clinton was committed to a dramatic reversal of the economic and political stagnation he had blamed on 12 years of conservative Republican rule. Within weeks of the January 1993 inauguration, however, Clinton's new administration was wobbling badly, the victim of ineptitude, bad judgment, and a knack for needless controversy. Fortunately for Clinton, the freshman jitters were eventually dispelled, and the 42nd president of the United States finished the year with an impressive record of accomplishment. According to Congressional Quarterly, for instance, he succeeded in moving more legislation through Congress in his first year than any other president since Dwight Eisenhower in 1952. And he did not have to use his veto power even once, a feat not seen since Richard Nixon's first year in 1969.

The Presidency.
      President Clinton's start was one of the shakiest in recent history. Among his first acts was his declaration that he would seek an end to the U.S. military's long-standing ban on homosexuals in the ranks. Though the move was popular among gays and many other Americans and Clinton had promised it during the election campaign, few Washington analysts thought he would move on such a potentially explosive issue so quickly. Indeed, Clinton's declaration put him at odds with top military leaders and with a number of key civilians who had oversight responsibilities for the armed forces. Chief among the latter was Sen. Sam Nunn, the Georgia Democrat who headed the Senate Armed Services Committee. After heated debate, Clinton managed to gain support for a compromise measure under which homosexual servicemen and servicewomen could remain in the military if they did not openly declare their sexual preference, a policy that quickly became known as "don't ask, don't tell." Yet military officers were overwhelmingly opposed to that approach, fearing that the mere presence of homosexuals in the armed forces would undermine morale. The policy was further undermined by discrimination suits that upheld the right of gays to serve in the military without fear of discrimination. The controversy helped send Clinton's approval ratings plunging to the lowest levels ever recorded for a first-year president and distracted the administration as it struggled to assemble its initial legislative agenda.

      The White House also encountered exasperating difficulty in filling a number of high-level positions in the new government. Two successive nominations for the job of attorney general, the nation's top law-enforcement officer, were derailed by disclosures involving the hiring of domestic help. Zoë Baird, a Connecticut insurance lawyer, was accused by Republicans of not having paid proper payroll taxes for a child-care worker; though the offense was minor and the taxes were eventually paid, she withdrew after being accused of impropriety. Kimba Wood, a federal judge in New York, was reported to have hired an undocumented foreigner for her household; though the practice was not illegal at the time and Wood had kept abreast of payroll taxes, she too was forced to withdraw. The job eventually went to Janet Reno, the state's attorney for Dade county, Fla. (See BIOGRAPHIES (Reno, Janet ).) The Baird and Wood incidents angered many women, who felt that such accusations would not have been brought up in connection with a male candidate. Indeed, Ron Brown, the former Democratic National Committee chairman whose nomination as commerce secretary sailed through Congress, admitted later—to no ill effect on his appointment—that he, too, had been less than punctilious in hiring domestic help.

      One other female nominee was sidelined by Republican opposition, though in this case ostensibly for ideological reasons. Lani Guinier, a law professor at the University of Pennsylvania, withdrew from consideration as the Justice Department's top civil rights official after conservatives objected to what they described as Guinier's radical positions on voting rights and related issues. Though Guinier's supporters protested that her views had been distorted and were hardly controversial, Clinton chose not to stand by her.

      And so it went throughout the early months of the administration. The White House would announce a nomination, Republican opposition would coalesce, and the candidate would withdraw. The failure rate was remarkable for a Democratic president whose party controlled both houses of Congress. Clinton was widely criticized for his timidity in confronting the Republicans. One crucial problem for him was an unusual degree of cohesion among the opposition. Far from being in disarray after losing the White House, the Republicans were lining up en bloc against administration initiatives. Conservative Republicans hinted that they were simply giving Clinton appointees the same sort of harassment they felt that three Republican nominees for the Supreme Court, Robert Bork, Douglas Ginsburg, and Clarence Thomas, had suffered at the hands of Democrats during the Reagan-Bush years (Bork and Ginsburg withdrew; Thomas eventually won confirmation, but only after televised hearings into allegations that he had sexually harassed a colleague, Anita Hill). The Democrats, meanwhile, were just as independent-minded as ever. Under long-standing House and Senate rules designed to limit abuses by the majority, a determined minority could prevent appointments and legislation from even coming to a vote. The Republicans acted cohesively enough to take advantage of those rules; the Democrats were too fractious to stop them. As a consequence, Clinton began to look ineffective.

      One of the president's first major pieces of legislation, an economic stimulus plan, was killed by a Republican filibuster. Clinton's next big initiative, a deficit-reduction package, ran into an early blitz of opposition from Republicans and from various special interests. That was not surprising, given its content: substantial tax increases and modest spending cuts that would affect many industries and individuals adversely. After months of wrangling, a watered-down version of the measure passed with the narrowest of margins; Vice Pres. Al Gore, in his role as president of the Senate, cast the tie-breaking vote.

      The package was expected to cut $500 billion from the federal budget deficit over five years. It included stiff tax increases for upper-income Americans, a slight boost in the corporate tax rate, and a 4-cent-a-gallon (1 gal = 3.8 litres) increase in the federal excise on gasoline. Americans barely noticed the latter, since a softness in global petroleum prices and notoriously low U.S. petroleum taxes had helped keep U.S. gasoline among the world's cheapest—about 25-30 cents a litre. The spending cuts ranged widely across the federal budget, though no serious reductions were made in such major and sacrosanct items as social security and Medicare.

      As the months wore on, Clinton began to gain expertise at wooing and arm-twisting. He succeeded in gaining adoption of his $1.5 billion national service plan, under which 100,000 young Americans would earn cash and credits toward college tuition by working in public service jobs. By autumn, when he faced one of the biggest tests of his administration, he was ready to wheel and deal. The issue was congressional approval of the North American Free Trade Agreement. NAFTA had been painstakingly negotiated by administrations of Ronald Reagan and George Bush, and Clinton had declared his support for it during the 1992 election campaign. The treaty would reduce tariffs between the U.S., Canada, and Mexico on a wide array of products and, in effect, create the world's largest free-trade zone. Business executives and economists supported the measure by a wide margin, confident that it would spur trade and thus prosperity in all three countries. Trade union leaders, environmentalists, and a variety of other interest groups opposed the measure, fearing, among other things, that it would prompt U.S. companies to move their operations to Mexico, where wages were lower than in the U.S. and Canada and environmental standards less rigorous.

      Prominent among NAFTA's opponents was H. Ross Perot, the Texas billionaire who a year earlier had made a run for the presidency. Perot's prediction that the measure would produce "a giant sucking sound" as U.S. jobs were lost to Mexico became a rallying cry of the treaty's critics. As the congressional vote on the agreement approached, chances of passage seemed dim. In apparent desperation, the White House accepted Perot's proposal that he and Vice President Gore debate the issue on national television. They appeared together on interviewer Larry King's Cable News Network talk show, and Gore was credited by many pundits and pollsters with having got the better of his challenger. In any case, public opinion began to swing toward the treaty. Meanwhile, Clinton was wooing legislators with intimate dinners at the White House and promises of federal largesse for their home districts. In the end Clinton prevailed, and NAFTA was passed by both houses.

      The victory provided the president with a measure of momentum that had previously eluded him. Capitalizing on it, he successfully pressed for the passage of a major anticrime bill that included a controversial waiting period on handgun purchases. He also intervened decisively a month later to end a strike by American Airlines flight attendants that threatened to disrupt travel over the Thanksgiving holiday weekend. By year's end it appeared that Clinton, a newcomer to Washington whose previous job had been governor of Arkansas, had figured out how to do business in the nation's capital.

Health Care.
      Perhaps the most important initiative of the new administration, health care, had not yet been formally debated by Congress by the end of 1993, but it nonetheless carried the potential for dramatically changing the way many Americans lived. Unlike most industrial countries, the individualistic, free-enterprise U.S. did not have a comprehensive government health care system. Instead, Americans made do with a patchwork of private insurers, employer-paid insurance, private doctors, private and tax-supported hospitals, and government subsidies for the poor and the elderly. For years the system worked satisfactorily. Though infant mortality rates were relatively high, Americans were generally healthy, and U.S. medical technology was the envy of the world. Yet the system was not without its critics. As Clinton noted during the election campaign, an estimated 37 million Americans had no health insurance coverage at all, and costs were rising sharply throughout the health care industry. In recent years costs had far outpaced the overall rate of inflation. By 1992 the U.S. was spending more than 14% of its gross domestic product (GDP) on health care, up from less than 6% in 1965 and double the percentages in Britain and Japan.

      The reasons for that explosive growth in spending were clear enough; insurance plans provided for virtually unlimited coverage, so hardly anyone in the health care system had an incentive to control costs, and Americans found it difficult to deny themselves access to the most expensive medical technology. Patients wanted the best care possible, and doctors gave it to them without regard to price because someone else, either an insurance company or the government's programs of Medicare (for the elderly) or Medicaid (for the poor), would pay a share of the bill. Yet costs were rising so steeply that the share that individuals had to pay was soaring. Opinion polls showed that while Americans were generally satisfied with the quality of care they were receiving, the costs worried them deeply.

      In a dramatic move to address those concerns, Clinton unveiled a thorough overhaul of the U.S. health care system. The plan, which had been formulated under the supervision of first lady Hillary Rodham Clinton (see BIOGRAPHIES (Clinton, Hillary Rodham )), had three basic elements: universal coverage for all Americans; employer mandates, under which companies would pay 80% of their workers' health insurance premiums; and a system of controls on medical costs. The plan had other details certain to be altered in the expected give-and-take with Congress and interest groups. For instance, the proposal would cover mental health costs, which could prove unacceptably expensive. Likewise, the plan called for a national health board that would enforce price controls, a notion that doctors and hospitals opposed and that economists called unworkable. Another feature of the plan, the creation of giant health alliances that would purchase coverage from private insurers on behalf of nearly all people in a particular region, was so radical that it faced months of study and debate, as well as a likelihood of being dropped.

The Economy.
      The pall of gloom that had hung over the U.S. economy for years was lifting. The recovery had actually begun during the Bush administration, but public perception did not catch up with reality until late in 1993. Most measures of business and consumer confidence were rising, and the stock markets hit new highs several times during the year.

      Signs of renewed vigour were almost everywhere. Consumer spending in the third quarter was up 4.2% from a year earlier. Investment in plant and equipment hit levels not seen since 1984. Unemployment dropped from 7% at the beginning of the year to 6.4% in December, and an average of 150,000 new jobs were created every month (despite a number of highly publicized mass layoffs announced by leading companies). GDP, the total amount of goods and services produced in the country, rose at an inflation-adjusted rate of more than 3%, about the same pace as in 1992. In September, sales of new single-family homes hit their highest monthly level since December 1986. The average price of those homes was up 7.9% from a year earlier, a clear sign of increased demand in a sector of the economy that had long been depressed. Even the American auto industry, battered for years by declining profits and rising imports from Japan, turned in its best year since 1989.

      All this activity raised fears that inflation might return, though prices remained remarkably stable throughout the year. The annual rate of increase of the Consumer Price Index hovered around 3%, one of the lowest levels in two decades. Partly as a consequence, interest rates remained extraordinarily low (lenders were willing to charge lower rates because they expected that the loans would be repaid in dollars that retained their value). Fixed-rate home mortgages, for instance, were carrying annual interest rates under 7%, a situation that had not prevailed in the adult lives of many home buyers.

      The economic picture might have been even brighter were it not for two major natural disasters. In the summer, heavy rains sent the Missouri, Mississippi, and other midwestern rivers surging over their banks. More than two million hectares (five million acres) of farmland were inundated; hundreds of cities and towns were flooded; and thousands of homes and factories were swept away. In the fall, wildfires devastated southern California, burning at least 61,500 ha (152,000 ac) and forcing 25,000 people from their homes. Damage from the two disasters totaled in the billions of dollars, and economists figured that the resulting dislocation may have shaved half a percentage point off the increase in GDP. Some of that was expected to be regained in 1994 as money spent to restore the damage flowed into the economy.

Social Issues.
      The year brought some major advances for women in the U.S. as they continued to gain important posts in business and government. Congress approved President Clinton's appointment of Ruth Bader Ginsburg (see BIOGRAPHIES (Ginsburg, Ruth Bader )), a New York law professor, to the Supreme Court, where she became the second woman on the nine-member panel. In addition, Congress enacted 30 major bills related to women and family issues, compared with 5 in 1989, according to the Congressional Caucus for Women's Issues. Prominent among the new laws was the Family and Medical Leave Act, which provided up to 12 weeks of job-guaranteed leave for workers to care for themselves or sick family members or to have or adopt a baby. The caucus, which at year's end comprised the 7 female members of the Senate and the 47 congresswomen (both numbers were up sharply from the previous legislative session), nonetheless failed in an effort to repeal a measure that banned Medicaid funds from being used for abortions.

      Despite the inauguration of a Democratic president committed to reproductive rights, foes of abortion continued their campaign of disruption and intimidation against clinics where the procedure was performed. The administration loosened some federal restrictions on terminating pregnancies, but abortion foes hoped to make it difficult for women to obtain them. In response, abortion rights advocates sought the intervention of local authorities and the courts. In one closely watched case, abortion rights advocates sought to have clinic blockaders prosecuted under the 1970 Racketeer Influence and Corrupt Organizations (RICO) Act, which was normally used against organized crime. The Supreme Court was expected to rule on the matter in 1994. The court had previously upheld the broadening of RICO to prosecute commodity traders and gang members, although in 1993 the justices ruled that federal courts may not stop abortion clinic blockades by invoking an 1871 civil rights law.

      A number of well-publicized incidents had the effect of polarizing popular opinion along gender lines. One was the disclosure that Sen. Robert Packwood, a veteran Oregon Republican, may have made sexual advances against more than two dozen women over 20 years and tried to intimidate some of his alleged victims into silence. Women in Congress demanded Packwood's resignation, and the Senate launched an investigation. At year's end Packwood hinted that he might resign.

      In December Secretary Hazel O'Leary announced that the Department of Energy would investigate reports that a number of major medical institutions and U.S. government research laboratories had exposed civilians to radiation without having fully informed them of the nature of the experiments. More than 1,000 subjects were involved in various programs dating from the late 1940s to the early 1970s.

      A perennial concern among Americans, crime became almost a national obsession in 1993. Highly publicized reports of gang- and drug-related violence, carjackings that ended in death, innocent bystanders killed in gun battles, children bringing guns to school for protection, and foreign tourists killed during robberies in Florida all fanned the flames of public concern. In one typical survey nearly 90% of those polled said they believed that the country's crime problem was growing, and nearly half reported that there was more crime in their neighbourhoods than a year earlier.

      That fear of crime was seemingly at odds with reality. FBI statistics indicated a 4% drop in overall reported crime in 1992, and major cities reported declines in several categories of violent crime, including murder, rape, and robbery. Yet many Americans did not believe such reports, and their concerns led to a number of dramatic steps toward making their localities safer. Sharon Pratt Dixon, the mayor of Washington, D.C., asked the Clinton administration to provide National Guard troops to help police the city's more crime-ridden precincts (the request was denied). Voters in several states approved stiffer sentences for many crimes, as well as money to build more prisons.

      Criminal justice and public safety had long been a matter of state and local responsibility in the U.S., with only a modest federal role. As the clamour for relief from crime rose, however, Washington was listening. Congress passed the Clinton administration's crime bill, which went far beyond previous measures. It lengthened the list of offenses that could be prosecuted by federal authorities, including, as critics of the measure noted with derision, the murder of a federal chicken inspector. On a more positive note, the bill also provided funds to hire 100,000 new police. The most remarkable feature was the bill's inclusion of a long-standing proposal to require a five-day waiting period for the purchase of a handgun. That measure was known as the Brady bill, after James Brady, the White House press secretary who was seriously injured in the 1981 attack on Ronald Reagan. Brady, confined to a wheelchair and unable to resume his duties, campaigned hard for the bill, but it was fiercely opposed by the National Rifle Association (NRA), one of Washington's most formidable interest groups. Even supporters of the Brady bill conceded that it was unlikely to have a major effect on crime, but they welcomed its passage as a step toward more limits on the easy availability of handguns in the U.S. and as a major setback for the NRA.

      The agency responsible for federal criminal enforcement, the U.S. Justice Department, was widely criticized for the way it handled a standoff near Waco, Texas, between federal agents and heavily armed members of a religious cult known as the Branch Davidians and their charismatic leader, David Koresh. Four agents of the Bureau of Alcohol, Tobacco and Firearms were killed in an ill-planned attempt to storm the cult's 31-ha (77-acre) compound. That raid led to a nationally televised 51-day siege and a fiery conflagration after which it was discovered that some 75 people inside the compound, including at least 17 children, had died; a number had been shot. Nearly all the deaths appeared to have been caused by the Branch Davidians, but the report concluded that federal officials handled the situation ineptly.

Foreign Affairs.
      With an administration focused on its domestic policy agenda, international matters receded into the background of public attention. One reason was that since the fall of the Berlin Wall in 1989 and the general collapse of communism around the world, the Cold War no longer served as a framework for U.S. foreign policy and as a focus for public anxiety about the possibility of superpower confrontation. Another reason was that the international conflicts that did occupy the year's headlines in Somalia, the Balkans, Haiti, and the Middle East were mostly protracted regional affairs and were maddeningly resistant to the application of U.S. power.

      In Somalia, for instance, the U.S. began pulling out the more than 25,000 troops it had sent a year earlier to help ensure the distribution of relief supplies to a populace suffering from starvation and from the depredations of feuding warlords. U.S. forces were surprised to encounter hostility from the very people they had been sent to save. When an angry crowd of Somalis attacked a United Nations convoy, American helicopters fired into the crowd, killing and wounding more than 100 people. Then troops under the control of a leading warlord, Muhammad Farah Aydid (see BIOGRAPHIES (Aydid, Gen. Muhammad Farah )), whom the U.S. had been trying to capture, killed 18 Americans in a gun battle. President Clinton quickly announced a pullout of all remaining U.S. forces by March 1994. In an ironic twist to the unhappy American experience in Somalia, the U.S. not only dropped its attempt to seize Aydid but gave him preferential treatment and passage on a U.S. plane to attend peace talks in neighbouring Ethiopia.

      In the Balkans, President Clinton indicated his willingness to send U.S. troops to help maintain order if warring factions in Bosnia and Serbia could settle their differences. The offer was not taken up, in part because the conflict dragged on and European countries could not agree on a role for themselves and the U.S. On other matters Europe and the U.S. did appear to be in agreement. Among them was an American proposal to expand the membership of NATO possibly at some time in the future to include states of the Warsaw Pact, a now-defunct alliance of former Soviet-bloc states. That was an astonishing development, given the four decades of enmity between the two blocs. In addition, after years of sometimes desultory talks, the U.S. and Europe resolved most of their differences on trade and in December concluded an agreement under the General Agreement on Tariffs and Trade. (See Economic Affairs .)

      In Haiti the U.S. found itself in the position of supporting exiled Pres. Jean-Bertrand Aristide but unable to arrange his return. Haitian army commander Raoul Cédras, who ousted Aristide after the former Roman Catholic priest was democratically elected in 1990, refused to yield power. Cédras did participate in a UN-brokered agreement that would allow Aristide to take office, and the U.S. and Canada promised to send a small contingent of lightly armed troops to help police the arrangement. Yet when the U.S. troop ship arrived in Haiti, a violent mob of army-backed civilians refused to let it dock, and the troops returned home. Clinton ordered six American ships into the region to enforce a UN arms and oil embargo against Haiti. Meanwhile, forces loyal to Cédras continued to intimidate and even murder their opponents with impunity.

      In the Middle East, where the U.S. had long played a major role, Clinton presided over the historic meeting in Washington of Palestine Liberation Organization chief Yasir Arafat and Israeli Prime Minister Yitzhak Rabin. The two leaders met for the signing of an agreement allowing an unprecedented measure of Palestinian autonomy in the Israeli-occupied West Bank and Gaza Strip. The U.S. had little directly to do with arranging the agreement, and at one point late in the year, Rabin asked the U.S. to refrain from direct involvement in Israel's talks with the Palestinians.

      As the year came to a close, President Clinton shifted U.S. attention to North Korea. That country, ruled by the reclusive Kim Il Sung and dedicated to a brand of highly regimented Stalinist communism, was refusing to allow international inspections of its nuclear energy facilities. American policy makers, concerned that Kim was developing a nuclear weapons program, indicated that the U.S. might take military action if Kim's government did not comply with the inspection. North Korea declared that it was prepared to endure war or economic sanctions; in response, the U.S. said it would increase its military activities in South Korea, Kim's neighbour and bitter foe. Tensions eased somewhat when North Korea said that it might allow some inspections and that it would turn over the remains of U.S. soldiers killed four decades earlier in the Korean War.

      The book was finally closed on one of the country's most enduring political scandals: the Iran-contra affair. The final report of the special prosecutor investigating the matter indicated that former presidents Reagan and Bush were far more complicit than they had asserted. The scandal involved the sale of arms to Iran and the diversion of the resulting profits to provide arms for the contra rebels fighting the leftist government of Nicaragua in the 1980s. Though the Reagan and Bush administrations publicly favoured the contras, Congress had banned military support for them. The report, by prosecutor Lawrence Walsh, concluded that Reagan had set the stage for the illegal activities and that Bush was less than truthful when he declared that he was "out of the loop" and not kept informed about the matter. Neither man, however, was said to be guilty of a crime. (DONALD MORRISON)

      See also Dependent States, below.

* * *

officially  United States of America , abbreviations  U.S.  or  U.S.A. , byname  America 
United States of America, flag of the   country of North America, a federal republic of 50 states. Besides the 48 contiguous states that occupy the middle latitudes of the continent, the United States includes the state of Alaska, at the northwestern extreme of North America, and the island state of Hawaii, in the mid-Pacific Ocean. The coterminous states are bounded on the north by Canada, on the east by the Atlantic Ocean, on the south by the Gulf of Mexico and Mexico, and on the west by the Pacific Ocean. The United States is the fourth largest country in the world in area (after Russia, Canada, and China). The national capital is Washington, which is coextensive with the District of Columbia, the federal capital region created in 1790.

      The major characteristic of the United States is probably its great variety. Its physical environment ranges from the Arctic to the subtropical, from the moist rain forest to the arid desert, from the rugged mountain peak to the flat prairie. Although the total population of the United States is large by world standards, its overall population density is relatively low; the country embraces some of the world's largest urban concentrations as well as some of the most extensive areas that are almost devoid of habitation.

      The United States contains a highly diverse population; but, unlike a country such as China that largely incorporated indigenous peoples, its diversity has to a great degree come from an immense and sustained global immigration. Probably no other country has a wider range of racial, ethnic, and cultural types than does the United States. In addition to the presence of surviving native Americans (including American Indians, Aleuts, and Eskimo) and the descendants of Africans taken as slaves to America, the national character has been enriched, tested, and constantly redefined by the tens of millions of immigrants who by and large have gone to America hoping for greater social, political, and economic opportunities than they had in the places they left.

      The United States is the world's greatest economic power, measured in terms of gross national product (GNP). The nation's wealth is partly a reflection of its rich natural resources and its enormous agricultural output, but it owes more to the country's highly developed industry. Despite its relative economic self-sufficiency in many areas, the United States is the most important single factor in world trade by virtue of the sheer size of its economy. Its exports and imports represent major proportions of the world total. The United States also impinges on the global economy as a source of and as a destination for investment capital. The country continues to sustain an economic life that is more diversified than any other on Earth, providing the majority of its people with one of the world's highest standards of living.

      The United States is relatively young by world standards, being barely more than 200 years old; it achieved its current size only in the mid-20th century. America was the first of the European colonies to separate successfully from its motherland, and it was the first nation to be established on the premise that sovereignty rests with its citizens and not with the government. In its first century and a half, the country was mainly preoccupied with its own territorial expansion and economic growth and with social debates that ultimately led to civil war and a healing period that is still not complete. In the 20th century the United States emerged as a world power, and since World War II it has been one of the preeminent powers. It has not accepted this mantle easily nor always carried it willingly; the principles and ideals of its founders have been tested by the pressures and exigencies of its dominant status. Although the United States still offers its residents opportunities for unparalleled personal advancement and wealth, the depletion of its resources, contamination of its environment, and continuing social and economic inequality that perpetuates areas of poverty and blight all threaten the fabric of the country.

      The District of Columbia is discussed in the article Washington. For discussion of other major U.S. cities, see the articles Boston, Chicago, Los Angeles, New Orleans, New York City, Philadelphia, and San Francisco. Political units in association with the United States include Puerto Rico, discussed in the article Puerto Rico, and several Pacific islands, discussed in Guam, Northern Mariana Islands, and American Samoa.

The land (United States)
  The two great sets of elements that mold the physical environment of the United States are, first, the geologic, which determines the main patterns of landforms, drainage, and mineral resources and influences soils to a lesser degree, and, second, the atmospheric, which dictates not only climate and weather but also in large part the distribution of soils, plants, and animals. Although these elements are not entirely independent of one another, each produces on a map patterns that are so profoundly different that essentially they remain two separate geographies. (Since this article covers only the coterminous United States, see also the articles Alaska and Hawaii.)

      The centre of the coterminous United States is a great sprawling interior lowland, reaching from the ancient shield of central Canada on the north to the Gulf of Mexico on the south. To east and west this lowland rises, first gradually and then abruptly, to mountain ranges that divide it from the sea on both sides. The two mountain systems differ drastically. The Appalachian Mountains on the east are low, almost unbroken, and in the main set well back from the Atlantic. From New York to the Mexican border stretches the low Coastal Plain, which faces the ocean along a swampy, convoluted coast. The gently sloping surface of the plain extends out beneath the sea, where it forms the continental shelf, which, although submerged beneath shallow ocean water, is geologically identical to the Coastal Plain. Southward the plain grows wider, swinging westward in Georgia and Alabama to truncate the Appalachians along their southern extremity and separate the interior lowland from the Gulf.

      West of the Central Lowland is the mighty Cordillera, part of a global mountain system that rings the Pacific Basin. The Cordillera encompasses fully one-third of the United States, with an internal variety commensurate with its size. At its eastern margin lie the Rocky Mountains, a high, diverse, and discontinuous chain that stretches all the way from New Mexico to the Canadian border. The Cordillera's western edge is a Pacific coastal chain of rugged mountains and inland valleys, the whole rising spectacularly from the sea without benefit of a coastal plain. Pent between the Rockies and the Pacific chain is a vast intermontane complex of basins, plateaus, and isolated ranges so large and remarkable that they merit recognition as a region separate from the Cordillera itself.

      These regions—the Interior Lowlands and their upland fringes, the Appalachian Mountain system, the Atlantic Plain, the Western Cordillera, and the Western Intermontane Region—are so various that they require further division into 24 major subregions, or provinces (see map).

The Interior Lowlands and their upland fringes
      Andrew Jackson is supposed to have remarked that the United States begins at the Alleghenies, implying that only west of the mountains, in the isolation and freedom of the great Interior Lowlands, could people finally escape Old World influences. Whether or not the lowlands constitute the country's cultural core is debatable, but there can be no doubt that they comprise its geologic core and in many ways its geographic core as well.

      This enormous region rests upon an ancient, much-eroded platform of complex crystalline rocks that have for the most part lain undisturbed by major orogenic (mountain-building) activity for more than 600,000,000 years. Over much of central Canada, these Precambrian rocks are exposed at the surface and form the continent's single largest topographical region, the formidable and ice-scoured Canadian Shield.

      In the United States most of the crystalline platform is concealed under a deep blanket of sedimentary rocks. In the far north, however, the naked Canadian Shield extends into the United States far enough to form two small but distinctive landform regions: the rugged and occasionally spectacular Adirondack Mountains of northern New York; and the more subdued but austere Superior Uplands (Superior Upland) of northern Minnesota, Wisconsin, and Michigan. As in the rest of the shield, glaciers have stripped soils away, strewn the surface with boulders and other debris, and obliterated preglacial drainage systems. Most attempts at farming in these areas have been abandoned, but the combination of a comparative wilderness in a northern climate, clear lakes, and white-water streams has fostered the development of both regions as year-round outdoor recreation areas.

      Mineral wealth in the Superior Uplands is legendary. Iron lies near the surface and close to the deepwater ports of the upper Great Lakes. Iron is mined both north and south of Lake Superior, but best known are the colossal deposits of Minnesota's Mesabi Range, for more than a century one of the world's richest and a vital element in America's rise to industrial power. In spite of depletion, the Minnesota and Michigan mines still yield a major proportion of the country's iron and a significant percentage of the world's supply.

      South of the Adirondack Mountains and Superior Uplands lies the boundary between crystalline and sedimentary rocks; abruptly, everything is different. The core of this sedimentary region—the heartland of the United States—is the great Central Lowland, which stretches for 1,500 miles (2,400 kilometres) from New York to central Texas and north another 1,000 miles to the Canadian province of Saskatchewan. To some, the landscape may seem dull, for heights of more than 2,000 feet (600 metres) are unusual, and truly rough terrain is almost lacking. Landscapes are varied, however, largely as the result of glaciation that directly or indirectly affected most of the subregion. North of the Missouri–Ohio river line, the advance and readvance of continental ice left an intricate mosaic of boulders, sand, gravel, silt, and clay and a complex pattern of lakes and drainage channels, some abandoned, some still in use. The southern part of the Central Lowland is quite different, covered mostly with loess (wind-deposited silt) that further subdued the already low relief surface. Elsewhere, especially near major rivers, postglacial streams carved the loess into rounded hills, and visitors have aptly compared their billowing shapes to the waves of the sea. Above all, the loess produces soil of extraordinary fertility. As the Mesabi iron was a major source of America's industrial wealth, its agricultural prosperity has been rooted in Midwestern loess.

      The Central Lowland resembles a vast saucer, rising gradually to higher lands on all sides. Southward and eastward, the land rises gradually to three major plateaus. Beyond the reach of glaciation to the south, the sedimentary rocks have been raised into two broad upwarps, separated from one another by the great valley of the Mississippi River. The Ozark Plateau (Ozark Mountains) lies west of the river and occupies most of southern Missouri and northern Arkansas; on the east the Interior Low Plateaus dominate central Kentucky and Tennessee. Except for two nearly circular patches of rich limestone country—the Nashville Basin of Tennessee and the Kentucky Bluegrass region—most of both plateau regions consists of sandstone uplands, intricately dissected by streams. Local relief runs to several hundreds of feet in most places, and visitors to the region must travel winding roads along narrow stream valleys. The soils there are poor, and mineral resources are scanty.

      Eastward from the Central Lowland the Appalachian Plateau—a narrow band of dissected uplands that strongly resembles the Ozark Plateau and Interior Low Plateaus in steep slopes, wretched soils, and endemic poverty—forms a transition between the interior plains and the Appalachian Mountains. Usually, however, the Appalachian Plateau is considered a subregion of the Appalachian Mountains, partly on grounds of location, partly because of geologic structure. Unlike the other plateaus, where rocks are warped upward, the rocks there form an elongated basin, wherein bituminous coal has been preserved from erosion. This Appalachian coal, like the Mesabi iron that it complements in U.S. industry, is extraordinary. Extensive, thick, and close to the surface, it has stoked the furnaces of northeastern steel mills for decades and helps explain the huge concentration of heavy industry along the lower Great Lakes.

      The western flanks of the Interior Lowlands are the Great Plains, a territory of awesome bulk that spans the full distance between Canada and Mexico in a swath nearly 500 miles wide. The Great Plains were built by successive layers of poorly cemented sand, silt, and gravel—debris laid down by parallel east-flowing streams from the Rocky Mountains. Seen from the east, the surface of the Great Plains rises inexorably from about 2,000 feet near Omaha, Neb., to more than 6,000 feet at Cheyenne, Wyo., but the climb is so gradual that popular legend holds the Great Plains to be flat. True flatness is rare, although the High Plains of western Texas, Oklahoma, Kansas, and eastern Colorado come close. More commonly, the land is broadly rolling, and parts of the northern plains are sharply dissected into badlands.

      The main mineral wealth of the Interior Lowlands derives from fossil fuels. Coal occurs in structural basins protected from erosion—high-quality bituminous in the Appalachian, Illinois, and western Kentucky basins; and subbituminous and lignite in the eastern and northwestern Great Plains. Petroleum and natural gas have been found in nearly every state between the Appalachians and the Rockies, but the Midcontinent Fields of western Texas and the Texas Panhandle, Oklahoma, and Kansas surpass all others. Aside from small deposits of lead and zinc, metallic minerals are of little importance.

The Appalachian Mountain system
 The Appalachians (Appalachian Mountains) dominate the eastern United States and separate the Eastern Seaboard from the interior with a belt of subdued uplands that extends nearly 1,500 miles from northeastern Alabama to the Canadian border. They are old, complex mountains, the eroded stumps of much greater ranges. Present topography results from erosion that has carved weak rocks away, leaving a skeleton of resistant rocks behind as highlands. Geologic differences are thus faithfully reflected in topography. In the Appalachians these differences are sharply demarcated and neatly arranged, so that all the major subdivisions except New England lie in strips parallel to the Atlantic and to one another.

      The core of the Appalachians is a belt of complex metamorphic and igneous rocks that stretches all the way from Alabama to New Hampshire. The western side of this belt forms the long slender rampart of the Blue Ridge Mountains, containing the highest elevations in the Appalachians (Mount Mitchell (Mitchell, Mount), N.C., 6,684 feet [2,037 metres]) and some of its most handsome mountain scenery. On its eastern, or seaward, side the Blue Ridge descends in an abrupt and sometimes spectacular escarpment to the Piedmont, a well-drained, rolling land—never quite hills, but never quite a plain. Before the settlement of the Midwest the Piedmont was the most productive agricultural region in the United States, and several Pennsylvania counties still consistently report some of the highest farm yields per acre in the entire country.

      West of the crystalline zone, away from the axis of primary geologic deformation, sedimentary rocks have escaped metamorphism but are compressed into tight folds. Erosion has carved the upturned edges of these folded rocks into the remarkable Ridge and Valley country of the western Appalachians. Long linear ridges characteristically stand about 1,000 feet from base to crest and run for tens of miles, paralleled by broad open valleys of comparable length. In Pennsylvania, ridges run unbroken for great distances, occasionally turning abruptly in a zigzag pattern; by contrast, the southern ridges are broken by faults and form short, parallel segments that are lined up like magnetized iron filings. By far the largest valley—and one of the most important routes in North America—is the Great Valley (Great Appalachian Valley), an extraordinary trench of shale and limestone that runs nearly the entire length of the Appalachians. It provides a lowland passage from the middle Hudson valley to Harrisburg, Pa., and on southward, where it forms the Shenandoah (Shenandoah Valley) and Cumberland valleys, and has been one of the main paths through the Appalachians since pioneer times. In New England it is floored with slates and marbles and forms the Valley of Vermont, one of the few fertile areas in an otherwise mountainous region.

      Topography much like that of the Ridge and Valley is found in the Ouachita Mountains of western Arkansas and eastern Oklahoma, an area generally thought to be a detached continuation of Appalachian geologic structure, the intervening section buried beneath the sediments of the lower Mississippi valley.

      The once-glaciated New England section of the Appalachians is divided from the rest of the chain by an indentation of the Atlantic. Although almost completely underlain by crystalline rocks, New England is laid out in north–south bands, reminiscent of the southern Appalachians. The rolling, rocky hills of southeastern New England are not dissimilar to the Piedmont, while, farther northwest, the rugged and lofty White Mountains are a New England analogue to the Blue Ridge. (Mount Washington (Washington, Mount), N.H., at 6,288 feet [1917 metres], is the highest peak in the northeastern United States.) The westernmost ranges—the Taconics (Taconic Range), Berkshires (Berkshire Hills), and Green Mountains—show a strong north–south lineation like the Ridge and Valley. Unlike the rest of the Appalachians, however, glaciation has scoured the crystalline rocks much like those of the Canadian Shield, so that New England is best known for its picturesque landscape, not for its fertile soil.

      Typical of diverse geologic regions, the Appalachians contain a great variety of minerals. Only a few occur in quantities large enough for sustained exploitation, notably iron in Pennsylvania's Blue Ridge and Piedmont and the famous granites, marbles, and slates of northern New England. In Pennsylvania the Ridge and Valley region contains one of the world's largest deposits of anthracite coal, once the basis of a thriving mining economy; many of the mines are now shut, oil and gas having replaced coal as the major fuel used to heat homes.

The Atlantic Plain
      The eastern and southeastern fringes of the United States are part of the outermost margins of the continental platform, repeatedly invaded by the sea and veneered with layer after layer of young, poorly consolidated sediments. Part of this platform now lies slightly above sea level and forms a nearly flat and often swampy coastal plain, which stretches from Cape Cod, Mass., to beyond the Mexican border. Most of the platform, however, is still submerged, so that a band of shallow water, the continental shelf, parallels the Atlantic and Gulf coasts, in some places reaching 250 miles out to sea.

      The Atlantic Plain slopes so gently that even slight crustal upwarping can shift the coastline far out to sea at the expense of the continental shelf. The peninsula of Florida is just such an upwarp; nowhere in its 400-mile length does the land rise more than 350 feet above sea level; much of the southern and coastal areas rise less than 10 feet and are poorly drained and dangerously exposed to Atlantic storms. Downwarps can result in extensive flooding. North of New York City, for example, the weight of glacial ice depressed most of the Coastal Plain beneath the sea, and the Atlantic now beats directly against New England's rock-ribbed coasts. Cape Cod, Long Island (N.Y.), and a few offshore islands are all that remain of New England's drowned Coastal Plain. Another downwarp lies perpendicular to the Gulf coast and guides the course of the lower Mississippi. The river, however, has filled with alluvium what otherwise would be an arm of the Gulf, forming a great inland salient of the Coastal Plain called the Mississippi Embayment.

      South of New York the Coastal Plain gradually widens, but ocean water has invaded the lower valleys of most of the coastal rivers and has turned them into estuaries. The greatest of these is Chesapeake Bay, merely the flooded lower valley of the Susquehanna River and its tributaries, but there are hundreds of others. Offshore a line of sandbars and barrier beaches stretches intermittently the length of the Coastal Plain, hampering entry of shipping into the estuaries but providing the eastern United States with a playground that is more than 1,000 miles long.

      Poor soils are the rule on the Coastal Plain, though rare exceptions have formed some of America's most famous agricultural regions—for example, the citrus country of central Florida's limestone uplands and the Cotton Belt of the Old South, once centred on the alluvial plain of the Mississippi and belts of chalky black soils of eastern Texas, Alabama, and Mississippi. The Atlantic Plain's greatest natural wealth derives from petroleum and natural gas trapped in domal structures that dot the Gulf Coast of eastern Texas and Louisiana. Onshore and offshore drilling have revealed colossal reserves of oil and natural gas.

The Western Cordillera
      West of the Great Plains the United States seems to become a craggy land whose skyline is rarely without mountains—totally different from the open plains and rounded hills of the East. On a map the alignment of the two main chains—the Rocky Mountains on the east, the Pacific ranges on the west—tempts one to assume a geologic and hence topographic homogeneity. Nothing could be farther from the truth, for each chain is divided into widely disparate sections.

      The Rockies (Rocky Mountains) are typically diverse. The Southern Rockies are composed of a disconnected series of lofty elongated upwarps, their cores made of granitic basement rocks, stripped of sediments, and heavily glaciated at high elevations. In New Mexico and along the western flanks of the Colorado ranges, widespread volcanism and deformation of colourful sedimentary rocks have produced rugged and picturesque country, but the characteristic central Colorado or southern Wyoming range is impressively austere rather than spectacular. The Front Range west of Denver is prototypical, rising abruptly from its base at about 6,000 feet to rolling alpine meadows between 11,000 and 12,000 feet. Peaks appear as low hills perched on this high-level surface, so that Colorado, for example, boasts 53 mountains over 14,000 feet but not one over 14,500 feet.

      The Middle Rockies cover most of west central Wyoming. Most of the ranges resemble the granitic upwarps of Colorado, but thrust faulting and volcanism have produced varied and spectacular country to the west, some of which is included in Grand Teton and Yellowstone national parks. Much of the subregion, however, is not mountainous at all but consists of extensive intermontane basins and plains—largely floored with enormous volumes of sedimentary waste eroded from the mountains themselves. Whole ranges have been buried, producing the greatest gap in the Cordilleran system, the Wyoming Basin—resembling in geologic structure and topography an intermontane peninsula of the Great Plains. As a result, the Rockies have never posed an important barrier to east–west transportation in the United States; all major routes, from the Oregon Trail to interstate highways, funnel through the basin, essentially circumventing the main ranges of the Rockies.

      The Northern Rockies contain the most varied mountain landscapes of the Cordillera, reflecting a corresponding geologic complexity. The region's backbone is a mighty series of batholiths—huge masses of molten rock that slowly cooled below the surface and were later uplifted. The batholiths are eroded into rugged granitic ranges, which, in central Idaho, compose the most extensive wilderness country in the coterminous United States. East of the batholiths and opposite the Great Plains, sediments have been folded and thrust-faulted into a series of linear north–south ranges, a southern extension of the spectacular Canadian Rockies. Although elevations run 2,000 to 3,000 feet lower than the Colorado Rockies (most of the Idaho Rockies lie well below 10,000 feet), increased rainfall and northern latitude have encouraged glaciation—there as elsewhere a sculptor of handsome alpine landscape.

      The western branch of the Cordillera directly abuts the Pacific Ocean. This coastal chain, like its Rocky Mountain cousins on the eastern flank of the Cordillera, conceals bewildering complexity behind a facade of apparent simplicity. At first glance the chain consists merely of two lines of mountains with a discontinuous trough between them. Immediately behind the coast is a line of hills and low mountains—the Pacific Coast Ranges. Farther inland, averaging 150 miles from the coast, the line of the Sierra Nevada and the Cascade Range includes the highest elevations in the coterminous United States. Between these two unequal mountain lines is a discontinuous trench, the Troughs of the Coastal Margin.

 The apparent simplicity disappears under the most cursory examination. The Pacific Coast Ranges actually contain five distinct sections, each of different geologic origin and each with its own distinctive topography. The Transverse Ranges of southern California are a crowded assemblage of islandlike faulted ranges, with peak elevations of more than 10,000 feet but sufficiently separated by plains and low passes so that travel through them is easy. From Point Conception to the Oregon border, however, the main California Coast Ranges are entirely different, resembling the Appalachian Ridge and Valley region, with low linear ranges that result from erosion of faulted and folded rocks. Major faults run parallel to the low ridges, and the greatest—the notorious San Andreas Fault—was responsible for the earthquake that all but destroyed San Francisco in 1906. Along the California–Oregon border, everything changes again. In this region, the wildly rugged Klamath Mountains represent a western salient of interior structure reminiscent of the Idaho Rockies and the northern Sierra Nevada. In western Oregon and southwestern Washington the Coast Ranges are also different—a gentle, hilly land carved by streams from a broad arch of marine deposits interbedded with tabular lavas. In the northernmost part of the Coast Ranges and the remote northwest, a domal upwarp has produced the Olympic Mountains; (Olympic Mountains) its serrated peaks tower nearly 8,000 feet above Puget Sound and the Pacific, and the heavy precipitation on its upper slopes supports the largest active glaciers in the United States outside of Alaska.

      East of these Pacific Coast Ranges the Troughs of the Coastal Margin contain the only extensive lowland plains of the Pacific margin—California's Central Valley, Oregon's Willamette River valley, and the half-drowned basin of Puget Sound in Washington. Parts of an inland trench that extends for great distances along the east coast of the Pacific, similar valleys occur in such diverse areas as Chile and the Alaska panhandle. These valleys are blessed with superior soils, easily irrigated, and very accessible from the Pacific. They have enticed settlers for more than a century and have become the main centres of population and economic activity for much of the U.S. West Coast.

      Still farther east rise the two highest mountain chains in the coterminous United States—the Cascades and the Sierra Nevada. Aside from elevation, geographic continuity, and spectacular scenery, however, the two ranges differ in almost every important respect. Except for its northern section, where sedimentary and metamorphic rocks occur, the Sierra Nevada is largely made of granite, part of the same batholithic chain that creates the Idaho Rockies. The range is grossly asymmetrical, the result of massive faulting that has gently tilted the western slopes toward the Central Valley but has uplifted the eastern side to confront the interior with an escarpment nearly two miles high. At high elevation glaciers have scoured the granites to a gleaming white, while on the west the ice has carved spectacular valleys such as the Yosemite. The loftiest peak in the Sierras is Mount Whitney (Whitney, Mount), which at 14,494 feet (4,418 metres) is the highest mountain in the coterminous states. The upfaulting that produced Mount Whitney is accompanied by downfaulting that formed nearby Death Valley, at 282 feet (86 metres) below sea level the lowest point in North America.

      The Cascades (Cascade Range) are made largely of volcanic rock; those in northern Washington contain granite like the Sierras, but the rest are formed from relatively recent lava outpourings of dun-coloured basalt and andesite. The Cascades are in effect two ranges. The lower, older range is a long belt of upwarped lava, rising unspectacularly to elevations between 6,000 and 8,000 feet. Perched above the “low Cascades” is a chain of lofty volcanoes (volcano) that punctuate the horizon with magnificent glacier-clad peaks. The highest is Mount Rainier (Rainier, Mount), which at 14,410 feet (4,392 metres) is all the more dramatic for rising from near sea level. Most of these volcanoes are quiescent, but they are far from extinct. Mount Lassen (Lassen Peak) in northern California erupted violently in 1914, as did Mount St. Helens (Saint Helens, Mount) in the state of Washington in 1980. Most of the other high Cascade volcanoes exhibit some sign of seismic activity.

The Western Intermontane Region
 The Cordillera's two main chains enclose a vast intermontane region of arid basins, plateaus, and isolated mountain ranges that stretches from the Mexican border nearly to Canada and extends 600 miles from east to west. This enormous territory contains three huge subregions, each with a distinctive geologic history and its own striking topography.

      The Colorado Plateau, nestled against the western flanks of the Southern Rockies, is an extraordinary island of geologic stability set in the turbulent sea of Cordilleran tectonic activity. Stability was not absolute, of course, so that parts of the plateau are warped and injected with volcanics, but in general the landscape results from the erosion by streams of nearly flat-lying sedimentary rocks. The result is a mosaic of angular mesas, buttes, and steplike canyons intricately cut from rocks that often are vividly coloured. Large areas of the plateau are so improbably picturesque that they have been set aside as national preserves. The Grand Canyon of the Colorado River is the most famous of several dozen such areas.

      West of the plateau and abutting the Sierra Nevada's eastern escarpment lies the arid Basin and Range (Basin and Range Province) subregion, among the most remarkable topographic provinces of the United States. The Basin and Range extends from southern Oregon and Idaho into northern Mexico. Rocks of great complexity have been broken by faulting, and the resulting blocks have tumbled, eroded, and been partly buried by lava and alluvial debris accumulating in the desert basins. The eroded blocks form mountain ranges that are characteristically dozens of miles long, several thousand feet from base to crest, with peak elevations that rarely rise to more than 10,000 feet, and almost always aligned roughly north–south. The basin floors are typically alluvium and sometimes salt marshes or alkali flats.

      The third intermontane region, the Columbia Basin, is literally the last, for in some parts its rocks are still being formed. Its entire area is underlain by innumerable tabular lava flows that have flooded the basin between the Cascades and Northern Rockies to undetermined depths. The volume of lava must be measured in thousands of cubic miles, for the flows blanket large parts of Washington, Oregon, and Idaho and in southern Idaho have drowned the flanks of the Northern Rocky Mountains in a basaltic sea. Where the lavas are fresh, as in southern Idaho, the surface is often nearly flat, but more often the floors have been trenched by rivers—conspicuously the Columbia and the Snake—or by glacial floodwaters that have carved an intricate system of braided canyons in the remarkable Channeled Scablands of eastern Washington. In surface form the eroded lava often resembles the topography of the Colorado Plateau, but the gaudy colours of the Colorado are replaced here by the sombre black and rusty brown of weathered basalt.

      Most large mountain systems are sources of varied mineral wealth, and the American Cordillera is no exception. Metallic minerals have been taken from most crystalline regions and have furnished the United States with both romance and wealth—the Sierra Nevada gold that provoked the 1849 gold rush, the fabulous silver lodes of western Nevada's Basin and Range, and gold strikes all along the Rocky Mountain chain. Industrial metals, however, are now far more important; copper and lead are among the base metals, and the more exotic molybdenum, vanadium, and cadmium are mainly useful in alloys.

      In the Cordillera, as elsewhere, the greatest wealth stems from fuels. Most major basins contain oil and natural gas, conspicuously the Wyoming Basin, the Central Valley of California, and the Los Angeles Basin. The Colorado Plateau, however, has yielded some of the most interesting discoveries—considerable deposits of uranium and colossal occurrences of oil shale. Oil from the shale, however, probably cannot be economically removed without widespread strip-mining and correspondingly large-scale damage to the environment. Wide exploitation of low-sulfur bituminous coal has been initiated in the Four Corners area of the Colorado Plateau, and open-pit mining has already devastated parts of this once-pristine country as completely as it has West Virginia.

      As befits a nation of continental proportions, the United States has an extraordinary network of rivers and lakes, including some of the largest and most useful in the world. In the humid East they provide an enormous mileage of cheap inland transportation; westward, most rivers and streams are unnavigable but are heavily used for irrigation and power generation. Both East and West, however, traditionally have used lakes and streams as public sewers, and despite efforts to clean them up, most large waterways are laden with vast, poisonous volumes of industrial, agricultural, and human wastes.

The Eastern systems
      Chief among U.S. rivers is the Mississippi (Mississippi River), which, with its great tributaries, the Ohio and the Missouri, drains most of the midcontinent. The Mississippi is navigable to Minneapolis nearly 1,200 miles by air from the Gulf of Mexico; and along with the Great Lakes–St. Lawrence system it forms the world's greatest network of inland waterways. The Mississippi's eastern branches, chiefly the Ohio (Ohio River) and the Tennessee (Tennessee River), are also navigable for great distances. From the west, however, many of its numerous Great Plains tributaries are too seasonal and choked with sandbars to be used for shipping. The Missouri (Missouri River), for example, though longer than the Mississippi itself, was essentially without navigation until the mid-20th century, when a combination of dams, locks, and dredging opened the river to barge traffic.

      The Great Lakes– (Great Lakes)St. Lawrence system, the other half of the midcontinental inland waterway, is connected to the Mississippi–Ohio via Chicago by canals and the Illinois River. The five Great Lakes (four of which are shared with Canada) constitute by far the largest freshwater lake group in the world and carry a larger tonnage of shipping than any other. The three main barriers to navigation—the St. Marys Rapids, at Sault Sainte Marie; Niagara Falls; and the rapids of the St. Lawrence—are all bypassed by locks, whose 27-foot draft lets ocean vessels penetrate 1,300 miles into the continent, as far as Duluth, Minnesota, and Chicago.

      The third group of Eastern rivers drains the coastal strip along the Atlantic Ocean and the Gulf of Mexico. Except for the Rio Grande, which rises west of the Rockies and flows about 1,900 circuitous miles to the Gulf, few of these coastal rivers measure more than 300 miles, and most flow in an almost straight line to the sea. Except in glaciated New England and in arid southwestern Texas, most of the larger coastal streams are navigable for some distance.

The Pacific systems
      West of the Rockies, nearly all of the rivers are strongly influenced by aridity. In the deserts and steppes of the intermontane basins, most of the scanty runoff disappears into interior basins, only one of which, the Great Salt Lake, holds any substantial volume of water. Aside from a few minor coastal streams, only three large river systems manage to reach the sea—the Columbia (Columbia River), the Colorado (Colorado River), and the San Joaquin–Sacramento system of California's Central Valley. All three of these river systems are exotic: that is, they flow for considerable distances across dry lands from which they receive little water. Both the Columbia and the Colorado have carved awesome gorges, the former through the sombre lavas of the Cascades and the Columbia Basin, the latter through the brilliantly coloured rocks of the Colorado Plateau. These gorges lend themselves to easy damming, and the once-wild Columbia has been turned into a stairway of placid lakes whose waters irrigate the arid plateaus of eastern Washington and power one of the world's largest hydroelectric networks. The Colorado is less extensively developed, and proposals for new dam construction have met fierce opposition from those who want to preserve the spectacular natural beauty of the river's canyon lands.

      Climate affects human habitats both directly and indirectly through its influence on vegetation, soils, and wildlife. In the United States, however, the natural environment has been altered drastically by nearly four centuries of European settlement, as well as thousands of years of Indian occupancy.

      Wherever land is abandoned, however, “wild” conditions return rapidly, achieving over the long run a dynamic equilibrium among soils, vegetation, and the inexorable strictures of climate. Thus, though Americans have created an artificial environment of continental proportions, the United States still can be divided into a mosaic of bioclimatic regions, each of them distinguished by peculiar climatic conditions and each with a potential vegetation and soil that eventually would return in the absence of humans. The main exception to this generalization applies to fauna, so drastically altered that it is almost impossible to know what sort of animal geography would redevelop in the areas of the United States if humans were removed from the scene.

Climatic controls
      The pattern of U.S. climates is largely set by the location of the coterminous United States almost entirely in the middle latitudes, by its position with respect to the continental landmass and its fringing oceans, and by the nation's gross pattern of mountains and lowlands. Each of these geographic controls operates to determine the character of air masses and their changing behaviour from season to season.

      The coterminous United States lies entirely between the tropic of Cancer and 50° N latitude, a position that confines Arctic climates to the high mountaintops and genuine tropics to a small part of southern Florida. By no means, however, is the climate literally temperate, for the middle latitudes are notorious for extreme variations of temperature and precipitation.

      The great size of the North American landmass tends to reinforce these extremes. Since land heats and cools more rapidly than bodies of water, places distant from an ocean tend to have continental climates; that is, they alternate between extremes of hot summers and cold winters, in contrast to the marine climates, which are more equable. Most U.S. climates are markedly continental, the more so because the Cordillera effectively confines the moderating Pacific influence to a narrow strip along the West Coast. Extremes of continentality occur near the centre of the country, and in North Dakota temperatures have ranged between a summer high record of 121 °F (49 °C) and a winter low of −60 °F (−51 °C). Moreover, the general eastward drift of air over the United States carries continental temperatures all the way to the Atlantic coast. Bismarck, N.D., for example, has a great annual temperature range. Boston, on the Atlantic but largely exempt from its influence, has a lesser but still-continental range, while San Francisco, which is under strong Pacific influence, has only a small summer–winter differential.

      In addition to confining Pacific temperatures to the coastal margin, the Pacific Coast Ranges are high enough to make a local rain shadow in their lee, although the main barrier is the great rampart formed by the Sierra Nevada and Cascade ranges. Rainy on their western slopes and barren on the east, this mountain crest forms one of the sharpest climatic divides in the United States.

      The rain shadow continues east to the Rockies, leaving the entire Intermontane Region either arid or semiarid, except where isolated ranges manage to capture leftover moisture at high altitudes. East of the Rockies the westerly drift brings mainly dry air, and as a result, the Great Plains are semiarid. Still farther east, humidity increases owing to the frequent incursion from the south of warm, moist, and unstable air from the Gulf of Mexico (Mexico, Gulf of), which produces more precipitation in the United States than the Pacific and Atlantic oceans combined.

      Although the landforms of the Interior Lowlands have been termed dull, there is nothing dull about their weather conditions. Air from the Gulf of Mexico can flow northward across the Great Plains, uninterrupted by topographical barriers, but continental Canadian air flows south by the same route, and, since these two air masses differ in every important respect, the collisions often produce disturbances of monumental violence. Plainsmen and Midwesterners are accustomed to sudden displays of furious weather—tornadoes, blizzards, hailstorms, precipitous drops and rises in temperature, and a host of other spectacular meteorological displays, sometimes dangerous but seldom boring.

The change of seasons
      Most of the United States is marked by sharp differences between winter and summer. In winter, when temperature contrasts between land and water are greatest, huge masses of frigid, dry Canadian air periodically spread far south over the midcontinent, bringing cold, sparkling weather to the interior and generating great cyclonic storms where their leading edges confront the shrunken mass of warm Gulf air to the south. Although such cyclonic activity occurs throughout the year, it is most frequent and intense during the winter, parading eastward out of the Great Plains to bring the Eastern states practically all their winter precipitation. Winter temperatures differ widely, depending largely on latitude. Thus, New Orleans, La., at 30° N latitude, and International Falls, Minn., at 49° N, have respective January temperature averages of 55 °F (13 °C) and 3 °F (−16° C). In the north, therefore, precipitation often comes as snow, often driven by furious winds; farther south, cold rain alternates with sleet and occasional snow. Southern Florida is the only dependably warm part of the East, though “polar outbursts” have been known to bring temperatures below 0 °F (−18 °C) as far south as Tallahassee. The main uniformity of Eastern weather in wintertime is the expectation of frequent change.

      Winter climate on the West Coast is very different. A great spiraling mass of relatively warm, moist air spreads south from the Aleutian Islands of Alaska, its semipermanent front producing gloomy overcast and drizzles that hang over the Pacific Northwest all winter long, occasionally reaching southern California, which receives nearly all of its rain at this time of year. This Pacific air brings mild temperatures along the length of the coast; the average January day in Seattle, Wash., ranges between 33 and 44 °F (1 and 7 °C) and in Los Angeles between 45 and 64 °F (7 and 18 °C). In southern California, however, rains are separated by long spells of fair weather, and the whole region is a winter haven for those seeking refuge from less agreeable weather in other parts of the country. The Intermontane Region is similar to the Pacific Coast, but with much less rainfall and a considerably wider range of temperatures.

      During the summer there is a reversal of the air masses, and east of the Rockies the change resembles the summer monsoon of Southeast Asia. As the midcontinent heats up, the cold Canadian air mass weakens and retreats, pushed north by an aggressive mass of warm, moist air from the Gulf. The great winter temperature differential between North and South disappears as the hot, soggy blanket spreads from the Gulf coast to the Canadian border. Heat and humidity are naturally most oppressive in the South, but there is little comfort in the more northern latitudes. In Houston, Texas, the temperature on a typical July day reaches 93 °F (34 °C), with relative humidity averaging near 75 percent, but Minneapolis, Minn., more than 1,000 miles north, is only slightly cooler and less humid.

      Since the Gulf air is unstable as well as wet, convectional and frontal summer thunderstorms are endemic east of the Rockies, accounting for a majority of total summer rain. These storms usually drench small areas with short-lived, sometimes violent downpours, so that crops in one Midwestern county may prosper, those in another shrivel in drought, and those in yet another be flattened by hailstones. Relief from the humid heat comes in the northern Midwest from occasional outbursts of cool Canadian air; small but more consistent relief is found downwind from the Great Lakes and at high elevations in the Appalachians. East of the Rockies, however, U.S. summers are distinctly uncomfortable, and air conditioning is viewed as a desirable amenity in most areas.

      Again, the Pacific regime is different. The moist Aleutian air retreats northward, to be replaced by mild, stable air from over the subtropical but cool waters of the Pacific, and except in the mountains the Pacific Coast is nearly rainless though often foggy. In the meanwhile, a small but potent mass of dry hot air raises temperatures to blistering levels over much of the intermontane Southwest. In Yuma, Ariz., for example, the normal temperature in July reaches 107 °F (42 °C), while nearby Death Valley, Calif., holds the national record, 134 °F (57 °C). During its summer peak this scorching air mass spreads from the Pacific margin as far as Texas on the east and Idaho to the north, turning the whole interior basin into a summer desert.

      Over most of the United States, as in most continental climates, spring and autumn are agreeable but disappointingly brief. Autumn is particularly idyllic in the East, with a romantic Indian summer of ripening corn and brilliantly coloured foliage and of mild days and frosty nights. The shift in dominance between marine and continental air masses, however, spawns furious weather in some regions. Along the Atlantic and Gulf coasts, for example, autumn is the season for hurricanes—the American equivalent of typhoons of the Asian Pacific—which rage northward from the warm tropics to create havoc along the Gulf and Atlantic coasts as far north as New England. The Mississippi valley holds the dubious distinction of recording more tornadoes than any other area on Earth. These violent and often deadly storms usually occur over relatively small areas and are confined largely to spring and early summer.

The bioclimatic regions
      Three first-order bioclimatic zones encompass most of the coterminous United States—regions in which climatic conditions are similar enough to dictate similar conditions of mature (zonal) soil and potential climax vegetation (i.e., the assemblage of plants that would grow and reproduce indefinitely given stable climate and average conditions of soil and drainage). These are the Humid East, the Humid Pacific Coast, and the Dry West. In addition, the boundary zone between the Humid East and the Dry West is so large and important that it constitutes a separate region, the Humid–Arid Transition. Finally, because the Western Cordillera contains an intricate mosaic of climatic types, largely determined by local elevation and exposure, it is useful to distinguish the Western Mountain Climate. The first three zones, however, are very diverse and require further breakdown, producing a total of 10 main bioclimatic regions. For two reasons, the boundaries of these bioclimatic regions are much less distinct than boundaries of landform regions. First, climate varies from year to year, especially in boundary zones, whereas landforms obviously do not. Second, regions of climate, vegetation, and soils coincide generally but sometimes not precisely. Boundaries, therefore, should be interpreted as zonal and transitional, and rarely should be considered as sharp lines in the landscape.

      For all of their indistinct boundaries, however, these bioclimatic regions have strong and easily recognized identities. Such regional identity is strongly reinforced when a particular area falls entirely within a single bioclimatic region and at the same time a single landform region. The result—as in the Piedmont South, the central Midwest, or the western Great Plains—is a landscape with an unmistakable regional personality.

The Humid East
      The largest and in some ways the most important of the bioclimatic zones, the Humid East was where the Europeans first settled, tamed the land, and adapted to American conditions. In early times almost all of this territory was forested, a fact of central importance in American history that profoundly influenced both soils and wildlife. As in most of the world's humid lands, soluble minerals have been leached from the earth, leaving a great family of soils called pedalfers, rich in relatively insoluble iron and aluminum compounds.

      Both forests and soils, however, differ considerably within this vast region. Since rainfall is ample and summers are warm everywhere, the main differences result from the length and severity of winters, which determine the length of the growing season. Winter, obviously, differs according to latitude, so that the Humid East is sliced into four great east–west bands of soils and vegetation, with progressively more amenable winters as one travels southward. These changes occur very gradually, however, and the boundaries therefore are extremely subtle.

      The Sub-Boreal Forest Region is the northernmost of these bands. It is only a small and discontinuous part of the United States, representing the tattered southern fringe of the vast Canadian taiga—a scrubby forest dominated by evergreen needle-leaf species that can endure the ferocious winters and reproduce during the short, erratic summers. Average growing seasons are less than 120 days, though localities in Michigan's Upper Peninsula have recorded frost-free periods lasting as long as 161 days and as short as 76 days. Soils of this region that survived the scour of glaciation are miserably thin podzols—heavily leached, highly acid, and often interrupted by extensive stretches of bog. Most attempts at farming in the region long since have been abandoned.

      Farther south lies the Humid Microthermal Zone of milder winters and longer summers. Large broadleaf trees begin to predominate over the evergreens, producing a mixed forest of greater floristic variety and economic value that is famous for its brilliant autumn colours. As the forest grows richer in species, sterile podzols give way to more productive gray-brown podzolic soils, stained and fertilized with humus. Although winters are warmer than in the Sub-Boreal zone, and although the Great Lakes help temper the bitterest cold, January temperatures ordinarily average below freezing, and a winter without a few days of subzero temperatures is uncommon. Everywhere, the ground is solidly frozen and snow covered for several months of the year.

      Still farther south are the Humid Subtropics. The region's northern boundary is one of the country's most significant climatic lines: the approximate northern limit of a growing season of 180–200 days, the outer margin of cotton growing, and, hence, of the Old South. Most of the South lies in the Piedmont and Coastal Plain, for higher elevations in the Appalachians cause a peninsula of Northern mixed forest to extend as far south as northern Georgia. The red-brown podzolic soil, once moderately fertile, has been severely damaged by overcropping and burning. Thus much of the region that once sustained a rich, broadleaf-forest flora now supports poor piney woods. Throughout the South, summers are hot, muggy, long, and disagreeable; Dixie's “frosty mornings” bring a welcome respite in winter.

      The southern margins of Florida contain the only real tropics in the coterminous United States; it is an area in which frost is almost unknown. Hot, rainy summers alternate with warm and somewhat drier winters, with a secondary rainfall peak during the autumn hurricane season—altogether a typical monsoonal regime. Soils and vegetation are mostly immature, however, since southern Florida rises so slightly above sea level that substantial areas, such as the Everglades, are swampy and often brackish. Peat and sand frequently masquerade as soil, and much of the vegetation is either salt-loving mangrove or sawgrass prairie.

The Humid Pacific Coast
      The western humid region differs from its eastern counterpart in so many ways as to be a world apart. Much smaller, it is crammed into a narrow littoral belt to the windward of the Sierra–Cascade summit, dominated by mild Pacific air, and chopped by irregular topography into an intricate mosaic of climatic and biotic habitats. Throughout the region rainfall is extremely seasonal, falling mostly in the winter half of the year. Summers are droughty everywhere, but the main regional differences come from the length of drought—from about two months in humid Seattle, Wash., to nearly five months in semiarid San Diego, Calif.

      Western Washington, Oregon, and northern California lie within a zone that climatologists call Marine West Coast. Winters are raw, overcast, and drizzly—not unlike northwestern Europe—with subfreezing temperatures restricted mainly to the mountains, upon which enormous snow accumulations produce local alpine glaciers. Summers, by contrast, are brilliantly cloudless, cool, and frequently foggy along the West Coast and somewhat warmer in the inland valleys. This mild marine climate produces some of the world's greatest forests of enormous straight-boled evergreen trees that furnish the United States with much of its commercial timber. Mature soils are typical of humid midlatitude forestlands, a moderately leached gray-brown podzol.

      Toward the south, with diminishing coastal rain the moist marine climate gradually gives way to California's (California) tiny but much-publicized Mediterranean regime. Although mountainous topography introduces a bewildering variety of local environments, scanty winter rains are quite inadequate to compensate for the long summer drought, and much of the region has a distinctly arid character. For much of the year, cool, stable Pacific air dominates the West Coast, bringing San Francisco its famous fogs and Los Angeles its infamous smoggy temperature inversions. Inland, however, summer temperatures reach blistering levels, so that in July, while Los Angeles expects a normal daily maximum of 83 °F (28 °C), Fresno expects 100 °F (38 °C) and is climatically a desert. As might be expected, Mediterranean California contains a huge variety of vegetal habitats, but the commonest perhaps is the chaparral, a drought-resistant, scrubby woodland of twisted hard-leafed trees, picturesque but of little economic value. Chaparral is a pyrophytic (fire-loving) vegetation—i.e., under natural conditions its growth and form depend on regular burning. These fires constitute a major environmental hazard in the suburban hills above Los Angeles and San Francisco Bay, especially in autumn, when hot dry Santa Ana winds from the interior regularly convert brush fires into infernos. Soils are similarly varied, but most of them are light in colour and rich in soluble minerals, qualities typical of subarid soils.

The Dry West
      In the United States, to speak of dry areas is to speak of the West. It covers an enormous region beyond the dependable reach of moist oceanic air, occupying the entire Intermontane area and sprawling from Canada to Mexico across the western part of the Great Plains. To Americans nurtured in the Humid East, this vast territory across the path of all transcontinental travelers has been harder to tame than any other—and no region has so gripped the national imagination as this fierce and dangerous land.

      In the Dry West nothing matters more than water. Thus, though temperatures may differ radically from place to place, the really important regional differences depend overwhelmingly on the degree of aridity, whether an area is extremely dry and hence desert or semiarid and therefore steppe.

      Americans of the 19th century were preoccupied by the myth of a Great American Desert (Great Plains), which supposedly occupied more than one-third of the entire country. True desert, however, is confined to the Southwest, with patchy outliers elsewhere, all without exception located in the lowland rain shadows of the Cordillera. Vegetation in these desert areas varies between nothing at all (a rare circumstance confined mainly to salt flats and sand dunes) to a low cover of scattered woody scrub and short-lived annuals that burst into flamboyant bloom after rains. Soils are usually thin, light-coloured, and very rich with mineral salts. In some areas wind erosion has removed fine-grained material, leaving behind desert pavement, a barren veneer of broken rock.

      Most of the West, however, lies in the semiarid region, in which rainfall is scanty but adequate to support a thin cover of short bunchgrass, commonly alternating with scrubby brush. Here, as in the desert, soils fall into the large family of the pedocals, rich in calcium and other soluble minerals, but in the slightly wetter environments of the West, they are enriched with humus from decomposed grass roots. Under the proper type of management, these chestnut-coloured steppe soils have the potential to be very fertile.

      Weather in the West resembles that of other dry regions of the world, often extreme, violent, and reliably unreliable. Rainfall, for example, obeys a cruel natural law: as total precipitation decreases, it becomes more undependable. John Steinbeck's novel The Grapes of Wrath describes the problems of a family enticed to the arid frontier of Oklahoma during a wet period only to be driven out by the savage drought of the 1930s that turned the western Great Plains into the great American Dust Bowl. Temperatures in the West also fluctuate convulsively within short periods, and high winds are infamous throughout the region.

The Humid–Arid Transition
      East of the Rockies all climatic boundaries are gradational. None, however, is so important or so imperceptibly subtle as the boundary zone that separates the Humid East from the Dry West and that alternates unpredictably between arid and humid conditions from year to year. Stretching approximately from Texas to North Dakota in an ill-defined band between the 95th and 100th meridians, this transitional region deserves separate recognition, partly because of its great size, and partly because of the fine balance between surplus and deficit rainfall, which produces a unique and valuable combination of soils, flora, and fauna. The native vegetation, insofar as it can be reconstructed, was prairie, the legendary sea of tall, deep-rooted grass now almost entirely tilled and planted to grains. Soils, often of loessial derivation, include the enormously productive chernozem (black earth) in the north, with reddish prairie soils of nearly equal fertility in the south. Throughout the region temperatures are severely continental, with bitterly cold winters in the north and scorching summers everywhere.

      The western edge of the prairie fades gradually into the shortgrass steppe of the High Plains, the change a function of diminishing rainfall. The eastern edge, however, represents one of the few major discordances between a climatic and biotic boundary in the United States, for the grassland penetrates the eastern forest in a great salient across humid Illinois and Indiana. Many scholars believe this part of the prairie was artificially induced by repeated burning and consequent destruction of the forest margins by Indians.

The Western mountains
      Throughout the Cordillera and Intermontane regions, irregular topography shatters the grand bioclimatic pattern into an intricate mosaic of tiny regions that differ drastically according to elevation and exposure. No small- or medium-scale map can accurately record such complexity, and mountainous parts of the West are said, noncommittally, to have a “mountain climate.” Lowlands are usually dry, but increasing elevation brings lower temperature, decreased evaporation, and—if a slope faces prevailing winds—greater precipitation. Soils vary wildly from place to place, but vegetation is fairly predictable. From the desert or steppe of intermontane valleys, a climber typically ascends into parklike savanna, then through an orderly sequence of increasingly humid and boreal forests until, if the range is high enough, one reaches the timberline and Arctic tundra. The very highest peaks are snow-capped, although permanent glaciers rarely occur outside the cool humid highlands of the Pacific Northwest.

Peirce F. Lewis

Plant life
      The dominant features of the vegetation are indicated by the terms forest, grassland, desert, and alpine tundra.

      A coniferous forest of white and red pine, hemlock, spruce, jack pine, and balsam fir extends interruptedly in a narrow strip near the Canadian border from Maine to Minnesota and southward along the Appalachian Mountains. There may be found smaller stands of tamarack, spruce, paper birch, willow, alder, and aspen or poplar. Southward, a transition zone of mixed conifers and deciduous trees gives way to a hardwood forest of broad-leaved trees. This forest, with varying mixtures of maple, oak, ash, locust, linden, sweet gum, walnut, hickory, sycamore, beech, and the more southerly tulip tree, once extended uninterruptedly from New England to Missouri and eastern Texas. Pines are prominent on the Atlantic and Gulf coastal plain and adjacent uplands, often occurring in nearly pure stands called pine barrens. Pitch, longleaf, slash, shortleaf, Virginia, and loblolly pines are commonest. Hickory and various oaks combine to form a significant part of this forest, with magnolia, white cedar, and ash often seen. In the frequent swamps, bald cypress, tupelo, and white cedar predominate. Pines, palmettos, and live oaks are replaced at the southern tip of Florida by the more tropical royal and thatch palms, figs, satinwood, and mangrove.

      The grasslands occur principally in the Great Plains area and extend westward into the intermontane basins and benchlands of the Rocky Mountains. Numerous grasses such as buffalo, grama, side oat, bunch, needle, and wheat grass, together with many kinds of herbs, make up the plant cover. Coniferous forests cover the lesser mountains and high plateaus of the Rockies, Cascades, and Sierra Nevada. Ponderosa (yellow) pine, Douglas fir, western red cedar, western larch, white pine, lodgepole pine, several spruces, western hemlock, grand fir, red fir, and the lofty redwood are the principal trees of these forests. The densest growth occurs west of the Cascade and Coast ranges in Washington, Oregon, and northern California, where the trees are often 100 feet or more in height. There the forest floor is so dark that only ferns, mosses, and a few shade-loving shrubs and herbs may be found.

      The alpine tundra, located in the coterminous United States only in the mountains above the limit of trees, consists principally of small plants that bloom brilliantly for a short season. Sagebrush is the most common plant of the arid basins and semideserts west of the Rocky Mountains, but juniper, nut pine, and mountain mahogany are often found on the slopes and low ridges. The desert, extending from southeastern California to Texas, is noted for the many species of cactus, some of which grow to the height of trees, and for the Joshua tree and other yuccas, creosote bush, mesquite, and acacias.

      The United States is rich in the variety of its native forest trees, some of which, as the species of sequoia, are the most massive known. More than 1,000 species and varieties have been described, of which almost 200 are of economic value, either because of the timber and other useful products that they yield or by reason of their importance in forestry.

      Besides the native flowering plants, estimated at between 20,000 to 25,000 species, many hundreds of species introduced from other regions—chiefly Europe, Asia, and tropical America—have become naturalized. A large proportion of these are common annual weeds of fields, pastures, and roadsides. In some districts these naturalized “aliens” constitute 50 percent or more of the total plant population.

Paul H. Oehser Reed C. Rollins Ed.

Animal life
      With most of North America, the United States lies in the Nearctic faunistic realm, a region containing an assemblage of species similar to Eurasia and North Africa but sharply different from the tropical and subtropical zones to the south. Main regional differences correspond roughly with primary climatic and vegetal patterns. Thus, for example, the animal communities of the Dry West differ sharply from those of the Humid East and from those of the Pacific Coast. Because animals tend to range over wider areas than plants, faunal regions are generally coarser than vegetal regions and harder to delineate sharply.

      The animal geography of the United States, however, is far from a natural pattern, for European settlement produced a series of environmental changes that grossly altered the distribution of animal communities. First, many species were hunted to extinction or near extinction, most conspicuously, perhaps, the American bison, which ranged by the millions nearly from coast to coast but now rarely lives outside of zoos and wildlife preserves. Second, habitats were upset or destroyed throughout most of the country—forests cut, grasslands plowed and overgrazed, and migration paths interrupted by fences, railroads, and highways. Third, certain introduced species found hospitable niches and, like the English sparrow, spread over huge areas, often preempting the habitats of native animals. Fourth, though their effects are not fully understood, chemical biocides such as DDT were used for so long and in such volume that they are believed at least partly responsible for catastrophic mortality rates among large mammals and birds, especially predators high on the food chain. Fifth, there has been a gradual northward migration of certain tropical and subtropical insects, birds, and mammals, perhaps encouraged by gradual climatic warming. In consequence, many native animals have been reduced to tiny fractions of their former ranges or exterminated completely, while other animals, both native and introduced, have found the new anthropocentric environment well suited to their needs, with explosive effects on their populations. The coyote, opossum, armadillo, and several species of deer are among the animals that now occupy much larger ranges than they once did.

Peirce F. Lewis
      Arrangement of the account of the distribution of the fauna according to the climatic and vegetal regions has the merit that it can be compared further with the distribution of insects and of other invertebrates, some of which may be expected to fall into the same patterns as the vertebrates, while others, with different modes or different ages of dispersal, have geographic patterns of their own.

      The transcontinental zone of coniferous forest at the north, the taiga (boreal forest), and the tundra zone into which it merges at the northern limit of tree growth are strikingly paralleled by similar vertical zones in the Rockies, and on Mount Washington in the east, where the area above the timberline and below the snow line is often inhabited with tundra animals like the ptarmigan and the white Parnassius butterflies, while the spruce and other conifers below the timberline form a belt sharply set off from the grassland or hardwood forest or desert at still lower altitudes.

      A whole series of important types of animals spread beyond the limits of such regions or zones, sometimes over most of the continent. Aquatic animals, in particular, may live equally in forest and plains, in the Gulf states, and at the Canadian border. Such widespread animals include the white-tailed (Virginia) deer and black bear, the puma (though only in the remotest parts of its former range) and bobcat, the river otter (though now rare in inland areas south of the Great Lakes) and mink, and the beaver and muskrat. The distinctive coyote ranges over all of western North America and eastward as far as Maine. The snapping turtle ranges from the Atlantic coast to the Rocky Mountains.

      In the northern coniferous forest zone, or taiga, the relations of animals with European or Eurasian representatives are numerous, and this zone is also essentially circumpolar. The relations are less close than in the Arctic forms, but the moose, beaver, hare, red fox, otter, wolverine, and wolf are recognizably related to Eurasian animals. Even some fishes, like the whitefishes (Coregonidae), the yellow perch, and the pike, exhibit this kind of Old World–New World relation. A distinctively North American animal in this taiga assemblage is the Canadian porcupine.

      The hardwood forest area of the eastern and the southeastern pinelands compose the most important of the faunal regions within the United States. A great variety of fishes, amphibians, and reptiles of this region have related forms in East Asia, and this pattern of representation is likewise found in the flora. This area is rich in catfishes, minnows, and suckers. The curious ganoid fishes, the bowfin and the gar, are ancient types. The spoonbill cat, a remarkable type of sturgeon in the lower Mississippi, is represented elsewhere in the world only in the Yangtze in China. The Appalachian region is headquarters for the salamanders of the world, with no less than seven of the eight families of this large group of amphibians represented; no other continent has more than three of the eight families together. The eellike sirens and amphiumas (congo snakes) are confined to the southeastern states. The lungless salamanders of the family Plethodontidae exhibit a remarkable variety of genera and a number of species centring in the Appalachians. There is a great variety of frogs, and these include tree frogs whose main development is South American and Australian. The emydid freshwater turtles of the southeast parallel those of East Asia to a remarkable degree, though the genus Clemmys is the only one represented in both regions. Much the same is true of the water snakes, pit vipers, rat snakes, and green snakes, though still others are peculiarly American. The familiar alligator is a form with an Asiatic relative, the only other living true alligator being a species in central China.

      In its mammals and birds the southeastern fauna is less sharply distinguished from the life to the north and west and is less directly related to that of East Asia. The forest is the home of the white-tailed deer, the black bear, the gray fox, the raccoon, and the common opossum. The wild turkey and the extinct hosts of the passenger pigeon were characteristic. There is a remarkable variety of woodpeckers. The birdlife in general tends to differ from that of Eurasia in the presence of birds, like the tanagers, American orioles, and hummingbirds, that belong to South American families. Small mammals abound with types of the worldwide rodent family Cricetidae, and with distinctive moles and shrews.

  Most distinctive of the grassland animals proper is the American bison (see photograph—>), whose nearly extinct European relative, the wisent, is a forest dweller. The most distinctive of the American hoofed animals is the pronghorn, or prongbuck, which represents a family intermediate between the deer and the true antelopes in that it sheds its horns like a deer but retains the bony horn cores. The pronghorn is perhaps primarily a desert mammal, but it formerly ranged widely into the shortgrass plains. Everywhere in open country in the West there are conspicuous and distinctive rodents. The burrowing pocket gopher is peculiarly American, rarely seen making its presence known by pushed-out mounds of earth. The ground squirrels of the genus Citellus are related to those of Central Asia, and resemble them in habit; in North America the gregarious prairie dog is a closely related form. The American badger (see photograph—>), not especially related to the badger of Europe, has its headquarters in the grasslands. The prairie chicken is a bird distinctive of the plains region, which is invaded everywhere by birds from both the east and the west.

 The Southwestern deserts are a paradise for reptiles. Distinctive lizards such as the poisonous Gila monster abound, and the rattlesnakes, of which only a few species are found elsewhere in the United States, are common there. Desert reptile species often range to the Pacific Coast and northward into the Great Basin. Noteworthy mammals are the graceful bipedal kangaroo rat (almost exlusively nocturnal; see photograph—>), the ring-tailed cat, a relative of the raccoon, and the piglike peccary.

      The Rocky Mountains and other western ranges afford distinctive habitats for rock- and cliff-dwelling hoofed animals and rodents. The small pikas, related to the rabbit, inhabit talus areas at high altitudes as they do in the mountain ranges of East Asia. Marmots live in the Rockies as in the Alps. Every western range formerly had its own race of mountain sheep. At the north the Rocky Mountain goat lives at high altitudes—it is more properly a goat antelope, related to the takin of the mountains of western China. The dipper, remarkable for its habit of feeding in swift-flowing streams, though otherwise a bird without special aquatic adaptations, is a Rocky Mountain form with relatives in Asia and Europe.

      In the Pacific region the extremely distinctive primitive tailed frog Ascaphus, which inhabits icy mountain brooks, represents a family by itself, perhaps more nearly related to the frogs of New Zealand than to more familiar types. The Cascades and Sierras form centres for salamanders of the families Ambystomoidae and Plethodontidae second only to the Appalachians, and there are also distinctive newts. The burrowing lizards, of the well-defined family Anniellidae, are found only in a limited area in coastal California. The only family of birds distinctive of North America, that of the wren-tits, Chamaeidae, is found in the chaparral of California. The mountain beaver, or sewellel (which is not at all beaverlike), is likewise a type peculiar to North America, confined to the Cascades and Sierras, and there are distinct kinds of moles in the Pacific area.

      The mammals of the two coasts are strikingly different, though true seals (the harbour seal and the harp seal) are found on both. The sea lions, with longer necks and with projecting ears, are found only in the Pacific—the California sea lion, the more northern Steller's sea lion, and the fur seal. On the East Coast the larger rivers of Florida are inhabited by the Florida manatee, or sea cow, a close relative of the more widespread and more distinctively marine West Indian species.

Karl Patterson Schmidt Ed.

Settlement patterns
      Although the land that now constitutes the United States was occupied and much affected by diverse Indian cultures over many millennia, these pre-European settlement patterns have had virtually no impact upon the contemporary nation—except locally, as in parts of New Mexico. A benign habitat permitted a huge contiguous tract of settled land to materialize across nearly all the eastern half of the United States and within substantial patches of the West. The vastness of the land, the scarcity of labour, and the abundance of migratory opportunities in a land replete with raw physical resources contributed to exceptional human mobility and a quick succession of ephemeral forms of land use and settlement. Human endeavours have greatly transformed the landscape, but such efforts have been largely destructive. Most of the pre-European landscape in the United States was so swiftly and radically altered that it is difficult to conjecture intelligently about its earlier appearance.

      The overall impression of the settled portion of the American landscape, rural or urban, is one of disorder and incoherence, even in areas of strict geometric survey. The individual landscape unit is seldom in visual harmony with its neighbour, so that, however sound in design or construction the single structure may be, the general effect is untidy. These attributes have been intensified by the acute individualism of the American, vigorous speculation in land and other commodities, a strongly utilitarian attitude toward the land and the treasures above and below it, and government policy and law. The landscape is also remarkable for its extensive transportation facilities, which have greatly influenced the configuration of the land.

      Another special characteristic of American settlement, one that became obvious only by the mid-20th century, is the convergence of rural and urban modes of life. The farmsteads—and rural folk in general—have become increasingly urbanized, and agricultural operations have become more automated, while the metropolis grows more gelatinous, unfocused, and pseudo-bucolic along its margins.

Rural (rural society) settlement
      Patterns of rural settlement indicate much about the history, economy, society, and minds of those who created them as well as about the land itself. The essential design of rural activity in the United States bears a strong family resemblance to that of other neo-European lands, such as Canada, Australia, New Zealand, South Africa, Argentina, or tsarist Siberia—places that have undergone rapid occupation and exploitation by immigrants intent upon short-term development and enrichment. In all such areas, under novel social and political conditions and with a relative abundance of territory and physical resources, ideas and institutions derived from a relatively stable medieval or early modern Europe have undergone major transformation. Further, these are nonpeasant countrysides, alike in having failed to achieve the intimate symbiosis of people and habitat, the humanized rural landscapes characteristic of many relatively dense, stable, earthbound communities in parts of Asia, Africa, Europe, and Latin America.

Early models of land allocation
      From the beginning the prevalent official policy of the British (except between 1763 and 1776) and then of the U.S. government was to promote agricultural and other settlement—to push the frontier westward as fast as physical and economic conditions permitted. The British crown's grants of large, often vaguely specified tracts to individual proprietors or companies enabled the grantees to draw settlers by the sale or lease of land at attractive prices or even by outright gift.

      Of the numerous attempts at group colonization, the most notable effort was the theocratic and collectivist New England town that flourished, especially in Massachusetts, Connecticut, and New Hampshire, during the first century of settlement. The town, the basic unit of government and comparable in area to townships in other states, allotted both rural and village parcels to single families by group decision. Contrary to earlier scholarly belief, in all but a few cases settlement was spatially dispersed in the socially cohesive towns, at least until about 1800. The relatively concentrated latter-day villages persist today as amoeba-like entities straggling along converging roads, neither fully rural nor agglomerated in form. The only latter-day settlement experiment of notable magnitude to achieve enduring success was a series of Mormon settlements in the Great Basin region of Utah and adjacent states, with their tightly concentrated farm villages reminiscent of the New England model. Other efforts have been made along ethnic, religious, or political lines, but success has been at best brief and fragile.

Creating the national domain
      With the coming of independence and after complex negotiations, the original 13 states surrendered to the new national government nearly all their claims to the unsettled western lands beyond their boundaries. Some tracts, however, were reserved for disposal to particular groups. Thus, the Western Reserve of northeastern Ohio gave preferential treatment to natives of Connecticut, while the military tracts in Ohio and Indiana were used as bonus payments to veterans of the American Revolution.

      A federally administered national domain was created, to which the great bulk of the territory acquired in 1803 in the Louisiana Purchase and later beyond the Mississippi and in 1819 in Florida was consigned. The only major exceptions were the public lands of Texas, which were left within that state's jurisdiction; such earlier French and Spanish land grants as were confirmed, often after tortuous litigation; and some Indian lands. In sharp contrast to the slipshod methods of colonial land survey and disposal, the federal land managers expeditiously surveyed, numbered, and mapped their territory in advance of settlement, beginning with Ohio in the 1780s, then sold or deeded it to settlers under inviting terms at a number of regional land offices.

      The design universally followed in the new survey system (except within the French, Spanish, and Indian grants) was a simple, efficient rectangular scheme. Townships (township) were laid out as blocks, each six by six miles in size, oriented with the compass directions. Thirty-six sections, each one square mile, or 640 acres (260 hectares), in size, were designated within each township; and public roads were established along section lines and, where needed, along half-section lines. At irregular intervals, offsets in survey lines and roads were introduced to allow for the Earth's curvature. Individual property lines were coincident with, or parallel to, survey lines, and this pervasive rectangularity generally carried over into the geometry of fields and fences or into the townsites later superimposed upon the basic rural survey.

      This all-encompassing checkerboard pattern is best appreciated from an airplane window over Iowa or Kansas. There, one sees few streams or other natural features and few diagonal highways or railroads interrupting the overwhelming squareness of the landscape. A systematic rectangular layout, rather less rigorous in form, also appears in much of Texas and in those portions of Maine, western New York and Pennsylvania, and southern Georgia that were settled after the 1780s.

Distribution of rural lands
      Since its formation, Congress has enacted a series of complex schemes for distribution of the national domain. The most famous of these plans was the Homestead Act of 1862, which offered title to 160 acres to individual settlers, subject only to residence for a certain period of time and to the making of minimal improvements to the land thus acquired. The legal provisions of such acts have varied with time as the nature of farming technology and of the remaining lands have changed, but their general effect has been to perpetuate the Jeffersonian ideal of a republic in which yeoman farmers own and till self-sufficient properties.

      The program was successful in providing private owners with relatively choice lands, aside from parcels reserved for schools and various township and municipal uses. More than one-third of the national territory, however, is still owned by federal and state governments, with much of this land in forest and wildlife preserves. A large proportion of this land is in the West and is unsuited for intensive agriculture or grazing because of the roughness, dryness, or salinity of the terrain; much of it is leased out for light grazing or for timber cutting.

Patterns of farm life
      During the classic period of American rural life, around 1900, the typical American lived or worked on a farm or was economically dependent upon farmers. In contrast to rural life in many other parts of the world, the farm family lived on an isolated farmstead some distance from town and often from farm neighbours; its property averaged less than one-quarter square mile. This farmstead varied in form and content with local tradition and economy. In particular, barn types were localized—for example, the tobacco barns of the South, the great dairy barns of Wisconsin, or the general-purpose forebay barns of southeastern Pennsylvania—as were modes of fencing. In general, however, the farmstead contained dwelling, barn, storage and sheds for small livestock and equipment, a small orchard, and a kitchen garden. A woodlot might be found in the least-accessible or least-fertile part of the farm.

      Successions of such farms were connected with one another and with the towns by means of a dense, usually rectangular lattice of roads, largely unimproved at the time. The hamlets, villages, and smaller cities were arrayed at relatively regular intervals, with size and affluence determined in large part by the presence and quality of rail service or status as the county seat. But, among people who have been historically rural, individualistic, and antiurban in bias, many services normally located in urban places might be found in rustic settings. Thus, much retail business was transacted by means of itinerant peddlers, while small shops for the fabrication, distribution, or repair of various items were often located in isolated farmsteads, as were many post offices.

      Social activity also tended to be widely dispersed among numerous rural churches, schools, or grange halls; and the climactic event of the year might well be the county fair, political rally, or religious encampment—again on a rural site. Not the least symptomatic sign of the strong tendency toward spatial isolation are the countless family burial plots or community cemeteries so liberally distributed across the countryside.

Regional small-town patterns
      There has been much regional variation among smaller villages and hamlets, but such phenomena have received relatively little attention from students of American culture or geography. The distinctive New England village, of course, is generally recognized and cherished: it consists of a loose clustering of white frame buildings, including a church (usually Congregationalist or Unitarian), town hall, shops, and stately homes with tall shade trees around the central green, or commons—a grassy expanse that may contain a bandstand and monuments or flowers. Derivative village forms were later carried westward to sections of the northern Midwest.

      Less widely known but equally distinctive is the town morphology characteristic of the Midland, or Pennsylvanian, culture area and most fully developed in southeastern and central Pennsylvania and Piedmont Maryland. It differs totally from the New England model in density, building materials, and general appearance. Closely packed, often contiguous buildings—mostly brick, but sometimes stone, frame, or stucco—abut directly on a sidewalk, which is often paved with brick and usually thickly planted with maple, sycamore, or other shade trees. Such towns are characteristically linear in plan, have dwellings intermingled with other types of buildings, have only one or two principal streets, and may radiate outward from a central square lined with commercial and governmental structures.

      The most characteristic U.S. small town is the one whose pattern evolved in the Midwest (Middle West). Its simple scheme is usually based on the grid plan. Functions are rigidly segregated spatially, with the central business district, consisting of closely packed two- or three-story brick buildings, limited exclusively to commercial and administrative activity. The residences, generally set well back within spacious lots, are peripheral in location, as are most rail facilities, factories, and warehouses.

      Even the modest urbanization of the small town came late to the South (South, the). Most urban functions long were spatially dispersed—almost totally so in the early Chesapeake Bay country or North Carolina—or were performed entirely by the larger plantations dominating the economic life of much of the region. When city and town began to materialize in the 19th and 20th centuries, they tended to follow the Midwestern model in layout.

      Although quite limited in geographic area, the characteristic villages of the Mormon and Hispanic-American districts are of considerable interest. The Mormon settlement uncompromisingly followed the ecclesiastically imposed grid plan composed of square blocks, each with perhaps only four very large house lots, and the block surrounded by extremely wide streets. Those villages in New Mexico in which population and culture were derived from Old Mexico were often built according to the standard Latin-American plan. The distinctive feature is a central plaza dominated by a Roman Catholic church and encircled by low stone or adobe buildings.

The rural–urban transition

Weakening of the agrarian ideal
      The United States has had little success in achieving or maintaining the ideal of the family farm. Through purchase, inheritance, leasing, and other means, some of dubious legality, smaller properties have been merged into much larger entities. By the late 1980s, for example, when the average farm size had surpassed 460 acres, farms containing 2,000 or more acres accounted for almost half of all farmland and 20 percent of the cropland harvested, even though they comprised less than 3 percent of all farms. At the other extreme were those 60 percent of all farms that contained fewer than 180 acres and reported less than 15 percent of cropland harvested. This trend toward fewer but larger farms has continued.

      The huge, heavily capitalized “neoplantation,” essentially a factory in the field, is especially conspicuous in parts of California, Arizona, and the Mississippi Delta, but examples can be found in any state. There are also many smaller but intensive operations that call for large investments and advanced managerial skills. This trend toward large-scale, capital-intensive farm enterprise has been paralleled by a sharp drop in rural farm population—a slump from the all-time high of some 32,000,000 in the early 20th century to about 5,000,000 in the late 1980s; but even in 1940, when farm folk still numbered more than 30,000,000, nearly 40 percent of farm operators were tenants, and another 10 percent were only partial owners.

      As the agrarian population has dwindled, so too has its immediate impact lessened, though less swiftly, in economic and political matters. The rural United States, however, has been the source of many of the nation's values and images. The United States has become a highly urbanized (urbanization), technologically advanced society far removed in daily life from cracker barrel, barnyard, corral, or logging camp. Although Americans have gravitated, sometimes reluctantly, to the big city, in the daydreams and assumptions that guide many sociopolitical decisions, the memory of a rapidly vanishing agrarian America is well noted. This is revealed not only in the works of contemporary novelists, poets, and painters but also throughout the popular arts: in movies, television, soap operas, folklore, country music, political oratory, and in much leisure activity.

Impact of the motor vehicle
      Since about 1920 more genuine change has occurred in American rural life than during the preceding three centuries of European settlement in North America. Although the basic explanation is the profound social and technological transformations engulfing most of the world, the most immediate agent of change has been the internal-combustion engine. The automobile, truck, bus, and paved highway have more than supplanted a moribund passenger and freight railroad system. While many local rail depots have been boarded up and scores of secondary lines have been abandoned, hundreds of thousands of miles of old dirt roads have been paved, and a vast system of interstate highways has been constructed to connect major cities in a single nonstop network. The net result has been a shrinking of travel time and an increase in miles traveled for the individual driver, rural or urban.

      Small towns in the United States have undergone a number of changes. Before 1970 towns near highways and urban centres generally prospered; while in the less-fortunate towns, where the residents lingered on for the sake of relatively cheap housing, downtown businesses often became extinct. From the late 1960s until about 1981 the rural and small-town population grew at a faster rate than the metropolitan population, the so-called metro–nonmetro turnaround, thus reversing more than a century of relatively greater urban growth. Subsequent evidence, however, suggests an approach toward equilibrium between the urban and rural sectors.

      As Americans have become increasingly mobile, the visual aspect of rural America has altered drastically. The highway has become the central route, and many of the functions once confined to the local town or city now stretch for many miles along major roads.

Reversal of the classic rural dominance
      The metropolitanization of life in the United States has not been limited to city, suburb, or exurb; it now involves most of the rural area and population. The result has been the decline of local crafts and regional peculiarities, quite visibly in such items as farm implements, fencing, silos, and housing and in commodities such as clothing or bread. In many ways, the countryside is now economically dependent on the city.

      The city dweller is the dominant consumer for products other than those of field, quarry, or lumber mill; and city location tends to determine patterns of rural economy rather than the reverse. During weekends and the vacation seasons, swarms of city folk stream out to second homes in the countryside and to campgrounds, ski runs, beaches, boating areas, or hunting and fishing tracts. For many large rural areas, recreation is the principal source of income and employment; and such areas as northern New England and upstate New York have become playgrounds and sylvan refuges for many urban residents.

      The larger cities reach far into the countryside for their vital supplies of water and energy. There is an increasing reliance upon distant coalfields to provide fuel for electrical power plants, and cities have gone far afield in seeking out rural disposal sites for their ever-growing volumes of garbage.

      The majority of the rural population now lives within daily commuting range of a sizable city. This enables many farm residents to operate their farms while, at the same time, working part- or full-time at a city job, and it thus helps to prevent the drastic decline in rural population that has occurred in remoter parts of the country. Similarly, many small towns within the shadow of a metropolis, with fewer and fewer farmers to service, have become dormitory satellites, serving residents from nearby cities and suburbs.

Urban settlement
      The United States has moved from a predominantly rural settlement into an urban society. In so doing, it has followed the general path that other advanced nations have traveled and one along which developing nations have begun to hasten. About three-fourths of the population live clustered within officially designated urban places and urbanized areas, which account for less than 2 percent of the national territory. At least another 15 percent live in dispersed residences that are actually urban in economic or social orientation.

Classic patterns of siting and growth
      Although more than 95 percent of the population was rural during the colonial period and for the first years of independence, cities were crucial elements in the settlement system from the earliest days. Boston; (Boston) New Amsterdam ( New York City); Jamestown, Va.; Charleston, S.C.; and Philadelphia were founded at the same time as the colonies they served. Like nearly all other North American colonial towns of consequence, they were ocean ports. Until at least the beginning of the 20th century the historical geography of U.S. cities was intimately related with that of successive transportation systems. The location of successful cities with respect to the areas they served, as well as their internal structure, was determined largely by the nature of these systems.

      The colonial cities acted as funnels for the collection and shipment of farm and forest products and other raw materials from the interior to trading partners in Europe, the Caribbean, or Africa and for the return flow of manufactured goods and other locally scarce items, as well as immigrants. Such cities were essentially marts and warehouses, and only minimal attention was given to social, military, educational, or religious functions. The inadequacy and high cost of overland traffic dictated sites along major ocean embayments or river estuaries; the only pre-1800 nonports worthy of notice were Lancaster and York, both in Pennsylvania, and Williamsburg, Va. With the populating of the interior and the spread of a system of canals and improved roads, such new cities as Pittsburgh, Pa.; Cincinnati, Ohio; Buffalo, N.Y.; and St. Louis, Mo., mushroomed at junctures between various routes or at which modes of transport were changed. Older ocean ports, such as New Castle, Del.; Newport, R.I.; Charleston, S.C.; Savannah, Ga.; and Portland, Maine, whose locations prevented them from serving large hinterlands, tended to stagnate.

      From about 1850 to 1920 the success of new cities and the further growth of older ones in large part were dependent on their location within the new steam railroad system and on their ability to dominate a large tributary territory. Such waterside rail hubs as Buffalo; Toledo, Ohio; Chicago; and San Francisco gained population and wealth rapidly, while such offspring of the rail era as Atlanta, Ga.; Indianapolis, Ind.; Minneapolis, Minn.; Fort Worth, Texas; and Tacoma, Wash., also grew dramatically. Much of the rapid industrialization of the 19th and early 20th centuries occurred in places already favoured by water or rail transport systems; but in some instances, such as in the cities of northeastern Pennsylvania's anthracite region, some New England mill towns, and the textile centres of the Carolina and Virginia Piedmont, manufacturing brought about rapid urbanization and the consequent attraction of transport facilities. The extraction of gold, silver, copper, coal, iron, and, in the 20th century, gas and oil led to rather ephemeral centres—unless these places were able to capitalize on local or regional advantages other than minerals.

      A strong early start, whatever the inital economic base may have been, was often the key factor in competition among cities. With sufficient early momentum, urban capital and population tended to expand almost automatically. The point is illustrated perfectly by the larger cities of the northeastern seaboard, from Portland, Maine, through Baltimore, Md. The nearby physical wealth is poor to mediocre, and they are now far off-centre on the national map; but a prosperous mercantile beginning, good land and sea connections with distant places, and a rich local accumulation of talent, capital, and initiative were sufficient to bring about the growth of one of the world's largest concentrations of industry, commerce, and people.

New factors in municipal development
      The pre-1900 development of the American city was almost completely a chronicle of the economics of the production, collection, and distribution of physical commodities and basic services dictated by geography, but there have been striking deviations from this pattern. The physical determinants of urban location and growth have given way to social factors. Increasingly, the most successful cities are oriented toward the more advanced modes for the production and consumption of services, specifically the knowledge, managerial, and recreational industries. The largest cities have become more dependent upon corporate headquarters, communications, and the manipulation of information for their sustenance. Washington, D.C. (Washington), is the most obvious example of a metropolis in which government and ancillary activities have been the spur for vigorous growth; but almost all of the state capitals have displayed a similar demographic and economic vitality. Further, urban centres that contain a major college or university often have enjoyed remarkable expansion.

      With the coming of relative affluence and abundant leisure to the population and a decrease of labour input in industrial processes, a new breed of cities has sprouted across the land: those that cater to the pleasure-seeker, vacationer, and the retired—for example, the young, flourishing cities of Florida or Nevada and many locations in California, Arizona, and Colorado.

      The automobile as a means of personal transportation was developed about the time of World War I, and the American city was catapulted into a radically new period, both quantitatively and qualitatively, in the further evolution of physical form and function. The size, density, and internal structure of the city were previously constrained by the limitations of the pedestrian and early mass-transit systems. Only the well-to-do could afford horse and carriage or a secluded villa in the countryside. Cities were relatively small and compact, with a single clearly defined centre, and they grew by accretion along their edges, without any significant spatial hiatuses except where commuter railroads linked outlying towns to the largest of metropolises. Workers living beyond the immediate vicinity of their work had to locate within reach of the few horse-drawn omnibuses or the later electric street railways.

      The universality of the automobile, even among the less affluent, and the parallel proliferation of service facilities and highways greatly loosened and fragmented the American city, which spread over surrounding rural lands. Older, formerly autonomous towns grew swiftly. Many towns became satellites of the larger city or were absorbed. Many suburbs and subdivisions arose with single-family homes on lots larger than had been possible for the ordinary householder in the city. These communities were almost totally dependent on the highway for the flow of commuters, goods, and services, and many were located in splendid isolation, separated by tracts of farmland, brush, or forest from other such developments. At the major interchanges of the limited-access highways, a new form of agglomerated settlement sprang up. In a further elaboration of this trend, many larger cities have been girdled by a set of mushrooming complexes. These creations of private enterprise embody a novel concept of urban existence: a metropolitan module no longer reliant on the central city or its downtown. Usually anchored on a cluster of shopping malls and office parks, these “hypersuburbs,” whose residents and employees circulate freely within the outer metropolitan ring, offer virtually all of the social and economic facilities needed for the modern life-style.

The new look of the metropolitan area
      The outcome has been a broad, ragged, semiurbanized belt of land surrounding each city, large or small, and quite often blending imperceptibly into the suburban-exurban halo encircling a neighbouring metropolitan centre. There is a great similarity in the makeup and general appearance of all such tracts: the planless intermixture of scraps of the rural landscape with the fragments of the scattered metropolis; the randomly distributed subdivisions or single homes; the vast shopping centres, the large commercial cemeteries, drive-in theatres, junkyards, and golf courses and other recreational enterprises; and the regional or metropolitan airport, often with its own cluster of factories, warehouses, or travel-oriented businesses. The traditional city—unitary, concentric in form, with a single well-defined middle—has been replaced by a relatively amorphous, polycentric metropolitan sprawl.

      The inner city of a large U.S. metropolitan area displays some traits that are common to the larger centres of all advanced nations. A central business district, almost always the oldest section of the city, is surrounded by a succession of roughly circular zones, each distinctive in economic and social-ethnic character. The symmetry of this scheme is distorted by the irregularities of surface and drainage or the effects of radial highways and railroads. Land is most costly, and hence land use is most intensive, toward the centre. Major business, financial and governmental offices, department stores, and specialty shops dominate the downtown, which is usually fringed by a band of factories and warehouses. The outer parts of the city, like the suburbs, are mainly residential.

      With some exceptions—e.g., large apartment complexes in downtown Chicago—people do not reside in the downtown areas, and there is a steady downward gradient in population density per unit area (and more open land and single-family residences) as one moves from the inner city toward the open country. Conversely, there is a general rise in income and social status with increasing distance from the core. The sharply defined immigrant neighbourhoods of the 19th century generally persist in a somewhat diluted form, though specific ethnic groups may have shifted their location. Later migrant groups, notably Southern blacks and Latin Americans, generally dominate the more run-down neighbourhoods of the inner cities.

Individual and collective character of cities
      American cities, more so than the small-town or agrarian landscape, tend to be the product of a particular period rather than of location. The relatively venerable centres of the Eastern Seaboard—Boston; Philadelphia; Baltimore, Md.; Albany, N.Y.; Chester, Pa.; Alexandria, Va.; or Georgetown (a district of Washington, D.C.), for example—are virtual replicas of the fashionable European models of their early period rather than the fruition of a regional culture, unlike New Orleans and Santa Fe, N.M., which reflect other times and regions. The townscapes of Pittsburgh; Detroit, Mich.; Chicago; and Denver, Colo., depict national modes of thought and the technological development of their formative years, just as Dallas, Texas; Las Vegas, Nev.; San Diego, Calif.; Tucson, Ariz.; and Albuquerque, N.M., proclaim contemporary values and gadgetry more than any local distinctiveness. When strong-minded city founders instituted a highly individual plan and their successors managed to preserve it—as, for example, in Savannah, Ga.; Washington, D.C.; and Salt Lake City, Utah—or when there is a happy combination of a spectacular site and appreciative residents—as in San Francisco or Seattle, Wash.—a genuine individuality does seem to emerge. Such an identity also may develop where immigration has been highly selective, as in such places as Miami, Fla.; Phoenix, Ariz.; and Los Angeles.

      As a group, U.S. cities differ from cities in other countries in both type and degree. The national political structure, the social inclinations of the people, and the strong outward surge of urban development have led to the political fragmentation of metropolises that socially and economically are relatively cohesive units. The fact that a single metropolitan area may sprawl across numerous incorporated towns and cities, several townships, and two or more counties and states has a major impact upon both its appearance and the way it functions. Not the least of these effects is a dearth of overall physical and social planning (urban planning) (or its ineffectuality when attempted), and the rather chaotic, inharmonious appearance of both inner-city and peripheral zones painfully reflects the absence of any effective collective action concerning such matters.

      The American city is a place of sharp transitions. Construction, demolition, and reconstruction go on almost ceaselessly, though increasing thought has been given to preserving monuments and buildings. From present evidence, it would be impossible to guess that New York City and Albany date from the 1620s or that Detroit was founded in 1701. Preservation and restoration do occur, but often only when it makes sense in terms of tourist revenue. Physical and social blight has reached epidemic proportions in the slum areas of the inner city; but, despite the wholesale razing of such areas and the subsequent urban-renewal projects (sometimes as apartment or commercial developments for the affluent), the belief has become widespread that the ills of the U.S. city are incurable, especially with the increasing flight of capital, tax revenue, and the more highly educated, affluent elements of the population to suburban areas and the spatial and political polarization of whites and nonwhites.

      In the central sections of U.S. cities, there is little sense of history or continuity; instead, one finds evidence of the dominance of the engineering mentality and of the credo that the business of the city is business. Commercial and administrative activities are paramount, and usually there is little room for church buildings or for parks or other nonprofit enterprises. The role of the cathedral, so central in the medieval European city, is filled by a U.S. invention serving both utilitarian and symbolic purposes, the skyscraper. Some cities have felt the need for other bold secular monuments; hence the Gateway Arch looming over St. Louis, Seattle's Space Needle, and Houston's Astrodome. Future archaeologists may well conclude from their excavations that American society was ruled by an oligarchy of highway engineers, architects, and bulldozer operators. The great expressways converging upon, or looping, the downtown area and the huge amount of space devoted to parking lots and garages are even more impressive than the massive surgery executed upon U.S. cities a century ago to hack out room for railroad terminals and marshaling yards.

      Within many urban sites there has been radical physical transformation of shoreline, drainage systems, and land surface that would be difficult to match elsewhere in the world. Thus, in their physical lineaments, Manhattan and inner Boston bear scant resemblance to the landscapes seen by their initial settlers. The surface of downtown Chicago has been raised several feet above its former swamp level, the city's lakefront extensively reshaped, and the flow of the Chicago River reversed. Los Angeles, notorious for its disregard of the environment, has its concrete arroyo bottoms, terraced hillsides and landslides, and its own artificial microclimate.

The supercities
      The unprecedented outward sprawl of American urban settlement has created some novel settlement forms, for the quantitative change has been so great as to induce qualitative transformation. The conurbation—a territorial coalescence of two or more sizable cities whose peripheral zones have grown together—may have first appeared in early 19th-century Europe. There are major examples in Great Britain, the Low Countries, and Germany, as well as in Japan.

      Nothing elsewhere, however, rivals in size and complexity the aptly named megalopolis, that supercity stretching along the Atlantic from Portland, Maine, past Richmond, Va. Other large conurbations include, in the Great Lakes region, one centred on Chicago and containing large slices of Illinois, Wisconsin, and Indiana; another based in Detroit, embracing large parts of Michigan and Ohio and reaching into Canada; and a third stretching from Buffalo through Cleveland and back to Pittsburgh. All three are reaching toward one another and may form another megalopolis that, in turn, may soon be grafted onto the seaboard megalopolis by a corridor through central New York state.

      Another example of a growing megalopolis is the huge southern California conurbation reaching from Santa Barbara, through a dominating Los Angeles, to the Mexican border. The solid strip of urban territory that lines the eastern shore of Puget Sound is a smaller counterpart. Quite exceptional in form is the slender linear multicity occupying Florida's Atlantic coastline, from Jacksonville to Miami, and the loose swarm of medium-sized cities clustering along the Southern Piedmont, from south-central Virginia to Birmingham, Ala.; also of note are the Texas cities of Dallas–Fort Worth, Houston, and San Antonio, which have formed a rapidly growing—though discontinuous—urbanized triangle.

      One of the few predictions that seem safe in so dynamic and innovative a land as the United States is that, unless severe and painful controls are placed on land use, the shape of the urban environment will be increasingly megalopolitan: a small set of great constellations of polycentric urban zones, each complexly interlocked socially and physically with its neighbours.

Traditional regions of the United States
      The differences among America's traditional regions, or culture areas, tend to be slight and shallow as compared with such areas in most older, more stable countries. The muted, often subtle nature of interregional differences can be ascribed to the relative newness of American settlement, a perpetually high degree of mobility, a superb communications system, and the galloping centralization of economy and government. It might even be argued that some of these regions are quaint vestiges of a vanishing past, of interest only to antiquarians.

      Yet, in spite of the nationwide standardization in many areas of American thought and behaviour, the lingering effects of the older culture areas do remain potent. In the case of the South, for example, the differences helped to precipitate the gravest political crisis and bloodiest military conflict in the nation's history. More than a century after the Civil War, the South remains a powerful entity in political, economic, and social terms, and its peculiar status is recognized in religious, educational, athletic, and literary circles.

      Even more intriguing is the appearance of a series of essentially 20th-century regions. Southern California is the largest and perhaps the most distinctive region, and its special culture has attracted large numbers of immigrants to the state. Similar trends are visible in southern Florida; in Texas, whose mystique has captured the national imagination; and to a certain degree in the more ebullient regions of New Mexico and Arizona as well. At the metropolitan level, it is difficult to believe that such distinctive cities as San Francisco, Las Vegas, Dallas, Tucson, and Seattle have become like all other American cities. A detailed examination, however, would show significant if sometimes subtle interregional differences in terms of language, religion, diet, folklore, folk architecture and handicrafts, political behaviour, social etiquette, and a number of other cultural categories.

The hierarchy of culture areas
      A multitiered hierarchy of culture areas might be postulated for the United States; but the most interesting levels are, first, the nation as a whole and, second, the five to 10 large subnational regions, each embracing several states or major portions thereof. There is a remarkably close coincidence between the political United States and the cultural United States. Crossing into Mexico, the traveler passes across a cultural chasm. If the contrasts are less dramatic between the two sides of the U.S.-Canadian boundary, they are nonetheless real, especially to the Canadian. Erosion of the cultural barrier has been largely limited to the area that stretches from northern New York state to Aroostook county, Maine. There, a vigorous demographic and cultural immigration by French-Canadians has gone far toward eradicating international differences.

      While the international boundaries act as a cultural container, the interstate boundaries are curiously irrelevant. Even when the state had a strong autonomous early existence—as happened with Massachusetts, Virginia, or Pennsylvania—subsequent economic and political forces have tended to wash away such initial identities. Actually, it could be argued that the political divisions of the 48 coterminous states are anachronistic in the context of contemporary socioeconomic and cultural forces. Partially convincing cases might be built for equating Utah and Texas with their respective culture areas because of exceptional historical and physical circumstances, or perhaps Oklahoma, given its very late European occupation and its dubious distinction as the territory to which exiled Indian tribes of the East were relegated. In most instances, however, the states either contain two or more distinctly different culture and political areas or fragments thereof or are part of a much larger single culture area. Thus sharp North–South dichotomies characterize California, Missouri, Illinois, Indiana, Ohio, and Florida, while Tennessee advertises that there are really three Tennessees. In Virginia the opposing cultural forces were so strong that actual fission took place in 1863 (with the admission to the Union of West Virginia) along one of those rare interstate boundaries that approximate a genuine cultural divide.

      Much remains to be learned about the cause and effect relations between economic and culture areas in the United States. If the South or New England could at one time be correlated with a specific economic system, this is no longer easy to do. Cultural systems appear to respond more slowly to agents of change than do economic or urban systems. Thus the Manufacturing Belt, a core region for many social and economic activities, now spans parts of four traditional culture areas—New England, the Midland, the Midwest, and the northern fringes of the South. The great urban sprawl, from southern Maine to central Virginia, blithely ignores the cultural slopes that are still visible in its more rural tracts.

The cultural hearths
      The culture areas of the United States are generally European in origin, the result of importing European colonists and ways of life and the subsequent adaptation of social groups to new habitats. The aboriginal cultures have had relatively little influence on the nation's modern culture. In the Southwestern and the indistinct Oklahoman subregions, the Indian element merits consideration only as one of several ingredients making up the regional mosaic. With some exceptions, the map of American culture areas in the East can be explained in terms of the genesis, development, and expansion of the three principal colonial cultural hearths along the Atlantic seaboard. Each was basically British in character, but their personalities remain distinct because of, first, different sets of social and political conditions during the critical period of first effective settlement and, second, local physical and economic circumstances. The cultural gradients between them tend to be much steeper and the boundaries more distinct than is true for the remainder of the nation.

 New England was the dominant region during the century of rapid expansion following the American Revolution and not merely in terms of demographic or economic expansion. In social and cultural life—in education, politics, theology, literature, science, architecture, and the more advanced forms of mechanical and social technology—the area exercised its primacy. New England was the leading source of ideas and styles for the nation from about 1780 to 1880; it furnishes an impressive example of the capacity of strongly motivated communities to rise above the constraints of a harsh environment.

      During its first two centuries, New England had an unusually homogeneous population. With some exceptions, the British immigrants shared the same nonconformist religious beliefs, language, social organization, and general outlook. A distinctive regional culture took form, most noticeably in terms of dialect, town morphology, and folk architecture. The personality of the people also took on a regional coloration both in folklore and in actuality; there is sound basis for the belief that the traditional New England Yankee is self-reliant, thrifty, inventive, and enterprising. The influx of immigrants that began in the 1830s diluted and altered the New England identity, but much of its early personality survived.

      By virtue of location, wealth, and seniority, the Boston metropolitan area has become the cultural economic centre of New England. This sovereignty is shared to some degree, however, with two other old centres, the lower Connecticut Valley and the Narragansett Bay region of Rhode Island.

      The early westward demographic and ideological expansion of New England was so influential that it is justifiable to call New York, northern New Jersey, northern Pennsylvania, and much of the Upper Midwest “New England Extended.” Further, the energetic endeavours of New England whalers, merchants, and missionaries had a considerable impact on the cultures of Hawaii, various other Pacific isles, and several points in the Caribbean. New Englanders also were active in the Americanization of early Oregon and Washington, with results that are still visible. Later, the overland diffusion of New England natives and practices meant a recognizable New England character not only for the Upper Midwest, from Ohio to the Dakotas, but also in the Pacific Northwest in general, though to a lesser degree.

The South (South, the)
  By far the largest of the three original Anglo-American culture areas, the South is also the most idiosyncratic with respect to national norms—or slowest to accept them. The South was once so distinct from the non-South in almost every observable or quantifiable feature and so fiercely proud of its peculiarities that for some years the question of whether it could maintain political and social unity with the non-South was in serious doubt. These differences are still observable in almost every realm of human activity, including rural economy, dialect, diet, costume, folklore, politics, architecture, social customs, and recreation. Only during the 20th century can an argument be made that it has achieved a decisive convergence with the rest of the nation, at least in terms of economic behaviour and material culture.

      A persistent deviation from the national mainstream probably began in the first years of settlement. The first settlers of the South were almost purely British, not outwardly different from those who flocked to New England or the Midland, but almost certainly distinct in terms of motives and social values and more conservative in retaining the rurality and the family and social structure of premodern Europe. The vast importation of African slaves was also a major factor, as was a degree of contact with the Indians that was less pronounced farther north. In addition, the unusual pattern of economy (much different from that of northwestern Europe), settlement, and social organization, which were in part an adaptation to a starkly unfamiliar physical habitat, accentuated the South's deviation from other culture areas.

      In both origin and spatial structure, the South has been characterized by diffuseness. In the search for a single cultural hearth, the most plausible choice is the Chesapeake Bay area and the northeastern corner of North Carolina, the earliest area of recognizably Southern character. Early components of Southern population and culture also arrived from other sources. A narrow coastal strip from North Carolina to the Georgia–Florida border and including the Sea Islands is decidedly Southern in character, yet it stands apart self-consciously from other parts of the South. Though colonized directly from Great Britain, it had also significant connections with the West Indies, in which relation the African cultural contribution was strongest and purest. Charleston and Savannah, which nurtured their own distinctive civilizations, dominated this subregion. Similarly, French Louisiana received elements of culture and population—to be stirred into the special Creole mixture—not only, putatively, from the Chesapeake Bay hearth area but also indirectly from France, French Nova Scotia, the French West Indies, and Africa. In south central Texas, the Germanic and Hispanic influx was so heavy that a special subregion can be designated.

      It would seem, then, that the Southern culture area may be an example of convergent, or parallel, evolution of a variety of elements arriving along several paths but subject to some single general process that could mold one larger regional consciousness and way of life.

      Because of its slowness in joining the national technological mainstream, the South can be subdivided into a much greater number of subregions than is possible for any of the other older traditional regions. Those described above are of lesser order than the two principal Souths, variously called Upper and Lower (or Deep) South, Upland and Lowland South, or Yeoman and Plantation South.

      The Upland South, which comprises the southern Appalachians, the upper Appalachian Piedmont, the Cumberland and other low interior plateaus, and the Ozarks and Ouachitas, was colonized culturally and demographically from the Chesapeake Bay hearth area and the Midland; it is most emphatically white Anglo-Saxon Protestant (WASP) in character. The latter area, which contains a large black population, includes the greater part of the South Atlantic and Gulf coastal plains and the lower Appalachian Piedmont. Its early major influences came from the Chesapeake Bay area, with only minor elements from the coastal Carolina–Georgia belt, Louisiana, and elsewhere. The division between the two subregions remains distinct from Virginia to Texas, but each region can be further subdivided. Within the Upland South, the Ozark region might legitimately be detached from the Appalachian; and, within the latter, the proud and prosperous Kentucky Bluegrass, with its emphasis on tobacco and Thoroughbreds, certainly merits special recognition.

      Toward the margins of the South, the difficulties in delimiting subregions become greater. The outer limits themselves are a topic of special interest. There seems to be more than an accidental relation between these limits and various climatic factors. The fuzzy northern boundary, definitely not associated with the conventional Mason and Dixon Line or the Ohio River, seems most closely associated with length of frost-free season or with temperature during the winter. As the Southern cultural complex was carried to the West, it not only retained its strength but became more intense, in contrast to the influence of New England and the Midland. But the South finally fades away as one approaches the 100th meridian, with its critical decline in annual precipitation. The apparent correlation between the cultural South and a humid subtropical climatic regime is in many ways valid.

      The Texas subregion is so large, distinctive, vigorous, and self-assertive that it presents some vexing classificatory questions. Is Texas simply a subregion of the Greater South, or has it acquired so strong and divergent an identity that it can be regarded as a major region in its own right? It is likely that a major region has been born in a frontier zone in which several distinct cultural communities confront one another and in which the mixture has bred the vigorous, extroverted, aggressive Texas personality so widely celebrated in song and story. Similarly, peninsular Florida may be considered either within or juxtaposed to the South but not necessarily part of it. In the case of Florida, an almost empty territory began to receive significant settlement only after about 1890, and if, like Texas, most of it came from the older South, there were also vigorous infusions from elsewhere.

The Midland
 The significance of this region has not been less than that of New England or the South, but its characteristics are the least conspicuous to outsiders as well as to its own residents—reflecting, perhaps, its centrality in the course of U.S. development. The Midland (a term not to be confused with Midwest) comprises portions of Middle Atlantic and Upper Southern states: Pennsylvania, New Jersey, Delaware, and Maryland. Serious European settlement of the Midland began a generation or more after that of the other major cultural centres and after several earlier, relatively ineffectual trials by the Dutch, Swedes, Finns, and British. But once begun late in the 17th century by William Penn (Penn, William) and his associates, the colonization of the area was a success. Within southeastern Pennsylvania this culture area first assumed its distinctive character: a prosperous, sober, industrious agricultural society that quickly became a mixed economy as mercantile and later industrial functions came to the fore. By the mid-18th century much of the region had acquired a markedly urban character, resembling in many ways the more advanced portions of the North Sea countries. In this respect, at least, the Midland was well ahead of neighbouring areas to the north and south.

      It differed also in its polyglot ethnicity (ethnic group). From almost the beginning, the various ethnic and religious groups of the British Isles were joined by immigrants from the European mainland. This diversity has grown and is likely to continue. The mosaic of colonial ethnic groups has persisted in much of Pennsylvania, New York, New Jersey, and Maryland, as has the remarkable variety of nationalities and churches in coalfields, company towns, cities, and many rural areas. Much of the same ethnic heterogeneity can be seen in New England, the Midwest, and a few other areas, but the Midland stands out as perhaps the most polyglot region of the nation. The Germanic element has always been notably strong, if irregularly distributed, in the Midland, accounting for more than 70 percent of the population of many towns. Had the Anglo-American culture not triumphed, the area might well have been designated Pennsylvania German.

      Physiography and migration carried the Midland culture area into the Maryland Piedmont. Although its width tapers quickly below the Potomac, it reaches into parts of Virginia and West Virginia, with traces legible far down the Appalachian zone and into the South.

      The northern half of the greater Midland region (the New York subregion, or New England Extended) cannot be assigned unequivocally to either New England or this Midland. Essentially it is a hybrid formed mainly from two regional strains of almost equal strength: New England and the post-1660 British element moving up the Hudson valley and beyond. There has also been a persistent, if slight, residue of early Dutch culture and some subtle filtering northward of Pennsylvanian influences. Apparently within the New York subregion occurred the first major fusion of American regional cultures, especially within the early 19th-century “Burned-Over District,” around the Finger Lakes and Genesee areas of central and western New York. This locality, the seedbed for a number of important social innovations, was a major staging area for westward migration and possibly a major source for the people and notions that were to build the Midwestern culture area.

      Toward the west the Midland retains its integrity for only a short distance—certainly no further than eastern Ohio—as it becomes submerged within the Midwest. Still, its significance in the genesis of the Midwest and the national culture should not be minimized. Its success in projecting its image upon so much of the country may have drawn attention away from the source area. As both name and location suggest, the Midland is intermediate in character in many respects, lying between New England and the South. Its residents are much less concerned with, or conscious of, a strong regional identity (excepting the Pennsylvania Dutch caricatures) than is true for the other regions, and, in addition, the Midland lacks their strong political and literary traditions, though it is unmistakable in its distinctive townscapes and farmsteads.

The newer culture areas

The Midwest (Middle West)
 There is no such self-effacement in the Midwest, that large triangular region justly regarded as the most nearly representative of the national average. Everyone within or outside of the Midwest knows of its existence, but no one is certain where it begins or ends. The older apex of the eastward-pointing triangle appears to rest around Pittsburgh, while the two western corners melt away somewhere in the Great Plains, possibly in southern Manitoba in the north and southern Kansas in the south. The eastern terminus and the southern and western borders are broad, indistinct transitional zones.

      Serious study of the historical geography of the Midwest began only in the 20th century, but it seems likely that this culture region was the combination of all three colonial regions and that this combination first took place in the upper Ohio valley. The early routes of travel—the Ohio and its tributaries, the Great Lakes, and the low, level corridor along the Mohawk and the coastal plains of Lake Ontario and Lake Erie—converge upon Ohio. There, the people and cultural traits from New England, the Midland, and the South were first funneled together. There seems to have been a fanlike widening of the new hybrid area into the West as settlers worked their way frontierward.

      Two major subregions are readily discerned, the Upper and Lower Midwest. They are separated by a line, roughly approximating the 41st parallel, that persists as far west as Colorado in terms of speech patterns and indicates differences in regional provenance in ethnic and religious terms as well. Much of the Upper Midwest retains a faint New England character, although Midland influences are probably as important. A rich mixture of German, Scandinavian, Slavic, and other non-WASP elements has greatly diversified a stock in which the British element usually remains dominant and the range of church denominations is great. The Lower Midwest, except for the relative scarcity of blacks, tends to resemble the South in its predominantly Protestant and British makeup. There are some areas with sizable Roman Catholic and non-WASP populations, but on the whole the subregion tends to be more WASP in inclination than most other parts of the nation.

The problem of “the West (West, The)”
      The foregoing culture areas account for roughly the eastern half of the coterminous United States. There is a dilemma in classifying the remaining half. The concept of the American West, strong in the popular imagination, is reinforced constantly by romanticized cinematic and television images of the cowboy. It is facile to accept the widespread Western livestock complex as epitomizing the full gamut of Western life, because although the cattle industry may have once accounted for more than one-half of the active Western domain as measured in acres, it employed only a relatively small fraction of the total population. As a single subculture, it cannot represent the total regional culture.

      It is not clear whether there is a genuine, single, grand Western culture region. Unlike the East, where virtually all the land is developed and culture areas and subregions abut and overlap in splendid confusion, the eight major and many lesser nodes of population in the western United States resemble oases, separated from one another by wide expanses of nearly unpopulated mountain or arid desert. The only obvious properties these isolated clusters have in common are, first, the intermixture of several strains of culture, primarily from the East but with additions from Europe, Mexico, and East Asia, and, second, except for one subregion, a general modernity, having been settled in a serious way no earlier than the 1840s. Some areas may be viewed as inchoate, or partially formed, cultural entities; the others have acquired definite personalities but are difficult to classify as first-order or lesser order culture areas.

      There are several major tracts in the western United States that reveal a genuine cultural identity: the Upper Rio Grande region, the Mormon region, southern California, and, by some accounts, northern California. To this group one might add the anomalous Texan and Oklahoman subregions, which have elements of both the West and the South.

 The term Upper Rio Grande region was coined to denote the oldest and strongest of the three sectors of Hispanic-American activity in the Southwest, the others being southern California and portions of Texas. Although covering the valley of the upper Rio Grande, the region also embraces segments of Arizona and Colorado as well as other parts of New Mexico. European communities and culture have been present there, with only one interruption, since the late 16th century. The initial sources were Spain and Mexico, but after 1848 at least three distinct strains of Anglo-American culture were increasingly well represented—the Southern, Mormon, and a general undifferentiated Northeastern culture—plus a distinct Texan subcategory. For once this has occurred without obliterating the Indians, whose culture endures in various stages of dilution, from the strongly Americanized or Hispanicized to the almost undisturbed.

      The general mosaic is a fabric of Indian, Anglo, and Hispanic elements, and all three major groups, furthermore, are complex in character. The Indian component is made up of Navajo, Pueblo, and several smaller groups, each of which is quite distinct from the others. The Hispanic element is also diverse—modally Mexican mestizo, but ranging from pure Spanish to nearly pure pre-Spanish aboriginal.

  The Mormon region is expansive in the religious and demographic realms, though it has ceased to expand territorially as it did in the decades after the first settlement in the Salt Lake valley in 1847. Despite its Great Basin location and an exemplary adaptation to environmental constraints, this cultural complex appears somewhat non-Western in spirit: the Mormons may be in the West, but they are not entirely of it. Their historical derivation from the Midwest and from ultimate sources in New York and New England is still apparent, along with the generous admixture of European converts to their religion.

      As in New England, the power of the human will and an intensely cherished abstract design have triumphed over an unfriendly habitat. The Mormon way of life is expressed in the settlement landscape and economic activities within a region more homogeneous internally than any other U.S. culture area.

 In contrast, northern California has yet to gain its own strong cultural coloration. From the beginning of the great 1849 gold rush the area drew a diverse population from Europe and Asia as well as the older portions of the United States. Whether the greater part of northern California has produced a culture amounting to more than the sum of the contributions brought by immigrants is questionable. San Francisco, the regional metropolis, may have crossed the qualitative threshold. An unusually cosmopolitan outlook that includes an awareness of the Orient stronger than that of any other U.S. city, a fierce self-esteem, and a unique townscape may be symptomatic of a genuinely new, emergent local culture.

      Southern California is the most spectacular of the Western regions, not only in terms of economic and population growth but also for the luxuriance, regional particularism, and general avant-garde character of its swiftly evolving cultural pattern. Until the coming of a direct transcontinental rail connection in 1885, the region was remote, rural, and largely inconsequential. Since then, the invasion by persons from virtually every corner of North America and by the world has been massive, but since the 1960s in-migration has slackened perceptibly, and many residents have begun to question the doctrine of unlimited growth. In any event, a loosely articulated series of urban and suburban developments continue to encroach upon what little is left of arable or habitable land in the Coast Ranges and valleys from Santa Barbara to the Mexican border.

      Although every major ethnic and racial group and every other U.S. culture area is amply represented in southern California, there is reason to suspect that a process of selection for certain types of people, attitudes, and personality traits may have been at work at both source and destination. The region is distinct from, or perhaps in the vanguard of, the remainder of the nation. One might view southern California as the super-American region or the outpost of a postindustrial future, but its cultural distinctiveness is very evident in landscape and social behaviour. Southern California in no way approaches being a “traditional region,” or even the smudged facsimile of such, but rather the largest, boldest experiment in creating a “voluntary region,” one built through the self-selection of immigrants and their subsequent interaction.

 The remaining identifiable Western regions—the Willamette valley of Oregon, the Puget Sound region, the Inland Empire of eastern Washington and adjacent tracts of Idaho and Oregon, central Arizona, and the Colorado Piedmont—can be treated jointly as potential, or emergent, culture areas, still too close to the national mean to display any cultural distinctiveness. In all of these regions is evident the arrival of a cross section of the national population and the growth of regional life around one or more major metropolises. A New England element is noteworthy in the Willamette valley and Puget Sound regions, while a Hispanic-American component appears in the Colorado Piedmont and central Arizona. Only time and further study will reveal whether any of these regions, so distant from the historic sources of U.S. population and culture, have the capacity to become an independent cultural area.

Wilbur Zelinsky

The people (United States)
      A nation for little more than 225 years, the United States is a relatively new member of the global community, but its rapid growth since the 18th century is unparalleled. The early promise of the New World as a refuge and land of opportunity was realized dramatically in the 20th century with the emergence of the United States as a world power. With a total population exceeded only by those of China and India, the United States is also characterized by an extraordinary diversity in ethnic and racial ancestry. A steady stream of immigration, notably from the 1830s onward, formed a pool of foreign-born persons unmatched by any other nation; 60 million people immigrated to U.S. shores in the 18th and 19th centuries. Many were driven, seeking escape from political or economic hardship, while others were drawn, by a demand for workers, abundant natural resources, and expansive cheap land. Most arrived hoping to remake themselves in the New World.

      Americans also have migrated internally with great vigour, exhibiting a restlessness that thrived in the open lands and on the frontier. Initially, migratory patterns ran east to west and from rural areas to cities, then, in the 20th century, from the South to the Northeast and Midwest. Since the 1950s, though, movement has been primarily from the cities to outlying suburbs, and from aging northern metropolises to the growing urban agglomerations of the South, Southwest, and West.

      At the dawn of the 21st century, the majority of the U.S. population had achieved a high level of material comfort, prosperity, and security. Nonetheless, Americans struggled with the unexpected problems of relative affluence, as well as the persistence of residual poverty. Crime, drug abuse, affordable energy sources, urban sprawl, voter apathy, pollution, high divorce rates, AIDS, and excessive litigation remained continuing subjects of concern, as were inequities and inadequacies in education and managed health care. Among the public policies widely debated were abortion, gun ownership, welfare reforms, and the death penalty.

      Many Americans perceive social tension as the product of their society's failure to extend the traditional dream of equality of opportunity to all people. Ideally, social, political, economic, and religious freedom would assure the like treatment of everyone, so that all could achieve goals in accord with their individual talents, if only they worked hard enough. This strongly held belief has united Americans throughout the centuries. The fact that some groups have not achieved full equality troubles citizens and policy-makers alike.

Ethnic distribution
      After decades of immigration and acculturation, many U.S. citizens can trace no discernible ethnic identity, describing themselves generically only as "American," while others claim mixed identities. The 2000 U.S. census introduced a new category for those who identified themselves as a member of more than one race; of 281.4 million counted, 2.4 percent chose this multiracial classification.

Ethnic European-Americans (ethnic group)
      Although the term "ethnic" is frequently confined to the descendants of the newest immigrants, its broader meaning applies to all groups unified by their cultural heritage and experience in the New World. In the 19th century, Yankees (Yankee) formed one such group, marked by common religion and by habits shaped by the original Puritan settlers. From New England, the Yankees spread westward through New York, northern Ohio, Indiana, Illinois, Iowa, and Kansas. Tightly knit communities, firm religious values, and a belief in the value of education resulted in prominent positions for Yankees in business, in literature and law, and in cultural and philanthropic institutions. They long identified with the Republican Party. Southern whites and their descendants, by contrast, remained preponderantly rural as migration took them westward across Tennessee and Kentucky to Arkansas, Missouri, Oklahoma, and Texas. These people inhabited small towns until the industrialization of the South in the 20th century, and they preserved affiliations with the Democratic Party until the 1960s.

      The colonial population also contained other elements that long sustained their group identities. The Pennsylvania Germans (Pennsylvania German), held together by religion and language, still pursue their own way of life after three centuries, as exemplified by the Amish. The great 19th-century German migrations, however, were made up of families who dispersed in the cities as well as in the agricultural areas to the West; to the extent that ethnic ties have survived they are largely sentimental. That is also true of the Scots, Scotch-Irish, Welsh, and Dutch, whose colonial nuclei received some reinforcement after 1800 but who gradually adapted to the ways of the larger surrounding groups.

      Distinctive language and religion preserved some coherence among the descendants of the Scandinavian newcomers of the 19th century. Where these people clustered in sizeable settlements, as in Minnesota, they transmitted a sense of identity beyond the second generation; and emotional attachments to the lands of origin lingered.

      Religion was a powerful force for cohesion among the Roman Catholic Irish and the Jews, both tiny groups before 1840, both reinforced by mass migration thereafter. Both have now become strikingly heterogeneous, displaying a wide variety of economic and social conditions, as well as a degree of conformity to the styles of life of other Americans. But the pull of external concerns—in the one case, unification of Ireland; in the other, Israel's security—have helped to preserve group loyalty.

      Indeed, by the 1970s "ethnic" (in its narrow connotation) had come to be used to describe the Americans of Polish, Italian, Lithuanian, Czech, and Ukrainian extraction, along with those of other eastern and southern European ancestry. Tending to be Roman Catholic and middle-class, most settled in the North and Midwest. The city neighbourhoods in which many of them lived initially had their roots in the "Little Italys" and "Polish Hills" established by the immigrants. By the 1980s and '90s a significant number had left these enclaves for nearby suburbs. The only European ethnic group to arrive in large numbers at the end of the 20th century were Russians, especially Russian Jews, benefiting from perestroika.

      In general, a pattern of immigration, self-support, and then assimilation was typical. Recently established ethnic groups often preserve greater visibility and greater cohesion. Their group identity is based not only upon a common cultural heritage but also on the common interests, needs, and problems they face in the present-day United States. As the immigrants and their descendants, most have been taught to believe that the road to success in the United States is achieved through individual effort. They tend to believe in equality of opportunity and self-improvement and attribute poverty to the failing of the individual and not to inequities in society. As the composition of the U.S. population changed, it was projected that sometime in the 21st century, Americans of European descent would be outnumbered by those from non-European ethnic groups.

      From colonial times, African-Americans arrived in large numbers as slaves and lived primarily on plantations in the South. In 1790 slave and free blacks together comprised about one-fifth of the U.S. population. As the nation split between southern slave and northern free states prior to the American Civil War, the Underground Railroad spirited thousands of escaped slaves from South to North. In the century following abolition, this migration pattern became more pronounced as 6.5 million blacks moved from rural areas of the South to northern cities between 1910 and 1970. On the heels of this massive internal shift came new immigrants from West Africa and the black Caribbean, principally Haiti, Jamaica, and the Dominican Republic.

      The civil rights movement in the 1950s and '60s awakened the nation's conscience to the plight of African-Americans, who had long been denied first-class citizenship. The movement used nonviolence and passive resistance to change discriminatory laws and practices, primarily in the South. As a result, increases in median income and college enrollment among the black (black nationalism) population were dramatic in the late 20th century. Widening access to professional and business opportunities included noteworthy political victories. By the early 1980s black mayors in Chicago, Los Angeles, Cleveland, Baltimore, Atlanta, and Washington, D.C., had gained election with white support. In 1984 and 1988 Jesse Jackson (Jackson, Jesse) ran for U.S. president; he was the first African-American to contend seriously for a major party nomination. However, despite an expanding black middle-class and equal-opportunity laws in education, housing, and employment, African-Americans continue to face staunch social and political challenges, especially those living in the inner cities, where some of American society's most difficult problems (such as crime and drug trafficking) are acute.

The Hispanics
      Like African-Americans, Hispanics (Latinos) make up about one-eighth of the U.S. population. Although they generally share Spanish as a second (and sometimes first) language, Hispanics are hardly a monolithic group. The majority, nearly three-fifths, are of Mexican origin—some descended from settlers in portions of the United States that were once part of Mexico (Texas, Arizona, New Mexico, and California), others legal and illegal migrants from across the loosely guarded Mexico–U.S. border. The greater opportunities and higher living standards in the United States have long attracted immigrants from Mexico and Central America.

      The Puerto Rican experience in the United States is markedly different from that of Mexican Americans. Most importantly, Puerto Ricans are American citizens by virtue of the island commonwealth's association with the United States. As a result, migration between Puerto Rico and the United States has been fairly fluid, mirroring the continuous process by which Americans have always moved to where chances seem best. While most of that migration traditionally has been toward the mainland, by the end of the 20th century in- and out-migration between the island and the United States equalized. Puerto Ricans now make up about one-tenth of the U.S. Latino population.

      Quite different, though also Spanish-speaking, are the Cubans who fled Fidel Castro's communist revolution of 1959 and their descendants. While representatives of every social group are among them, the initial wave of Cubans was distinctive because of the large number of professional and middle-class people who migrated. Their social and political attitudes differ significantly from those of Mexican Americans and Puerto Ricans, though this distinction was lessened by an influx of 120,000 Cuban refugees in the 1980s, known as the Mariel immigrants.

      After 1960 easy air travel and political and economic instability stimulated a significant migration from the Caribbean, Central America, and South America. The arrivals from Latin America in earlier years were often political refugees, more recently they usually have been economic refugees. Constituting about one-fourth of the Hispanic diaspora, this group comprises largely Central Americans, Colombians, and Dominicans, the last of whom have acted as a bridge between the black and Latino communities. Latinos have come together for better health, housing, and municipal services, for bilingual school programs, and for improved educational and economic opportunities.

      Asian-Americans as a group have confounded earlier expectations that they would form an indigestible mass in American society. The Chinese, earliest to arrive (in large numbers from the mid-19th century, principally as labourers, notably on the transcontinental railroad), and the Japanese were long victims of racial discrimination. In 1924 the law barred further entries; those already in the United States had been ineligible for citizenship since the previous year. In 1942 thousands of Japanese, many born in the United States and therefore American citizens, were interned in relocation camps because their loyalty was suspect after the United States engaged Japan in World War II. Subsequently, anti-Asian prejudice largely dissolved, and Chinese and Japanese, along with others such as the Vietnamese and Taiwanese, have adjusted and advanced. Among generally more recent arrivals, many Koreans, Filipinos, and Asian Indians have quickly enjoyed economic success. Though enumerated separately by the U.S. census, Pacific Islanders, such as native Hawaiians, constitute a small minority but contribute to making Hawaii and California the states with the largest percentages of Asian-Americans.

Middle Easterners
      Among the trends of Arab immigration in the 20th century were the arrival of Lebanese Christians in the first half of the century and Palestinian Muslims in the second half. Initially Arabs inhabited the East Coast, but by the end of the century there was a large settlement of Arabs in the greater Detroit area. Armenians, also from southwest Asia, arrived in large numbers in the early 20th century, eventually congregating largely in California, where, later in the century, Iranians were also concentrated. Some recent arrivals from the Middle East maintain national customs such as traditional dress.

Native Americans (Native American)
      Native Americans form an ethnic group only in a very general sense. In the East, centuries of coexistence with whites has led to some degree of intermarriage and assimilation and to various patterns of stable adjustment. In the West the hasty expansion of agricultural settlement crowded the Native Americans into reservations (reservation), where federal policy has vacillated between efforts at assimilation and the desire to preserve tribal cultural identity, with unhappy consequences. The Native American population has risen from its low point of 235,000 in 1900 to 2.5 million at the turn of the 21st century.

      The reservations are often enclaves of deep poverty and social distress, although the many casinos operated on their land have created great wealth in some instances. The physical and social isolation of the reservation prompted many Native Americans to migrate to large cities, but, by the end of the 20th century, a modest repopulation occurred in rural counties of the Great Plains. In census numerations Native Americans are categorized with Alaskan natives, notably Aleuts and Eskimos. In the latter half of the 20th century, intertribal organizations were founded to give Native Americans a unified, national presence.

Religious groups
      The U.S. government has never supported an established church, and the diversity of the population has discouraged any tendency toward uniformity in worship. As a result of this individualism, thousands of religious denominations thrive within the country. Only about one-sixth of religious adherents are not Christian, and although Roman Catholicism is the largest single denomination (about one-fifth of the U.S. population), the many churches of Protestantism constitute the majority. Some are the products of native development—among them the Disciples of Christ (founded in the early 19th century), Church of Jesus Christ of Latter-day Saints (Mormons; 1830), Seventh-day Adventists (officially established 1863), Jehovah's Witnesses (1872), Christian Scientists (1879), and the various Pentecostal churches (late 19th century).

      Other denominations had their origins in the Old World, but even these have taken distinctive American forms. Affiliated Roman Catholics look to Rome for guidance, although there are variations in practice from diocese to diocese. More than 5.5 million Jews are affiliated with three national organizations (Orthodox, Conservative, and Reform), as well as with many smaller sects. Most Protestant denominations also have European roots, the largest being the Baptists, Pentecostals, and Methodists. Among other groups are Lutherans, Presbyterians, Episcopalians, various Eastern churches (including Orthodox), Congregationalists, Reformed, Mennonites and Amish, various Brethren, Unitarians, and the Friends (Quakers). By 2000 substantial numbers of recent immigrants had increased the Muslim, Buddhist, and Hindu presence to about 4 million, 2.5 million, and 1 million believers, respectively.

  Immigration legislation began in earnest in the late 19th century, but it was not until after World War I that the era of mass immigration came to an abrupt end. The Immigration Act of 1924 established an annual quota (fixed in 1929 at 150,000) and established the national-origins system, which was to characterize immigration policy for the next 40 years. Under it, quotas were established for each country based on the number of persons of that national origin who were living in the United States in 1920. The quotas reduced drastically the flow of immigrants from southeastern Europe in favour of the countries of northwestern Europe. The quota system was abolished in 1965 in favour of a predominantly first-come, first-served policy. An annual ceiling of immigrant visas was established for nations outside the Western Hemisphere (170,000, with 20,000 allowed to any one nation) and for all persons from the Western Hemisphere (120,000).

      The new policy radically changed the pattern of immigration. For the first time, non-Europeans formed the dominant immigrant group, with new arrivals from Asia, Latin America, the Caribbean, and the Middle East. In the 1980s and '90s immigration was further liberalized by granting amnesty to illegal aliens (alien), raising admission limits, and creating a system for validating refugees. The plurality of immigrants, both legal and illegal, recently hail from Mexico and elsewhere in Latin America, though Asians form a significant percentage.

Ed.John Naisbitt Thea K. Flaum Oscar Handlin

 The United States is the world's greatest economic power in terms of gross domestic product (GDP) and is among the greatest powers in terms of GDP per capita. With less than 5 percent of the world's population, the United States produces about one-fifth of the world's economic output.

      The sheer size of the U.S. economy makes it the most important single factor in global trade. Its exports represent more than one-tenth of the world total. The United States also influences the economies of the rest of the world because it is a significant source of investment capital. Just as direct investment, primarily by the British, was a major factor in 19th-century U.S. economic growth, so direct investment abroad by U.S. firms is a major factor in the economic well-being of Canada, Mexico, China, and many countries in Latin America, Europe, and Asia.

Strengths and weaknesses
      The U.S. economy is marked by resilience, flexibility, and innovation. In the first decade of the 21st century, the economy was able to withstand a number of costly setbacks. These included the collapse of stock markets following an untenable run-up in technology shares, losses from corporate scandals, the September 11 attacks in 2001, wars in Afghanistan and Iraq, and a devastating hurricane along the Gulf Coast near New Orleans in 2005.

      For the most part, the U.S. government plays only a small direct role in running the nation's economic enterprises. Businesses are free to hire or fire employees and open or close operations. Unlike the situation in many other countries, new products and innovative practices can be introduced with minimal bureaucratic delays. The government does, however, regulate various aspects of all U.S. industries. Federal agencies oversee worker safety and work conditions, air and water pollution, food and prescription drug safety, transportation safety, and automotive fuel economy—to name just a few examples. Moreover, the Social Security Administration operates the country's pension system, which is funded through payroll taxes. The government also operates public health programs such as Medicaid (for the poor) and Medicare (for the elderly).

      In an economy dominated by privately owned businesses, there are still some government-owned companies. These include the U.S. Postal Service, the Nuclear Regulatory Commission, the National Railroad Passenger Corporation (Amtrak), and the Tennessee Valley Authority.

      The federal government also influences economic activity in other ways. As a purchaser of goods, it exerts considerable leverage on certain sectors of the economy—most notably in the defense and aerospace industries. It also implements antitrust laws to prevent companies from colluding on prices or monopolizing market shares.

      Despite its ability to weather economic shocks, in the earliest years of the 21st century, the U.S. economy developed many weaknesses that pointed to future risks. The country faces a chronic trade deficit; imports greatly outweigh the value of U.S. goods and services exported to other countries. For many citizens, household incomes have effectively stagnated since the 1970s, while indebtedness reached record levels. Rising energy prices made it more costly to run businesses, heat homes, and transport goods and people. The country's aging population placed new burdens on public health spending and pension programs (including Social Security). At the same time, the burgeoning federal budget deficit limited the amount of funding available for social programs.

      Nearly all of the federal government's revenues come from taxes, with total income from federal taxes representing about one-fifth of GDP. The most important source of tax revenue is the personal income tax (personal income tax) (accounting for roughly half of federal revenue). Gross receipts from corporate income taxes yield a far smaller fraction (about one-eighth) of total federal receipts. Excise duties yield yet another small portion (less than one-tenth) of total federal revenue; however, individual states levy their own excise and sales taxes. Federal excises rest heavily on alcohol, gasoline, and tobacco. Other sources of revenue include Medicare and social security payroll taxes (which account for almost two-fifths of federal revenue) and estate and gift taxes (yielding only about 1 percent of the total).

Labour force
      With an unemployment rate of roughly 5 percent per year, the U.S. labour market is in line with those of other developed countries. The service sector accounts for more than three-fourths of the country's jobs, whereas industrial and manufacturing trades employ less than one-fifth of the labour market.

      After peaking in the 1950s, when 36 percent of American workers were enrolled in unions (organized labour), union membership at the beginning of the 21st century had fallen to less than 15 percent of U.S. workers, nearly half of them government employees. The transformation in the late 20th century to a service-based economy changed the nature of labour unions. Organizational efforts, once aimed primarily at manufacturing industries, are now focused on service industries. The country's largest union, the National Education Association (NEA), represents teachers. In 2005 three large labour unions broke their affiliation with the American Federation of Labor–Congress of Industrial Organizations (AFL-CIO), the nationwide federation of unions, and formed a new federation, the Change to Win coalition, with the goal of reviving union influence in the labour market. Although the freedom to strike is qualified with provisions requiring cooling-off periods and in some cases compulsory arbitration, major unions are able and sometimes willing to embark on long strikes.

Agriculture, forestry, and fishing
 Despite the enormous productivity of U.S. agriculture, the combined outputs of agriculture, forestry, and fishing contribute to only a small percentage of GDP. Advances in farm productivity (stemming from mechanization and organizational changes in commercial farming) have enabled a smaller labour force to produce greater quantities than ever before. Improvements in yields have also resulted from the increased use of fertilizers, pesticides, and herbicides and from changes in agricultural techniques (such as irrigation). Among the most important crops are corn (maize), soybeans, wheat, cotton, grapes, and potatoes.

      The United States is the world's major producer of timber. More than four-fifths of the trees harvested are softwoods such as Douglas fir and southern pine. The major hardwood is oak.

      The United States also ranks among the world's largest producers of edible and nonedible fish products. Fish for human consumption accounts for more than half of the tonnage landed. Shellfish account for less than one-fifth of the annual catch but for nearly half the total value.

      Less than one-fiftieth of the GDP comes from mining and quarrying, yet the United States is a leading producer of coal, petroleum, and some metals.

Resources and power
      The United States is one of the world's leading producers of energy. It is also the world's biggest consumer of energy. It therefore relies on other countries for many energy sources—petroleum products in particular. The country is notable for its efficient use of natural resources, and it excels in transforming its resources into usable products.

      With major producing fields in Alaska, California, the Gulf of Mexico, Louisiana, and Oklahoma, the United States is one of the world's leading producers of refined petroleum and has important reserves of natural gas. It is also among the world's coal exporters. Recoverable coal deposits are concentrated largely in the Appalachian Mountains and in Wyoming. Nearly half the bituminous (bituminous coal) coal is mined in West Virginia and Kentucky, while Pennsylvania produces the country's only anthracite. Illinois, Indiana, and Ohio also produce coal.

      Iron ore is mined predominantly in Minnesota and Michigan. The United States also has important reserves of copper, magnesium, lead, and zinc. Copper production is concentrated in the mountainous western states of Arizona, Utah, Montana, Nevada, and New Mexico. Zinc is mined in Tennessee, Missouri, Idaho, and New York. Lead mining is concentrated in Missouri. Other metals mined in the United States are gold, silver, molybdenum, manganese, tungsten, bauxite, uranium, vanadium, and nickel. Important nonmetallic minerals produced are phosphates, potash, sulfur, stone, and clays.

Biological resources
      More than two-fifths of the total land area of the United States is devoted to farming (including pasture and range). Tobacco is produced in the Southeast and in Kentucky and cotton in the South and Southwest; California is noted for its vineyards, citrus groves, and truck gardens; the Midwest is the centre of corn and wheat farming, while dairy herds are concentrated in the Northern states. The Southwestern and Rocky Mountain states support large herds of livestock.

      Most of the U.S. forestland is located in the West (including Alaska), but significant forests also grow elsewhere. Almost half of the country's hardwood forests are located in Appalachia. Of total commercial forestland, more than two-thirds is privately owned. About one-fifth is owned or controlled by the federal government, the remainder being controlled by state and local governments.

      Hydroelectric resources are heavily concentrated in the Pacific and Mountain regions. Hydroelectricity, however, contributes less than one-tenth of the country's electricity supply. Coal-burning plants provide more than half of the country's power; nuclear generators contribute about one-fifth.

      Since the mid-20th century, services (such as health care, entertainment, and finance) have grown faster than any other sector of the economy. Nevertheless, while manufacturing jobs have declined since the 1960s, advances in productivity have caused manufacturing output, including construction, to remain relatively constant, at about one-fifth of GDP.

      Significant economic productivity occurs in a wide range of industries. The manufacture of transportation equipment (including motor vehicles, aircraft, and space equipment) represents a leading sector. Computer and telecommunications firms (including software and hardware) remain strong, despite a downturn in the early 21st century. Other important sectors include drug manufacturing and biotechnology, health services, food products, chemicals, electrical and nonelectrical machinery, energy, and insurance.

      Under the Federal Reserve System, which regulates bank credit and influences the money supply, central banking functions are exercised by 12 regional Federal Reserve banks. The Board of Governors, appointed by the U.S. president, supervises these banks. Based in Washington, D.C., the board does not necessarily act in accord with the administration's views on economic policy. The U.S. Treasury also influences the working of the monetary system through its management of the national debt (which can affect interest rates) and by changing its own deposits with the Federal Reserve banks (which can affect the volume of credit). While only about two-fifths of all commercial banks belong to the Federal Reserve System, these banks hold almost three-fourths of all commercial bank deposits. Banks incorporated under national charter must be members of the system, while banks incorporated under state charters may become members. Member banks must maintain minimum legal reserves and must deposit a percentage of their savings and checking accounts with a Federal Reserve bank. There are also thousands of nonbank credit agencies such as personal credit institutions and savings and loan associations (S&Ls).

      Although banks supply less than half of the funds used for corporate finance, bank loans represent the country's largest source of capital for business borrowing. A liberalizing trend in state banking laws in the 1970s and '80s encouraged both intra- and interstate expansion of bank facilities and bank holding companies. Succeeding mergers (merger) among the country's largest banks led to the formation of large regional and national banking and financial services corporations. In serving both individual and commercial customers, these institutions accept deposits, provide checking accounts, underwrite securities, originate loans, offer mortgages, manage investments, and sponsor credit cards.

      Financial services are also provided by insurance companies and security brokerages. The federal government sponsors credit agencies in the areas of housing (home mortgages), farming (agricultural loans), and higher education (student loans). New York City has three organized stock exchanges—the New York Stock Exchange (NYSE), the American Stock Exchange (AMEX), and the National Association of Securities Dealers Automated Quotations (NASDAQ) Stock Market—which account for the bulk of all stock sales in the United States. The country's leading markets for commodities, futures, and options are the Chicago Board of Trade (CBOT), the Chicago Mercantile Exchange (CME), and the Chicago Board Options Exchange (CBOE). The Chicago Climate Exchange (CCX) specializes in futures contracts for greenhouse gas emissions (carbon credits). Smaller exchanges operate in a number of American cities.

Foreign trade
      International trade is crucial to the national economy, with the combined value of imports and exports equivalent to about one-sixth of the gross national product. Canada, Mexico, Japan, China, and the United Kingdom are the principal trading partners. Leading exports include electrical and office machinery, chemical products, motor vehicles, airplanes and aviation parts, and scientific equipment. Major imports include manufactured goods, petroleum and fuel products, and machinery and transportation equipment.

E.I.U. Ed.

      The economic and social complexion of life in the United States mirrors the nation's extraordinary mobility. A pervasive transportation network has helped transform the vast geographic expanse into a surprisingly homogeneous and close-knit social and economic environment. Another aspect of mobility is flexibility, and this freedom to move is often seen as a major factor in the dynamism of the U.S. economy. Mobility has also had destructive effects: it has accelerated the deterioration of older urban areas, multiplied traffic congestion, intensified pollution of the environment, and diminished support for public transportation systems.

Roads and railroads
      Central to the U.S. transportation network is the 45,000-mile Interstate System, now known as the Dwight D. Eisenhower System of Interstate and Defense Highways. The system connects about nine-tenths of all cities of at least 50,000 population. Begun in the 1950s, the highway system carries about one-fifth of the country's motor traffic. Nearly nine-tenths of all households own at least one automobile or truck. At the end of the 20th century, these added up to more than 100 million privately owned vehicles. While most trips in metropolitan areas are made by automobile, the public transit and rail commuter lines play an important role in the most populous cities, with the majority of home-to-work commuters traveling by public carriers in such cities as New York City, Chicago, Philadelphia, and Boston. Although railroads once dominated both freight and passenger traffic in the United States, government regulation and increased competition from trucking reduced their role in transportation. Railroads move about one-third of the nation's intercity freight traffic. The most important items carried are coal, grain, chemicals, and motor vehicles. Many rail companies had given up passenger service by 1970, when Congress created the National Railroad Passenger Corporation (known as Amtrak), a government corporation, to take over passenger service. Amtrak operates a 21,000-mile system serving more than 500 stations across the country.

Water and air transport
 Navigable waterways are extensive and centre upon the Mississippi River system in the country's interior, the Great Lakes–St. Lawrence Seaway system in the north, and the Gulf Coast waterways along the Gulf of Mexico. Barges carry more than two-thirds of domestic waterborne traffic, transporting petroleum products, coal and coke, and grain. The country's largest ports in tonnage handled are the Port of South Louisiana; the Port of Houston, Texas; the Port of New York/New Jersey; and the Port of New Orleans, La.

      Air traffic has experienced spectacular growth in the United States since the mid-20th century. From 1970 to 1999, passenger traffic on certified air carriers increased 373 percent. Much of this growth occurred after airline deregulation, which began in 1978. There are more than 14,000 public and private airports, the busiest being in Atlanta, Ga., and Chicago for passenger traffic. Airports in Memphis, Tenn. (the hub of package-delivery company Federal Express), and Los Angeles handle the most freight cargo.

Government and society

Constitutional framework
      The Constitution of the United States (Constitution of the United States of America), written to redress the deficiencies of the country's first constitution, the Articles of Confederation (Confederation, Articles of) (1781–89), defines a federal system (federalism) of government in which certain powers are delegated to the national government and others are reserved to the states. The national government consists of executive, legislative, and judicial branches that are designed to ensure, through separation of powers (powers, separation of) and through checks and balances, that no one branch of government is able to subordinate the other two branches. All three branches are interrelated, each with overlapping yet quite distinct authority.

      The U.S. Constitution (see original text (Constitution of the United States)), the world's oldest written national constitution still in effect, was officially ratified on June 21, 1788 (when New Hampshire became the ninth state to ratify the document), and formally entered into force on March 4, 1789, when George Washington (Washington, George) was sworn in as the country's first president. Although the Constitution contains several specific provisions (such as age and residency requirements for holders of federal offices and powers granted to Congress), it is vague in many areas and could not have comprehensively addressed the complex myriad of issues (e.g., historical, technological, etc.) that have arisen in the centuries since its ratification. Thus, the Constitution is considered a living document, its meaning changing over time as a result of new interpretations of its provisions. In addition, the framers allowed for changes to the document, outlining in Article V the procedures required to amend the Constitution. Amending the Constitution requires a proposal by a two-thirds vote of each house of Congress or by a national convention called for at the request of the legislatures of two-thirds of the states, followed by ratification by three-fourths of the state legislatures or by conventions in as many states.

      In the more than two centuries since the Constitution's ratification, there have been 27 amendments. All successful amendments have been proposed by Congress, and all but one—the Twenty-first Amendment (1933), which repealed prohibition—have been ratified by state legislatures. The first 10 amendments, proposed by Congress in September 1789 and adopted in 1791, are known collectively as the Bill of Rights (Rights, Bill of), which places limits on the federal government's power to curtail individual freedoms. The First Amendment, for example, provides that “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.” Though the First Amendment's language appears absolute, it has been interpreted to mean that the federal government (and later the state governments) cannot place undue restrictions on individual liberties but can regulate speech, religion, and other rights. The Second and Third amendments, which, respectively, guarantee the people's right to bear arms and limit the quartering of soldiers in private houses, reflect the hostility of the framers to standing armies. The Fourth through Eighth amendments establish the rights of the criminally accused, including safeguards against unreasonable searches and seizures, protection from double jeopardy (being tried twice for the same offense), the right to refuse to testify against oneself, and the right to a trial by jury. The Ninth and Tenth amendments underscore the general rights of the people. The Ninth Amendment protects the unenumerated residual rights of the people (i.e., those not explicitly granted in the Constitution), and the Tenth Amendment reserves to the states or to the people those powers not delegated to the United States nor denied to the states.

      The guarantees of the Bill of Rights are steeped in controversy, and debate continues over the limits that the federal government may appropriately place on individuals. One source of conflict has been the ambiguity in the wording of many of the Constitution's provisions—such as the Second Amendment's right “to keep and bear arms” and the Eighth Amendment's prohibition of “cruel and unusual punishments.” Also problematic is the Tenth Amendment's apparent contradiction of the body of the Constitution; Article I, Section 8, enumerates the powers of Congress but also allows that it may make all laws “which shall be necessary and proper,” while the Tenth Amendment stipulates that “powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.” The distinction between what powers should be left to the states or to the people and what is a necessary and proper law for Congress to pass has not always been clear.

      Between the ratification of the Bill of Rights and the American Civil War (1861–65), only two amendments were passed, and both were technical in nature. The Eleventh Amendment (1795) forbade suits against the states in federal courts, and the Twelfth Amendment (1804) corrected a constitutional error that came to light in the presidential election of 1800, when Democratic-Republicans Thomas Jefferson (Jefferson, Thomas) and Aaron Burr (Burr, Aaron) each won 73 electors because electors were unable to cast separate ballots for president and vice president. The Thirteenth, Fourteenth, and Fifteenth amendments were passed in the aftermath of the Civil War. The Thirteenth (1865) abolished slavery, while the Fifteenth (1870) forbade denial of the right to vote to former male slaves. The Fourteenth Amendment, which granted citizenship rights to former slaves and guaranteed to every citizen due process and equal protection of the laws, was regarded for a while by the courts as limiting itself to the protection of freed slaves, but it has since been used to extend protections to all citizens. Initially, the Bill of Rights applied solely to the federal government and not to the states. In the 20th century, however, many (though not all) of the provisions of the Bill of Rights were extended by the Supreme Court through the Fourteenth Amendment to protect individuals from encroachments by the states. Notable amendments since the Civil War include the Sixteenth (1913), which enabled the imposition of a federal income tax; the Seventeenth (1913), which provided for the direct election of U.S. senators; the Nineteenth (1920), which established woman suffrage; the Twenty-fifth (1967), which established succession to the presidency and vice presidency; and the Twenty-sixth (1971), which extended voting rights to all citizens 18 years of age or older.

The executive branch
      The executive branch is headed by the president (presidency of the United States of America), who must be a natural-born citizen of the United States, at least 35 years old, and a resident of the country for at least 14 years. A president is elected indirectly by the people through an electoral college system to a four-year term and is limited to two elected terms of office by the Twenty-second Amendment (1951). The president's official residence and office is the White House, located at 1600 Pennsylvania Avenue N.W. in Washington, D.C. (Washington) The formal constitutional responsibilities vested in the presidency of the United States (presidency of the United States of America) include serving as commander in chief of the armed forces; negotiating treaties; appointing federal judges, ambassadors, and cabinet officials; and acting as head of state. In practice, presidential powers have expanded to include drafting legislation, formulating foreign policy, conducting personal diplomacy, and leading the president's political party.

      The members of the president's cabinet—the attorney general and the secretaries of State, Treasury, Defense, Homeland Security, Interior, Agriculture, Commerce, Labor, Health and Human Services, Housing and Urban Development, Transportation, Education, Energy, and Veterans Affairs—are appointed by the president with the approval of the Senate; although they are described in the Twenty-fifth Amendment as “the principal officers of the executive departments,” significant power has flowed to non-cabinet-level presidential aides, such as those serving in the Office of Management and Budget (OMB), the Council of Economic Advisers, the National Security Council (NSC), and the office of the White House Chief of Staff; cabinet-level rank may be conferred to the heads of such institutions at the discretion of the president. Members of the cabinet and presidential aides serve at the pleasure of the president and may be dismissed by him at any time.

      The executive branch also includes independent regulatory agencies such as the Federal Reserve System and the Securities and Exchange Commission. Governed by commissions appointed by the president and confirmed by the Senate (commissioners may not be removed by the president), these agencies protect the public interest by enforcing rules and resolving disputes over federal regulations. Also part of the executive branch are government corporations (e.g., the Tennessee Valley Authority, the National Railroad Passenger Corporation [ Amtrak], and the U.S. Postal Service), which supply services to consumers that could be provided by private corporations, and independent executive agencies (e.g., the Central Intelligence Agency, the National Science Foundation, and the National Aeronautics and Space Administration), which comprise the remainder of the federal government.

The legislative branch
      The U.S. Congress (Congress of the United States), the legislative branch of the federal government, consists of two houses: the Senate and the House of Representatives (Representatives, House of). Powers granted to Congress under the Constitution include the power to levy taxes, borrow money, regulate interstate commerce, impeach and convict the president, declare war, discipline its own membership, and determine its rules of procedure.

      With the exception of revenue bills, which must originate in the House of Representatives, legislative bills may be introduced in and amended by either house, and a bill—with its amendments—must pass both houses in identical form and be signed by the president before it becomes law. The president may veto a bill, but a veto can be overridden by a two-thirds vote of both houses. The House of Representatives may impeach a president or another public official by a majority vote; trials of impeached officials are conducted by the Senate, and a two-thirds majority is necessary to convict and remove the individual from office. Congress is assisted in its duties by the General Accounting Office (GAO), which examines all federal receipts and expenditures by auditing federal programs and assessing the fiscal impact of proposed legislation, and by the Congressional Budget Office (CBO), a legislative counterpart to the OMB, which assesses budget data, analyzes the fiscal impact of alternative policies, and makes economic forecasts.

      The House of Representatives is chosen by the direct vote of the electorate in single-member districts in each state. The number of representatives allotted to each state is based on its population as determined by a decennial census; states sometimes gain or lose seats, depending on population shifts. The overall membership of the House has been 435 since the 1910s, though it was temporarily expanded to 437 after Hawaii and Alaska were admitted as states in 1959. Members must be at least 25 years old, residents of the states from which they are elected, and previously citizens of the United States for at least seven years. It has become a practical imperative—though not a constitutional requirement—that a member be an inhabitant of the district that elects him. Members serve two-year terms, and there is no limit on the number of terms they may serve. The speaker of the House, who is chosen by the majority party, presides over debate, appoints members of select and conference committees, and performs other important duties; he is second in the line of presidential succession (following the vice president). The parliamentary leaders of the two main parties are the majority floor leader and the minority floor leader. The floor leaders are assisted by party whips, who are responsible for maintaining contact between the leadership and the members of the House. Bills introduced by members in the House of Representatives are received by standing committees, which can amend, expedite, delay, or kill legislation. Each committee is chaired by a member of the majority party, who traditionally attained this position on the basis of seniority, though the importance of seniority has eroded somewhat since the 1970s. Among the most important committees are those on Appropriations, Ways and Means, and Rules. The Rules Committee, for example, has significant power to determine which bills will be brought to the floor of the House for consideration and whether amendments will be allowed on a bill when it is debated by the entire House.

      Each state elects two senators at large. Senators must be at least 30 years old, residents of the state from which they are elected, and previously citizens of the United States for at least nine years. They serve six-year terms, which are arranged so that one-third of the Senate is elected every two years. Senators also are not subject to term limits. The vice president serves as president of the Senate, casting a vote only in the case of a tie, and in his absence the Senate is chaired by a president pro tempore, who is elected by the Senate and is third in the line of succession to the presidency. Among the Senate's most prominent standing committees are those on Foreign Relations, Finance, Appropriations, and Governmental Affairs. Debate is almost unlimited and may be used to delay a vote on a bill indefinitely. Such a delay, known as a filibuster, can be ended by three-fifths of the Senate through a procedure called cloture. Treaties negotiated by the president with other governments must be ratified by a two-thirds vote of the Senate. The Senate also has the power to confirm or reject presidentially appointed federal judges, ambassadors, and cabinet officials.

The judicial branch (judiciary)
      The judicial branch is headed by the Supreme Court of the United States, which interprets the Constitution and federal legislation. The Supreme Court consists of nine justices (including a chief justice) appointed to life terms by the president with the consent of the Senate. It has appellate jurisdiction over the lower federal courts and over state courts if a federal question is involved. It also has original jurisdiction (i.e., it serves as a trial court) in cases involving foreign ambassadors, ministers, and consuls and in cases to which a U.S. state is a party.

      Most cases reach the Supreme Court through its appellate jurisdiction. The Judiciary Act of 1925 provided the justices with the sole discretion to determine their caseload. In order to issue a writ of certiorari, which grants a court hearing to a case, at least four justices must agree (the “Rule of Four”). Three types of cases commonly reach the Supreme Court: cases involving litigants of different states, cases involving the interpretation of federal law, and cases involving the interpretation of the Constitution. The court can take official action with as few as six judges joining in deliberation, and a majority vote of the entire court is decisive; a tie vote sustains a lower-court decision. The official decision of the court is often supplemented by concurring opinions from justices who support the majority decision and dissenting opinions from justices who oppose it.

      Because the Constitution is vague and ambiguous in many places, it is often possible for critics to fault the Supreme Court for misinterpreting it. In the 1930s, for example, the Republican-dominated court was criticized for overturning much of the New Deal legislation of Democratic (Democratic Party) President Franklin D. Roosevelt (Roosevelt, Franklin D.). In the area of civil rights, the court has received criticism from various groups at different times. Its 1954 ruling in Board of Education of Topeka (Brown v. Board of Education of Topeka), which declared school segregation unconstitutional, was harshly attacked by Southern political leaders, who were later joined by Northern conservatives. A number of decisions involving the pretrial rights of prisoners, including the granting of Miranda (Miranda v. Arizona) rights and the adoption of the exclusionary rule, also came under attack on the ground that the court had made it difficult to convict criminals. On divisive issues such as abortion, affirmative action, school prayer, and flag burning, the court's decisions have aroused considerable opposition and controversy, with opponents sometimes seeking constitutional amendments to overturn the court's decisions.

      At the lowest level of the federal court system are district courts (see United States District Court). Each state has at least one federal district court and at least one federal judge. District judges are appointed to life terms by the president with the consent of the Senate. Appeals from district-court decisions are carried to the U.S. courts of appeals (see United States Court of Appeals). Losing parties at this level may appeal for a hearing from the Supreme Court. Special courts handle property and contract damage suits against the United States ( United States Court of Federal Claims), review customs rulings (United States Court of International Trade), hear complaints by individual taxpayers (United States Tax Court) or veterans (United States Court of Appeals for Veteran Claims), and apply the Uniform Code of Military Justice ( United States Court of Appeals for the Armed Forces).

State and local government
      Because the U.S. Constitution establishes a federal system, the state governments enjoy extensive authority. The Constitution outlines the specific powers granted to the national government and reserves the remainder to the states. However, because of ambiguity in the Constitution and disparate historical interpretations by the federal courts, the powers actually exercised by the states have waxed and waned over time. Beginning in the last decades of the 20th century, for example, decisions by conservative-leaning federal courts, along with a general trend favouring the decentralization of government, increased the power of the states relative to the federal government. In some areas, the authority of the federal and state governments overlap; for example, the state and federal governments both have the power to tax, establish courts, and make and enforce laws. In other areas, such as the regulation of commerce within a state, the establishment of local governments, and action on public health, safety, and morals, the state governments have considerable discretion. The Constitution also denies to the states certain powers; for example, the Constitution forbids states to enter into treaties, to tax imports or exports, or to coin money. States also may not adopt laws that contradict the U.S. Constitution.

      The governments of the 50 states have structures closely paralleling those of the federal government. Each state has a governor, a legislature, and a judiciary. Each state also has its own constitution.

      Mirroring the U.S. Congress, all state legislatures are bicameral (bicameral system) except Nebraska's, which is unicameral. Most state judicial systems are based upon elected justices of the peace (justice of the peace) (although in many states this term is not used), above whom are major trial courts, often called district courts, and appellate courts. Each state has its own supreme court. In addition, there are probate courts concerned with wills, estates, and guardianships. Most state judges are elected, though some states use an appointment process similar to the federal courts and some use a nonpartisan selection process known as the Missouri Plan.

      State governors are directly elected and serve varying terms (generally ranging from two to four years); in some states, the number of terms a governor may serve is limited. The powers of governors also vary, with some state constitutions ceding substantial authority to the chief executive (such as appointment and budgetary powers and the authority to veto legislation). In a few states, however, governors have highly circumscribed authority, with the constitution denying them the power to veto legislative bills.

      Most states have a lieutenant governor, who is often elected independently of the governor and is sometimes not a member of the governor's party. Lieutenant governors generally serve as the presiding officer of the state Senate. Other elected officials commonly include a secretary of state, state treasurer, state auditor, attorney general, and superintendent of public instruction.

      State governments have a wide array of functions, encompassing conservation, highway and motor vehicle supervision, public safety and corrections, professional licensing, regulation of agriculture and of intrastate business and industry, and certain aspects of education, public health, and welfare. The administrative departments that oversee these activities are headed by the governor.

      Each state may establish local governments to assist it in carrying out its constitutional powers. Local governments exercise only those powers that are granted to them by the states, and a state may redefine the role and authority of local government as it deems appropriate. The country has a long tradition of local democracy (e.g., the town meeting), and even some of the smallest areas have their own governments. There are some 85,000 local government units in the United States. The largest local government unit is the county (called a parish in Louisiana or a borough in Alaska). Counties range in population from as few as 100 people to millions (e.g., Los Angeles county). They often provide local services in rural areas and are responsible for law enforcement and keeping vital records. Smaller units include townships, villages, school districts, and special districts (e.g., housing authorities, conservation districts, and water authorities).

      Municipal, or city, governments are responsible for delivering most local services, particularly in urban areas. At the beginning of the 21st century there were some 20,000 municipal governments in the United States. They are more diverse in structure than state governments. There are three basic types: mayor-council (mayor and council system), commission, and council-manager governments. The mayor-council form, which is used in Boston, New York City, Philadelphia, Chicago, and thousands of smaller cities, consists of an elected mayor and council. The power of mayors and councils vary from city to city; in most cities the mayor has limited powers and serves largely as a ceremonial leader, but in some cities (particularly large urban areas) the council is nominally responsible for formulating city ordinances, which the mayor enforces, but the mayor often controls the actions of the council. In the commission type, used less frequently now than it was in the early 20th century, voters elect a number of commissioners, each of whom serves as head of a city department; the presiding commissioner is generally the mayor. In the council-manager type, used in large cities such as Charlotte (North Carolina), Dallas (Texas), Phoenix (Arizona), and San Diego (California), an elected council hires a city manager to administer the city departments. The mayor, elected by the council, simply chairs the council and officiates at important functions.

      As society has become increasingly urban, politics and government have become more complex. Many problems of the cities, including transportation, housing, education, health, and welfare, can no longer be handled entirely on the local level. Because even the states do not have the necessary resources, cities have often turned to the federal government for assistance, though proponents of local control have urged that the federal government provide block-grant aid to state and local governments without federal restrictions.

Political process
      The framers of the U.S. Constitution focused their efforts primarily on the role, power, and function of the state and national governments, only briefly addressing the political and electoral process. Indeed, three of the Constitution's four references to the election of public officials left the details to be determined by Congress or the states. The fourth reference, in Article II, Section 1, prescribed the role of the electoral college in choosing the president, but this section was soon amended (in 1804 by the Twelfth Amendment) to remedy the technical defects that had arisen in 1800, when all Democratic-Republican Party electors cast their votes for Thomas Jefferson and Aaron Burr, thereby creating a tie because electors were unable to differentiate between their presidential and vice presidential choices. (The election of 1800 was finally settled by Congress, which selected Jefferson president following 36 ballots.)

      In establishing the electoral college, the framers stipulated that “Congress may determine the Time of chusing [sic] the Electors, and the Day on which they shall give their votes; which Day shall be the same throughout the United States.” In 1845 Congress established that presidential electors would be appointed on the first Tuesday after the first Monday in November; the electors cast their ballots on the Monday following the second Wednesday in December. Article I, establishing Congress, merely provides (Section 2) that representatives are to be “chosen every second Year by the People of the several States” and that voting qualifications are to be the same for Congress as for the “most numerous Branch of the State Legislature.” Initially, senators were chosen by their respective state legislatures (Section 3), though this was changed to popular election by the Seventeenth Amendment in 1913. Section 4 leaves to the states the prescription of the “Times, Places and Manner of holding Elections for Senators and Representatives” but gives Congress the power “at any time by Law [to] make or alter such Regulations, except as to the Places of chusing Senators.” In 1875 Congress designated the first Tuesday after the first Monday in November in even years as federal election day.

      All citizens at least 18 years of age are eligible to vote. (Prisoners, ex-felons, and individuals on probation or parole are prohibited, sometimes permanently, from voting in some states.) The history of voting rights in the United States has been one of gradual extension of the franchise. Religion, property ownership, race, and gender have disappeared one by one as legal barriers to voting. In 1870, through the Fifteenth Amendment, former slaves were granted the right to vote, though African Americans were subsequently still denied the franchise (particularly in the South) through devices such as literacy tests, poll taxes, and grandfather clauses. Only in the 1960s, through the Twenty-fourth Amendment (barring poll taxes) and the Voting Rights Act, were the full voting rights of African Americans guaranteed. Though universal manhood suffrage had theoretically been achieved following the American Civil War, woman suffrage was not fully guaranteed until 1920 with the enactment of the Nineteenth Amendment (several states, particularly in the West, had begun granting women the right to vote and to run for political office beginning in the late 19th century). Suffrage was also extended by the Twenty-sixth Amendment (1971), which lowered the minimum voting age to 18.

Voting and elections
      Voters go to the polls in the United States not only to elect members of Congress and presidential electors but also to cast ballots for state and local officials, including governors, mayors, and judges, and on ballot initiatives and referendums that may range from local bond issues to state constitutional amendments (see referendum and initiative). The 435 members of the House of Representatives are chosen by the direct vote of the electorate in single-member districts in each state. State legislatures (sometimes with input from the courts) draw congressional district boundaries, often for partisan advantage (see gerrymandering); incumbents have always enjoyed an electoral advantage over challengers, but, as computer technology has made redistricting more sophisticated and easier to manipulate, elections to the House of Representatives have become even less competitive, with more than 90 percent of incumbents who choose to run for reelection regularly winning—often by significant margins. By contrast, Senate elections are generally more competitive.

      Voters indirectly elect the president (presidency of the United States of America) and vice president through the electoral college. Instead of choosing a candidate, voters actually choose electors committed to support a particular candidate. Each state is allotted one electoral vote for each of its senators and representatives in Congress; the Twenty-third Amendment (1961) granted electoral votes to the District of Columbia, which does not have congressional representation. A candidate must win a majority (270) of the 538 electoral votes to be elected president. If no candidate wins a majority, the House of Representatives selects the president, with each state delegation receiving one vote; the Senate elects the vice president if no vice presidential candidate secures an electoral college majority. A candidate may lose the popular vote but be elected president by winning a majority of the electoral vote (as George W. Bush (Bush, George W.) did in 2000), though such inversions are rare. Presidential elections are costly and generate much media and public attention—sometimes years before the actual date of the general election. Indeed, some presidential aspirants have declared their candidacies years in advance of the first primaries and caucuses, and some White House hopefuls drop out of the grueling process long before the first votes are cast.

      Voting in the United States is not compulsory, and, in contrast to most other Western countries, voter turnout is quite low. In the late 20th and the early 21st century, about 50 percent of Americans cast ballots in presidential elections; turnout was even lower for congressional and state and local elections, with participation dropping under 40 percent for most congressional midterm elections (held midway through a president's four-year term). Indeed, in some local elections (such as school board elections or bond issues) and primaries or caucuses, turnout has sometimes fallen below 10 percent. High abstention rates led to efforts to encourage voter participation by making voting easier. For example, in 1993 Congress passed the National Voter Registration Act (the so-called “motor-voter law”), which required states to allow citizens to register to vote when they received their driver's licenses, and in 1998 voters in Oregon approved a referendum that established a mail-in voting system. In addition, some states now allow residents to register to vote on election day, polls are opened on multiple days and in multiple locations in some states, and Internet voting has even been introduced on a limited basis for some elections.

Money and campaigns
      Campaigns for all levels of office are expensive in the United States compared with those in most other democratic countries. In an attempt to reduce the influence of money in the political process, reforms were instituted in the 1970s that required public disclosure of contributions and limited the amounts of contributions to candidates for federal office. Individuals were allowed to contribute directly to a candidate no more than $1,000 in so-called “hard money” (i.e., money regulated by federal election law) per candidate per election. The law, however, allowed labour unions, corporations, political advocacy groups, and political parties to raise and spend unregulated “soft money,” so long as funds were not spent specifically to support a candidate for federal office (in practice, this distinction was often blurry). Because there were no limits on such soft money, individuals or groups could contribute to political parties any sum at their disposal or spend limitlessly to advocate policy positions (often to the benefit or detriment of particular candidates). In the 2000 election cycle, it is estimated that more than $1 billion was spent by the Democratic and Republican parties and candidates for office, with more than two-fifths of this total coming from soft money contributions.

      Concerns about campaign financing led to the passage of the Bipartisan Campaign Reform Act of 2002 (popularly called the “McCain-Feingold law” for its two chief sponsors in the Senate, Republican John McCain and Democrat Russell Feingold), which banned national political parties from raising soft money. The law also increased the amount individuals could contribute to candidates (indexing the amount for inflation) and prevented interest groups from broadcasting advertisements that specifically referred to a candidate within 30 days of a primary election and 60 days of a general election.

      There are no federal limits on how much an individual may spend on his or her own candidacy. In 1992, for example, Ross Perot (Perot, Ross) spent more than $60 million of his fortune on his unsuccessful bid to become president of the United States, and Michael Bloomberg was elected mayor of New York City in 2001 after spending nearly $70 million of his own funds. The campaign finance law of 2002 allowed candidates for federal office to raise amounts greater than the normal limit on individual hard money contributions when running against wealthy, largely self-financed opponents.

      The United States has two major national political parties (political party), the Democratic Party and the Republican Party. Although the parties contest presidential elections every four years and have national party organizations, between elections they are often little more than loose alliances of state and local party organizations. Other parties have occasionally challenged the Democrats and Republicans. Since the Republican Party's rise to major party status in the 1850s, however, minor parties have had only limited electoral success, generally restricted either to influencing the platforms of the major parties or to siphoning off enough votes from a major party to deprive that party of victory in a presidential election. In the 1912 election, for example, former Republican president Theodore Roosevelt (Roosevelt, Theodore) challenged Republican President William Howard Taft (Taft, William Howard), splitting the votes of Republicans and allowing Democrat Woodrow Wilson (Wilson, Woodrow) to win the presidency with only 42 percent of the vote, and the 2.7 percent of the vote won by Green Party nominee Ralph Nader (Nader, Ralph) in 2000 may have tipped the presidency toward Republican George W. Bush by attracting votes that otherwise would have been cast for Democrat Al Gore (Gore, Al).

      There are several reasons for the failure of minor parties and the resilience of America's two-party system. In order to win a national election, a party must appeal to a broad base of voters and a wide spectrum of interests. The two major parties have tended to adopt centrist political programs, and sometimes there are only minor differences between them on major issues, especially those related to foreign affairs. Each party has both conservative and liberal wings, and on some issues (e.g., affirmative action) conservative Democrats have more in common with conservative Republicans than with liberal Democrats. The country's “winner-take-all” plurality system, in contrast to the proportional representation used in many other countries (whereby a party, for example, that won 5 percent of the vote would be entitled to roughly 5 percent of the seats in the legislature), has penalized minor parties by requiring them to win a plurality of the vote in individual districts in order to gain representation. The Democratic and Republican Party candidates are automatically placed on the general election ballot, while minor parties often have to expend considerable resources collecting enough signatures from registered voters to secure a position on the ballot. Finally, the cost of campaigns, particularly presidential campaigns, often discourages minor parties. Since the 1970s, presidential campaigns (primaries and caucuses, national conventions, and general elections) have been publicly funded through a tax checkoff system, whereby taxpayers can designate whether a portion of their federal taxes (in the early 21st century, $3 for an individual and $6 for a married couple) should be allocated to the presidential campaign fund. Whereas the Democratic and Republican presidential candidates receive full federal financing (nearly $75 million in 2004) for the general election, a minor party is eligible for a portion of the federal funds only if its candidate surpassed 5 percent in the prior presidential election (all parties with at least 25 percent of the national vote in the prior presidential election are entitled to equal funds). A new party contesting the presidential election is entitled to federal funds after the election if it received at least 5 percent of the national vote.

      Both the Democratic and Republican parties have undergone significant ideological transformations throughout their histories. The modern Democratic Party traditionally supports organized labour, minorities, and progressive reforms. Nationally, it generally espouses a liberal political philosophy, supporting greater governmental intervention in the economy and less governmental regulation of the private lives of citizens. It also generally supports higher taxes (particularly on the wealthy) to finance social welfare benefits that provide assistance to the elderly, the poor, the unemployed, and children. By contrast, the national Republican Party supports limited government regulation of the economy, lower taxes, and more conservative (traditional) social policies.

  At the state level, political parties reflect the diversity of the population. Democrats in the Southern states are generally more conservative than Democrats in New England or the Pacific Coast states; likewise, Republicans in New England or the mid-Atlantic states also generally adopt more liberal positions than Republicans in the South or the mountain states of the West. Large urban centres are more likely to support the Democratic Party, whereas rural areas, small cities, and suburban areas tend more often to vote Republican. Some states have traditionally given majorities to one particular party. For example, because of the legacy of the Civil War and its aftermath, the Democratic Party dominated the 11 Southern states of the former Confederacy (Confederate States of America) until the mid-20th century. Since the 1960s, however, the South and the mountain states of the West have heavily favoured the Republican Party; in other areas, such as New England, the mid-Atlantic, and the Pacific Coast, support for the Democratic Party is strong. Compare, for example, the 1960—> and 2000—> presidential elections.

      Both the Democratic and Republican parties select their candidates for office through primary elections. Traditionally, individuals worked their way up through the party organization, belonging to a neighbourhood party club, helping to raise funds, getting out the vote, watching the polls, and gradually rising to become a candidate for local, state, and—depending on chance, talent, political expediency, and a host of other factors—higher office. Because American elections are now more heavily candidate-centred rather than party-centred and are less susceptible to control by party bosses, wealthy candidates have often been able to circumvent the traditional party organization to win their party's nomination.

National security
      The September 11 attacks of 2001 precipitated the creation of the Department of Homeland Security, which is charged with protecting the United States against terrorist attacks. The legislation establishing the department—the largest government reorganization in 50 years—consolidated much of the country's security infrastructure, integrating the functions of more than 20 agencies under Homeland Security. The department's substantive responsibilities are divided into four directorates: border and transportation security, emergency preparedness, information analysis and infrastructure protection, and science and technology. The Secret Service, which protects the president, vice president, and other designated individuals, is also under the department's jurisdiction.

      The country's military forces consist of the U.S. Army (United States Army, The), Navy (United States Navy, The) (including the Marine Corps (United States Marine Corps, The)), and Air Force (United States Air Force, The), under the umbrella of the Department of Defense, which is headquartered in the Pentagon building in Arlington county, Virginia. (A related force, the Coast Guard (United States Coast Guard), is under the jurisdiction of the Department of Homeland Security.) conscription was ended in 1973, and since that time the United States has maintained a wholly volunteer military force; since 1980, however, all male citizens (as well as immigrant alien males) between 18 and 25 years of age have been required to register for selective service in case a draft is necessary during a crisis. The armed services also maintain reserve forces that may be called upon in time of war. Each state has a National Guard consisting of reserve groups subject to call at any time by the governor of the state.

      Because a large portion of the military budget, which generally constitutes about 15 to 20 percent of government expenditures, is spent on matériel and research and development, military programs have considerable economic and political impact. The influence of the military also extends to other countries through a variety of multilateral and bilateral treaties and organizations (e.g., the North Atlantic Treaty Organization) for mutual defense and military assistance. The United States has military bases in Africa, Asia, Europe, and Latin America.

      The National Security Act of 1947 created a coordinated command for security and intelligence-gathering activities. The act established the National Security Council (NSC) and the Central Intelligence Agency (CIA), the latter under the authority of the NSC and responsible for foreign intelligence. The National Security Agency, an agency of the Department of Defense, is responsible for cryptographic and communications intelligence. The Department of Homeland Security analyzes information gathered by the CIA and its domestic counterpart, the Federal Bureau of Investigation (FBI), to assess threat levels against the United States.

Domestic law enforcement
      Traditionally, law enforcement in the United States has been concentrated in the hands of local police officials, though the number of federal law-enforcement officers began to increase in the late 20th century. The bulk of the work is performed by police and detectives in the cities and by sheriffs and constables in rural areas. Many state governments also have law-enforcement agencies, and all of them have highway-patrol systems for enforcing traffic law.

      The investigation of crimes that come under federal jurisdiction (e.g., those committed in more than one state) is the responsibility of the FBI (Federal Bureau of Investigation), which also provides assistance with fingerprint identification and technical laboratory services to state and local law-enforcement agencies. In addition, certain federal agencies—such as the Drug Enforcement Administration of the Department of Justice and the Bureau of Alcohol, Tobacco, and Firearms of the Department of the Treasury—are empowered to enforce specific federal laws.

Health and welfare
      Despite the country's enormous wealth, poverty remains a reality for many people in the United States, though programs such as social security and Medicare (Medicare and Medicaid) have significantly reduced the poverty rate among senior citizens. In the early 21st century, more than one-tenth of the general population—and about one-sixth of children under 18 years of age—lived in poverty. About half the poor live in homes in which the head of the household is a full- or part-time wage earner. Of the others living in poverty, many are too old to work or are disabled, and a large percentage are mothers of young children. The states provide assistance to the poor in varying amounts, and the United States Department of Agriculture subsidizes the distribution of low-cost food and food stamps to the poor through the state and local governments. Unemployment assistance, provided for by the 1935 Social Security Act, is funded through worker and employer contributions.

      Increasing public concern with poverty and welfare led to new federal legislation beginning in the 1960s, especially the Great Society programs of the presidential administration of Lyndon B. Johnson (Johnson, Lyndon B.). Work, training, and rehabilitation programs were established in 1964 for welfare recipients. Between 1964 and 1969 the Office of Economic Opportunity began a number of programs, including the Head Start program for preschool children, the Neighborhood Youth Corps, and the Teacher Corps. Responding to allegations of abuse in the country's welfare system and charges that it encouraged dependency, the federal government introduced reforms in 1996, including limiting long-term benefits, requiring recipients to find work, and devolving much of the decision making to the states.

      Persons who have been employed are eligible for retirement pensions under the Social Security program, and their surviving spouses and dependent children are generally eligible for survivor benefits. Many employers provide additional retirement benefits, usually funded by worker and employer contributions. In addition, millions of Americans maintain individual retirement accounts, such as the popular 401(k) plan, which is organized by employers and allows workers (sometimes with matching funds from their employer) to contribute part of their earnings on a tax-deferred basis to individual investment accounts.

      With total health-care spending significantly exceeding $1 trillion annually, the provision of medical and health care is one of the largest industries in the United States. There are, nevertheless, many inadequacies in medical services, particularly in rural and poor areas. Some two-thirds of the population is covered by employer-based health-insurance plans, and about one-sixth of the population, including members of the armed forces and their families, receives medical care paid for or subsidized by the federal government, with that for the poor provided by Medicaid. Approximately one-sixth of the population is not covered by any form of health insurance. Though the United States spends a larger proportion of its gross domestic product (GDP) on health care than any other major industrialized country, it is the only such country that does not guarantee health-care coverage for all its citizens. During the late 20th and the early 21st century, rising health-care and prescription drug costs were major concerns for both workers and employers.

      The federal Department of Health and Human Services, through its National Institutes of Health, supports much of the biomedical research in the United States. Grants are also made to researchers in clinics and medical schools.

      About three-fifths of the housing units in the United States are detached single-family homes, and about two-thirds are owner-occupied. Most houses are constructed of wood, and many are covered with shingles or brick veneer. The housing stock is relatively modern; nearly one-third of all units have been constructed since 1980, while about one-fifth of units were built prior to 1940. The average home is relatively large, with more than two-thirds of homes consisting of five or more rooms.

      Housing has long been considered a private rather than a public concern. The growth of urban slums, however, led many municipal governments to enact stricter building codes and sanitary regulations. In 1934 the Federal Housing Administration was established to make loans to institutions that would build low-rent dwellings. However, efforts to reduce slums in large cities by developing low-cost housing in other areas were frequently resisted by local residents who feared a subsequent decline in property values. For many years the restrictive covenant, by which property owners pledged not to sell to certain racial or religious groups, served to bar those groups from many communities. In 1948 the Supreme Court declared such covenants unenforceable, and in 1962 President John F. Kennedy (Kennedy, John F.) issued an executive order prohibiting discrimination in housing built with federal aid. Since that time many states and cities have adopted fair-housing laws and set up fair-housing commissions. Nevertheless, there are considerable racial disparities in home ownership; about three-fourths of whites but only about half of Hispanics and African Americans own their housing units.

      During the 1950s and '60s large high-rise public housing units were built for low-income families in many large U.S. cities, but these often became centres of crime and unemployment, and minority groups and the poor continued to live in segregated urban ghettos (ghetto). During the 1990s and the early 21st century, efforts were made to demolish many of the housing projects and to replace them with joint public-private housing communities that would include varying income levels.

      The interplay of local, state, and national programs and policies is particularly evident in education. Historically, education has been considered the province of the state and local governments. Of the approximately 4,000 colleges and universities (including branch campuses), the academies of the armed services are among the few federal institutions. (The federal government also administers, among others, the University of the Virgin Islands.) However, since 1862—when public lands were granted to the states to sell to fund the establishment of colleges of agricultural and mechanical arts, called land-grant colleges (land-grant college)—the federal government has been involved in education at all levels. Additionally, the federal government supports school lunch programs, administers American Indian education, makes research grants to universities, underwrites loans to college students, and finances education for veterans. It has been widely debated whether the government should also give assistance to private and parochial (religious) schools or tax deductions to parents choosing to send their children to such schools. Although the Supreme Court has ruled that direct assistance to parochial schools is barred by the Constitution's First Amendment—which states that “Congress shall make no law respecting an establishment of religion”—it has allowed the provision of textbooks and so-called supplementary educational centres on the grounds that their primary purpose is educative rather than religious.

      Public secondary and elementary education is free and provided primarily by local government. Education is compulsory, generally from age 7 through 16, though the age requirements vary somewhat among the states. The literacy rate exceeds 95 percent. In order to address the educational needs of a complex society, governments at all levels have pursued diverse strategies, including preschool programs, classes in the community, summer and night schools, additional facilities for exceptional children, and programs aimed at culturally deprived and disaffected students.

      Although primary responsibility for elementary education rests with local government, it is increasingly affected by state and national policies. The Civil Rights Act of 1964, for example, required federal agencies to discontinue financial aid to school districts that were not racially integrated, and in Swann v. Charlotte-Mecklenburg County (North Carolina) Board of Education (1971) the Supreme Court mandated busing to achieve racially integrated schools, a remedy that often required long commutes for African American children living in largely segregated enclaves. In the late 20th and the early 21st century, busing remained a controversial political issue, and many localities (including Charlotte) ended their busing programs or had them terminated by federal judges. In addition, the No Child Left Behind Act, enacted in 2002, increased the federal role in elementary and secondary education by requiring states to implement standards of accountability for public elementary and secondary schools.

James T. Harris Ed.

Cultural life
      The great art historian Sir Ernst Hans Josef Gombrich (Gombrich, Sir Ernst Hans Josef) once wrote that there is really no such thing as “art”; there are only artists. This is a useful reminder to anyone studying, much less setting out to try to define, anything as big and varied as the culture of the United States. For the culture that endures in any country is made not by vast impersonal forces or by unfolding historical necessities but by uniquely talented men and women, one-of-a-kind people doing one thing at a time—doing what they can, or must. In the United States, particularly, where there is no more a truly “established” art than an established religion—no real academies, no real official art—culture is where one finds it, and many of the most gifted artists have chosen to make their art far from the parades and rallies of worldly life.

 Some of the keenest students of the American arts have even come to dislike the word culture as a catchall for the plastic and literary arts, since it is a term borrowed from anthropology, with its implication that there is any kind of seamless unity to the things that writers and poets and painters have made. The art of some of the greatest American artists and writers, after all, has been made in deliberate seclusion and has taken as its material the interior life of the mind and heart that shapes and precedes shared “national” experience. It is American art before it is the culture of the United States. Even if it is true that these habits of retreat are, in turn, themselves in part traditions, and culturally shaped, it is also true that the least illuminating way to approach the poems of Emily Dickinson (Dickinson, Emily) or the paintings of Winslow Homer (Homer, Winslow), to take only two imposing instances, is as the consequence of large-scale mass sociological phenomenon.

      Still, many, perhaps even most, American culture-makers have not only found themselves, as all Americans do, caught in the common life of their country—they have chosen to make the common catch their common subject. Their involvement with the problems they share with their neighbours, near and far, has given their art a common shape and often a common substance. And if one quarrel has absorbed American artists and thinkers more than any other, it has been that one between the values of a mass, democratic, popular culture and those of a refined elite culture accessible only to the few—the quarrel between “low” and “high.” From the very beginnings of American art, the “top down” model of all European civilization, with a fine art made for an elite class of patrons by a specialized class of artists, was in doubt, in part because many Americans did not want that kind of art, in part because, even if they wanted it, the social institutions—a court or a cathedral—just were not there to produce and welcome it. What came in its place was a commercial culture, a marketplace of the arts, which sometimes degraded art into mere commerce and at other times raised the common voice of the people to the level of high art.

      In the 20th century, this was, in some part, a problem that science left on the doorstep of the arts. Beginning at the turn of the century, the growth of the technology of mass communications—the movies, the phonograph, radio, and eventually television—created a potential audience for stories and music and theatre larger than anyone could previously have dreamed that made it possible for music and drama and pictures to reach more people than had ever been possible. People in San Francisco could look at the latest pictures or hear the latest music from New York months, or even moments, after they were made; a great performance demanded a pilgrimage no longer than the path to a corner movie theatre. High culture had come to the American living room.

 But, though interest in a “democratic” culture that could compete with traditional high culture has grown in recent times, it is hardly a new preoccupation. One has only to read such 19th-century classics as Mark Twain (Twain, Mark)'s The Innocents Abroad (1869) to be reminded of just how long, and just how keenly, Americans have asked themselves if all the stained glass and sacred music of European culture is all it is cracked up to be, and if the tall tales and Cigar-Store Indians did not have more juice and life in them for a new people in a new land. Twain's whole example, after all, was to show that American speech as it was actually spoken was closer to Homer than imported finery was.

      In this way, the new machines of mass reproduction and diffusion that fill modern times, from the daguerreotype to the World Wide Web, came not simply as a new or threatening force but also as the fulfillment of a standing American dream. Mass culture seemed to promise a democratic culture: a cultural life directed not to an aristocracy but to all men and women. It was not that the new machines produced new ideals but that the new machines made the old dreams seem suddenly a practical possibility.

      The practical appearance of this dream began in a spirit of hope. Much American art at the turn of the 20th century and through the 1920s, from the paintings of Charles Sheeler (Sheeler, Charles) to the poetry of Hart Crane (Crane, Hart), hymned the power of the new technology and the dream of a common culture. By the middle of the century, however, many people recoiled in dismay at what had happened to the American arts, high and low, and thought that these old dreams of a common, unifying culture had been irrevocably crushed. The new technology of mass communications, for the most part, seemed to have achieved not a generous democratization but a bland homogenization of culture. Many people thought that the control of culture had passed into the hands of advertisers, people who used the means of a common culture just to make a buck. It was not only that most of the new music and drama that had been made for movies and radio, and later for television, seemed shallow; it was also that the high or serious culture that had become available through the means of mass reproduction seemed to have been reduced to a string of popularized hits, which concealed the real complexity of art. Culture, made democratic, had become too easy.

 As a consequence, many intellectuals and artists around the end of World War II began to try to construct new kinds of elite “high” culture, art that would be deliberately difficult—and to many people it seemed that this new work was merely difficult. Much of the new art and dance seemed puzzling and deliberately obscure. Difficult art happened, above all, in New York City. During World War II, New York had seen an influx of avant-garde artists escaping Adolf Hitler's Europe, including the painters Max Ernst (Ernst, Max), Piet Mondrian (Mondrian, Piet), and Joan Miró (Miró, Joan), as well as the composer Igor Stravinsky (Stravinsky, Igor). They imported many of the ideals of the European avant-garde, particularly the belief that art should always be difficult and “ahead of its time.” (It is a paradox that the avant-garde movement in Europe had begun, in the late 19th century, in rebellion against what its advocates thought were the oppressive and stifling standards of high, official culture in Europe and that it had often looked to American mass culture for inspiration.) In the United States, however, the practice of avant-garde art became a way for artists and intellectuals to isolate themselves from what they thought was the cheapening of standards.

      And yet this counterculture had, by the 1960s, become in large American cities an official culture of its own. For many intellectuals around 1960, this gloomy situation seemed to be all too permanent. One could choose between an undemanding low culture and an austere but isolated high culture. For much of the century, scholars of culture saw these two worlds—the public world of popular culture and the private world of modern art—as irreconcilable antagonists and thought that American culture was defined by the abyss between them.

      As the century and its obsessions closed, however, more and more scholars came to see in the most enduring inventions of American culture patterns of cyclical renewal between high and low. And as scholars have studied particular cases instead of abstract ideas, it has become apparent that the contrast between high and low has often been overdrawn. Instead of a simple opposition between popular culture and elite culture, it is possible to recognize in the prolix and varied forms of popular culture innovations and inspirations that have enlivened the most original high American culture—and to then see how the inventions of high culture circulate back into the street, in a spiraling, creative flow. In the astonishing achievements of the American jazz musicians, who took the popular songs of Tin Pan Alley and the Broadway musical and inflected them with their own improvisational genius; in the works of great choreographers like Paul Taylor and George Balanchine, who found in tap dances and marches and ballroom bebop new kinds of movement that they then incorporated into the language of high dance; in the “dream boxes” of the American avant-garde artist Joseph Cornell (Cornell, Joseph), who took for his material the mundane goods of Woolworth's and the department store and used them as private symbols in surreal dioramas: in the work of all of these artists, and so many more, we see the same kind of inspiring dialogue between the austere discipline of avant-garde art and the enlivening touch of the vernacular.

      This argument has been so widely resolved, in fact, that, in the decades bracketing the turn of the 21st century, the old central and shaping American debate between high and low has been in part replaced by a new and, for the moment, still more clamorous argument. It might be said that if the old debate was between high and low, this one is between the “centre” and the “margins.” The argument between high and low was what gave the modern era its special savour. A new generation of critics and artists, defining themselves as “postmodern,” have argued passionately that the real central issue of culture is the “construction” of cultural values, whether high or low, and that these values reflect less enduring truth and beauty, or even authentic popular taste, than the prejudices of professors. Since culture has mostly been made by white males praising dead white males to other white males in classrooms, they argue, the resulting view of American culture has been made unduly pale, masculine, and lifeless. It is not only the art of African Americans and other minorities that has been unfairly excluded from the canon of what is read, seen, and taught, these scholars argue, often with more passion than evidence; it is also the work of anonymous artists, particularly women, that has been “marginalized” or treated as trivial. This argument can conclude with a rational, undeniable demand that more attention be paid to obscure and neglected writers and artists, or it can take the strong and often irrational form that all aesthetic values are merely prejudices enforced by power. If the old debate between high and low asked if real values could rise from humble beginnings, the new debate about American culture asks if true value, as opposed to mere power, exists at all.

      Because the most articulate artists are, by definition, writers, most of the arguments about what culture is and ought to do have been about what literature is and ought to do—and this can skew our perception of American culture a little, because the most memorable American art has not always appeared in books and novels and stories and plays. In part, perhaps, this is because writing was the first art form to undergo a revolution of mass technology; books were being printed in thousands of copies, while one still had to make a pilgrimage to hear a symphony or see a painting. The basic dispute between mass experience and individual experience has been therefore perhaps less keenly felt as an everyday fact in writing in the 20th and 21st centuries than it has been in other art forms. Still, writers have seen and recorded this quarrel as a feature of the world around them, and the evolution of American writing in the past 50 years has shown some of the same basic patterns that can be found in painting and dance and the theatre.

      In the United States after World War II, many writers, in opposition to what they perceived as the bland flattening out of cultural life, made their subject all the things that set Americans apart from one another. Although for many Americans, ethnic and even religious differences had become increasingly less important as the century moved on—holiday rather than everyday material—many writers after World War II seized on these differences to achieve a detached point of view on American life. Beginning in the 1940s and '50s, three groups in particular seemed to be “outsider-insiders” who could bring a special vision to fiction: Southerners, Jews, and African Americans.

      Each group had a sense of uncertainty, mixed emotions, and stifled aspirations that lent a questioning counterpoint to the general chorus of affirmation in American life. The Southerners—William Faulkner (Faulkner, William), Eudora Welty (Welty, Eudora), and Flannery O'Connor (O'Connor, Flannery) most particularly—thought that a noble tradition of defeat and failure had been part of the fabric of Southern life since the Civil War. At a time when “official” American culture often insisted that the American story was one of endless triumphs and optimism, they told stories of tragic fate. Jewish writers—most prominently Chicago novelist Saul Bellow (Bellow, Saul), who won the Nobel Prize for Literature in l976, Bernard Malamud (Malamud, Bernard), and Philip Roth (Roth, Philip)—found in the “golden exile” of Jews in the United States a juxtaposition of surface affluence with deeper unease and perplexity that seemed to many of their fellow Americans to offer a common predicament in a heightened form.

      For African Americans, of course, the promise of American life had in many respects never been fulfilled. “What happens to a dream deferred,” the poet Langston Hughes (Hughes, Langston) asked, and many African American writers attempted to answer that question, variously, through stories that mingled pride, perplexity, and rage. African American literature achieved one of the few unquestioned masterpieces of late 20th-century American fiction writing in Ralph Ellison (Ellison, Ralph)'s Invisible Man (l952). More recently, the rise of feminism as a political movement has given many women a sense that their experience too is richly and importantly outside the mainstream; since at least the 1960s, there has been an explosion of women's fiction, including the much-admired work of Toni Morrison (Morrison, Toni), the first African American female to win the Nobel Prize for Literature (1993); Anne Tyler (Tyler, Anne); and Ann Beattie (Beattie, Ann).

      Perhaps precisely because so many novelists sought to make their fiction from experiences that were deliberately imagined as marginal, set aside from the general condition of American life, many other writers had the sense that fiction, and particularly the novel, might not any longer be the best way to try to record American life. For many writers the novel seemed to have become above all a form of private, interior expression and could no longer keep up with the extravagant oddities of the United States. Many gifted writers took up journalism with some of the passion for perfection of style that had once been reserved for fiction. The exemplars of this form of poetic journalism included the masters of The New Yorker (New Yorker, The) magazine, most notably A.J. Liebling, whose books included The Earl of Louisiana (1961), a study of an election in Louisiana, as well as Joseph Mitchell, who in his books The Bottom of the Harbour (1944) and Joe Gould's Secret (1942) offered dark and perplexing accounts of the life of the American metropolis. The dream of combining real facts and lyrical fire also achieved a masterpiece in the poet James Agee (Agee, James)'s Let Us Now Praise Famous Men (l941; with photographs by Walker Evans (Evans, Walker)), an account of sharecropper life in the South that is a landmark in the struggle for fact writing that would have the beauty and permanence of poetry.

      As the century continued, this genre of imaginative nonfiction (sometimes called the documentary novel or the nonfiction novel) continued to evolve and took on many different forms. In the writing of Calvin Trillin, John McPhee (McPhee, John), Neil Sheehan, and Truman Capote (Capote, Truman), all among Liebling's and Mitchell's successors at The New Yorker, this new form continued to seek a tone of subdued and even amused understatement. Tom Wolfe (Wolfe, Tom), whose influential books included The Right Stuff (1979), an account of the early days of the American space program, and Norman Mailer (Mailer, Norman), whose books included Miami and the Siege of Chicago (1968), a ruminative piece about the Republican and Democratic national conventions in l968, deliberately took on huge public subjects and subjected them to the insights (and, many people thought, the idiosyncratic whims) of a personal sensibility.

      As the nonfiction novel often pursued extremes of grandiosity and hyperbole, the American short story assumed a previously unexpected importance in the life of American writing; the short story became the voice of private vision and private lives. The short story, with its natural insistence on the unique moment and the infrangible glimpse of something private and fragile, had a new prominence. The rise of the American short story is bracketed by two remarkable books: J.D. Salinger (Salinger, J D)'s Nine Stories (1953) and Raymond Carver (Carver, Raymond)'s collection What We Talk About When We Talk About Love (1981). Salinger inspired a generation by imagining that the serious search for a spiritual life could be reconciled with an art of gaiety and charm; Carver confirmed in the next generation their sense of a loss of spirituality in an art of taciturn reserve and cloaked emotions.

      Since Carver's death in 1988, the great novelist and man of letters John Updike (Updike, John) has remained perhaps the last undisputed master of literature in the high American sense that emerged with Ernest Hemingway (Hemingway, Ernest) and Faulkner. Yet in no area of the American arts, perhaps, have the claims of the marginal to take their place at the centre of the table been so fruitful, subtle, or varied as in literature. Perhaps because writing is inescapably personal, the trap of turning art into mere ideology has been most deftly avoided in its realm. This can be seen in the dramatically expanded horizons of the feminist and minority writers whose work first appeared in the 1970s and '80s, including the Chinese American Amy Tan (Tan, Amy). A new freedom to write about human erotic experience previously considered strange or even deviant shaped much new writing, from the comic obsessive novels of Nicholson Baker through the work of those short-story writers and novelists, including Edmund White (White, Edmund) and David Leavitt, who have made art out of previously repressed and unnarrated areas of homoerotic experience. Literature is above all the narrative medium of the arts, the one that still best relates What Happened to Me, and American literature, at least, has only been enriched by new “mes” and new narratives. (See also American literature.)

The visual arts and postmodernism
 Perhaps the greatest, and certainly the loudest, event in American cultural life since World War II was what the critic Irving Sandler has called “The Triumph of American Painting”—the emergence of a new form of art that allowed American painting to dominate the world. This dominance lasted for at least 40 years, from the birth of the so-called New York school, or Abstract Expressionism, around l945 until at least the mid-1980s, and it took in many different kinds of art and artists. In its first flowering, in the epic-scaled abstractions of Jackson Pollock (Pollock, Jackson), Mark Rothko (Rothko, Mark), Willem de Kooning (de Kooning, Willem), and the other members of the New York school, this new painting seemed abstract, rarefied, and constructed from a series of negations, from saying “no!” to everything except the purest elements of painting. Abstract Expressionism seemed to stand at the farthest possible remove from the common life of American culture and particularly from the life of American popular culture. Even this painting, however, later came under a new and perhaps less-austere scrutiny; and the art historian Robert Rosenblum has persuasively argued that many of the elements of Abstract Expressionism, for all their apparent hermetic distance from common experience, are inspired by the scale and light of the American landscape and American 19th-century landscape painting—by elements that run deep and centrally in Americans' sense of themselves and their country.

 It is certainly true that the next generation of painters, who throughout the 1950s continued the unparalleled dominance of American influence in the visual arts, made their art aggressively and unmistakably of the dialogue between the studio and the street. Jasper Johns (Johns, Jasper), for instance, took as his subject the most common and even banal of American symbols—maps of the 48 continental states, the flag itself—and depicted the quickly read and immediately identifiable common icons with a slow, meditative, painterly scrutiny. His contemporary and occasional partner Robert Rauschenberg (Rauschenberg, Robert) took up the same dialogue in a different form; his art consisted of dreamlike collages of images silk-screened from the mass media, combined with personal artifacts and personal symbols, all brought together in a mélange of jokes and deliberately perverse associations. In a remarkably similar spirit, the eccentric surrealist Joseph Cornell (Cornell, Joseph) made little shoe-box-like dioramas in which images taken from popular culture were made into a dreamlike language of nostalgia and poetic reverie. Although Cornell, like William Blake (Blake, William), whom he in many ways resembled, worked largely in isolation, his sense of the poetry that lurks unseen in even the most absurd everyday objects had a profound effect on other artists.

      By the early 1960s, with the explosion of the new art form called Pop art, the engagement of painting and drawing with popular culture seemed so explicit as to be almost overwhelming and, at times, risked losing any sense of private life and personal inflection at all—it risked becoming all street and no studio. Artists such as Andy Warhol (Warhol, Andy), Roy Lichtenstein (Lichtenstein, Roy), and Claes Oldenburg (Oldenburg, Claes) took the styles and objects of popular culture—everything from comic books to lipstick tubes—and treated them with the absorption and grave seriousness previously reserved for religious icons. But this art too had its secrets, as well as its strong individual voices and visions. In his series of drawings called Proposals for Monumental Buildings, 1965–69, Oldenburg drew ordinary things—fire hydrants, ice-cream bars, bananas—as though they were as big as skyscrapers. His pictures combined a virtuoso's gift for drawing with a vision, at once celebratory and satirical, of the P.T. Barnum (Barnum, P.T.) spirit of American life. Warhol silk-screened images of popular movie stars and Campbell's soup cans; in replicating them, he suggested that their reiteration by mass production had emptied them of their humanity but also given them a kind of hieratic immortality. Lichtenstein used the techniques of comic-book illustration to paraphrase some of the monuments of modern painting, making a coolly witty art in which Henri Matisse (Matisse, Henri) danced with Captain Marvel.

      But these artists who self-consciously chose to make their art out of popular materials and images were not the only ones who had something to say about the traffic between mass and elite culture. The so-called minimalists, who made abstract art out of simple and usually hard-edged geometric forms, from one point of view carried on the tradition of austere abstraction. But it was also the minimalists, as art historians have pointed out, who carried over the vocabulary of the new International Style of unornamented architecture into the world of the fine arts; minimalism imagined the dialogue between street and studio in terms of hard edges and simple forms rather than in terms of imagery, but it took part in the same dialogue. In some cases, the play between high and low has been carried out as a dialogue between Pop and minimalist styles themselves. Frank Stella (Stella, Frank), thought by many to be the preeminent American painter of the late 20th century, began as a minimalist, making extremely simple paintings of black chevrons from which everything was banished except the barest minimum of painterly cues. Yet in his subsequent work he became almost extravagantly “maximalist” and, as he began to make bas-reliefs, added to the stark elegance of his early paintings wild, Pop-art elements of outthrusting spirals and Day-Glo colors—even sequins and glitter—that deliberately suggested the invigorating vulgarity of the Las Vegas Strip. Stella's flamboyant reliefs combine the spare elegance of abstraction with the greedy vitality of the American street.

      In the 1980s and '90s, it was in the visual arts, however, that the debates over postmodern marginality and the construction of a fixed canon became, perhaps, most fierce—yet, oddly, were at the same time least eloquent, or least fully realized in emotionally potent works of art. Pictures and objects do not “argue” particularly well, so the tone of much contemporary American art became debased, with the cryptic languages of high abstraction and conceptual art put in the service of narrow ideological arguments. It became a standard practice in American avant-garde art of the 1980s and '90s to experience an installation in which an inarguable social message—for instance, that there should be fewer homeless people in the streets—was encoded in a highly oblique, Surrealist manner, with the duty of the viewer then reduced to decoding the manner back into the message. The long journey of American art in the 20th century away from socially “responsible” art that lacked intense artistic originality seemed to have been short-circuited, without necessarily producing much of a gain in clarity or accessibility.

      No subject or idea has been as powerful, or as controversial, in American arts and letters at the end of the 20th century and into the new millennium as the idea of the ‘‘postmodern, " and in no sphere has the argument been as lively as in that of the plastic arts. The idea of the postmodern has been powerful in the United States exactly because the idea of the modern was so powerful; where Europe has struggled with the idea of modernity, in the United States it has been largely triumphant, thus leaving the question of ‘‘what comes next "all the more problematic. Since the 1960s, the ascendance of postmodern culture has been argued—now it is even sometimes said that a ‘‘post-postmodern " epoch has begun, but what exactly that means is remarkably vague.

      In some media, what is meant by postmodern is clear and easy enough to point to: it is the rejection of the utopian aspects of modernism, and particularly of the attempt to express that utopianism in ideal or absolute form—the kind experienced in Bauhaus architecture or in minimalist painting. Postmodernism is an attempt to muddy lines drawn falsely clear. In American architecture, for instance, the meaning of postmodern is reasonably plain. Beginning with the work of Robert Venturi (Venturi, Robert), Denise Scott-Brown, and Peter Eisenman (Eisenman, Peter), postmodern architects deliberately rejected the pure forms and ‘‘truth to materials " of the modern architect and put in their place irony, ornament, historical reference, and deliberate paradox. Some American postmodern architecture has been ornamental and cheerfully cosmetic, as in the later work of Philip Johnson (Johnson, Philip C.) and the mid-1980s work of Michael Graves (Graves, Michael). Some has been demanding and deliberately challenging even to conventional ideas of spatial lucidity, as in Eisenman's Wexner Center in Columbus, Ohio. But one can see the difference just by looking.

      In painting and sculpture (Western sculpture), on the other hand, it is often harder to know where exactly to draw the line—and why the line is drawn. In the paintings of the American artist David Salle (Salle, David) or the photographs of Cindy Sherman (Sherman, Cindy), for instance, one sees apparently postmodern elements of pastiche, borrowed imagery, and deliberately ‘‘impure " collage. But all of these devices are also components of modernism and part of the heritage of Surrealism, though the formal devices of a Rauschenberg or Johns were used in a different emotional key. The true common element among the postmodern perhaps lies in a note of extreme pessimism and melancholy about the possibility of escaping from borrowed imagery into ‘‘authentic " experience. It is this emotional tone that gives postmodernism its peculiar register and, one might almost say, its authenticity.

      In literature, the postmodern is, once again, hard to separate from the modern, since many of its keynotes—for instance, a love of complicated artifice and obviously literary devices, along with the mixing of realistic and frankly fantastic or magical devices—are at least as old as James Joyce (Joyce, James)'s founding modernist fictions. But certainly the expansion of possible sources, the liberation from the narrowly white male view of the world, and a broadening of testimony given and testimony taken are part of what postmodern literature has in common with other kinds of postmodern culture. It has been part of the postmodern transformation in American fiction as well to place authors previously marginalized as genre writers at the centre of attention. The African American crime writer Chester Himes (Himes, Chester), for example, has been given serious critical attention, while the strange visionary science-fiction writer Philip K. Dick (Dick, Philip K.) was ushered, in 2007, from his long exile in paperback into the Library of America.

      What is at stake in the debates over modern and postmodern is finally the American idea of the individual. Where modernism in the United States placed its emphasis on the autonomous individual, the heroic artist, postmodernism places its emphasis on the ‘‘de-centred " subject, the artist as a prisoner, rueful or miserable, of culture. Art is seen as a social event rather than as communication between persons. If in modernism an individual artist made something that in turn created a community of observers, in the postmodern epoch the opposite is true: the social circumstance, the chain of connections that make seeming opposites unite, key off the artist and make him what he is. In the work of the artist Jeff Koons (Koons, Jeff), for instance—who makes nothing but has things, from kitsch figurines to giant puppies composed of flowers, made for him—this postmodern rejection of the handmade or authentic is given a weirdly comic tone, at once eccentric and humorous. It is the impurities of culture, rather than the purity of the artist's vision, that haunts contemporary art.

      Nonetheless, if the push and charge that had been so unlooked-for in American art since the 1940s seemed diminished, the turn of the 21st century was a rich time for second and even third acts. Richard Serra (Serra, Richard), John Baldessari (Baldessari, John), Elizabeth Murray (Murray, Elizabeth), and Chuck Close (Close, Chuck) were all American artists who continued to produce arresting, original work—most often balanced on that fine knife edge between the blankly literal and the disturbingly metaphoric—without worrying overmuch about theoretical fashions or fashionable theory.

      As recently as the 1980s, most surveys of American culture might not have thought photography of much importance. But at the turn of the century, photography began to lay a new claim to attention as a serious art form. For the bulk of the first part of the 20th century, the most remarkable American photographers had, on the whole, tried to make photography into a “fine art” by divorcing it from its ubiquitous presence as a recorder of moments and by splicing it onto older, painterly traditions. A clutch of gifted photographers, however, have, since the end of World War II, been able to transcend the distinction between media image and aesthetic object—between art and photojournalism—to make from a single, pregnant moment a complete and enduring image. Walker Evans (Evans, Walker), Margaret Bourke-White (Bourke-White, Margaret), and Robert Frank (Frank, Robert) (the latter, like so many artists of the postwar period, an emigrant), for instance, rather than trying to make of photography something as calculated and considered as the traditional fine arts, found in the instantaneous vision of the camera something at once personal and permanent. Frank's book The Americans (l956), the record of a tour of the United States that combined the sense of accident of a family slide show with a sense of the ominous worthy of the Italian painter Giorgio de Chirico (de Chirico, Giorgio), was the masterpiece of this vision; and no work of the postwar era was more influential in all fields of visual expression. Robert Mapplethorpe (Mapplethorpe, Robert), Diane Arbus (Arbus, Diane), and, above all, Richard Avedon (Avedon, Richard) and Irving Penn (Penn, Irving), who together dominated both fashion and portrait photography for almost half a century and straddled the lines between museum and magazine, high portraiture and low commercials, all came to seem, in their oscillations between glamour and gloom, exemplary of the predicaments facing the American artist.

The theatre (theatre, Western)
 Perhaps more than any other art form, the American theatre suffered from the invention of the new technologies of mass reproduction. Where painting and writing could choose their distance from (or intimacy with) the new mass culture, many of the age-old materials of the theatre had by the 1980s been subsumed by movies and television. What the theatre could do that could not be done elsewhere was not always clear. As a consequence, the Broadway theatre—which in the 1920s had still seemed a vital area of American culture and, in the high period of the playwright Eugene O'Neill (O'Neill, Eugene), a place of cultural renaissance—had by the end of the 1980s become very nearly defunct. A brief and largely false spring had taken place in the period just after World War II. Tennessee Williams (Williams, Tennessee) and Arthur Miller (Miller, Arthur), in particular, both wrote movingly and even courageously about the lives of the “left-out” Americans, demanding attention for the outcasts of a relentlessly commercial society. Viewing them from the 21st century, however, both seem more traditional and less profoundly innovative than their contemporaries in the other arts, more profoundly tied to the conventions of European naturalist theatre and less inclined or able to renew and rejuvenate the language of their form.

 Also much influenced by European models, though in his case by the absurdist theatre of Eugène Ionesco (Ionesco, Eugène) and Samuel Beckett (Beckett, Samuel), was Edward Albee (Albee, Edward), the most prominent American playwright of the 1960s. As Broadway's dominance of the American stage waned in the 1970s, regional theatre took on new importance, and cities such as Chicago, San Francisco, and Louisville, Ky., provided significant proving grounds for a new generation of playwrights. On those smaller but still potent stages, theatre continues to speak powerfully. An African American renaissance in the theatre has taken place, with its most notable figure being August Wilson (Wilson, August), whose 1985 play Fences won the Pulitzer Prize. And, for the renewal and preservation of the American language, there is still nothing to equal the stage: David Mamet (Mamet, David), in his plays, among them Glengarry, Glen Ross (1983) and Speed the Plow (1987), both caught and created an American vernacular—verbose, repetitive, obscene, and eloquent—that combined the local colour of Damon Runyon (Runyon, Damon) and the bleak truthfulness of Harold Pinter (Pinter, Harold). The one completely original American contribution to the stage, the musical theatre, blossomed in the 1940s and '50s in the works of Frank Loesser (Loesser, Frank) (especially Guys and Dolls, which the critic Kenneth Tynan regarded as one of the greatest of American plays) but became heavy-handed and exists at the beginning of the 21st century largely as a revival art and in the brave “holdout” work of composer and lyricist Stephen Sondheim (Sondheim, Stephen) (Company, Sweeney Todd, and Into the Woods).

Motion pictures (motion picture)
 In some respects the motion picture is the American art form par excellence, and no area of art has undergone a more dramatic revision in critical appraisal in the recent past. Throughout most of the 1940s and '50s, serious critics, with a few honourable exceptions (notably, James Agee and Manny Farber), even those who took the cinema seriously as a potential artistic medium, took it for granted that (excepting the work of D.W. Griffith (Griffith, D W) and Orson Welles (Welles, Orson)), the commercial Hollywood movie was, judged as art, hopelessly compromised by commerce. In the 1950s in France, however, a generation of critics associated with the magazine Cahiers du cinéma (many of whom later would become well-known filmmakers themselves, including François Truffaut (Truffaut, François) and Claude Lelouch (Lelouch, Claude)) argued that the American commercial film, precisely because its need to please a mass audience had helped it break out of the limiting gentility of the European cinema, had a vitality and, even more surprisingly, a set of master-makers (auteurs) without equal in the world. New studies and appreciations of such Hollywood filmmakers as John Ford (Ford, John), Howard Hawks (Hawks, Howard), and William Wyler (Wyler, William) resulted, and, eventually, this new evaluation worked its way back into the United States itself: another demonstration that one country's low art can become another country's high art. Imported back into the United States, this reevaluation changed and amended preconceptions that had hardened into prejudices.

 The new appreciation of the individual vision of the Hollywood film was to inspire a whole generation of young American filmmakers, including Francis Ford Coppola (Coppola, Francis Ford), Martin Scorsese (Scorsese, Martin), and George Lucas (Lucas, George), to attempt to use the commercial film as at once a form of personal expression and a means of empire building, with predictably mixed results. By the end of the century, another new wave of filmmakers (notably Spike Lee (Lee, Spike) and Stephen Soderbergh), like the previous generation mostly trained in film schools, had graduated from independent filmmaking to the mainstream, and the American tradition of film comedy stretching from Buster Keaton (Keaton, Buster) and Charlie Chaplin (Chaplin, Charlie) to Billy Wilder (Wilder, Billy), Preston Sturges (Sturges, Preston), and Woody Allen (Allen, Woody) had come to include the quirky sensibilities of Joel and Ethan Coen and Wes Anderson. In mixing a kind of eccentric, off-focus comedy with a private, screw-loose vision, they came close to defining another kind of postmodernism, one that was as antiheroic as the more academic sort but cheerfully self-possessed in tone. As the gap between big studio-made entertainment—produced for vast international audiences—and the small ‘‘art " or independent film widened, the best of the independents came to have the tone and idiosyncratic charm of good small novels: Nicole Holofcener's Lovely & Amazing (2001) or Kenneth Lonergan's You Can Count on Me (2000) reached audiences that felt bereft by the steady run of Batmans and Lethal Weapons. But with that achievement came a sense too that the audience for such serious work as Francis Ford Coppola's Godfather films and Chinatown (1974), which had been intact as late as the 1970s, had fragmented beyond recomposition.

 If the Martian visitor beloved of anthropological storytelling were to visit the United States at the beginning of the 21st century, all of the art forms listed and enumerated here—painting and sculpture and literature, perhaps even motion pictures and popular music—would seem like tiny minority activities compared with the great gaping eye of American life: “the box,” television. Since the mid-1950s, television has been more than just the common language of American culture; it has been a common atmosphere. For many Americans television is not the chief manner of interpreting reality but a substitute for it, a wraparound simulated experience that has come to be more real than reality itself. Indeed, beginning in the 1990s, American television was inundated with a spate of “reality” programs, a wildly popular format that employed documentary techniques to examine ‘‘ordinary " people placed in unlikely situations, from the game-show structure of Survivor (marooned contestants struggling for supremacy) to legal dramas such as The People's Court and Cops, to American Idol, the often caustically judged talent show that made instant stars of some of its contestants. Certainly, no medium—not even motion pictures at the height of their popular appeal in the 1930s—has created so much hostility, fear, and disdain in some “right-thinking” people. Television is chewing gum for the eyes, famously characterized as a vast wasteland in 1961 by Newton Minow, then chairman of the Federal Communications Commission. When someone in the movies is meant to be shown living a life of meaningless alienation, he is usually shown watching television.

      Yet television itself is, of course, no one thing, nor, despite the many efforts since the time of the Canadian philosopher Marshall Mcluhan (McLuhan, Marshall) to define its essence, has it been shown to have a single nature that deforms the things it shows. Television can be everything from Monday Night Football to the Persian Gulf War's Operation Desert Storm to Who Wants to Be a Millionaire? The curious thing, perhaps, is that, unlike motion pictures, where unquestioned masters and undoubted masterpieces and a language of criticism had already emerged, television still waits for a way to be appreciated. Television is the dominant contemporary cultural reality, but it is still in many ways the poor relation. (It is not unusual for magazines and newspapers that keep on hand three art critics to have but one part-time television reviewer—in part because the art critic is in large part a cultural broker, a “cultural explainer,” and few think that television needs to be explained.)

      When television first appeared in the late 1940s, it threatened to be a “ghastly gelatinous nirvana,” in James Agee (Agee, James)'s memorable phrase. Yet the 1950s, the first full decade of television's impact on American life, was called then, and is still sometimes called, a “Golden Age.” Serious drama, inspired comedy, and high culture all found a place in prime-time programming. From Sid Caesar to Lucille Ball (Ball, Lucille), the performers of this period retain a special place in American affections. Yet in some ways these good things were derivative of other, older media, adaptations of the manner and styles of theatre and radio. It was perhaps only in the 1960s that television came into its own, not just as a way of showing things in a new way but as a way of seeing things in a new way. Events as widely varied in tone and feeling as the broadcast of the Olympic Games and the assassination and burial of Pres. John F. Kennedy (Kennedy, John F.)—extended events that took place in real time—brought the country together around a set of shared, collective images and narratives that often had neither an “author” nor an intended point or moral. The Vietnam War became known as the “living room war” because images (though still made on film) were broadcast every night into American homes; later conflicts, such as the Persian Gulf War and the Iraq War, were actually brought live and on direct video feed from the site of the battles into American homes. Lesser but still compelling live events, from the marriage of Charles, prince of Wales, and Lady Diana Spencer (Diana, princess of Wales) to the pursuit of then murder suspect O.J. Simpson (Simpson, O.J.) in his white Bronco by the Los Angeles police in 1994, came to have the urgency and shared common currency that had once belonged exclusively to high art. From ordinary television viewers to professors of the new field of cultural studies, many Americans sought in live televised events the kind of meaning and significance that they had once thought it possible to find only in highly wrought and artful myth. Beginning in the late 1960s with CBS's 60 minutes, this epic quality also informed the TV newsmagazine; presented with an in-depth approach that emphasized narrative drama, the personality of the presenters as well as the subjects, and muckraking and malfeasance, it became one of television's most popular and enduring formats.

      Even in the countless fictional programs that filled American evening television, a sense of spontaneity and immediacy seemed to be sought and found. Though television produced many stars and celebrities, they lacked the aura of distance and glamour that had once attached to the great performers of the Hollywood era. Yet if this implied a certain diminishment in splendour, it also meant that, particularly as American film became more and more dominated by the demands of sheer spectacle, a space opened on television for a more modest and convincing kind of realism. Television series, comedy and drama alike, now play the role that movies played in the earlier part of the century or that novels played in the 19th century: they are the modest mirror of their time, where Americans see, in forms stylized or natural, the best image of their own manners. The most acclaimed of these series—whether produced for broadcast television and its diminishing market share (thirtysomething, NYPD Blue, and Seinfeld) or the creations of cable providers (The Sopranos and Six Feet Under)—seem as likely to endure as popular storytelling as any literature made in the late 20th and early 21st centuries.

Popular music
      Every epoch since the Renaissance has had an art form that seems to become a kind of universal language, one dominant artistic form and language that sweeps the world and becomes the common property of an entire civilization, from one country to another. Italian painting in the 15th century, German music in the 18th century, or French painting in the 19th and early 20th centuries—all of these forms seem to transcend their local sources and become the one essential soundscape or image of their time. Johann Sebastian Bach (Bach, Johann Sebastian) and Georg Frideric Handel (Handel, George Frideric), like Claude Monet (Monet, Claude) and Édouard Manet (Manet, Édouard), are local and more.

      At the beginning of the 21st century, and seen from a worldwide perspective, it is the American popular music that had its origins among African Americans at the end of the 19th century that, in all its many forms— ragtime, jazz, swing, jazz-influenced popular song, blues, rock and roll and its art legacy as rock and later hip-hop—has become America's greatest contribution to the world's culture, the one indispensable and unavoidable art form of the 20th century.

      The recognition of this fact was a long time coming and has had to battle prejudice and misunderstanding that continues today. Indeed, jazz-inspired American popular music has not always been well served by its own defenders, who have tended to romanticize rather than explain and describe. In broad outlines, the history of American popular music involves the adulteration of a “pure” form of folk music, largely inspired by the work and spiritual and protest music of African Americans. But it involves less the adulteration of those pure forms by commercial motives and commercial sounds than the constant, fruitful hybridization of folk forms by other sounds, other musics—art and avant-garde and purely commercial, Bach and Broadway meeting at Birdland. Most of the watershed years turn out to be permeable; as the man who is by now recognized by many as the greatest of all American musicians, Louis Armstrong (Armstrong, Louis), once said, “There ain't but two kinds of music in this world. Good music and bad music, and good music you tap your toe to.”

 Armstrong's own career is a good model of the nature and evolution of American popular music at its best. Beginning in impossibly hard circumstances, he took up the trumpet at a time when it was the military instrument, filled with the marching sounds of another American original, John Phillip Sousa (Sousa, John Philip). On the riverboats and in the brothels of New Orleans, as the protégé of King Oliver (Oliver, King), Armstrong learned to play a new kind of syncopated ensemble music, decorated with solos. By the time he traveled to Chicago in the mid-1920s, his jazz had become a full-fledged art music, “full of a melancholy and majesty that were new to American music,” as Whitney Balliett has written. The duets he played with the renowned pianist Earl Hines (Hines, Earl), such as the 1928 version of "Weather Bird," have never been equaled in surprise and authority. This art music in turn became a kind of commercial or popular music, commercialized by the swing bands that dominated American popular music in the 1930s, one of which Armstrong fronted himself, becoming a popular vocalist, who in turn influenced such white pop vocalists as Bing Crosby (Crosby, Bing). The decline of the big bands led Armstrong back to a revival of his own earlier style, and, at the end, when he was no longer able to play the trumpet, he became, ironically, a still more celebrated straight “pop” performer, making hits out of Broadway tunes, among them the German-born Kurt Weill (Weill, Kurt)'s "Mack the Knife" and Jerry Herman's "Hello, Dolly." Throughout his career, Armstrong engaged in a constant cycling of creative crossbreeding—Sousa and the blues and Broadway each adding its own element to the mix.

 By the 1940s, the craze for jazz as a popular music had begun to recede, and it began to become an art music. Duke Ellington (Ellington, Duke), considered by many as the greatest American composer, assembled a matchless band to play his ambitious and inimitable compositions, and by the 1950s jazz had become dominated by such formidable and uncompromising creators as Miles Davis (Davis, Miles) and John Lewis of the Modern Jazz Quartet.

      Beginning in the 1940s, it was the singers whom jazz had helped spawn—those who used microphones in place of pure lung power and who adapted the Viennese operetta-inspired songs of the great Broadway composers (who had, in turn, already been changed by jazz)—who became the bearers of the next dominant American style. Simply to list their names is to evoke a social history of the United States since World War II: Frank Sinatra (Sinatra, Frank), Nat King Cole (Cole, Nat King), Mel Tormé (Tormé, Mel), Ella Fitzgerald (Fitzgerald, Ella), Billie Holiday (Holiday, Billie), Doris Day (Day, Doris), Sarah Vaughan (Vaughan, Sarah), Peggy Lee (Lee, Peggy), Joe Williams (Williams, Joe), Judy Garland (Garland, Judy), Patsy Cline (Cline, Patsy), Willie Nelson (Nelson, Willie), Tony Bennett (Bennett, Tony), and many others. More than any other single form or sound, it was their voices that created a national soundtrack of longing, fulfillment, and forever-renewed hope that sounded like America to Americans, and then sounded like America to the world.

 September 1954 is generally credited as the next watershed in the evolution of American popular music, when a recent high-school graduate and truck driver named Elvis Presley (Presley, Elvis) went into the Memphis Recording Service and recorded a series of songs for a small label called . An easy, swinging mixture of country music, rhythm and blues, and pop ballad singing, these were, if not the first, then the seminal recordings of a new music that, it is hardly an exaggeration to say, would make all other kinds of music in the world a minority taste: rock and roll (rock). What is impressive in retrospect is that, like Armstrong's leap a quarter century before, this was less the sudden shout of a new generation coming into being than, once again, the self-consciously eclectic manufacture of a hybrid thing. According to Presley's biographer Peter Guralnick, Presley and Sam Phillips, Sun's owner, knew exactly what they were doing when they blended country style, white pop singing, and African American rhythm and blues. What was new was the mixture, not the act of mixing.

      The subsequent evolution of this music into the single musical language of the last quarter of the 20th century hardly needs be told—like jazz, it showed an even more accelerated evolution from folk to pop to art music, though, unlike jazz, this was an evolution that depended on new machines and technologies for the DNA of its growth. Where even the best-selling recording artists of the earlier generations had learned their craft in live performance, Presley was a recording artist before he was a performing one, and the British musicians who would feed on his innovations knew him first and best through records (and, in the case of the Beatles (Beatles, the) particularly, made their own innovations in the privacy of the recording studio). Yet once again, the lines between the new music and the old—between rock and roll and the pop and jazz that came before it—can be, and often are, much too strongly drawn. Instead, the evolution of American popular music has been an ongoing dialogue between past and present—between the African-derived banjo and bluegrass, Beat poets (Beat movement) and bebop—that brought together the most heartfelt interests of poor black and white Americans in ways that Reconstruction could not, its common cause replaced for working-class whites by supremacist diversions. It became, to use Greil Marcus's phrase, an Invisible Republic, not only where Presley chose to sing Arthur (‘‘Big Boy ") Crudup (Crudup, Arthur)'s song ( "That's All Right Mama" ) but where Chuck Berry (Berry, Chuck), a brown-eyed handsome man (his own segregation-era euphemism), revved up Louis Jordan (Jordan, Louis)'s jump blues to turn "Ida Red," a country-and-western ditty, into "Maybelline," along the way inventing a telegraphic poetry that finally coupled adolescent love and lust. It was a crossroads where Delta bluesman Robert Johnson (Johnson, Robert), more often channeled as a guitarist and singer, wrote songs that were as much a part of the musical education of Bob Dylan (Dylan, Bob) as were those of Woody Guthrie (Guthrie, Woody) and Weill.

      Coined in the 1960s to describe a new form of African American rhythm and blues, a strikingly American single descriptive term encompasses this extraordinary flowering of creativity— soul music. All good American popular music, from Armstrong forward, can fairly be called soul music, not only in the sense of emotional directness but with the stronger sense that great emotion can be created within simple forms and limited time, that the crucial contribution of soul is, perhaps, a willingness to surrender to feeling rather than calculating it, to appear effortless even at the risk of seeming simpleminded—to surrender to plain form, direct emotion, unabashed sentiment, and even what in more austere precincts of art would be called sentimentality. What American soul music, in this broad, inclusive sense, has, and what makes it matter so much in the world, is the ability to generate emotion without seeming to engineer emotion—to sing without seeming to sweat too much. The test of the truth of this new soulfulness is, however, its universality. Revered and catalogued in France and imitated in England, this American soul music is adored throughout the world.

      It is, perhaps, necessary for an American to live abroad to grasp how entirely American soul music had become the model and template for a universal language of emotion by the 20th century. And for an American abroad, perhaps what is most surprising is how, for all the national reputation for energy, vim, and future-focused forgetfulness, the best of all this music—from that mournful majesty of Armstrong to the heartaching quiver of Presley—has a small-scale plangency and plaintive emotion that belies the national reputation for the overblown and hyperbolic. In every sense, American culture has given the world the gift of the blues.

 Serious dance hardly existed in the United States in the first half of the 20th century. One remarkable American, Isadora Duncan (Duncan, Isadora), had played as large a role at the turn of the century and after as anyone in the emancipation of dance from the rigid rules of classical ballet into a form of intense and improvisatory personal expression. But most of Duncan's work was done and her life spent in Europe, and she bequeathed to the American imagination a shining, influential image rather than a set of steps. Ruth St. Denis (St. Denis, Ruth) and Ted Shawn (Shawn, Ted), throughout the 1920s, kept dance in America alive; but it was in the work of the choreographer Martha Graham (Graham, Martha) that the tradition of modern dance in the United States that Duncan had invented found its first and most influential master. Graham's work, like that of her contemporaries among the Abstract Expressionist painters, sought a basic, timeless vocabulary of primal expression; but even after her own work seemed to belong only to a period, in the most direct sense she founded a tradition: a Graham dancer, Paul Taylor (Taylor, Paul), became the most influential modern dance master of the next generation, and a Taylor dancer, Twyla Tharp (Tharp, Twyla), in turn the most influential choreographer of the generation after that. Where Graham had deliberately turned her back on popular culture, however, both Taylor and Tharp, typical of their generations, viewed it quizzically, admiringly, and hungrily. Whether the low inspiration comes from music—as in Tharp's Sinatra Songs, choreographed to recordings by Frank Sinatra and employing and transforming the language of the ballroom dance—or comes directly off the street—as in a famous section of Taylor's dance Cloven Kingdom, in which the dancer's movement is inspired by the way Americans walk and strut and fight—both Taylor and Tharp continue to feed upon popular culture without being consumed by it. Perhaps for this reason, their art continues to seem of increasing stature around the world; they are intensely local yet greatly prized elsewhere.

 A similar arc can be traced from the contributions of African American dance pioneers Katherine Dunham (Dunham, Katherine), beginning in the 1930s, and Alvin Ailey (Ailey, Alvin, Jr.), who formed his own company in 1958, to Savion Glover (Glover, Savion), whose pounding style of tap dancing, know as ‘‘hitting, " was the rage of Broadway in the mid-1990s with Bring in 'Da Noise, Bring in 'Da Funk.

      George Balanchine (Balanchine, George), the choreographer who dominated the greatest of American ballet troupes, the New York City Ballet, from its founding in l946 as the Ballet Society until his death in l983, might be considered outside the bounds of purely “American” culture. Yet this only serves to remind us of how limited and provisional such national groupings must always be. For, though Mr. B., as he was always known, was born and educated in Russia and took his inspiration from a language of dance codified in France in the 19th century, no one has imagined the gestures of American life with more verve, love, or originality. His was an art made with every window in the soul open: to popular music (he choreographed major classical ballets to Sousa marches and George Gershwin (Gershwin, George) songs) as well as to austere and demanding American classical music (as in Ivesiana, his works choreographed to the music of Charles Ives (Ives, Charles)). He created new standards of beauty for both men and women dancers (and, not incidentally, helped spread those new standards of athletic beauty into the culture at large) and invented an audience for dance in the United States where none had existed before. By the end of his life, this Russian-born choreographer, who spoke all his life with a heavy accent, was perhaps the greatest and certainly among the most American of all artists.

      In many countries, the inclusion of sports, and particularly spectator sports, as part of “culture,” as opposed to the inclusion of recreation or medicine, would seem strange, even dubious. But no one can make sense of the culture of the United States without recognizing that Americans are crazy about games—playing them, watching them, and thinking about them. In no country have sports, especially commercialized, professional spectator sports, played so central a role as they have in the United States. Italy and England have their football (soccer) fanatics; the World Cups of rugby and cricket attract endless interest from the West Indies to Australia; but only in the United States do spectator sports, from “amateur” college (gridiron) football and basketball to the four major professional leagues—hockey, basketball, football, and baseball—play such a large role as a source of diversion, commerce, and, above all, shared common myth. In watching men (and sometimes women) play ball and comparing it with the way other men have played ball before, Americans have found their "proto-myth," a shared common romantic culture that unites them in ways that merely procedural laws cannot.

      Sports are central to American culture in two ways. First, they are themselves a part of the culture, binding, unifying theatrical events that bring together cities, classes, and regions not only in a common cause, however cynically conceived, but in shared experience. They have also provided essential material for culture, the means for writing and movies and poetry. If there is a “Matter of America” in the way that the King Arthur stories were the “Matter of Britain” and La Chanson de Roland (Chanson de Roland, La) the “Matter of France,” then it lies in the lore of professional sports and, perhaps, above all in the lore of baseball.

       baseball, more than any other sport played in the United States, remains the central national pastime and seems to attract mythmakers as Troy attracted poets. Some of the mythmaking has been naive or fatuous—onetime Major League Baseball commissioner Bartlett Giamatti wrote a book called Take Time for Paradise, finding in baseball a powerful metaphor for the time before the Fall. But the myths of baseball remain powerful even when they are not aided, or adulterated, by too-self-conscious appeals to poetry. The rhythm and variety of the game, the way in which its meanings and achievements depend crucially on a context, a learned history—the way that every swing of Hank Aaron (Aaron, Hank) was bound by the ghost of every swing by Babe Ruth (Ruth, Babe)—have served generations of Americans as their first contact with the nature of aesthetic experience, which, too, always depends on context and a sense of history, on what things mean in relation to other things that have come before. It may not be necessary to understand baseball to understand the United States, as someone once wrote, but it may be that many Americans get their first ideas about the power of the performing arts by seeing the art with which baseball players perform.

 Although baseball, with the declining and violent sport of boxing, remains by far the most literary of all American games, in recent decades it has been basketball—a sport invented as a small-town recreation more than a century ago and turned on American city playgrounds into the most spectacular and acrobatic of all team sports—that has attracted the most eager followers and passionate students. If baseball has provided generations of Americans with their first glimpse of the power of aesthetic context to make meaning—of the way that what happened before makes sense out of what happens next—then a new generation of spectators has often gotten its first essential glimpse of the poetry implicit in dance and sculpture, the unlimitable expressive power of the human body in motion, by watching such inimitable performers as Julius Erving (Erving, Julius), Magic Johnson (Johnson, Magic), and Michael Jordan (Jordan, Michael), a performer who, at the end of the 20th century, seemed to transcend not merely the boundaries between sport and art but even those between reality and myth, as larger-than-life as Paul Bunyan (Bunyan, Paul) and as iconic as Bugs Bunny, with whom he even shared the motion picture screen (Space Jam [1996]).

      By the beginning of the 21st century, the Super Bowl, professional football's championship game, American sports' gold standard of hype and commercial synergy, and the august ‘‘October classic, " Major League Baseball's World Series, had been surpassed for many as a shared event by college basketball's national championship. Mirroring a similar phenomenon on the high-school and state level, known popularly as March Madness, this single-elimination tournament whose early rounds feature David versus Goliath matchups and television coverage that shifts between a bevy of regional venues not only has been statistically proved to reduce the productivity of the American workers who monitor the progress of their brackets (predictions of winners and pairings on the way to the Final Four) but for a festive month both reminds the United States of its vanishing regional diversity and transforms the country into one gigantic community. In a similar way, the growth of fantasy baseball and football leagues—in which the participants ‘‘draft " real players—has created small communities while offering an escape, at least in fantasy, from the increasingly cynical world of commercial sports.

      Art is made by artists, but it is possible only with audiences; and perhaps the most worrying trait of American culture in the past half century, with high and low dancing their sometimes happy, sometimes challenging dance, has been the threatened disappearance of a broad middlebrow audience for the arts. Many magazines (magazine) that had helped sustain a sense of community and debate among educated readers—Collier's, The Saturday Evening Post, Look—had all stopped publishing by the late 20th century or continued only as a newspaper insert (Life). Others, including Harper's and the Atlantic Monthly, continue principally as philanthropies.

      As the elephantine growth and devouring appetite of television has reduced the middle audience, there has also been a concurrent growth in the support of the arts in the university. The public support of higher education in the United States, although its ostensible purposes were often merely pragmatic and intended simply to produce skilled scientific workers for industry, has had the perhaps unintended effect of making the universities into cathedrals of culture. The positive side of this development should never be overlooked; things that began as scholarly pursuits—for instance, the enthusiasm for authentic performances of early music—have, after their incubation in the academy, given pleasure to increasingly larger audiences. The growth of the universities has also, for good or ill, helped decentralize culture; the Guthrie Theaterin Minnesota, for instance, or the regional opera companies of St. Louis, Mo., and Santa Fe, N.M., are difficult to imagine without the support and involvement of local universities. But many people believe that the “academicization” of the arts has also had the negative effect of encouraging art made by college professors for other college professors. In literature, some people believe, for instance, this has led to the development of a literature that is valued less for its engagement with the world than for its engagement with other kinds of writing.

      Yet a broad, middle-class audience for the arts, if it is endangered, continues to flourish too. The establishment of the Lincoln Center for the Performing Arts in the early 1960s provided a model for subsequent centres across the country, including the John F. Kennedy Center for the Performing Arts in Washington, D.C., which opened in l971. It is sometimes said, sourly, that the audiences who attend concerts and recitals at these centres are mere “consumers” of culture, rather than people engaged passionately in the ongoing life of the arts. But it seems probable that the motives that lead Americans to the concert hall or opera house are just as mixed as they have been in every other historical period: a desire for prestige, a sense of duty, and real love of the form all commingled together.

      The deeper problem that has led to one financial crisis after another for theatre companies and dance troupes and museums (the Twyla Tharp dance company, despite its worldwide reputation, for instance, and a popular orientation that included several successful seasons on Broadway, was compelled to survive only by being absorbed into America Ballet Theater) rests on hard and fixed facts about the economics of the arts, and about the economics of the performing arts in particular. Ballet, opera, symphony, and drama are labour-intensive industries in an era of labour-saving devices. Other industries have remained competitive by substituting automated labour for human labour; but, for all that new stage devices can help cut costs, the basic demands of the old art forms are hard to alter. The corps of a ballet cannot be mechanized or stored on software; voices belong to singers, and singers cannot be replicated. Many Americans, accustomed to the simple connection between popularity and financial success, have had a hard time grasping this fact; perhaps this is one of the reasons for the uniquely impoverished condition of government funding for the arts in the United States.

      First the movies, then broadcast television, then cable television, and now the Internet—again and again, some new technology promises to revolutionize the delivery systems of culture and therefore change culture with it. Promising at once a larger audience than ever before (a truly global village) and a smaller one (e.g., tiny groups interested only in Gershwin having their choice today of 50 Gershwin Web sites), the Internet is only the latest of these candidates. Cable television, the most trumpeted of the more recent mass technologies, has so far failed sadly to multiply the opportunities for new experience of the arts open to Americans. The problem of the “lowest common denominator” is not that it is low but that it is common. It is not that there is no audience for music and dance and jazz. It is that a much larger group is interested in sex and violent images and action, and therefore the common interest is so easy to please.

      Yet the growing anxiety about the future of the arts reflects, in part, the extraordinary demands Americans have come to make on them. No country has ever before, for good or ill, invested so much in the ideal of a common culture; the arts for most Americans are imagined as therapy, as education, as a common inheritance, as, in some sense, the definition of life itself and the summum bonum. Americans have increasingly asked art to play the role that religious ritual played in older cultures.

      The problem of American culture in the end is inseparable from the triumph of liberalism and of the free-market, largely libertarian social model that, at least for a while at the end of the 20th century, seemed entirely ascendant and which much of the world, despite understandable fits and starts, emulated. On the one hand, liberal societies create liberty and prosperity and abundance, and the United States, as the liberal society par excellence, has not only given freedom to its own artists but allowed artists from elsewhere, from John James Audubon (Audubon, John James) to Marcel Duchamp (Duchamp, Marcel), to exercise their freedom: artists, however marginalized, are free in the United States to create weird forms, new dance steps, strange rhythms, free verse, and inverted novels.

      At the same time, however, liberal societies break down the consensus, the commonality, and the shared viewpoint that is part of what is meant by traditional culture, and what is left that is held in common is often common in the wrong way. The division between mass product and art made for small and specific audiences has perhaps never seemed so vast as it does at the dawn of the new millennium, and the odds of leaping past the divisions into common language or even merely a decent commonplace civilization have never seemed greater. Even those who are generally enthusiastic about the democratization of culture in American history are bound to find a catch in their throat of protest or self-doubt as they watch bad television reality shows become still worse or bad comic-book movies become still more dominant. The appeal of the lowest common denominator, after all, does not mean that all the people who are watching something have no other or better interests; it just means that the one thing they can all be interested in at once is this kind of thing.

      Liberal societies create freedoms and end commonalities, and that is why they are both praised for their fertility and condemned for their pervasive alienation of audiences from artists, and of art from people. The history of the accompanying longing for authentic community may be a dubious and even comic one, but anyone who has spent a night in front of a screen watching the cynicism and proliferation of gratuitous violence and sexuality at the root of much of what passes for entertainment for most Americans cannot help but feel a little soul-deadened. In this way, as the 21st century began, the cultural paradoxes of American society—the constant oscillation between energy and cynicism, the capacity to make new things and the incapacity to protect the best of tradition—seemed likely not only to become still more evident but also to become the ground for the worldwide debate about the United States itself. Still, if there were not causes of triumph, there were grounds for hope.

      It is in the creative life of Americans that all the disparate parts of American culture can, for the length of a story or play or ballet, at least, come together. What is wonderful, and perhaps special, in the culture of the United States is that the marginal and central, like the high and the low, are not in permanent battle but instead always changing places. The sideshow becomes the centre ring of the circus, the thing repressed the thing admired. The world of American culture, at its best, is a circle, not a ladder. High and low link hands.

Adam Gopnik

 The territory represented by the continental United States had, of course, been discovered, perhaps several times, before the voyages of Christopher Columbus (Columbus, Christopher). When Columbus arrived, he found the New World inhabited by peoples who in all likelihood had originally come from the continent of Asia. Probably these first inhabitants had arrived 20,000 to 35,000 years before in a series of migrations from Asia to North America by way of the Bering Strait. By the time the first Europeans appeared, the indigenous people (commonly referred to as Indians (American Indian)) had spread and occupied all portions of the New World.

      The foods and other resources available in each physiographic region largely determined the type of culture prevailing there. Fish and sea mammals, for example, contributed the bulk of the food supply of coastal peoples, although the acorn was a staple for California Indians (California Indian); plant life and wild game (especially the American bison, or buffalo) were sources for the Plains Indians (Plains Indian); and small-game hunting and fishing (depending again on local resources) provided for Midwestern and Eastern American Indian groups. These foods were supplemented by corn (maize), which was a staple food for the Indians of the Southwest. The procurement of these foods called for the employment of fishing, hunting, plant and berry gathering, and farming techniques, the application of which depended, in turn, upon the food resources utilized in given areas.

 Foods and other raw materials likewise conditioned the material culture of the respective regional groups. All Indians transported goods by human carrier; the use of dogs to pull sleds or travois was widespread; and rafts, boats, and canoes were used where water facilities were available. The horse, imported by the Spanish in the early 16th century, was quickly adopted by the Indians once it had made its appearance. Notably, it came to be used widely by the buffalo-hunting Indians of the Great Plains.

      American Indian culture groups were distinguished, among other ways, by house types. Dome-shaped ice houses (igloos) were developed by the Eskimos (Eskimo) (called Inuit in Canada) in what would become Alaska; rectangular plank houses were produced by the Northwest Coast Indians (Northwest Coast Indian); earth and skin lodges and tepees, by plains and prairie tribes; flat-roofed and often multistoried houses, by some of the Pueblo Indians of the Southwest; and barrel houses, by the Northeast Indians (Northeast Indian). Clothing, or the lack of it, likewise varied with native groups, as did crafts, weapons, and tribal economic, social, and religious customs.

      At the time of Columbus's arrival there were probably roughly 1.5 million American Indians in what is now the continental United States, although estimates vary greatly. In order to assess the role and the impact of the American Indian upon the subsequent history of the United States in any meaningful way, one must understand the differentiating factors between Native American peoples, such as those mentioned above. Generally speaking, it may be said, however, that the American Indians as a whole exercised an important influence upon the civilization transplanted from Europe to the New World. Indian foods and herbs, articles of manufacture, methods of raising some crops, war techniques, words, a rich folklore, and ethnic infusions are among the more obvious general contributions of the Indians to their European conquerors. The protracted and brutal westward-moving conflict caused by “white” expansionism and Indian resistance constitutes one of the most tragic chapters in the history of the United States.

Oscar O. Winther

Colonial America to 1763
The European background (European exploration)
      The English colonization of North America was but one chapter in the larger story of European expansion throughout the globe. The Portuguese, beginning with a voyage to Porto Santo off the coast of West Africa in 1418, were the first Europeans to promote overseas exploration and colonization. By 1487 the Portuguese had traveled all the way to the southern tip of Africa, establishing trading stations at Arguin, Sierra Leone, and El Mina. In 1497 Vasco da Gama (Gama, Vasco da) rounded the Cape of Good Hope (Good Hope, Cape of) and sailed up the eastern coast of Africa, laying the groundwork for Portugal's later commercial control of India. By 1500, when Pedro Álvares Cabral (Cabral, Pedro Álvares) stumbled across the coast of Brazil en route to India, Portuguese influence had expanded to the New World as well.

      Though initially lagging behind the Portuguese in the arts of navigation and exploration, the Spanish quickly closed that gap in the decades following Columbus's voyages to America. First in the Caribbean and then in spectacular conquests of New Spain (New Spain, Viceroyalty of) and Peru, they captured the imagination, and the envy, of the European world.

 France, occupied with wars in Europe to preserve its own territorial integrity, was not able to devote as much time or effort to overseas expansion as did Spain and Portugal. Beginning in the early 16th century, however, French fishermen established an outpost in Newfoundland (Newfoundland and Labrador), and in 1534 Jacques Cartier (Cartier, Jacques) began exploring the Gulf of St. Lawrence (Saint Lawrence, Gulf of). By 1543 the French had ceased their efforts to colonize the northeast portion of the New World. In the last half of the 16th century, France attempted to found colonies in Florida and Brazil, but each of these efforts failed, and by the end of the century Spain and Portugal remained the only two European nations to have established successful colonies in America.

 The English (England), although eager to duplicate the Spanish and Portuguese successes, nevertheless lagged far behind in their colonization efforts. The English possessed a theoretical claim to the North American mainland by dint of the 1497 voyage of John Cabot (Cabot, John) off the coast of Nova Scotia, but in fact they had neither the means nor the desire to back up that claim during the 16th century. Thus it was that England relied instead on private trading companies, which were interested principally in commercial rather than territorial expansion, to defend its interests in the expanding European world. The first of these commercial ventures began with the formation of the Muscovy Company in 1554. In 1576–78 the English mariner Martin Frobisher (Frobisher, Sir Martin) undertook three voyages in search of a Northwest Passage to the Far East. In 1577 Sir Francis Drake (Drake, Sir Francis) made his famous voyage around the world, plundering the western coast of South America en route. A year later Sir Humphrey Gilbert (Gilbert, Sir Humphrey), one of the most dedicated of Elizabethan imperialists, began a series of ventures aimed at establishing permanent colonies in North America. All his efforts met with what was, at best, limited success. Finally, in September 1583, Gilbert, with five vessels and 260 men, disappeared in the North Atlantic. With the failure of Gilbert's voyage, the English turned to a new man, Sir Walter Raleigh (Raleigh, Sir Walter), and a new strategy—a southern rather than a northern route to North America—to advance England's fortunes in the New World. Although Raleigh's efforts to found a permanent colony off the coast of Virginia did finally fail with the mysterious destruction of the Roanoke Island colony (Roanoke Island) in 1587, they awakened popular interest in a permanent colonizing venture.

      During the years separating the failure of the Roanoke attempt and the establishment in 1607 of Jamestown colony, English propagandists worked hard to convince the public that a settlement in America would yield instant and easily exploitable wealth. Even men such as the English geographer Richard Hakluyt (Hakluyt, Richard) were not certain that the Spanish colonization experience could or should be imitated but hoped nevertheless that the English colonies in the New World would prove to be a source of immediate commercial gain. There were, of course, other motives for colonization. Some hoped to discover the much-sought-after route to the Orient (East Asia) in North America. English imperialists thought it necessary to settle in the New World in order to limit Spanish expansion. Once it was proved that America was a suitable place for settlement, some Englishmen would travel to those particular colonies that promised to free them from religious persecution. There were also Englishmen, primarily of lower- and middle-class origin, who hoped the New World would provide them with increased economic opportunity in the form of free or inexpensive land. These last two motives, while they have been given considerable attention by historians, appear not to have been so much original motives for English colonization as they were shifts of attitude once colonization had begun.

  The leaders of the Virginia Company, a joint-stock company in charge of the Jamestown (Jamestown Colony) enterprise, were for the most part wealthy and wellborn commercial and military adventurers eager to find new outlets for investment. During the first two years of its existence, the Virginia colony, under the charter of 1607, proved an extraordinarily bad investment. This was principally due to the unwillingness of the early colonizers to do the necessary work of providing for themselves and to the chronic shortage of capital to supply the venture.

      A new charter in 1609 significantly broadened membership in the Virginia Company, thereby temporarily increasing the supply of capital at the disposal of its directors, but most of the settlers continued to act as though they expected the Indians to provide for their existence, a notion that the Indians fiercely rejected. As a result, the enterprise still failed to yield any profits, and the number of investors again declined.

      The crown issued a third charter in 1612, authorizing the company to institute a lottery to raise more capital for the floundering enterprise. In that same year, John Rolfe (Rolfe, John) harvested the first crop of a high-grade and therefore potentially profitable strain of tobacco. At about the same time, with the arrival of Sir Thomas Dale in the colony as governor in 1611, the settlers gradually began to practice the discipline necessary for their survival, though at an enormous personal cost.

      Dale carried with him the “Laws Divine, Morall, and Martial,” which were intended to supervise nearly every aspect of the settlers' lives. Each person in Virginia, including women and children, was given a military rank, with duties spelled out in minute detail. Penalties imposed for violating these rules were severe: those who failed to obey the work regulations were to be forced to lie with neck and heels together all night for the first offense, whipped for the second, and sent to a year's service in English galleys (convict ships) for the third. The settlers could hardly protest the harshness of the code, for that might be deemed slander against the company—an offense punishable by service in the galleys or by death.

      Dale's code brought order to the Virginia experiment, but it hardly served to attract new settlers. To increase incentive the company, beginning in 1618, offered 50 acres (about 20 hectares) of land to those settlers who could pay their transportation to Virginia and a promise of 50 acres after seven years of service to those who could not pay their passage. Concurrently, the new governor of Virginia, Sir George Yeardley, issued a call for the election of representatives to a House of Burgesses (Burgesses, House of), which was to convene in Jamestown in July 1619. In its original form the House of Burgesses was little more than an agency of the governing board of the Virginia Company, but it would later expand its powers and prerogatives and become an important force for colonial self-government.

      Despite the introduction of these reforms, the years from 1619 to 1624 proved fatal to the future of the Virginia Company. Epidemics, constant warfare with the Indians, and internal disputes took a heavy toll on the colony. In 1624 the crown finally revoked the charter of the company and placed the colony under royal control. The introduction of royal government into Virginia, while it was to have important long-range consequences, did not produce an immediate change in the character of the colony. The economic and political life of the colony continued as it had in the past. The House of Burgesses, though its future under the royal commission of 1624 was uncertain, continued to meet on an informal basis; by 1629 it had been officially reestablished. The crown also grudgingly acquiesced to the decision of the Virginia settlers to continue to direct most of their energies to the growth and exportation of tobacco. By 1630 the Virginia colony, while not prosperous, at least was showing signs that it was capable of surviving without royal subsidy.

  Maryland, Virginia's neighbour to the north, was the first English colony to be controlled by a single proprietor (proprietary colony) rather than by a joint-stock company. Lord Baltimore (George Calvert) (Baltimore, George Calvert, 1st Baron) had been an investor in a number of colonizing schemes before being given a grant of land from the crown in 1632. Baltimore was given a sizable grant of power to go along with his grant of land; he had control over the trade and political system of the colony so long as he did nothing to deviate from the laws of England. Baltimore's son Cecilius Calvert took over the project at his father's death and promoted a settlement at St. Mary's (Saint Marys City) on the Potomac. Supplied in part by Virginia, the Maryland colonists managed to sustain their settlement in modest fashion from the beginning. As in Virginia, however, the early 17th-century settlement in Maryland was often unstable and unrefined; composed overwhelmingly of young single males—many of them indentured servants—it lacked the stabilizing force of a strong family structure to temper the rigours of life in the wilderness.

 The colony was intended to serve at least two purposes. Baltimore, a Roman Catholic, was eager to found a colony where Catholics could live in peace, but he was also eager to see his colony yield him as large a profit as possible. From the outset, Protestants outnumbered Catholics, although a few prominent Catholics tended to own an inordinate share of the land in the colony. Despite this favouritism in the area of land policy, Baltimore was for the most part a good and fair administrator.

      Following the accession of William III and Mary II to the English throne, however, control of the colony was taken away from the Calvert family and entrusted to the royal government. Shortly thereafter, the crown decreed that Anglicanism would be the established religion of the colony. In 1715, after the Calvert family had renounced Catholicism and embraced Anglicanism, the colony reverted back to a proprietary form of government.

The New England colonies (New England)
      Although lacking a charter, the founders of Plymouth in Massachusetts (Massachusetts Bay Colony) were, like their counterparts in Virginia, dependent upon private investments from profit-minded backers to finance their colony. The nucleus of that settlement was drawn from an enclave of English émigrés in Leiden, Holland (now in The Netherlands). These religious Separatists (Separatist) believed that the true church was a voluntary company of the faithful under the “guidance” of a pastor and tended to be exceedingly individualistic in matters of church doctrine. Unlike the settlers of Massachusetts Bay, these Pilgrims (Pilgrim Fathers) chose to “separate” from the Church of England (England, Church of) rather than to reform it from within.

  In 1620, the first year of settlement, nearly half the Pilgrim settlers died of disease. From that time forward, however, and despite decreasing support from English investors, the health and the economic position of the colonists improved. The Pilgrims soon secured peace treaties with most of the Indians around them, enabling them to devote their time to building a strong, stable economic base rather than diverting their efforts toward costly and time-consuming problems of defending the colony from attack. Although none of their principal economic pursuits—farming, fishing, and trading—promised them lavish wealth, the Pilgrims in America were, after only five years, self-sufficient.

 Although the Pilgrims were always a minority in Plymouth, they nevertheless controlled the entire governmental structure of their colony during the first four decades of settlement. Before disembarking from the Mayflower in 1620, the Pilgrim founders, led by William Bradford (Bradford, William), demanded that all the adult males aboard who were able to do so sign a compact promising obedience to the laws and ordinances drafted by the leaders of the enterprise. Although the Mayflower Compact has been interpreted as an important step in the evolution of democratic government in America, it is a fact that the compact represented a one-sided arrangement, with the settlers promising obedience and the Pilgrim founders promising very little. Although nearly all the male inhabitants were permitted to vote for deputies to a provincial assembly and for a governor, the colony, for at least the first 40 years of its existence, remained in the tight control of a few men. After 1660 the people of Plymouth gradually gained a greater voice in both their church and civic affairs, and by 1691, when Plymouth colony (also known as the Old Colony) was annexed to Massachusetts Bay, the Plymouth settlers had distinguished themselves by their quiet, orderly ways.

      The Puritans (Puritanism) of the Massachusetts Bay Colony, like the Pilgrims, sailed to America principally to free themselves from religious restraints. Unlike the Pilgrims, the Puritans did not desire to “separate” themselves from the Church of England but, rather, hoped by their example to reform it. Nonetheless, one of the recurring problems facing the leaders of the Massachusetts Bay colony was to be the tendency of some, in their desire to free themselves from the alleged corruption of the Church of England, to espouse Separatist doctrine. When these tendencies or any other hinting at deviation from orthodox Puritan doctrine developed, those holding them were either quickly corrected or expelled from the colony. The leaders of the Massachusetts Bay enterprise never intended their colony to be an outpost of toleration in the New World; rather, they intended it to be a “Zion in the wilderness,” a model of purity and orthodoxy, with all backsliders subject to immediate correction.

 The civil government of the colony was guided by a similar authoritarian spirit. Men such as John Winthrop (Winthrop, John), the first governor of Massachusetts Bay, believed that it was the duty of the governors of society not to act as the direct representatives of their constituents but rather to decide, independently, what measures were in the best interests of the total society. The original charter of 1629 gave all power in the colony to a General Court composed of only a small number of shareholders in the company. On arriving in Massachusetts, many disfranchised settlers immediately protested against this provision and caused the franchise to be widened to include all church members. These “freemen” were given the right to vote in the General Court once each year for a governor and a Council of Assistants. Although the charter of 1629 technically gave the General Court the power to decide on all matters affecting the colony, the members of the ruling elite initially refused to allow the freemen in the General Court to take part in the lawmaking process on the grounds that their numbers would render the court inefficient.

      In 1634 the General Court adopted a new plan of representation whereby the freemen of each town would be permitted to select two or three delegates and assistants, elected separately but sitting together in the General Court, who would be responsible for all legislation. There was always tension existing between the smaller, more prestigious group of assistants and the larger group of deputies. In 1644, as a result of this continuing tension, the two groups were officially lodged in separate houses of the General Court, with each house reserving a veto power over the other.

      Despite the authoritarian tendencies of the Massachusetts Bay colony, a spirit of community developed there as perhaps in no other colony. The same spirit that caused the residents of Massachusetts to report on their neighbours for deviation from the true principles of Puritan morality also prompted them to be extraordinarily solicitous about their neighbours' needs. Although life in Massachusetts was made difficult for those who dissented from the prevailing orthodoxy, it was marked by a feeling of attachment and community for those who lived within the enforced consensus of the society.

      Many New Englanders, however, refused to live within the orthodoxy imposed by the ruling elite of Massachusetts, and both Connecticut and Rhode Island were founded as a by-product of their discontent. The Rev. Thomas Hooker (Hooker, Thomas), who had arrived in Massachusetts Bay in 1633, soon found himself in opposition to the colony's restrictive policy regarding the admission of church members and to the oligarchic power of the leaders of the colony. Motivated both by a distaste for the religious and political structure of Massachusetts and by a desire to open up new land, Hooker and his followers began moving into the Connecticut valley in 1635. By 1636 they had succeeded in founding three towns— Hartford, Windsor, and Wethersford. In 1638 the separate colony of New Haven was founded, and in 1662 Connecticut and Rhode Island merged under one charter.

      Roger Williams (Williams, Roger), the man closely associated with the founding of Rhode Island, was banished from Massachusetts because of his unwillingness to conform to the orthodoxy established in that colony. Williams's views conflicted with those of the ruling hierarchy of Massachusetts in several important ways. His own strict criteria for determining who was regenerate, and therefore eligible for church membership, finally led him to deny any practical way to admit anyone into the church. Once he recognized that no church could ensure the purity of its congregation, he ceased using purity as a criterion and instead opened church membership to nearly everyone in the community. Moreover, Williams showed distinctly Separatist leanings, preaching that the Puritan church could not possibly achieve purity as long as it remained within the Church of England. Finally, and perhaps most serious, he openly disputed the right of the Massachusetts leaders to occupy land without first purchasing it from the Native Americans.

      The unpopularity of Williams's views forced him to flee Massachusetts Bay for Providence in 1636. In 1639 William Coddington (Coddington, William), another dissenter in Massachusetts, settled his congregation in Newport. Four years later Samuel Gorton, yet another minister banished from Massachusetts Bay because of his differences with the ruling oligarchy, settled in Shawomet (later renamed Warwick). In 1644 these three communities joined with a fourth in Portsmouth under one charter to become one colony called Providence Plantation in Narragansett Bay.

      The early settlers of New Hampshire and Maine were also ruled by the government of Massachusetts Bay. New Hampshire was permanently separated from Massachusetts in 1692, although it was not until 1741 that it was given its own royal governor. Maine remained under the jurisdiction of Massachusetts until 1820.

The middle colonies
 New Netherland, founded in 1624 at Fort Orange (now Albany) by the Dutch West India Company, was but one element in a wider program of Dutch expansion in the first half of the 17th century. In 1664 the English captured the colony of New Netherland, renaming it New York after James, duke of York (James II), brother of Charles II, and placing it under the proprietary control of the duke. In return for an annual gift to the king of 40 beaver skins, the duke of York and his resident board of governors were given extraordinary discretion in the ruling of the colony. Although the grant to the duke of York made mention of a representative assembly, the duke was not legally obliged to summon it and in fact did not summon it until 1683. The duke's interest in the colony was chiefly economic, not political, but most of his efforts to derive economic gain from New York proved futile. Indians, foreign interlopers (the Dutch actually recaptured New York in 1673 and held it for more than a year), and the success of the colonists in evading taxes made the proprietor's job a frustrating one.

      In February 1685 the duke of York found himself not only proprietor of New York but also king of England, a fact that changed the status of New York from that of a proprietary to a royal colony. The process of royal consolidation was accelerated when in 1688 the colony, along with the New England and New Jersey colonies, was made part of the ill-fated Dominion of New England. In 1691 Jacob Leisler (Leisler, Jacob), a German merchant living on Long Island, led a successful revolt against the rule of the deputy governor, Francis Nicholson. The revolt, which was a product of dissatisfaction with a small aristocratic ruling elite and a more general dislike of the consolidated scheme of government of the Dominion of New England, served to hasten the demise of the dominion.

  Pennsylvania, in part because of the liberal policies of its founder, William Penn (Penn, William), was destined to become the most diverse, dynamic, and prosperous of all the North American colonies. Penn himself was a liberal, but by no means radical, English Whig (Whig and Tory). His Quaker (Society of Friends (Friends, Society of)) faith was marked not by the religious extremism of some Quaker leaders of the day but rather by an adherence to certain dominant tenets of the faith—liberty of conscience and pacifism—and by an attachment to some of the basic tenets of Whig doctrine. Penn sought to implement these ideals in his “holy experiment” in the New World.

 Penn received his grant of land along the Delaware River in 1681 from Charles II as a reward for his father's service to the crown. The first “frame of government” proposed by Penn in 1682 provided for a council and an assembly, each to be elected by the freeholders of the colony. The council was to have the sole power of initiating legislation; the lower house could only approve or veto bills submitted by the council. After numerous objections about the “oligarchic” nature of this form of government, Penn issued a second frame of government in 1682 and then a third in 1696, but even these did not wholly satisfy the residents of the colony. Finally, in 1701, a Charter of Privileges, giving the lower house all legislative power and transforming the council into an appointive body with advisory functions only, was approved by the citizens. The Charter of Privileges, like the other three frames of government, continued to guarantee the principle of religious toleration to all Protestants.

 Pennsylvania prospered from the outset. Although there was some jealousy between the original settlers (who had received the best land and important commercial privileges) and the later arrivals, economic opportunity in Pennsylvania was on the whole greater than in any other colony. Beginning in 1683 with the immigration of Germans into the Delaware valley and continuing with an enormous influx of Irish and Scotch-Irish in the 1720s and '30s, the population of Pennsylvania increased and diversified. The fertile soil of the countryside, in conjunction with a generous government land policy, kept immigration at high levels throughout the 18th century. Ultimately, however, the continuing influx of European settlers hungry for land spelled doom for the pacific Indian policy initially envisioned by Penn. “Economic opportunity” for European settlers often depended on the dislocation, and frequent extermination, of the American Indian residents who had initially occupied the land in Penn's colony.

  New Jersey remained in the shadow of both New York and Pennsylvania throughout most of the colonial period. Part of the territory ceded to the duke of York by the English crown in 1664 lay in what would later become the colony of New Jersey. The duke of York in turn granted that portion of his lands to John Berkeley and George Carteret (Carteret, Sir George, Baronet), two close friends and allies of the king. In 1665 Berkeley and Carteret established a proprietary government under their own direction. Constant clashes, however, developed between the New Jersey and the New York proprietors over the precise nature of the New Jersey grant. The legal status of New Jersey became even more tangled when Berkeley sold his half interest in the colony to two Quakers, who in turn placed the management of the colony in the hands of three trustees, one of whom was Penn. The area was then divided into East Jersey, controlled by Carteret, and West Jersey, controlled by Penn and the other Quaker trustees. In 1682 the Quakers bought East Jersey. A multiplicity of owners and an uncertainty of administration caused both colonists and colonizers to feel dissatisfied with the proprietary arrangement, and in 1702 the crown united the two Jerseys into a single royal province.

      When the Quakers purchased East Jersey, they also acquired the tract of land that was to become Delaware, in order to protect their water route to Pennsylvania. That territory remained part of the Pennsylvania colony until 1704, when it was given an assembly of its own. It remained under the Pennsylvania governor, however, until the American Revolution.

The Carolinas and Georgia
      The English crown had issued grants to the Carolina territory as early as 1629, but it was not until 1663 that a group of eight proprietors—most of them men of extraordinary wealth and power even by English standards—actually began colonizing the area. The proprietors hoped to grow silk in the warm climate of the Carolinas, but all efforts to produce that valuable commodity failed. Moreover, it proved difficult to attract settlers to the Carolinas; it was not until 1718, after a series of violent Indian wars had subsided, that the population began to increase substantially. The pattern of settlement, once begun, followed two paths. North Carolina, which was largely cut off from the European and Caribbean trade by its unpromising coastline, developed into a colony of small to medium farms. South Carolina, with close ties to both the Caribbean and Europe, produced rice and, after 1742, indigo for a world market. The early settlers in both areas came primarily from the West Indian colonies. This pattern of migration was not, however, as distinctive in North Carolina, where many of the residents were part of the spillover from the natural expansion of Virginians southward.

      The original framework of government for the Carolinas, the Fundamental Constitutions, drafted in 1669 by Anthony Ashley Cooper (Lord Shaftesbury) (Shaftesbury, Anthony Ashley Cooper, 1st Earl of, Baron Cooper of Pawlett, Baron Ashley of Wimborne St. Giles) with the help of the philosopher John Locke (Locke, John), was largely ineffective because of its restrictive and feudal nature. The Fundamental Constitutions was abandoned in 1693 and replaced by a frame of government diminishing the powers of the proprietors and increasing the prerogatives of the provincial assembly. In 1729, primarily because of the proprietors' inability to meet the pressing problems of defense, the Carolinas were converted into the two separate royal colonies of North and South Carolina.

      The proprietors of Georgia, led by James Oglethorpe (Oglethorpe, James Edward), were wealthy philanthropic English gentlemen. It was Oglethorpe's plan to transport imprisoned debtors to Georgia, where they could rehabilitate themselves by profitable labour and make money for the proprietors in the process. Those who actually settled in Georgia—and by no means all of them were impoverished debtors—encountered a highly restrictive economic and social system. Oglethorpe and his partners limited the size of individual landholdings to 500 acres (about 200 hectares), prohibited slavery, forbade the drinking of rum, and instituted a system of inheritance that further restricted the accumulation of large estates. The regulations, though noble in intention, created considerable tension between some of the more enterprising settlers and the proprietors. Moreover, the economy did not live up to the expectations of the colony's promoters. The silk industry in Georgia, like that in the Carolinas, failed to produce even one profitable crop.

      The settlers were also dissatisfied with the political structure of the colony; the proprietors, concerned primarily with keeping close control over their utopian experiment, failed to provide for local institutions of self-government. As protests against the proprietors' policies mounted, the crown in 1752 assumed control over the colony; subsequently, many of the restrictions that the settlers had complained about, notably those discouraging the institution of slavery, were lifted.

Imperial organization
      British policy toward the American colonies was inevitably affected by the domestic politics of England; since the politics of England in the 17th and 18th centuries were never wholly stable, it is not surprising that British colonial policy during those years never developed along clear and consistent lines. During the first half century of colonization, it was even more difficult for England to establish an intelligent colonial policy because of the very disorganization of the colonies themselves. It was nearly impossible for England to predict what role Virginia, Maryland, Massachusetts, Connecticut, and Rhode Island would play in the overall scheme of empire because of the diversity of the aims and governmental structures of those colonies. By 1660, however, England had taken the first steps in reorganizing her empire in a more profitable manner. The Navigation Act of 1660 (Navigation Acts), a modification and amplification of a temporary series of acts passed in 1651, provided that goods (mercantilism) bound to England or to English colonies, regardless of origin, had to be shipped only in English vessels; that three-fourths of the personnel of those ships had to be Englishmen; and that certain “enumerated articles,” such as sugar, cotton, and tobacco, were to be shipped only to England, with trade (international trade) in those items with other countries prohibited. This last provision hit Virginia and Maryland particularly hard; although those two colonies were awarded a monopoly over the English tobacco market at the same time that they were prohibited from marketing their tobacco elsewhere, there was no way that England alone could absorb their tobacco production.

      The 1660 act proved inadequate to safeguard the entire British commercial empire, and in subsequent years other navigation acts were passed, strengthening the system. In 1663 Parliament passed an act requiring all vessels with European goods bound for the colonies to pass first through English ports to pay customs duties. In order to prevent merchants from shipping the enumerated articles from colony to colony in the coastal trade and then taking them to a foreign country, in 1673 Parliament required that merchants post bond guaranteeing that those goods would be taken only to England. Finally, in 1696 Parliament established a Board of Trade (Trade, Board of) to oversee Britain's commercial empire, instituted mechanisms to ensure that the colonial governors aided in the enforcement of trade regulations, and set up vice admiralty courts in America for the prosecution of those who violated the Navigation Acts. On the whole, this attempt at imperial consolidation—what some historians have called the process of Anglicization—was successful in bringing the economic activities of the colonies under closer crown control. While a significant amount of colonial trade continued to evade British regulation, it is nevertheless clear that the British were at least partially successful in imposing greater commercial and political order on the American colonies during the period from the late-17th to the mid-18th century.

      In addition to the agencies of royal control in England, there were a number of royal officials in America responsible not only for aiding in the regulation of Britain's commercial empire but also for overseeing the internal affairs of the colonies. The weaknesses of royal authority in the politics of provincial America were striking, however. In some areas, particularly in the corporate colonies of New England during the 17th century and in the proprietary colonies throughout their entire existence, direct royal authority in the person of a governor responsible to the crown was nonexistent. The absence of a royal governor in those colonies had a particularly deleterious effect on the enforcement of trade regulations. In fact, the lack of royal control over the political and commercial activities of New England prompted the Board of Trade to overturn the Massachusetts Bay charter in 1684 and to consolidate Massachusetts, along with the other New England colonies and New York, into the Dominion of New England. After the colonists, aided by the turmoil of the Glorious Revolution of 1688 (Glorious Revolution) in England, succeeded in overthrowing the dominion scheme, the crown installed a royal governor in Massachusetts to protect its interests.

      In those colonies with royal governors—the number of those colonies grew from one in 1650 to eight in 1760—the crown possessed a mechanism by which to ensure that royal policy was enforced. The Privy Council issued each royal governor in America a set of instructions carefully defining the limits of provincial authority. The royal governors were to have the power to decide when to call the provincial assemblies together, to prorogue, or dissolve, the assemblies, and to veto any legislation passed by those assemblies. The governor's power over other aspects of the political structure of the colony was just as great. In most royal colonies he was the one official primarily responsible for the composition of the upper houses of the colonial legislatures and for the appointment of important provincial officials, such as the treasurer, attorney general, and all colonial judges. Moreover, the governor had enormous patronage powers over the local agencies of government. The officials of the county court, who were the principal agents of local government, were appointed by the governor in most of the royal colonies. Thus, the governor had direct or indirect control over every agency of government in America.

The growth of provincial power

Political growth
      The distance separating England and America, the powerful pressures exerted on royal officials by Americans, and the inevitable inefficiency of any large bureaucracy all served to weaken royal power and to strengthen the hold of provincial leaders on the affairs of their respective colonies. During the 18th century the colonial legislatures gained control over their own parliamentary prerogatives, achieved primary responsibility for legislation affecting taxation and defense, and ultimately took control over the salaries paid to royal officials. Provincial leaders also made significant inroads into the governor's patronage powers. Although theoretically the governor continued to control the appointments of local officials, in reality he most often automatically followed the recommendations of the provincial leaders in the localities in question. Similarly, the governor's councils, theoretically agents of royal authority, came to be dominated by prominent provincial leaders who tended to reflect the interests of the leadership of the lower house of assembly rather than those of the royal government in London.

      Thus, by the mid-18th century most political power in America was concentrated in the hands of provincial rather than royal officials. These provincial leaders undoubtedly represented the interests of their constituents more faithfully than any royal official could, but it is clear that the politics of provincial America were hardly democratic by modern standards. In general, both social prestige and political power tended to be determined by economic standing, and the economic resources of colonial America, though not as unevenly distributed as in Europe, were nevertheless controlled by relatively few men.

      In the Chesapeake Bay societies of Virginia and Maryland, and particularly in the regions east of the Blue Ridge mountains, a planter class came to dominate nearly every aspect of those colonies' economic life. These same planters, joined by a few prominent merchants and lawyers, dominated the two most important agencies of local government—the county courts and the provincial assemblies. This extraordinary concentration of power in the hands of a wealthy few occurred in spite of the fact that a large percentage of the free adult male population (some have estimated as high as 80 to 90 percent) was able to participate in the political process. The ordinary citizens of the Chesapeake society, and those of most colonies, nevertheless continued to defer to those whom they considered to be their “betters.” Although the societal ethic that enabled power to be concentrated in the hands of a few was hardly a democratic one, there is little evidence, at least for Virginia and Maryland, that the people of those societies were dissatisfied with their rulers. In general, they believed that their local officials ruled responsively.

      In the Carolinas a small group of rice and indigo planters monopolized much of the wealth. As in Virginia and Maryland, the planter class came to constitute a social elite. As a rule, the planter class of the Carolinas did not have the same long tradition of responsible government as did the ruling oligarchies of Virginia and Maryland, and, as a consequence, they tended to be absentee landlords and governors, often passing much of their time in Charleston, away from their plantations and their political responsibilities.

      The western regions of both the Chesapeake and Carolina societies displayed distinctive characteristics of their own. Ruling traditions were fewer, accumulations of land and wealth less striking, and the social hierarchy less rigid in the west. In fact, in some western areas antagonism toward the restrictiveness of the east and toward eastern control of the political structure led to actual conflict. In both North and South Carolina armed risings of varying intensity erupted against the unresponsive nature of the eastern ruling elite. As the 18th century progressed, however, and as more men accumulated wealth and social prestige, the societies of the west came more closely to resemble those of the east.

      New England society was more diverse and the political system less oligarchic than that of the South. In New England the mechanisms of town government served to broaden popular participation in government beyond the narrow base of the county courts.

      The town meetings (town meeting), which elected the members of the provincial assemblies, were open to nearly all free adult males. Despite this, a relatively small group of men dominated the provincial governments of New England. As in the South, men of high occupational status and social prestige were closely concentrated in leadership positions in their respective colonies; in New England, merchants, lawyers, and to a lesser extent clergymen made up the bulk of the social and political elite.

      The social and political structure of the middle colonies was more diverse than that of any other region in America. New York, with its extensive system of manors and manor lords, often displayed genuinely feudal characteristics. The tenants on large manors often found it impossible to escape the influence of their manor lords. The administration of justice, the election of representatives, and the collection of taxes often took place on the manor itself. As a consequence, the large landowning families exercised an inordinate amount of economic and political power. The Great Rebellion of 1766, a short-lived outburst directed against the manor lords, was a symptom of the widespread discontent among the lower and middle classes. By contrast, Pennsylvania's governmental system was more open and responsive than that of any other colony in America. A unicameral legislature, free from the restraints imposed by a powerful governor's council, allowed Pennsylvania to be relatively independent of the influence of both the crown and the proprietor. This fact, in combination with the tolerant and relatively egalitarian bent of the early Quaker settlers and the subsequent immigration of large numbers of Europeans, made the social and political structure of Pennsylvania more democratic but more faction-ridden than that of any other colony.

Population growth
  The increasing political autonomy of the American colonies was a natural reflection of their increased stature in the overall scheme of the British Empire. In 1650 the population of the colonies had been about 52,000; in 1700 it was perhaps 250,000, and by 1760 it was approaching 1,700,000. Virginia had increased from about 54,000 in 1700 to approximately 340,000 in 1760. Pennsylvania had begun with about 500 settlers in 1681 and had attracted at least 250,000 people by 1760. And America's cities were beginning to grow as well. By 1765 Boston had reached 15,000; New York City, 16,000–17,000; and Philadelphia, the largest city in the colonies, 20,000.

      Part of that population growth was the result of the involuntary immigration of African slaves. During the 17th century, slaves remained a tiny minority of the population. By the mid-18th century, after Southern (South, the) colonists discovered that the profits generated by their plantations could support the relatively large initial investments needed for slave labour, the volume of the slave trade increased markedly. In Virginia the slave population leaped from about 2,000 in 1670 to perhaps 23,000 in 1715 and reached 150,000 on the eve of the American Revolution. In South Carolina it was even more dramatic. In 1700 there were probably no more than 2,500 blacks in the population; by 1765 there were 80,000–90,000, with blacks outnumbering whites by about 2 to 1.

      One of the principal attractions for the immigrants who moved to America voluntarily was the availability of inexpensive arable land. The westward migration to America's frontier—in the early 17th century all of America was a frontier, and by the 18th century the frontier ranged anywhere from 10 to 200 miles (15 to 320 km) from the coastline—was to become one of the distinctive elements in American history. English Puritans (Puritanism), beginning in 1629 and continuing through 1640, were the first to immigrate in large numbers to America. Throughout the 17th century most of the immigrants were English; but, beginning in the second decade of the 18th century, a wave of Germans, principally from the Rhineland Palatinate (Rhineland-Palatinate), arrived in America: by 1770 between 225,000 and 250,000 Germans had immigrated to America, more than 70 percent of them settling in the middle colonies, where generous land policies and religious toleration made life more comfortable for them. The Scotch-Irish and Irish immigration, which began on a large scale after 1713 and continued past the American Revolution, was more evenly distributed. By 1750 both Scotch-Irish and Irish could be found in the western portions of nearly every colony. In almost all the regions in which Europeans sought greater economic opportunity, however, that same quest for independence and self-sufficiency led to tragic conflict with Indians over the control of land. And in nearly every instance the outcome was similar: the Europeans, failing to respect Indian claims either to land or to cultural autonomy, pushed the Indians of North America farther and farther into the periphery.

 Provincial America came to be less dependent upon subsistence agriculture and more on the cultivation and manufacture of products for the world market. Land, which initially served only individual needs, came to be the fundamental source of economic enterprise. The independent yeoman farmer continued to exist, particularly in New England and the middle colonies, but most settled land in North America by 1750 was devoted to the cultivation of a cash crop. New England turned its land over to the raising of meat products for export. The middle colonies were the principal producers of grains. By 1700 Philadelphia exported more than 350,000 bushels of wheat and more than 18,000 tons of flour annually. The Southern colonies were, of course, even more closely tied to the cash crop system. South Carolina, aided by British incentives, turned to the production of rice and indigo. North Carolina, although less oriented toward the market economy than South Carolina, was nevertheless one of the principal suppliers of naval stores. Virginia and Maryland steadily increased their economic dependence on tobacco and on the London merchants who purchased that tobacco, and for the most part they ignored those who recommended that they diversify their economies by turning part of their land over to the cultivation of wheat. Their near-total dependence upon the world tobacco price would ultimately prove disastrous, but for most of the 18th century Virginia and Maryland soil remained productive enough to make a single-crop system reasonably profitable.

 As America evolved from subsistence to commercial agriculture, an influential commercial class increased its power in nearly every colony. Boston was the centre of the merchant elite of New England, who not only dominated economic life but also wielded social and political power as well. Merchants such as James De Lancey and Philip Livingston in New York and Joseph Galloway (Galloway, Joseph), Robert Morris (Morris, Robert), and Thomas Wharton in Philadelphia exerted an influence far beyond the confines of their occupations. In Charleston the Pinckney, Rutledge, and Lowndes families controlled much of the trade that passed through that port. Even in Virginia, where a strong merchant class was nonexistent, those people with the most economic and political power were those commercial farmers who best combined the occupations of merchant and farmer. And it is clear that the commercial importance of the colonies was increasing. During the years 1700–10, approximately £265,000 sterling was exported annually to Great Britain from the colonies, with roughly the same amount being imported by the Americans from Great Britain. By the decade 1760–70, that figure had risen to more than £1,000,000 sterling of goods exported annually to Great Britain and £1,760,000 annually imported from Great Britain.

Richard R. Beeman

      Although Frederick Jackson Turner (Turner, Frederick Jackson)'s 1893 “frontier thesis”—that American democracy was the result of an abundance of free land—has long been seriously challenged and modified, it is clear that the plentifulness of virgin acres and the lack of workers to till them did cause a loosening of the constraints of authority in the colonial and early national periods. Once it became clear that the easiest path to success for Britain's New World “plantations” lay in raising export crops, there was a constant demand for agricultural labour, which in turn spurred practices that—with the notable exception of slavery—compromised a strictly hierarchical social order.

      In all the colonies, whether governed directly by the king, by proprietors, or by chartered corporations, it was essential to attract settlers, and what governors had most plentifully to offer was land. Sometimes large grants were made to entire religious communities numbering in the hundreds or more. Sometimes tracts were allotted to wealthy men on the “head rights” (literally “per capita”) system of so many acres for each family member they brought over. Few Englishmen or Europeans had the means to buy farms outright, so the simple sale of homesteads by large-scale grantees was less common than renting. But there was another well-traveled road to individual proprietorship that also provided a workforce: the system of contract labour known as indentured service. Under it, an impecunious new arrival would sign on with a landowner for a period of service—commonly seven years—binding him to work in return for subsistence and sometimes for the repayment of his passage money to the ship captain who had taken him across the Atlantic (such immigrants were called “redemptioners”). At the end of this term, the indentured servant would in many cases be rewarded by the colony itself with “freedom dues,” a title to 50 or more acres of land in a yet-unsettled area. This somewhat biblically inspired precapitalist system of transfer was not unlike apprenticeship, the economic and social tool that added to the supply of skilled labour. The apprentice system called for a prepubescent boy to be “bound out” to a craftsman who would take him into his own home and there teach him his art while serving as a surrogate parent. (Girls were perennially “apprenticed” to their mothers as homemakers.) Both indentured servants and apprentices were subject to the discipline of the master, and their lot varied with his generosity or hard-fistedness. There must have been plenty of the latter type of master, as running away was common. The first Africans taken to Virginia, or at least some of them, appear to have worked as indentured servants. Not until the case of John Punch in the 1640s did it become legally established that black “servants” were to remain such for life. Having escaped, been caught, and brought to trial, Punch, an indentured servant of African descent, and two other indentured servants of European descent received very different sentences, with Punch's punishment being servitude for the “rest of his natural life” while that for the other two was merely an extension of their service.

      The harshness of New England's climate and topography meant that for most of its people the road to economic independence lay in trade, seafaring, fishing, or craftsmanship. But the craving for an individually owned subsistence farm grew stronger as the first generations of religious settlers who had “planted” by congregation died off. In the process the communal holding of land by townships—with small allotted family garden plots and common grazing and orchard lands, much in the style of medieval communities—yielded gradually to the more conventional privately owned fenced farm. The invitation that available land offered—individual control of one's life—was irresistible. Property in land also conferred civic privileges, so an unusually large number of male colonists were qualified for suffrage by the Revolution's eve, even though not all of them exercised the vote freely or without traditional deference to the elite.

      Slavery was the backbone of large-scale cultivation of such crops as tobacco and hence took strongest root in the Southern colonies. But thousands of white freeholders of small acreages also lived in those colonies; moreover, slavery on a small scale (mainly in domestic service and unskilled labour) was implanted in the North. The line between a free and a slaveholding America had not yet been sharply drawn.

      One truly destabilizing system of acquiring land was simply “squatting.” On the western fringes of settlement, it was not possible for colonial administrators to use police powers to expel those who helped themselves to acres technically owned by proprietors in the seaboard counties. Far from seeing themselves as outlaws, the squatters believed that they were doing civilization's work in putting new land into production, and they saw themselves as the moral superiors of eastern “owners” for whom land was a mere speculative commodity that they did not, with great danger and hardship, cultivate themselves. Squatting became a regular feature of westward expansion throughout early U.S. history.

Bernard A. Weisberger

Cultural and religious development

Colonial culture
  America's intellectual attainments during the 17th and 18th centuries, while not inferior to those of the countries of Europe, were nevertheless of a decidedly different character. It was the techniques of applied science (science, history of) that most excited the minds of Americans, who, faced with the problem of subduing an often wild and unruly land, saw in science the best way to explain, and eventually to harness, those forces around them. Ultimately this scientific mode of thought might be applied to the problems of civil society as well, but for the most part the emphasis in colonial America remained on science and technology (technology, history of), not politics or metaphysics. Typical of America's peculiar scientific genius was John Bartram (Bartram, John) of Pennsylvania, who collected and classified important botanical data from the New World. The American Philosophical Society, founded in 1744, is justly remembered as the focus of intellectual life in America. Men such as David Rittenhouse (Rittenhouse, David), an astronomer who built the first planetarium in America; Cadwallader Colden, the lieutenant governor of New York, whose accomplishments as a botanist and as an anthropologist probably outmatched his achievements as a politician; and Benjamin Rush (Rush, Benjamin), a pioneer in numerous areas of social reform as well as one of colonial America's foremost physicians, were among the many active members of the society. At the centre of the society (Franklin, Benjamin) was one of its founders, Benjamin Franklin (Franklin, Benjamin), who (in his experiments concerning the flow of electricity) proved to be one of the few American scientists to achieve a major theoretical breakthrough but who was more adept at the kinds of applied research that resulted in the manufacture of more efficient stoves and the development of the lightning rod.

 American cultural achievements in nonscientific fields were less impressive. American literature, at least in the traditional European forms, was nearly nonexistent. The most important American contribution to literature was neither in fiction nor in metaphysics but rather in such histories as Robert Beverley's History and Present State of Virginia (1705) or William Byrd (Byrd, William)'s History of the Dividing Line (1728–29, but not published until 1841). The most important cultural medium in America was not the book but the newspaper. The high cost of printing tended to eliminate all but the most vital news, and local gossip or extended speculative efforts were thus sacrificed so that more important material such as classified advertisements and reports of crop prices could be included. Next to newspapers, almanacs (almanac) were the most popular literary form in America, Franklin's Poor Richard's (Poor Richard) being only the most famous among scores of similar projects. Not until 1741 and the first installment of Franklin's General Magazine did literary magazines (magazine) begin to make their first appearance in America. Most of the 18th-century magazines, however, failed to attract subscribers, and nearly all of them collapsed after only a few years of operation.

 The visual and performing arts, though flourishing somewhat more than literature, were nevertheless slow to achieve real distinction in America. America did produce one good historical painter in Benjamin West (West, Benjamin) and two excellent portrait painters in John Copley (Copley, John Singleton) and Gilbert Stuart (Stuart, Gilbert), but it is not without significance that all three men passed much of their lives in London, where they received more attention and higher fees.

      The Southern colonies, particularly Charleston, seemed to be more interested in providing good theatre for their residents than did other regions, but in no colony did the theatre approach the excellence of that of Europe. In New England, Puritan influence was an obstacle to the performance of plays, and even in cosmopolitan Philadelphia the Quakers for a long time discouraged the development of the dramatic arts.

  If Americans in the colonial period did not excel in achieving a high level of traditional cultural attainment, they did manage at least to disseminate what culture they had in a manner slightly more equitable than that of most countries of the world. Newspapers and almanacs, though hardly on the same intellectual level as the Encyclopédie produced by the European philosophes, probably had a wider audience than any European cultural medium. The New England colonies, although they did not always manage to keep pace with population growth, pioneered in the field of public education (education). Outside New England, education remained the preserve of those who could afford to send their children to private schools, although the existence of privately supported but tuition-free charity schools and of relatively inexpensive “academies” made it possible for the children of the American middle class to receive at least some education. The principal institutions of higher learning—Harvard (Harvard University) (1636), William and Mary (William and Mary, College of) (1693), Yale (Yale University) (1701), Princeton (Princeton University) (1747), Pennsylvania (a college since 1755), King's College (1754, now Columbia University), Rhode Island College (1764, now Brown University) (Brown University), Queen's College (1766, now Rutgers University (Rutgers, The State University of New Jersey)), and Dartmouth (Dartmouth College) (1769)—served the upper class almost exclusively; and most of them had a close relationship with a particular religious point of view (e.g., Harvard (Harvard University) was a training ground for Congregational ministers, and Princeton (Princeton University) was closely associated with Presbyterianism).

Richard R. Beeman

From a city on a hill to the Great Awakening (Great Awakening)
      The part played by religion in the shaping of the American mind, while sometimes overstated, remains crucial. Over the first century and a half of colonial life, the strong religious impulses present in the original settlements—particularly those in New England—were somewhat secularized and democratized but kept much of their original power.

      When the Pilgrim Fathers signed the Mayflower Compact in 1620, resolving themselves into a “civil body politic,” they were explicitly making religious fellowship the basis of a political community. But even from the start, there were nonmembers of the Leiden Separatist congregation on the passenger list—the “strangers” among the “saints”—and they sought steady expansion of their rights in Plymouth colony until its absorption into Massachusetts in 1691.

      The Puritans were even more determined that their community be, as John Winthrop (Winthrop, John) called it in his founding sermon, “A Model of Christian Charity,” a “city on a hill,” to which all humankind should look for an example of heaven on earth. This theme, in various guises, resounds in every corner of American history. The traditional image of Massachusetts Puritanism is one of repressive authority, but what is overlooked is the consensus among Winthrop and his followers that they should be bound together by love and shared faith, an expectation that left them “free” to do voluntarily what they all agreed was right. It was a kind of elective theocracy for the insiders.

      The theocratic model, however, did not apply to nonmembers of the church, to whom the franchise was not originally extended, and problems soon arose in maintaining membership. Only those who had undergone a personal experience of “conversion” reassuring them of their salvation could be full members of the church and baptize their children. As the first generation died off, however, many of those children could not themselves personally testify to such conversion and so bring their own offspring into the church. They were finally allowed to do so by the Half-Way Covenant of 1662 but did not enjoy all the rights of full membership. Such apparent theological hair-splitting illustrated the power of the colony's expanding and dispersing population. As congregations hived off to different towns and immigration continued to bring in worshippers of other faiths, the rigidity of Puritan doctrine was forced to bend somewhat before the wind.

      Nevertheless, in the first few years of Massachusetts's history, Puritan disagreements over the proper interpretation of doctrine led to schisms, exilings, and the foundation of new colonies. Only in America could dissenters move into neighbouring “wilderness” and start anew, as they did in Rhode Island and Connecticut. So the American experience encouraged religious diversity from the start. Even the grim practice of punishing dissidents such as the Quakers (and “witches”) fell into disuse by the end of the 17th century.

      Toleration was a slow-growing plant, but circumstances sowed its seeds early in the colonial experience. Maryland's founders, thewell-born Catholic Calvert family, extended liberty to their fellow parishioners and other non-Anglicans in the Toleration Act of 1649. Despite the fact that Anglicanism was later established in Maryland, it remained the first locus of American Catholicism, and the first “American” bishop named after the Revolution, John Carroll (Carroll, John), was of English stock. Not until the 19th century would significant immigration from Germany, Ireland, Italy, and Poland provide U.S. Catholicism its own “melting pot.” Pennsylvania was not merely a refuge for the oppressed community who shared William Penn (Penn, William)'s Quaker faith but by design a model “commonwealth” of brotherly love in general. And Georgia was founded by idealistic and religious gentlemen to provide a second chance in the New World for debtors in a setting where both rum and slavery were banned, though neither prohibition lasted long.

      American Protestantism was also diversified by immigration. The arrival of thousands of Germans early in the 18th century brought, especially to western Pennsylvania, islands of German pietism as practiced by Mennonites (Mennonite), Moravians (Moravian church), Schwenkfelders, and others.

      Anabaptists (Anabaptist), also freshly arrived from the German states, broadened the foundations of the Baptist church in the new land. French Huguenots (Huguenot) fleeing fresh persecutions after 1687 (they had already begun arriving in North America in the 1650s) added a Gallic brand of Calvinism to the patchwork quilt of American faith. Jews arrived in what was then Dutch New Amsterdam in 1654 and were granted asylum by the Dutch West India Company, to the dismay of Gov. Peter Stuyvesant (Stuyvesant, Peter), who gloomily foresaw that it would be a precedent for liberality toward Quakers, Lutherans, and “Papists.” By 1763, synagogues had been established in New York, Philadelphia, Newport (R.I.), Savannah (Ga.), and other seaport cities where small Jewish mercantile communities existed.

      Religious life in the American colonies already had a distinctive stamp in the 1740s. Some of its original zeal had cooled as material prosperity increased and the hardships of the founding era faded in memory. But then came a shake-up.

Bernard A. Weisberger
      A series of religious revivals (revivalism) known collectively as the Great Awakening swept over the colonies in the 1730s and '40s. Its impact was first felt in the middle colonies, where Theodore J. Frelinghuysen, a minister of the Dutch Reformed Church, began preaching in the 1720s. In New England in the early 1730s, men such as Jonathan Edwards (Edwards, Jonathan), perhaps the most learned theologian of the 18th century, were responsible for a reawakening of religious fervour. By the late 1740s the movement had extended into the Southern colonies, where itinerant preachers such as Samuel Davies (Davies, Samuel) and George Whitefield (Whitefield, George) exerted considerable influence, particularly in the backcountry.

      The Great Awakening represented a reaction against the increasing secularization of society and against the corporate and materialistic nature of the principal churches of American society. By making conversion the initial step on the road to salvation and by opening up the conversion experience to all who recognized their own sinfulness, the ministers of the Great Awakening, some intentionally and others unwittingly, democratized Calvinist theology. The technique of many of the preachers of the Great Awakening was to inspire in their listeners a fear of the consequences of their sinful lives and a respect for the omnipotence of God. This sense of the ferocity of God was often tempered by the implied promise that a rejection of worldliness and a return to faith would result in a return to grace and an avoidance of the horrible punishments of an angry God. There was a certain contradictory quality about these two strains of Great Awakening theology, however. predestination, one of the principal tenets of the Calvinist theology of most of the ministers of the Great Awakening, was ultimately incompatible with the promise that man could, by a voluntary act of faith, achieve salvation by his own efforts. Furthermore, the call for a return to complete faith and the emphasis on the omnipotence of God was the very antithesis of Enlightenment thought, which called for a greater questioning of faith and a diminishing role for God in the daily affairs of man. On the other hand, Edwards, one of the principal figures of the Great Awakening in America, explicitly drew on the thought of men such as John Locke (Locke, John) and Isaac Newton (Newton, Sir Isaac) in an attempt to make religion rational. Perhaps most important, the evangelical styles of religious worship promoted by the Great Awakening helped make the religious doctrines of many of the insurgent church denominations—particularly those of the Baptists and the Methodists—more accessible to a wider cross section of the American population. This expansion in church membership extended to blacks as well as to those of European descent, and the ritual forms of Evangelical Protestantism possessed features that facilitated the syncretism of African and American forms of religious worship.

Colonial America, England, and the wider world
      The American colonies, though in many ways isolated from the countries of Europe, were nevertheless continually subject to diplomatic and military pressures from abroad. In particular, Spain and France were always nearby, waiting to exploit any signs of British weakness in America in order to increase their commercial and territorial designs on the North American mainland. The Great War for the Empire—or the French and Indian War, as it is known to Americans—was but another round in a century of warfare between the major European powers. First in King William's War (1689–97), then in Queen Anne's War (1702–13), and later in King George's War (1744–48; the American phase of the War of the Austrian Succession (Austrian Succession, War of the)), Englishmen and Frenchmen had vied for control over the Indians, for possession of the territory lying to the north of the North American colonies, for access to the trade in the Northwest, and for commercial superiority in the West Indies. In most of these encounters, France had been aided by Spain. Because of its own holdings immediately south and west of the British colonies and in the Caribbean, Spain realized that it was in its own interest to join with the French in limiting British expansion. The culmination of these struggles came in 1754 with the Great War for the Empire. Whereas previous contests between Great Britain and France in North America had been mostly provincial affairs, with American colonists doing most of the fighting for the British, the Great War for the Empire saw sizable commitments of British troops to America. The strategy of the British under William Pitt (Pitt, William, the Elder) was to allow their ally, Prussia, to carry the brunt of the fighting in Europe and thus free Britain to concentrate its troops in America.

   Despite the fact that they were outnumbered 15 to 1 by the British colonial population in America, the French were nevertheless well equipped to hold their own. They had a larger military organization in America than did the English; their troops were better trained; and they were more successful than the British in forming military alliances with the Indians. The early engagements of the war went to the French; the surrender of George Washington (Washington, George) to a superior French force at Fort Necessity, the annihilation of Gen. Edward Braddock (Braddock, Edward) at the Monongahela River, and French victories at Oswego and Fort William Henry all made it seem as if the war would be a short and unsuccessful one for the British. Even as these defeats took place, however, the British were able to increase their supplies of both men and matériel in America. By 1758, with its strength finally up to a satisfactory level, Britain began to implement its larger strategy, which involved sending a combined land and sea force to gain control of the St. Lawrence and a large land force aimed at Fort Ticonderoga (Ticonderoga) to eliminate French control of Lake Champlain (Champlain, Lake). The first expedition against the French at Ticonderoga was a disaster, as Gen. James Abercrombie (Abercrombie, James) led about 15,000 British and colonial troops in an attack against the French before his forces were adequately prepared. The British assault on Louisburg (Louisbourg), the key to the St. Lawrence, was more successful. In July 1758 Lord Jeffrey Amherst (Amherst, Jeffery Amherst, 1st Baron) led a naval attack in which his troops landed on the shores from small boats, established beachheads, and then captured the fort at Louisburg.

 In 1759, after several months of sporadic fighting, the forces of James Wolfe (Wolfe, James) captured Quebec (Quebec, Battle of) from the French army led by the marquis de Montcalm. This was probably the turning point of the war. By the fall of 1760, the British had taken Montreal, and Britain possessed practical control of all of the North American continent. It took another two years for Britain to defeat its rivals in other parts of the world, but the contest for control of North America had been settled.

      In the Treaty of Paris of 1763 (Paris, Treaty of), Great Britain took possession of all of Canada, East and West Florida, all territory east of the Mississippi in North America, and St. Vincent, Tobago, and Dominica in the Caribbean. At the time, the British victory seemed one of the greatest in its history. The British Empire in North America had been not only secured but also greatly expanded. But in winning the war Britain had dissolved the empire's most potent material adhesives. Conflicts arose as the needs and interests of the British Empire began to differ from those of the American colonies; and the colonies, now economically powerful, culturally distinct, and steadily becoming more independent politically, would ultimately rebel before submitting to the British plan of empire.

Richard R. Beeman

The Native American response
      The other major players in this struggle for control of North America were, of course, the American Indians. Modern historians no longer see the encounters between Native Americans and Europeans through the old lens in which “discoverers of a New World” find a “wilderness” inhabited by “savages.” Instead they see a story of different cultures interacting, with the better-armed Europeans eventually subduing the local population, but not before each side had borrowed practices and techniques from the other and certainly not according to any uniform plan.

  The English significantly differed from the Spanish and French colonizers in North America. Spain's widespread empire in the Southwest relied on scattered garrisons and missions to keep the Indians under control and “usefully” occupied. The French in Canada dealt with “their” Indians essentially as the gatherers of fur, who could therefore be left in de facto possession of vast forest tracts. English colonies, in what would eventually become their strength, came around to encouraging the immigration of an agricultural population that would require the exclusive use of large land areas to cultivate—which would have to be secured from native possessors.

      English colonial officials began by making land purchases, but such transactions worked to the disadvantage of the Indians, to whom the very concept of group or individual “ownership” of natural resources was alien. After a “sale” was concluded with representatives of Indian peoples (who themselves were not always the “proprietors” of what they signed away), the Indians were surprised to learn that they had relinquished their hunting and fishing rights, and settlers assumed an unqualified sovereignty that Native American culture did not recognize.

      In time, conflict was inevitable. In the early days of settlement, Indian-European cooperation could and did take place, as with, for example, the assistance rendered by Squanto to the settlers of Plymouth colony or the semidiplomatic marriage of Virginia's John Rolfe (Rolfe, John) to Pocahontas, the daughter of Powhatan. The Native Americans taught the newcomers techniques of survival in their new environment and in turn were introduced to and quickly adopted metal utensils, European fabrics, and especially firearms. They were less adept in countering two European advantages—the possession of a common written language and a modern system of exchange—so most purchases of Indian lands by colonial officials often turned into thinly disguised landgrabs. William Penn (Penn, William) and Roger Williams (Williams, Roger) made particular efforts to deal fairly with the Native Americans, but they were rare exceptions.

 The impact of Indian involvement in the affairs of the colonists was especially evident in the Franco-British struggle over Canada. For furs the French had depended on the Huron people settled around the Great Lakes, but the Iroquois Confederacy, based in western New York and southern Ontario, succeeded in crushing the Hurons and drove Huron allies such as the Susquehannocks (Susquehannock) and the Delawares (Delaware) southward into Pennsylvania. This action put the British in debt to the Iroquois because it diverted some of the fur trade from French Montreal and Quebec city to British Albany and New York City. European-Indian alliances also affected the way in which Choctaws (Choctaw), influenced by the French in Louisiana, battled with Spanish-supported Apalachees (Apalachee) from Florida and with the Cherokees (Cherokee), who were armed by the British in Georgia.

 The French and Indian War not only strengthened the military experience and self-awa