Health and Disease

Health and Disease
▪ 2009


Food and Drug Safety.
      In 2008 the contamination of infant formula and related dairy products with melamine in China led to widespread health problems in children, including urinary problems and possible renal-tube blockages and kidney stones. According to the World Health Organization (WHO), by late 2008 the contamination had led to four infant deaths and the hospitalization of more than 14,000 infants. Melamine, a chemical compound with many industrial uses, had no approved use in food, but according to health officials, it was sometimes added to foods illegally to inflate their apparent protein content as measured with standard tests. Following inspections conducted by China's national inspection agency, at least 22 dairy manufacturers across the country were found to have melamine in some of their products. The Chinese government responded to the public health crisis by announcing a major shake-up in the dairy industry to improve safety all along the supply chain for dairy products, and it said that it would establish a tracking system to record their flow and delivery.

      Several countries reported finding melamine in exported Chinese dairy products, including liquid milk and frozen yogurt dessert. All of these products had likely been manufactured with ingredients made from melamine-contaminated milk, according to WHO. Although there had been no reports of illness from contaminated Chinese milk products in the United States, in November the U.S. Food and Drug Administration (FDA) ordered that imported Chinese milk products be held at the border until tests proved that they were not contaminated. Recalls of melamine-tainted products occurred in Australia, Britain, Hong Kong, New Zealand, Singapore, and Thailand. American consumers had become aware of the deadly effects of melamine contamination in 2007 when tainted pet food from China killed more than 4,000 dogs and cats in the U.S.

      Contaminated lots of the blood-thinning drug heparin were blamed for having caused allergic-type reactions—such as a drop in blood pressure and shortness of breath—in hundreds of persons in the United States from late 2007 through early 2008. The adverse reactions were initially linked to heparin marketed by Baxter, which recalled its heparin products. The contaminant was subsequently identified as a heparin-like synthetic substance called oversulfated chondroitin sulfate (OSCS), and the FDA urged all American suppliers of heparin products to use sophisticated screening to determine whether their products were free of the contaminant. In April the agency said that OSCS-contaminated heparin had been found in 11 countries, including the U.S., and that it had been traced to a number of Chinese companies that were involved in heparin manufacture. The FDA tallied at least 81 death reports in the U.S. of persons who had been administered heparin of any kind and had experienced an adverse reaction. Many were patients who had been undergoing surgery or had underlying life-threatening conditions, and the specific cause of death was difficult to determine. A study published in The New England Journal of Medicine in late December showed conclusively that OSCS-contaminated heparin caused adverse reactions in patients but did not establish it as a cause of death.

      In late 2007 a campaign that had been launched by WHO four years earlier to bring antiretroviral therapy to HIV-positive people in less-developed countries reached its goal of treating three million individuals. Although the milestone was achieved two years after its targeted date, a report on the initiative noted that improved access to the therapy was expected to proceed at a greater pace.

      In the United States, researchers at the CDC reported that the number of Americans newly infected with HIV each year was, and had long been, higher than previously assumed. This conclusion was the result of an improved calculation method that distinguished recent HIV infections from older ones. Using the new formula, the CDC said that about 56,300 new HIV infections occurred in the United States in 2006, 40% more than the previous estimate of 40,000. In addition, the CDC reported that new diagnoses of HIV infection across 33 states increased by 12% annually between 2001 and 2006 among young gay and bisexual men. The report said that the rise was especially significant among young black men aged 13 to 24 who had sex with men. The annual rate of new HIV diagnoses for this group increased by 15% annually, compared with an increase of 9% and 8% annually among their white and Hispanic peers.

      In another development, scientists found evidence that HIV arose decades earlier than previously believed. According to a study by Michael Worobey, an assistant professor of ecology and evolutionary biology at the University of Arizona in Tucson, and colleagues, HIV began spreading in sub-Saharan Africa between 1884 and 1924, about the time urban centres were established in west-central Africa. Scientists had believed that HIV originated in about 1930. Previous studies had shown that HIV spread to humans from chimpanzees in southeastern Cameroon. Worobey believed that the growth of cities and high-risk behaviours for HIV infection among city dwellers might have been a principal cause of the subsequent spread of the virus.

      The Global Polio Eradication Initiative (GPEI) announced in March that Somalia was polio-free once again. Somalia, which had wiped out the disease in 2002, became reinfected in 2005 by poliovirus that originated in Nigeria. The new eradication effort in Somalia had been particularly challenging because of widespread armed conflict, shifting populations, and the lack of a functioning government infrastructure in the war-torn country. The effort involved more than 10,000 Somali volunteers and health workers, who vaccinated more than 1.8 million children under the age of five.

      Although polio remained endemic in only four countries—Afghanistan, India, Nigeria, and Pakistan—through 2008 the GPEI reported confirmed cases of infection by wild poliovirus in 13 other countries, including 7 countries that had reported no cases in 2007. Health officials reported a resurgence of polio in Nigeria's northern states, where more than 20% of children remained unimmunized. From 2003 to 2006, poliovirus of Nigerian origin spread to 20 countries, with outbreaks that reached as far as Indonesia.

Avian Influenza.
      In April Egypt confirmed its 50th human case of bird flu—in a two-year-old boy. Bangladesh confirmed its first human case in May, and two new cases of human infection were reported in Indonesia in December. Out of a total of 139 human cases in Indonesia since 2004—the highest total reported by any country—113 had been fatal.

      A study published in The New England Journal of Medicine said that scientists had developed a whole-virus bird-flu vaccine, Celvapan. It appeared to be safe and more effective than bird-flu vaccine that was currently approved for human use. The study was conducted by Baxter, Celvapan's manufacturer, and found that 75% of volunteers produced antibodies against the virus after having received a second dose of the vaccine, compared with only 45% in the currently approved vaccine. The study's author said that Celvapan provided protection from several bird-flu virus strains, that it could be produced in less than one-half the time of traditional methods, and that it did not require an additive to boost an immune response. Baxter was seeking approval of the vaccine for use in Europe and the United States.

Other Infectious Diseases.
      A report published in February found that the incidence rate of multidrug-resistant tuberculosis (MDR-TB) was at its highest ever and that extensively drug-resistant tuberculosis, which was considered virtually untreatable, had been recorded in 45 countries. The report, entitled Anti-tuberculosis Drug Resistance in the World, represented the largest survey on the extent of drug resistance in cases of tuberculosis and was based on data collected from 90,000 TB patients in 81 countries from 2002 to 2006. The report also found a link between HIV infection and MDR-TB. Surveys in Latvia and Ukraine found about two times the level of MDR-TB among TB patients with HIV compared with patients who were free of HIV. On the basis of the survey data, WHO estimated that there were about 500,000 new cases of MDR-TB annually—about 5% of the total of 9,000,000 new cases of TB of all types. The highest rate was recorded in Baku, the capital of Azerbaijan, and about one-fourth of all new TB cases in Baku were multidrug-resistant. Although new vaccines could potentially prevent TB, the removal of one strain might allow a previously suppressed strain to succeed, according to research from the University of Bristol, Eng. A vaccination program could therefore result in the proliferation of strains more likely to be, or become, drug resistant.

      In February Paraguay declared a public health emergency following an outbreak of yellow fever. In response, an initial one million yellow-fever vaccination doses were supplied by Brazil, Peru, and other Latin American countries, and an additional two million were supplied by UNICEF. Officials reported at least 66 suspected cases of yellow fever. Of those, 15 were confirmed and 7 people died. Of the suspected cases, 26 were located in urban centres close to Asunción. Meanwhile, Brazil also reported cases of yellow fever in six states.

      The Ministry of Health of Guinea-Bissau battled to control a cholera epidemic that began in May and quickly spread across the country. As of November, 14,129 persons in the country had contracted cholera, and 221 had died. In addition to providing $750,000 in aid, UNICEF assisted with the disinfection of Bissau's water system and traditional wells and coordinated hygiene and public health initiatives. Elsewhere on the African continent, Zimbabwe declared a national emergency in early December in the wake of a cholera epidemic that had resulted in more than 560 deaths since August, according to the United Nations. By year's end, health officials in Zimbabwe had reported more than 1,500 deaths from a total of about 26,000 cases of cholera.

      In May China reported a fast-spreading outbreak of hand, foot, and mouth disease (which was unrelated to the foot-and-mouth disease of livestock). The outbreak killed 22 children and sickened about 4,500 others in Anhui province. All of the fatalities were children younger than six years of age, and most were younger than two. The outbreak was caused by a strong intestinal virus known as enterovirus 71. Symptoms began with a fever and typically included mouth ulcers and blisters on the hands, feet, and buttocks. Most patients recovered in a week without treatment. In severe cases, however, brain swelling led to paralysis or death. There was no vaccine or known cure for the disease.

      As a consequence of using improved computational statistical techniques, WHO reported in September that there were many fewer cases of malaria in the world than had been thought. In its 2008 annual malaria report, WHO said that there were about 250 million cases of malaria in the world annually, with about 880,000 deaths in 2006. Previous official estimates had ranged from 350 million to 500 million cases and more than 1 million deaths. Nevertheless, WHO called for continued aggressive efforts to attack the disease. The number of estimated cases in Africa—where 91% of malaria deaths occurred—had remained relatively unchanged. The number of cases in Asia had been overestimated, however, because they were derived from population and vegetation maps that dated to the 1960s. Since that time, millions of Asians had migrated to cities, and large regions had been deforested, which, according to the report, reduced mosquito habitat. Moreover, countries such as India had grown wealthier and had improved health care and mosquito control.

      Researchers for the first time sequenced the entire genome of a cancer patient—a woman who had acute myelogenous leukemia. In a paper published in November in Nature, the researchers reported that they had found 10 mutations in the woman's cancer-cell DNA compared with the DNA from her normal skin cells and that 8 of the mutations had not been previously linked to her disease. In April the International Cancer Genome Consortium was formed by research organizations from around the world to produce high-quality genomic data on up to 50 types of cancer. Each consortium member planned to conduct a comprehensive high-resolution analysis of the full range of genomic changes in at least one specific type or subtype of cancer, and each analysis was expected to involve specimens from at least 500 patients and to cost an estimated $20 million.

      A study presented in September at the American Society of Clinical Oncology's 2008 Breast Cancer Symposium in Washington, D.C., highlighted progress in making diagnoses. An experimental screening technique known as molecular breast imaging, which used an injected radioactive tracer, detected three times as many breast cancers as conventional scanning techniques in women who had dense breasts and who were at a higher risk of developing the disease.

      In a study published in August in The New England Journal of Medicine, Jane J. Kim and Sue J. Goldie of Harvard University analyzed the cost-effectiveness of vaccination programs for immunizing women against viruses that caused cervical cancer and evaluated the implications of their findings for vaccination guidelines. As the result of the success of clinic trials and subsequent national vaccination programs, within just a few years tens of millions of girls and women had received doses of Gardasil or Cervarix, vaccines that targeted two strains of the human papillomavirus that together caused an estimated 70% of cervical cancers. The authors concluded that the vaccines would be cost-effective if they proved to protect women for a lifetime and if current methods for screening for cervical cancer by means of Pap smears could be safely adjusted to reduce costs. In an accompanying editorial, Charlotte J. Haug, editor of The Journal of the Norwegian Medical Association, observed that many uncertainties remained concerning the vaccines, such as how long the immunity would last or whether eliminating some strains of cancer-causing virus might decrease the body's natural immunity to other strains. She urged that clinical trials and follow-up studies be continued to test unproven assumptions about the two vaccines.

      A study published in October in the Journal of Clinical Oncology found that persons with pancreatic cancer were more likely than those without the disease to have been previously infected with the hepatitis B virus. Lead author James L. Abbruzzese from the M.D. Anderson Cancer Center in Houston noted that although the study had shown an association between hepatitis B and pancreatic cancer, it did not prove a cause and effect. (Hepatitis B was known to cause liver cancer in some patients.) Though uncommon, pancreatic cancer was among the deadliest forms of cancer, and a vaccine existed to prevent hepatitis B.

Cardiovascular Disease.
      The American Academy of Pediatrics in July recommended that some children as young as eight years of age take cholesterol-fighting drugs to ward off potential future heart problems. The academy also recommended low-fat milk for one-year-olds, as well as more cholesterol testing. Stephen Daniels, of the academy's nutrition committee, said that the advice was based on mounting evidence that the cardiovascular damage that leads to heart disease begins early in life. He added that the recommendation for the cholesterol-fighting drugs stemmed from recent research that showed that they were generally safe for children. In general, the drug treatment would be targeted for children at least eight years old who had too much LDL, or “bad,” cholesterol as well as risky conditions such as obesity and high blood pressure. The new recommendation prompted debate among pediatricians, with critics saying that there was no evidence that giving statins to children would prevent heart attacks later in life and that there were no data on the potential side effects of taking the drugs for decades.

      More than 100 heart patients took part in clinical trials to test the effectiveness of the HeartNet Ventricular Support System to stop advanced heart failure. The system used an elastic metallic-alloy mesh that was wrapped around the heart through a minimally invasive implant procedure. According to HeartNet's developer, Paracor Medical, the mesh exerted a mild pressure on the heart and was designed to slow or stop the enlargement of the heart that was associated with heart failure. The HeartNet device was first implanted in 2006.

Alzheimer Disease.
      Research to develop drugs that could cure or halt the progression of Alzheimer disease experienced setbacks during 2008. To date there were only medications to treat the symptoms of the disease, such as memory loss and confusion. During the year, Myriad Genetics announced that Flurizan, a drug that it had developed to treat Alzheimer disease, failed in a late-stage clinical trial. Flurizan was one of the first drugs to reach late-stage testing that worked by trying to prevent the buildup in the brain of toxic amyloid plaques, which had been thought to cause the disease. Moreover, a study published in July in The Lancet said that a once-promising experimental vaccine called AN1792 failed to prevent the progression of Alzheimer disease, even though it cleared amyloid plaques in the brain.

      On the positive side, another study published in The Lancet in July reported that an older drug called dimebon significantly helped Alzheimer symptoms. Rachelle S. Doody, at Baylor College of Medicine in Houston, and colleagues studied the effects of dimebon on 183 patients in Russia with mild to moderate Alzheimer disease. The drug was not being marketed and had been previously used in Russia as an antihistamine. Doody's team found that patients on dimebon had a significant increase in cognitive ability, compared with those who received a placebo. Treated patients also showed improvement in thinking abilities, behavioral symptoms, and daily skills.

      At the International Conference on Alzheimer disease in August, John Ronald, of the University of Western Ontario, and his colleagues reported that they had identified the brain plaques associated with the disease by using magnetic resonance imaging. Previously, Alzheimer disease could be distinguished with certainty from other dementias only by postmortem examination. The imaging advance was expected to make it easier to identify people with the disease and thus start treatment early.

      Public health officials had expressed concern in recent years that some parents, fearful about vaccine safety, were declining to get their children vaccinated, which consequently made the children more likely to catch and spread preventable childhood diseases. In September, however, the CDC reported that in the preceding year record numbers of toddlers in the U.S. had received the vaccinations recommended by the CDC's Advisory Committee on Immunization Practices. According to the CDC, a record 77.4% of children aged 18 months to three years old received the full recommended series of vaccinations, and 90% of children got all but one of the six individual vaccines in the series (the exception was the four doses of the vaccine for diptheria, tetanus, and pertussis [whooping cough]). The report was issued one day before another study was released that concluded there was no link, as had been claimed by critics, between autism and the vaccine for measles, mumps, and rubella.

      The CDC reported significant progress toward the introduction and use of Haemophilus influenzae type b (Hib) vaccine in less-developed countries. The CDC estimated that Hib disease annually caused three million cases of meningitis (swelling of the membranes surrounding the brain and spinal cord) and severe pneumonia and about 386,000 deaths worldwide in children five years of age and younger. Hib vaccines had been widely used in developed countries for almost 20 years but had been relatively unavailable in the world's poorest countries.

       Prostate-cancer specialists reported that the drug finasteride could reduce men's risk of developing the disease by 30%. Finasteride already was used by millions of men to shrink the prostate. As many as 100,000 cases of prostate cancer could be prevented annually by taking the drug, according to Eric Klein of the Cleveland Clinic. The discovery arose from an analysis of a large American study of finasteride. Nevertheless, it was debated whether men should take the drug to prevent prostate cancer, which in a given individual might or might not be dangerous because the cancer was relatively slow-growing and often not lethal. On the one hand, doctors said, many men diagnosed with prostate cancer chose to be treated, which could potentially leave them impotent or incontinent. On the other hand, men who considered taking finasteride would need to weigh the risk of unanticipated side effects that might emerge years later from taking the drug even if they avoided developing prostate problems.

Other Developments.
      For the first time, doctors performed a human trachea (windpipe) transplant by using tissue that was grown from the recipient's own stem cells—a procedure intended to prevent the immune system from rejecting the new organ. Doctors from four European universities performed the surgery in Barcelona on a 30-year-old woman who suffered from a severely collapsed lung owing to tuberculosis. In preparation for the transplant, a donor's trachea was first stripped of cells that would have been rejected when transplanted. Stem cells from the woman's bone marrow were then used to create cartilage and tissue cells to cover and line the trachea. Details of the procedure were published in The Lancet, which reported that the woman did not require immune-suppressing drugs and was doing well months after the surgery. Although the surgery was considered an important advance in stem-cell technology, scientists said that the ability to grow entire organs with stem cells remained only a far-off possibility.

      A newly passed U.S. law required insurance companies to provide equal coverage for mental and physical illnesses. As a result, more than one-third of all Americans were expected to receive better coverage for mental health treatments. Many insurers set higher co-payments and deductibles and stricter limits on treatment for addiction and mental illnesses. The new law would make it easier for people to obtain treatment for a wide range of conditions, including depression, autism, schizophrenia, eating disorders, and alcohol and drug abuse. Federal officials said that the law would improve coverage for 113 million persons, including 82 million in employer-sponsored health plans that were not subject to state regulation. The effective date for most of the plans would be Jan. 1, 2010.

      Taiwanese researchers reported in June that the already high worldwide rate of chronic kidney disease (CKD) was increasing and that because it raised the risk of death, addressing the disease should be a public health priority. The study, which analyzed data from 462,293 persons, found that the 12% of people with CKD were 83% more likely to die from any cause and twice as likely to die from cardiovascular causes, compared with those without CKD. About 40% of deaths in the CKD group occurred before age 65. Of the deaths in the entire study group, 10.3% were attributable to CKD, but this figure increased to 17.5% among people with low socioeconomic status. The researchers also found that people who regularly used Chinese herbal medicines had a 20% increased risk of developing CKD. The study was published in The Lancet.

      Rates of childhood obesity, which had been rising for more than two decades, appeared to have hit a plateau in 2006. The finding, based on data gathered from 1999 to 2006 by the CDC, was published in May in the Journal of the American Medical Association. The study said that it was unclear whether the slowdown in childhood weight gain was permanent or the short-term result of public antiobesity efforts such as curbing junk food and increasing physical activity in schools. Even if the trend held, 32% of American schoolchildren remained overweight or obese, doctors noted. The data came from thousands of children who had taken part in the National Health and Nutrition Examination Surveys, which had been compiled by the National Center for Health Statistics at the CDC since the 1960s. The plateau followed years of weight gain among American children. In 1980, 6.5% of children from ages 6 to 11 were obese, but by 1994 that number had climbed to 11.3%. The rate had jumped to 16.3% by 2002 and in 2006 had stabilized at about 17%.

Kevin Davis

▪ 2008

Obesity and diabetes increased to epidemic proportions, and deadly staph infections and counterfeit drugs raised serious concerns. International efforts against several endemic diseases made headway, and new developments were reported in stem-cell research and in finding the genetic basis of certain disorders.

Global Warming.
      Along with shifting climate patterns, scientists were concerned that climate change would contribute to the spread of infectious diseases. In 2007 the Intergovernmental Panel on Climate Change released a report that predicted that global warming could create unprecedented health risks, including deadly heat waves, droughts, rising sea levels, and fierce storms. Flooding and drought could lead to contaminated water supplies, which in turn could result in the spread of infectious waterborne diseases. The report warned of the possible spread of mosquitoborne illnesses such as malaria, dengue fever, yellow fever, and encephalitis. Some mosquitoborne diseases already were spreading beyond their normal ranges as mosquitoes moved to higher altitudes and into areas that were once too cold for them to survive.

Child Mortality.
 In 2007 the estimated worldwide number of deaths of children under five years of age fell below 10 million annually—to 9.7 million—for the first time since record keeping began in 1960. UNICEF attributed the decrease to widespread campaigns against measles and malaria, promotion of breastfeeding over bottle-feeding (which was a potential source of contaminated water), and the economic improvements experienced in many countries (with the exception of countries in Africa). The agency reported that vaccination drives had helped reduce measles deaths and that more babies were sleeping under mosquito nets, which protected against malaria and other mosquitoborne illnesses. In 1960 about 20 million children died before the age of five, and the increase in population since then underlined the significance of the reduction in childhood deaths. If young children were still dying at 1960 rates, 25 million would have succumbed in 2007. According to UNICEF, the most rapid improvements had been seen in Latin America and the Caribbean, Central and Eastern Europe, eastern Asia, and Oceania. The situation worsened, however, in countries of southern Africa that had been hit hard by AIDS and in countries such as the Democratic Republic of the Congo and Sierra Leone, which had been ravaged by war. The highest rates of child mortality were reported in western and central Africa, where more than 150 of every 1,000 children born were expected to die before the age of five.

      A report published by the World Health Organization (WHO), the Joint UN Programme on HIV/AIDS, and UNICEF said that the number of persons in low- and middle-income countries who were receiving treatment in 2007 had increased to 2 million from 1.3 million over the past year. Although the news was promising, the figure was still far short of the 3 million WHO had hoped would have access to such drugs by the end of 2005 and corresponded to only 28% of the 7.1 million people with advanced AIDS in low- and middle-income countries. In sub-Saharan Africa the rate of infection continued to be high, and for every person in this region who received anti-HIV drugs, another five were newly infected. According to a report released in June by the Global HIV Prevention Working Group, if current trends continued, the region (which had about 25 million infected persons) would face 36 million new infections by 2015.

      While access to HIV/AIDS drugs slowly improved, researchers developed new HIV/AIDS drugs. Two new drugs, each representing a new class of anti-HIV medication, were reported to be safe and effective and would add to the four classes already available to HIV/AIDS patients. One of the drugs, maraviroc, worked by blocking a protein used by HIV to enter human immune-system cells. Maraviroc was developed by Pfizer and received approval from the U.S. Food and Drug Administration in August. The drug would be used to treat people with advanced HIV/AIDS who had not responded to other drugs. The other drug, raltegravir (formerly known as MK-0518), was developed by Merck and received FDA approval in October. The drug worked by blocking the HIV enzyme integrase, one of three enzymes that HIV needed in order to replicate in the body. According to the company, the integrase inhibitor would prevent HIV from inserting its genes into uninfected DNA. The company said that the drug was safe and effective for patients who had multidrug-resistant HIV.

Staph Infection.
 A study led by R. Monina Klevens of the Centers for Disease Control and Prevention (CDC) and published in October in the Journal of the American Medical Association sparked concern about the prevalence of serious infections caused by methicillin-resistant Staphylococcus aureus, or MRSA, a type of staph bacteria that was resistant not only to the antibiotic methicillin but also to other antibiotics. Most MSRA infections occurred in hospitals and other health care settings, and they could be invasive and potentially deadly when a wound or medical procedure provided a port of entry for the bacteria into the body. Noninvasive MSRA infections typically occurred as a mild, treatable infection of the skin. The study found that invasive MRSA infections were more common, both in and out of hospitals, than health experts had thought; the study estimated that in the U.S. in 2005 there had been about 94,000 cases of invasive MRSA, which had resulted in about 19,000 deaths. Another study by CDC researchers indicated that staph infections—mainly minor skin and soft-tissue infections—were responsible for an estimated 12 million outpatient visits annually in the U.S. and that the percentage of staph infections caused by MRSA was growing.

      The increased awareness of MRSA that resulted from these studies, together with the reported deaths of a number of school-age children who had contracted MRSA, helped spark a health scare during which many U.S. schools closed for disinfection. Health officials stressed that good hygiene, such as washing hands with soap and water, not sharing towels and other personal items, and keeping cuts and other wounds bandaged, greatly reduced the risk of infection by the bacteria.

      Although new cases of polio appeared in 2007 in Chad and Myanmar (Burma), which had been free of cases the year before, the total number of new cases of polio around the world declined significantly as the work of the Global Polio Eradication Initiative, begun in 1988, continued. As of late December, 1,083 cases had been reported, compared with 1,997 cases for 2006. More than one-third of the 17 countries, including Kenya and Indonesia, that had reported cases of polio in 2006 did not report new cases in 2007. A campaign was carried out to provide polio vaccinations in some of the world's most troubled and dangerous regions. In Afghanistan nearly 7.3 million children were vaccinated in April. In September health workers in Iraq began an effort to vaccinate 4.8 million children, and in the northern area of The Sudan a vaccination effort that was to reach about 5 million children was begun in August.

Avian Influenza.
      The H5N1 strain of avian influenza (bird flu) had infected poultry throughout much of Southeast Asia, central Asia, Africa, and Europe. Millions of birds had been destroyed in an effort to stop its spread. The disease could be transmitted to humans in close contact with infected birds, and since 2005 more than 100 persons had died worldwide from H5N1 infection. The virus did not have the ability to be readily transmitted between humans, but health officials were concerned that the virus could acquire such an ability and—because humans had no immunity to bird flu—cause a pandemic with the potential for causing millions of deaths.

      In January WHO reported that H5N1 viruses with resistance to the antiviral drug oseltamivir had been isolated from two family members in Egypt. WHO called the development potentially dangerous because oseltamivir, commonly sold under the name Tamiflu, was the chief weapon against H5N1. The resistant viruses did not spread to anyone else.

      FDA officials announced in April the approval of the first bird-flu vaccine for humans, although the vaccine had to be given in a high dose and was only about 50% effective in clinical trials. Despite the vaccine's limitations, the U.S. government planned to buy several million doses as part of the country's strategic national stockpile of medicine, which was maintained by the CDC.

Other Infectious Diseases.
 In March WHO reported that the worldwide incidence of tuberculosis (TB) had leveled off for the first time since 1993, when the organization had declared a tuberculosis emergency. According to WHO, the percentage of the world's population with TB peaked in 2004, and the total number of cases in 2005 (the latest year for which statistics were available) was 8.79 million, up about 70,000 from 2004. At the same time, WHO officials expressed concern that the spread of drug-resistant TB strains could reverse the progress made against the disease. In South Africa the AIDS epidemic had led to an increase in TB cases, including drug-resistant strains. Of the 343,000 cases reported there in 2006, an estimated 6,000 were multidrug resistant, and in one outbreak in that year, extensively drug-resistant tuberculosis (XDR-TB), which did not respond to either the best first- or second-line tuberculosis drugs, killed 52 of its 53 victims who were infected with HIV.

      WHO reported in August that dengue fever was spreading across Southeast Asia and warned that the region might face the worst outbreak of the disease in about a decade. The mosquitoborne disease infected about 25,000 people in Cambodia and killed nearly 300 children under the age of 15. WHO reported that the number was three times the total cases for all of 2005. Dengue fever, a severe flulike illness, affected infants, young children, and adults. It seldom caused death, though dengue hemorrhagic fever was a potentially deadly complication.

      In September, after an absence of two years, the Ebola virus reappeared in the Democratic Republic of the Congo. The highly contagious disease, one of the world's deadliest pathogens, killed 50–90% of those it infected. WHO reported on October 3 that out of 76 suspected cases, there were 25 confirmed. WHO later reported an Ebola outbreak in western Uganda, which by December 7 had 93 suspected cases. They included 22 fatalities, 4 of whom were health care workers. A new species of Ebola virus was identified in 9 of the cases.

      Researchers reported in the British medical journal The Lancet that there was a rapidly growing epidemic of syphilis in China, where that sexually transmitted disease had been almost eliminated from 1960 to 1980. The researchers found that the incidence of syphilis increased from under 0.2 to 6.5 cases per 100,000 persons in the period 1993–99 and that congenital syphilis increased from 0.01 to 19.68 cases per 100,000 live births in the period 1991–2005. A coauthor of the report, Myron S. Cohen, director of the Center for Infectious Diseases at the University of North Carolina at Chapel Hill School of Medicine, said that the data demonstrated “a syphilis epidemic of such scope and magnitude that it will require terrific effort to intervene.”

      In early 2007 WHO and UNICEF announced that the number of deaths from measles had been reduced 60% worldwide from 1999 to 2005, when there were an estimated 345,000 measles deaths. The greatest success was in Africa, where measles deaths fell by 75%. During the period 1999–2005, global measles-immunization coverage with the first routine dose increased from 71% to 77%, and more than 360 million children from 9 months to 15 years old had received measles vaccine through immunization campaigns.

      In what U.S. health officials called promising news, a report by the CDC and major U.S. cancer organizations released in October found that national cancer death rates had fallen by 2.1% each year from 2002 through 2004. The drop was about double the 1.1% annual decline from 1993 through 2002. There was a decline in the death rates of most of the top 15 cancers, including lung, prostate, and colorectal cancers in men and colorectal and breast cancer in women. The death rate from lung cancer among women continued to increase but at a slower rate. Incidence rates for all cancers decreased slightly from 1992 through 2004 after having increased between 1975 and 1992.

      Controversy surrounded the use of a vaccine, Gardasil, that helped prevent cervical cancer, the second most common cancer in women worldwide. Gardasil was approved by the FDA in 2006 and protected against four types of sexually transmitted human papillomavirus (HPV), including two types that had been identified as the cause of most cases of cervical cancer. The CDC's Advisory Committee on Immunization Practices recommended the use of Gardasil for girls aged 11 and 12 and for females aged 13 to 26 who had not yet been immunized. In the U.S., lawmakers in many states debated whether to require girls who entered sixth grade to be vaccinated with Gardasil. At issue was whether it was ethical or cost-effective to mandate a vaccine for a disease that was transmitted sexually. In September the European Union approved the sale of Cervarix, another vaccine against certain types of HPV. The approval allowed doctors in EU countries to prescribe the vaccine to females aged 10 to 25. An FDA decision whether to approve Cervarix was expected in 2008.

Cardiovascular Disease.
      Researchers found that South Asians typically suffered heart attacks nearly 6 years earlier than their counterparts from other regions and that they typically died from cardiovascular disease 5 to 10 years earlier. The difference was attributed to a higher prevalence of risk factors among South Asians, including smoking, a history of diabetes, hypertension, depression, and stress. The study, conducted by the Government Medical College in Nagpur, India, and published in the Journal of the American Medical Association, determined that the average age for a first heart attack in South Asian countries was 53, compared with 58.8 in other countries. The researchers also noted that South Asians were less likely to adopt lifestyle habits that helped protect against heart attack, including daily consumption of fruits and vegetables and leisure-time physical activity.

      German researchers from the Institute for Medical Informatics, Biometry and Epidemiology at the University of Duisberg-Essen, Ger., found a relationship between living close to heavy traffic—such as near a heavily traveled street or highway—and atherosclerosis, or hardening of the arteries, which was a risk factor for heart attack or stroke. The researchers attributed the effect to vehicular sources of air pollution in the form of particulates that had previously been linked to heart attacks and strokes.

      A number of experimental technologies to treat cardiovascular problems were introduced in 2007. They included an implantable device that stimulated the body's cardiovascular regulatory system to control high blood pressure, a computer that automated balloon inflation during angioplasty, and a microcapsule that could be tracked with X-rays to simplify the delivery of stem cells to tissues that needed new blood vessels. The devices were introduced at the American College of Cardiology's Innovation in Intervention: i2 Summit 2007 in March.

      In the United States, where more than one in three children and adolescents was overweight or obese, the Robert Wood Johnson Foundation, an American philanthropic organization, launched an unprecedented effort to reverse the childhood obesity epidemic. It announced plans to spend $500 million during the next five years on public-health efforts to curb childhood obesity. The money would fund research and programs to improve nutrition and physical activity in schools and would help provide better access to healthy foods in poor and underserved communities.

      Growing numbers of persons in parts of the world where obesity was once rare were also gaining excessive weight. According to WHO—which considered global obesity an epidemic—on the basis of 2005 data, the most recent available, about 1.6 billion adults worldwide were overweight and 400 million were obese. WHO had also found that obesity rates in Europe had tripled in recent years. An American epidemiologist reported in September that in China the obesity rate for men and women had jumped from less than 1% in 1990 to more than 20% and that for Mexican women obesity rates reached 71%. In October French researchers reported that worldwide, 40% of men and 30% of women were overweight and that 24% of men and 27% of women were obese. The cause for obesity in all countries occurred in similar patterns—diets rich in sweeteners and saturated fats, lack of exercise, and the availability of inexpensive processed foods.

       Malaria remained the greatest threat to children in Africa, particularly in the sub-Saharan region of the continent, and in 2007 at least nine vaccines were in development. One vaccine, being developed by GlaxoSmithKline and tentatively named Mosquirix, reduced the risk of infection from malaria by 65% after the course of three shots and was shown to offer protection to infants under one year of age.

      Health officials believed that a new meningitis vaccine that was being used for West African children would make it possible to eliminate the meningococcal epidemics that had afflicted the continent for more than 100 years. The Meningitis Vaccine Project (MVP) reported in June that preliminary results of a phase-two vaccine trial showed that the vaccine was safe and could slash the incidence of epidemics in the “meningitis belt,” which extended through 21 countries of sub-Saharan Africa. MVP (a partnership between WHO and Program for Appropriate Technology in Health, a U.S.-based nonprofit organization) was working with an Indian firm to produce the new vaccine against serogroup A Neisseria meningitidis (meningococcus). The vaccine was expected to block infection and extend protection to the entire population, including those who had not been vaccinated.

      Researchers who were trying to develop a vaccine to treat Alzheimer disease had hit several roadblocks in recent years but now believed that they were moving forward. In a study with mice, American scientists found that a transdermal, or skin-patch, vaccine was safe and effective in clearing away brain plaques that were associated with the disease. The vaccine worked by stimulating the immune system to act against beta-amyloid, the protein that accumulates in the brain plaques. The results of the study indicated that the side effects that had plagued a previous human vaccine could be potentially eliminated. In an earlier clinical trial, a small percentage of study participants developed brain inflammation as an autoimmune response and died.

      An inexpensive antimalarial pill, developed through a multinational collaboration of universities and pharmaceutical companies, was introduced in March. The medicine, called ASAQ, was to cost less than $1 for adults and less than 50 cents for children. The medicine combined amodiaquine with artemisinin, which was derived from sweet wormwood. Doctors believed that combining the two antimalarial drugs would reduce the possibility that resistance to either drug would develop.

 The sale and distribution of counterfeit drugs reached crisis proportions in Asia in 2007, and experts reported that the problem was growing worldwide. Counterfeiters appeared to target antimalarial medications—artemisinin, in particular—though fake antibiotics and other counterfeit drugs were also reported. In some cases fake antimalarial drugs contained inert substances such as chalk or starch, but in other cases they contained potentially dangerous drugs. WHO estimated that the number of avoidable malaria deaths that resulted from the inadvertent use of counterfeit drugs ranged from tens of thousands to more than 200,000 every year. In China, which was believed to be the source of most of the world's fake drugs, the former chief of China's food and drug administration was executed in July for having taken $850,000 from pharmaceutical companies and having approved fake drugs.

Stem Cells.
      Stem-cell research took a promising new turn in 2007 when two separate research teams, one based in Japan and the other in the United States, reported that they had been able to turn human skin cells into cells similar to embryonic stem cells. The development could have far-reaching implications, because the process of acquiring embryonic stem cells involved destroying embryos and had therefore been at the centre of a long-standing controversy. Supporters for embryonic stem-cell research argued that the potential to cure disorders such as diabetes and Alzheimer disease made the research worthwhile, whereas opponents considered the destruction of embryos to be unethical. In the new research, both groups of scientists added four master regulatory genes to the skin cells. (Each group used only two of the same genes.) The genes reprogrammed the skin cells to have characteristics of a pluripotent stem cell. Such a cell had the potential to develop into the more than 200 types of human cells that constituted the body's tissues and organs. The induced pluripotent stem cells required further study and evaluation, but the researchers said that they would be in a position to create patient- and disease-specific stem cells without using human eggs or embryos. Such cells could help scientists understand disease mechanisms and aid in the search for safe and effective drugs.

Genome Research.
      Ongoing research into the human genome was helping to pinpoint the causes of various diseases and eventually could lead to new drugs and treatments. The findings were part of a continuing wave of discoveries made by means of DNA microarrays, or chips, which could quickly read the sequence of human DNA at up to 500,000 points across an individual's genome. In an approach called whole-genome association, scientists were using the technology to compare the genomes of large numbers of patients with those of healthy individuals to identify differences that might be associated with disease.

      In June scientists in Britain reported that with whole-genome association they had detected DNA variations that underlay seven common diseases. Their work revealed the genetics of bipolar disorder, coronary artery disease, Crohn disease, hypertension, rheumatoid arthritis, and type 1 and type 2 diabetes. Similarly, researchers in Iceland and Sweden discovered the genetic basis for a major type of glaucoma, a leading cause of blindness, and two independent research teams in Germany and Iceland identified three variant sites on the human genome that predisposed people to restless legs syndrome—a condition characterized by an urge to move the legs, typically when at rest. In addition, researchers in May reported finding six variant sites on the genome that increased the risk of breast cancer. The discovery added to already-identified genes and accounted for most of the overall genetic risk of breast cancer.

Other Developments.
      In 2007 a variety of products manufactured in China were found to be tainted or were recalled because of health and safety concerns. The incidents raised questions about product safety regulations and enforcement in China. In May diethylene glycol, a poisonous industrial solvent that was sometimes used in antifreeze, was found in Chinese-made toothpaste in Panama, and officials in several other countries also discovered and seized Chinese-made toothpaste that contained diethylene glycol. No deaths were reported, unlike the previous year when at least 50 persons in Panama had died from cold medicine that had been contaminated by diethylene glycol from China that had been labeled as glycerin. Chinese regulators claimed that diethylene glycol in small amounts was safe and that the toothpaste was meant to be spit out and not consumed. In July, however, China prohibited manufacturers from using the chemical in toothpaste, and in October the Chinese government said that it had arrested 774 people during a two-month period as part of a crackdown on the production and sale of tainted food, drugs, and agricultural products.

      The recalls of Chinese products included millions of toys that were decorated with paint that contained lead—a toxic metal when ingested. In response, China signed an agreement to prohibit the use of paint containing lead on toys that were exported to the United States. The agreement was announced in September at the second U.S.-China meeting on consumer product safety. The Chinese government also vowed to step up safety efforts by increasing inspections of goods headed for export and by investigating companies that were suspected of violating the law. After one of the largest pet-food recalls in U.S. history, China gave U.S. regulators permission to enter the country to investigate whether suppliers exported contaminated pet-food ingredients to the United States. Previously, FDA representatives had been blocked from entering China. FDA officials said that there was evidence that tainted pet food from China had killed at least 16 cats and dogs in the U.S. and sickened thousands of other animals. They believed that the source originated with Chinese exporters of wheat gluten and other animal-feed ingredients.

Kevin Davis

▪ 2007

Well over one million people in less-developed countries were on HIV/AIDS antiretroviral drug therapy, and routine HIV testing was recommended in the U.S. In an unusual case, bird flu spread person to person in Indonesia. A study suggested that most lung cancers could be caught early and cured.

      June 5, 2006, marked the 25th anniversary of the first published report of an unknown deadly infectious disease that had sickened five previously healthy young men in Los Angeles. The disease—acquired immunodeficiency syndrome (AIDS), which is caused by the human immunodeficiency virus (HIV)—soon grew into a global pandemic, and in the quarter century since that report, HIV had infected 65 million people worldwide and killed 25 million. Nevertheless, stunning progress had been made in understanding, preventing, and treating HIV infections. A study published in November 2006 indicated that a person diagnosed with HIV in the United States could expect to live an average of 24 years with treatment. The pandemic continued to wreak havoc, however, particularly in poor countries. Indeed, the vast majority of the 4.3 million new HIV infections and the 2.9 million AIDS deaths in 2006 were in people living in less-developed countries.

      An end-of-the-year report by the Joint United Nations Programme on HIV/AIDS ( UNAIDS) and the World Health Organization (WHO) painted a detailed picture of the epidemic. The number of people who had been infected with HIV was growing in every region of the world. The most dramatic increases were seen in Eastern Europe and Central Asia, where the numbers of newly infected people in 2006 were almost 70% higher than in 2004. Of the 37.2 million HIV-infected adults, just under one-half were women. In sub-Saharan Africa, which continued to be the region most devastated by HIV/AIDS (with about 63% of the world's HIV-infected population), the ratio of infected women to men was 14 to 10.

      The UNAIDS/WHO report described “a global revolution in the delivery of complex therapy in resource-limited settings,” although only 24% of people infected with HIV and in need of treatment had access to it. From 2001 through 2005, the number of people on antiretroviral drugs in low- and middle-income countries increased from 240,000 to 1,300,000, and the number of health care sites that provided antiretroviral drugs grew from 500 to more than 5,000. Expanded access to treatment was estimated to have averted 250,000–350,000 AIDS deaths between 2003 and 2005. Universal access to treatment remained an important goal, but many public health leaders warned that treatment without prevention could never be sustained.

      Among the most promising prevention technologies on the horizon were microbicides that could be applied inside the vagina or rectum to prevent sexual transmission of HIV. Five products had passed safety tests and were in large-scale clinical trials to evaluate their effectiveness and many others were in earlier-stage trials or under development. Having a prevention method that women could use without a partner's participation had become a high priority. A keynote speaker at a conference on microbicides in Cape Town in April explained, “Asking women to simply abstain, be faithful, or use condoms is not practical. Nor is it enough—especially when UNAIDS reports that 75% of new infections are acquired from a spouse or regular partner…. Marriage, or being in what a woman thinks is a monogamous, faithful relationship, is sadly one of the biggest HIV risk factors for many young African women.”

      Another prevention approach that was being explored was male circumcision. Numerous studies had shown an inverse correlation between rates of male circumcision and rates of HIV infection. In West Africa, where circumcision was common, the prevalence of sexually transmitted HIV infection was low. In southern Africa, where circumcision was not common, the reverse was true. In India uncircumcised men had a sevenfold higher incidence of HIV infection than circumcised men. Biology largely accounted for these differences—the tissue of the internal foreskin contains cells that are specific targets for HIV, and removal of the foreskin substantially lowers men's susceptibility. A medical trial in South Africa that involved more than 3,000 male volunteers began in 2004 and was stopped in 2005 when it became clear that circumcision reduced sexual transmission of HIV from women to men by 60%. Using data from that trial, an international team of scientists estimated that in sub-Saharan Africa male circumcision could prevent six million new infections and save three million lives over 20 years.

      The U.S. Centers for Disease Control and Prevention (CDC) issued significantly revised HIV-testing recommendations that took effect in September. Specifically, the health agency recommended HIV testing in the United States for everyone aged 13–64 as a part of routine health care. It also specified that prevention counseling and written consent at the time of a test should no longer be required; surveys suggested that for many people those previous requirements were deterrents to getting tested. CDC officials believed that the new recommendations would, among other goals, reduce the stigma associated with HIV testing and enable people who learned that they were infected to take steps to prevent their infecting others.

      The Global Polio Eradication Initiative, begun in 1988 (when about 350,000 people in 125 countries had the crippling viral disease), did not meet its revised goal of ridding the world of polio by the end of 2005. Nonetheless, in October 2005 an advisory committee reaffirmed the feasibility of eradication “in the near future.” In four countries—Nigeria, India, Afghanistan, and Pakistan—the chain of polio transmission had yet to be entirely broken. Nigeria recorded 1,062 new cases of polio through mid-November 2006, compared with a total of 830 in 2005, and India had 624 new cases through late November 2006, compared with a total of 66 in 2005. An outbreak in western Uttar Pradesh state spread to many previously polio-free areas within India and to four formerly polio-free countries: Bangladesh, Nepal, Angola, and Namibia. Afghanistan had been on the verge of eradication in 2005, when 9 polio cases were recorded; in 2006, the number exceeded 30. Vaccination efforts had been compromised in Afghanistan's violence-ridden south. At the end of the year, WHO and UNICEF appealed to both government and anti-government forces to agree upon “Days of Tranquility” so that polio vaccinators could safely reach every child.

      The mosquitoborne tropical disease malaria continued to cause enormous human suffering in many parts of the world. Each year hundreds of millions of people suffered from malaria's fever, chills, and flulike symptoms, and more than one million died; children in sub-Saharan Africa were by far the most vulnerable. One of the most effective means of controlling malaria was the use of the insecticide dichlorodiphenyltrichloroethane (DDT) in indoor residual spraying (the spraying of the walls and ceilings of houses with the insecticide, which in residual amounts continues to kill mosquitoes that land on the sprayed surfaces up to several months later). Concerns that DDT endangered wildlife, the environment, and human health (concerns that stemmed mainly from the chemical's once widespread use in agriculture) had led to the banning of DDT in many countries, including the U.S. in 1972. WHO, which had long supported the ban on DDT, reversed its position in 2006 and recommended the use of the chemical as a principal tool in the ongoing war against malaria. WHO had found that DDT presented no health risk when used properly and that in epidemic areas where DDT had been reintroduced, it had reduced malaria transmission by up to 90%. In its 2007 budget the U.S. Agency for International Development allotted $20 million to support indoor spraying with DDT. “DDT specifically has an advantage over other insecticides when long persistence is needed on porous surfaces, such as unpainted mud walls, which are found in many African communities, particularly in rural or semi-urban areas,” the agency pointed out.

      In the late 1990s cases of multidrug-resistant tuberculosis (MDR-TB) that was resistant to the first-line drugs isoniazid and rifampicin emerged in much of the world. Such cases required treatment with second-line drugs, which were more costly, more likely to cause adverse effects, and generally less effective than the first-line medications. By 2006, according to a CDC/WHO survey, 20% of TB isolates from 48 countries were MDR-TB. In March 2006 the CDC published the first comprehensive data on “extensively drug-resistant TB” (XDR-TB)—cases that were resistant to at least two first-line drugs and three or more of the six classes of second-line drugs.

      Hospitals in two South African provinces reported more than 70 deaths from XDR-TB between January 2005 and October 2006. The majority of the cases were in persons with AIDS. An infectious disease specialist working at one of the hospitals called the XDR-TB/AIDS problem “a potential time bomb.” Although TB was at an all-time low in the United States, San Francisco, which had the country's highest rate, was seeing virtually untreatable XDR-TB cases. Some patients who had not responded to any tuberculosis drugs had to undergo surgery to remove part of a lung. A TB expert in the city noted, “It's really turned back the time to [the] pre-antibiotic era.”

Avian Influenza.
      By the end of 2006, millions of birds across much of the globe had died or been destroyed as a result of outbreaks of avian influenza (bird flu) caused by the lethal strain of influenza A known as H5N1. Although H5N1 remained mainly a bird virus, the cumulative number of laboratory-confirmed human cases since late 2003, when the virus began spreading across Asia, was 263, about 60% of which had been fatal. (See Map.) Each new human case increased concerns that the virus might be gaining the ability to spread among people—a dreaded development with the potential to set off a global pandemic.

       Vietnam, the country in which bird flu had taken the greatest human toll through 2005—93 cases, 42 deaths—reported neither human cases nor poultry outbreaks in 2006, which public health officials viewed as evidence that aggressive measures, such as killing infected flocks, inoculating healthy ones, and educating farmers about protecting themselves, had worked. In Indonesia, however, bird flu devastated poultry in 2006 and infected 55 people, of whom 45 died. In May WHO investigators focused on a cluster of cases among members of an extended family on the island of Sumatra. The initial case was a woman who kept chickens at home, and although no viral samples were taken from the chickens or the woman, investigators concluded that she had likely contracted the H5N1 virus from the chickens. Seven additional family members became infected, and only one of them survived. Nevertheless, WHO investigators did not believe this instance of person-to-person infection was cause for excessive alarm, because despite multiple opportunities for the virus to have spread to more family members and to health care workers, it had not.

      Researchers in Wisconsin and The Netherlands discovered why the H5N1 virus was not spreading easily among people. They found that unlike seasonal flu viruses, which lodge in the upper respiratory tract and are spread by coughs and sneezes, the H5N1 virus attaches itself to lung cells deep in the respiratory tract, from which viral particles cannot readily be expelled. British scientists studying the H5N1 virus in Vietnam found that once the virus is in the respiratory tract it reproduces rapidly and causes patients to drown in the fluid produced in their own lungs. The scientists also determined that treatment with antiviral drugs within the first 48 hours of infection has the potential to suppress the virus and that drugs given any later are unlikely to prevent a patient's rapid decline to death.

Foodborne Illness.
      In the U.S. during September and October, about 200 people scattered over 26 states became ill after eating spinach that was contaminated with the O157:H7 strain of Escherichia coli. One-half of the patients became sick enough to be hospitalized. The typical symptoms—severe bloody diarrhea and abdominal cramps—developed within three to four days of eating the contaminated spinach. About 16% of the patients developed hemolytic uremic syndrome, a type of kidney failure that required treatment with blood transfusions and dialysis. Three people (two elderly women and a two-year-old child) died. The outbreak was controlled through a nationwide ban on spinach and a recall of spinach products grown in three central California counties. Ultimately, field investigators in California found a strain of E. coli in cattle feces that was identical to the bacterium in the tainted spinach, but the precise method of contamination was unknown.

      In November and December another E. coli outbreak sickened about 70 persons who had eaten contaminated food at Taco Bell restaurants, mainly in four northeastern states. No fatalities were recorded, but about three-quarters of those who became ill were hospitalized. The O157:H7 strain was again responsible, and the contaminated food—originally thought to have been green onions—was later believed to have been lettuce.

 In June the U.S. Food and Drug Administration (FDA) licensed the vaccine Gardasil against four types of human papillomavirus (HPV)—6, 11, 16, and 18. The vaccine, developed with the help of research by Australian immunologist Ian Frazer (see Biographies), was expected to have a substantial impact on the health of women worldwide. HPV types 16 and 18 were responsible for 70% of cervical cancers and types 6 and 11 for 90% of sexually transmitted genital warts. Cervical cancer was the second most common cancer in women worldwide, with about 500,000 new cases and more than 200,000 deaths occurring each year. Gardasil was approved for use by girls and women aged 9 to 26. Three injections—ideally given to 11- and 12-year-olds over a period of six months—were recommended. In clinical trials the vaccine was almost 100% effective in preventing precancerous cervical lesions. Another HPV vaccine, Ceravix, was being reviewed for approval in the European Union.

      In May the FDA licensed the first vaccine against shingles, an often-painful nerve-cell infection characterized by a blistering rash. Shingles is caused by reactivation of the herpes zoster virus, which causes chickenpox; anyone who has had chickenpox is at risk for shingles. The vaccine, Zostavax, which was meant for people aged 60 and older, was a stronger version of the pediatric chickenpox vaccine and could lessen the likelihood of an outbreak or reduce the severity of one if it occurred.

Pharmaceuticals and Medical Devices.
      “No American should have to cut pills in half, decide between taking medicine and putting food on the table, or go without medicines altogether,” said Wal-Mart CEO Lee Scott about the groundbreaking program his company, the largest retailer in the United States, launched in September. First in Florida and subsequently in all states but North Dakota, the company offered more than 300 generic versions of prescription drugs at a cost of $4 for a month's supply. Shortly after Wal-Mart launched its program, Target, the sixth largest U.S. retailer, also offered $4 generic drugs at about 1,200 stores in 46 states.

      Severely depressed people who had not responded to at least two antidepressant medications benefited from a single low-dose injection of ketamine, a drug that was developed in the early 1960s and first used in the Vietnam War as a battlefield anesthetic. (It had also been used as a recreational drug that produced hallucinations and out-of-body experiences.) Because existing antidepressants took four to eight weeks to relieve depression, researchers had been seeking faster-acting medications. In a study of 18 treatment-resistant patients, depression improved within 24 hours in 12 patients, and 5 patients were nearly symptom-free. In two patients the effects lasted for two weeks. Ketamine acts on different brain receptors from the ones affected by existing antidepressants. Owing to its side effects and abuse potential, however, ketamine was not considered appropriate for the treatment of depression outside controlled research settings; the researchers' goal was to find substances that affected the same brain pathways and chemicals that ketamine did.

 The FDA approved the first insulin delivered by inhalation for people with type 1 or type 2 diabetes. The product, Exubera, was a fast-acting form of human insulin that could replace the short-acting insulin that many patients injected at mealtimes. Inhaled insulin was not recommended for children, pregnant women, people who had smoked within six months, or people with breathing disorders. Januvia, the first in a new class of drugs for type 2 diabetes (DPP-4 inhibitors), also gained FDA approval. Taken in pill form once a day, Januvia aided the activity of a protein that both stimulated insulin production when blood-sugar levels were elevated and lowered liver glucose production. Based on clinical trials, Januvia was less likely than other oral antidiabetes drugs to cause weight gain or severe drops in blood sugar. Another drug in the class, Galvus, was under FDA review.

      After a long politically charged debate, the FDA approved the sale in the United States of the morning-after, or next-day, pill (Plan B) to women (and men) aged 18 and older without a prescription. At least 40 other countries already sold such emergency contraceptives over the counter, and Plan B had been available in the U.S. by prescription since 1999. Plan B was a synthetic form of the hormone progesterone and, if taken within 72 hours of unprotected sex, was about 90% effective in preventing pregnancy.

      In November, following an exhaustive review of the safety of silicone-gel breast implants, the FDA lifted a 14-year ban on their use in the United States, and it licensed two companies to manufacture them. The devices had been banned because of allegations that they caused cancer and autoimmune disorders if they leaked or ruptured. The implants would be available to all women for breast reconstruction following breast cancer or trauma and to women 22 years of age and older for breast augmentation. The FDA required the two manufacturers to monitor the safety of the implants by collecting detailed data on their use in 80,000 women.

Cardiovascular Disease.
 Several reports published or presented at conferences during the year indicated that common treatments for coronary artery disease were being used inappropriately, to the detriment of patients and at enormous cost. For more than a decade, cardiologists had used stents—tiny metal-mesh tubes that were guided into an area of blockage in a coronary artery during a balloon angioplasty procedure—to prop open the vessel and improve blood flow to the heart. Reclosure of the stented artery months after the procedure was a problem, however, in as many as 30% of treated arteries. Drug-eluting stents, which were coated with drugs that inhibited cell growth in the inner artery, were introduced in 2003–04, and by late 2006 they had been used to treat an estimated six million patients worldwide. About 18 months to three years after having received a drug-eluting stent, however, some patients developed blood clots, which increased the risk of heart attack and death. A suspected reason was that drug-eluting stents were being used to treat longer lesions in larger vessels than those for which they had been officially approved. One study suggested that drug-eluting stents were of benefit in only about one-third of patients who received them, and cardiologists at the University of California, Los Angeles, calculated that more than 2,100 patients a year were dying needlessly because they had received drug-eluting stents. It made “little clinical, economic, or common sense,” they concluded, “to forsake a therapy that works well for most patients (bare-metal stents) in favor of a costly new therapy (drug-eluting stents) that has no effect on important clinical outcomes but increases the risk for … a life-threatening complication.”

      Opening blocked arteries with angioplasty and stents in people who had experienced a heart attack could be life-saving if it was done within about 12 hours of the attack. In the U.S., however, only about one-third of the one million people who had heart attacks each year received care within that time frame. Nevertheless, many underwent angioplasty (with or without the insertion of stents) days or weeks after the attack, because it was widely assumed that opening an artery might help prevent a future heart attack, heart failure, or death. That long-held belief, however, was shown to be unfounded in a large international study. The study found that angioplasty performed 3 to 28 days after a patient had a heart attack offered none of the assumed benefits and, in fact, was associated with an increased risk of the recurrence of a heart attack.

      At a breast cancer symposium near the end of the year, U.S. investigators reported that 14,000 fewer breast cancer cases were diagnosed among American women of all ages in 2003 than in 2002—a 7% drop. An even sharper decline of 12% was seen among women aged 50–69 in the type of breast cancer that is dependent on the hormone estrogen for its growth. The researchers believed that the drops in that one-year period could be attributed to the fact that millions of American women stopped taking hormone-replacement therapy (HRT) in 2002 after widely publicized results from a major clinical trial indicated that women who took estrogen and progestin had higher rates of breast cancer, heart disease, stroke, and blood clots than women who took placebos. Prior to the release of those findings, about 30% of postmenopausal American women had been on HRT for such purposes as treating menopausal symptoms and reducing the risk of osteoporosis. By the end of 2002, one-half of the women on HRT had discontinued it. Cancer experts surmised that a substantial number of the women who discontinued HRT might have had tiny tumours that either stopped growing or regressed once they were deprived of supplemental estrogen. Likewise, HRT prescriptions for Canadian women plummeted in late 2002, which presumably explained Canada's 6% drop in overall breast cancer cases in 2003.

      In a large international study, more than 30,000 cigarette smokers were screened every 7 to 18 months with a spiral computed tomography (CT) scan—a procedure in which an imaging machine rotates rapidly around the body and takes more than 100 pictures in sequence. This method detected small lung tumours at a very early stage in more than 400 subjects. (Generally, lung cancers were diagnosed at later stages, when treatment was unlikely to be curative; even with the best treatment, only 15% of patients survived for five years, and ultimately 95% of patients diagnosed with lung cancer died from it.) The researchers estimated that 92% of patients with tumours that were caught very early and surgically removed within one month of diagnosis would survive for 10 years and that 80% of lung-cancer deaths could be prevented through annual CT screening of smokers and others at risk for lung cancer.

      Those estimates, however, sparked considerable controversy. The study was criticized for not having had a comparison group of people who were not screened. Critics pointed out that many of the lumps that were detected might never grow or cause problems, and they cautioned that lung-cancer screening through routine CT scans might lead to unnecessary biopsies and surgeries.

Emergency Care.
      In June the Institute of Medicine (IOM), an arm of the U.S. National Academy of Sciences, issued three reports that detailed a crisis in emergency medical care in the United States. The reports described the inability of hospital emergency departments (EDs) to meet the ever-growing demand for their services. Between 1993 and 2003 the number of people seeking care in EDs increased by 26%, whereas the number of EDs declined by 425. Emergency medical services were found to be highly fragmented, poorly coordinated, and ill-equipped to manage the flow of patients, which resulted in some EDs' remaining empty while others were unmanageably overcrowded. Crowding in the ED affected the whole hospital; often patients were, in effect, boarded in the ED until inpatient beds in the hospital became available. Critically ill patients frequently waited the longest, because intensive-care beds were in shortest supply.

      Other key findings were that ambulances were frequently diverted to distant hospitals; in emergencies most children were taken to general hospitals, which often lacked pediatric expertise and equipment; and the American emergency-care system was ill-prepared for a major disaster—be it a natural disaster, a disease outbreak, or a terrorist attack. The IOM recommended strategies to address each of the system's shortfalls, including the creation of a single agency within the U.S. Department of Health and Human Services to coordinate and oversee emergency and trauma services.

Ellen Bernstein

▪ 2006

Outbreaks of bird flu in poultry spread and government officials prepared for a potential pandemic among humans. Polio reemerged in Indonesia and elsewhere. Physicians reported the first successful treatment for a patient with advanced rabies, and researchers identified the animal reservoir for SARS. “We don't know when it will start, we don't know where it will start, we don't know how severe it will be, we don't even know for certain from where the causative virus will come.” So said David Nabarro, senior coordinator of the UN response to avian influenza, or “bird flu,” in a BBC News interview in November 2005. Those were the unknowns. Nabarro went on to list the things he knew—that sooner or later there would be a flu pandemic, that such a pandemic would cause widespread deaths, and, above all, that the world was not prepared. Since mid-2003 a deadly influenza A strain known as H5N1 had been circulating among poultry flocks in Southeast Asia, and in 2005 outbreaks spread to other areas of Asia and to Eastern Europe. Although H5N1 influenza remained essentially an illness in birds, by late 2005 more than140 people in six Asian countries—Cambodia, China, Indonesia, Thailand, Turkey, and Vietnam—had come down with the particularly virulent flu after having had contact with infected poultry; more than half of them died. Vietnam was the most severely affected, with more than 90 cases. (See World Affairs: Vietnam: Sidebar (Bird Flu-The Next Human Pandemic? ).) The first human cases outside China and Southeast Asia occurred in eastern Turkey at the end of December.

      The expanding geographic range of H5N1-infected birds sharply increased opportunities for human exposure, and influenza experts warned that each additional human case increased the opportunity for the virus to transform itself into a strain capable of being spread easily among humans and setting off a pandemic. The best defense against pandemic flu would be a vaccine, and vaccine manufacturers were developing and testing vaccines against the H5N1 virus present in birds. It would take many months to prepare hundreds of millions of doses, however, and existing global vaccine production capacity was not sufficient to meet the demand. Antiviral medications were expected to play an important role in treatment plans. Two existing antiviral drugs—oseltamivir (Tamiflu), a pill, and zanamivir (Relenza), a nasal spray—were likely to be able to shorten the duration and severity of flu caused by an H5N1 influenza strain if they were used soon after a person became infected. A pandemic-preparedness plan for the United States was announced by the administration of Pres. George W. Bush on November 1. The plan included the purchase of 20 million doses of a vaccine against the existing H5N1 virus and the stockpiling of enough antiviral medication for another 20 million people. In addition, $2.8 billion was allotted toward research into more reliable and faster ways to produce vaccines.

      As the world was scrambling to prepare for a flu pandemic, American scientists working in a high-security laboratory at the Centers for Disease Control and Prevention in Atlanta discovered that the deadliest influenza in history, the 1918–19 “Spanish flu,” which killed 50 million people, was in fact a bird flu that became an extremely lethal human flu through the slow accumulation of genetic mutations. The timely research not only offered insights into the evolution of avian influenza viruses but also revealed that the H5N1 virus that was circulating in Asia in 2005 shared some of the mutations found in the 1918 virus.

 The highly ambitious 3 by 5 Initiative of the World Health Organization (WHO) and the Joint UN Program on HIV/AIDS aimed to provide life-prolonging antiretroviral drugs to three million people living with HIV/AIDS in less-developed countries (mainly in Africa) by the end of 2005. In June the sponsoring UN agencies issued a progress report that described clear accomplishments since the launching of the initiative in December 2003 but acknowledged that the 2005 goal would not be met.

      Problems with drug procurement were greater than expected, and donors had delivered only about $9 billion of the $27 billion pledged. WHO Director-General Lee Jong Wook was not discouraged. When the initiative began, there were only about 400,000 persons who were receiving treatment in the target countries; a year and a half later, there were one million. “This is the first time that complex therapy for a chronic condition has been introduced at anything approaching this scale in the developing world,” said Lee. “The challenges in providing sustainable care in resource-poor settings are enormous, as we expected them to be. But every day demonstrates that this type of care can and must be provided.” The UN special envoy for HIV/AIDS in Africa, Stephen Lewis, expressed his belief that the 3 by 5 Initiative would “be seen one day as one of the UN's finest hours.” As he traveled through Africa, Lewis observed governments “moving heaven and earth to keep their people alive, and nothing will stop that driving impulse.”

      Researchers had been struggling for two decades to produce a vaccine against HIV that would be safe and effective in diverse populations. More than 100 candidates had been tested in animals and humans, but none had achieved that goal. A trial that was under way in six countries and involved 1,500 healthy volunteers had scientists excited, however. The vaccine used a disabled common cold virus to deliver three HIV genes into cells to stimulate an immune response against HIV. (No live HIV was used in the production of the vaccine, so it could not cause HIV infection.) The early results were so promising—the vaccine had generated a potent, lasting response—that the researchers were doubling the number of enrollees in the trial.

Polio Eradication.
      The suspension of polio vaccination in Muslim states in Nigeria in 2003–04 led to polio outbreaks in children and set back the Global Polio Eradication Initiative, which aimed to wipe polio off the face of the Earth by the end of 2005. Polio vaccination resumed in Nigeria in late 2004 but not before 788 youngsters had been afflicted, and the polio strain that crippled Nigerian children spread across Africa. Thanks to a massive international public-health effort and $135 million in emergency vaccination funds, an estimated 100 million children in 23 African countries received multiple doses of polio vaccine over an 18-month period, and epidemics of polio in 10 countries—Benin, Burkina Faso, Cameroon, the Central African Republic, Chad, Côte d'Ivoire, Ghana, Guinea, Mali, and Togo—were stopped.

      Approximately 1,500 polio cases in 16 countries were recorded during the year—a 99% reduction since the global eradication initiative began in 1988. For the first time the number of cases was greater in countries that had been reinfected after having been polio-free than in countries in which the chain of polio transmission had never been interrupted.

      In May polio reemerged in Indonesia, which was the world's fourth most populous country and which had been without polio for a decade. By the end of November, there were nearly 300 cases. In response to the outbreaks, an estimated 24 million Indonesian children were vaccinated. In Yemen, which had not seen a case of polio since 1999, a 2005 polio outbreak was thought to have been started by pilgrims returning from Mecca. In May and July five million Yemeni children were immunized. Alarmed by the reemergence of polio in the Middle East, Iraq undertook a vaccination drive to deliver drops of polio vaccine to an estimated five million children. The UN even partnered with mobile-phone service providers to send text messages to Iraqi parents with cellular phones, urging them to take their children to clinics to be vaccinated. To curb the spread of polio during the 2005–06 hajj, or pilgrimage to Mecca, Saudi Arabia took the unprecedented step of requiring all children from countries experiencing polio to bring proof of polio vaccination.

      Since 1963 most polio vaccines given around the world had included weakened forms of the three existing polioviruses (types 1, 2, and 3) in one oral dose. In May researchers in Egypt and India began testing a new polio vaccine composed solely of type 1 virus. Experts believed that mass immunization with the new vaccine in areas where types 2 and 3 had already been eliminated could rapidly finish the job of eradication. (Wild poliovirus type 2 had not been found anywhere in the world since 1999; type 3 continued to circulate in Africa, Pakistan, and Afghanistan.) On the basis of the success of the trials of type 1 vaccine, WHO contracted with a French vaccine maker to produce tens of millions of doses.

Other Infectious Diseases.
      HIV/AIDS and polio, of course, were not the only infectious diseases that were causing misery and death around the world. Between March and the end of August, Uíge province in Angola experienced an outbreak of highly infectious Marburg hemorrhagic fever—the largest such outbreak the world had ever seen. More than 300 persons died from the viral illness, including most of the patients in the pediatric ward of one hospital and more than a dozen health care workers who treated victims of the disease.

      Marburg is a close relative of the Ebola virus, which had previously caused lethal epidemics in Angola. A WHO epidemiologist who had witnessed outbreaks of both viruses in the country, noted, “Marburg is a very bad virus, even worse than Ebola.” Symptoms included high fever, diarrhea, vomiting, and bleeding from bodily orifices; most of those infected died within one week. The virus was spread via contact with the bodily fluids (such as blood, saliva, sweat, or semen) of an infected person. Corpses too were highly infectious; thus, victims had to be buried rapidly. Some families were reported to have hidden sick loved ones rather than allow them to be put in the isolation unit of a hospital, where they were likely to die and then be buried without a traditional family funeral.

      The mosquitoborne viral illness Japanese encephalitis, which causes high fever, blinding headaches, coma, and sometimes death, took an especially harsh toll on young people in the state of Uttar Pradesh, India. In the month of August alone, the viral disease was responsible for more than 1,100 deaths. Those who survived were at risk of mental retardation and other neurological problems. (The virus grows mainly in pigs; mosquitoes transmit it from pigs to humans, and children are the most susceptible.) An effective Japanese encephalitis vaccine existed, but only 200,000 of Uttar Pradesh's 7,000,000 children had received it. At least 300 Japanese encephalitis deaths were also reported in neighbouring Nepal.

 There had been woefully little progress in the fight against another mosquitoborne illness, malaria, which killed more than one million persons a year, the vast majority of them children in Africa. In October a major infusion of funds, three grants totaling $258.3 million from the Bill & Melinda Gates Foundation, offered hope that the suffering and deaths associated with malaria could finally be reduced. “It's a disgrace that the world has allowed malaria deaths to double in the last 20 years, when so much more could be done to stop the disease,” said Bill Gates, cofounder of the foundation. One grant would support advanced human trials of a malaria vaccine that had shown promise in early trials in children in Mozambique. Another would support research into new antimalarial drugs, which were desperately needed in Africa because malaria parasites had developed high levels of resistance to available drugs. At least 20 promising compounds were in the pipeline, and several were in clinical trials. The third grant would support efforts to find more effective methods of controlling mosquitoes—among them, improved insecticide-treated bed nets. “As we step up malaria research, it's also critically important to save lives today with existing tools. Bed nets cost just a few dollars each, but only a fraction of African children sleep under one,” said Gates. The Gates Foundation gave another $35 million to help establish a program in Zambia to use proven malaria-control strategies—such as bed nets—to cut malaria deaths by 75% over three years.

      The life of a 15-year-old Wisconsin girl was saved by a first-of-its-kind treatment after she contracted rabies from a bat bite. (Rabies is a viral illness; the virus travels from the site of a bite via nerves to the spinal cord and brain, where it multiplies and causes serious neurological damage.) The disease had always been fatal if an infected person did not immediately receive multiple doses of rabies vaccine. In this case the girl ignored her bite for a month, so by the time she developed symptoms—including nausea, blurred vision, fever, numbness, slurred speech, and tremours—it was too late for the vaccine to be effective. Rather than watch her die, her parents allowed a team of Milwaukee physicians to try an aggressive experimental treatment. To protect her brain from injury, the doctors gave her drugs that put her into a deep coma. They also gave her antiviral medications, which they hoped would stimulate her immune system to mount a response against the rabies virus. After a week the physicians tapered the drugs. Once she woke from her coma, her senses returned gradually. A month after she entered the hospital, tests showed that she no longer had transmissible rabies, so she was able to move out of isolation. Over the next couple of months, she progressed rapidly; by the time she left the hospital—76 days after she entered it—she was able to walk with the aid of a walker, feed herself, and speak intelligibly. Five months after her treatment, she still had some neurological impairment, including a condition characterized by involuntary bodily movements, but she was able to attend high school part time and enjoy many normal teenage activities. She was the first unvaccinated person known to have survived rabies. During the year doctors in Germany used a similar strategy in an unsuccessful attempt to cure three transplant recipients who had contracted rabies from infected donor organs.

      On the research front, two independent international teams of scientists reported that they had identified the animal reservoir of the virus responsible for severe acute respiratory syndrome ( SARS), which infected more than 8,000 persons and killed about 800 in 26 countries in 2002–03. (Animal reservoirs are hosts for an infectious organism that causes illness in other species; the host generally does not become ill.) At the time of the frightening SARS outbreak, attention was focused on Himalayan palm civets and raccoon dogs that were sold in live food markets in Guangdong province in China as the source of SARS. According to the new findings, however, they were only intermediaries. Chinese horseshoe bats, which were also sold at the markets, were the actual reservoir. The most likely scenario, according to the scientists, was that the bats in markets infected civets and raccoon dogs, and humans who had contact with the latter animals then became infected.

      The findings of four large clinical trials published in the October 20 issue of The New England Journal of Medicine were called “revolutionary,” “simply stunning,” and “truly life-saving results in a major disease.” The studies found that the cancer drug trastuzumab (Herceptin) dramatically reduced the chances of cancer recurrence in patients with early-stage disease when the drug was given for one year following standard chemotherapy. Trastuzumab had been used since 1998 to prolong survival in women with advanced-stage breast cancer. The drug is a monoclonal antibody that specifically blocks the activity of human epidermal growth factor receptor 2 (HER2), which is found on the cells of up to 30% of breast cancers. HER2-positive tumours tend to be aggressive and unresponsive to most chemotherapy agents. The latest results were so impressive that a leading breast cancer specialist who was not involved in the studies declared, “Our care of patients with HER2-positive breast cancer must change today.”

Cardiovascular Disease.
      Cardiologists in the United States reported in the February 10 issue of The New England Journal of Medicine on a unique cardiac syndrome that they had seen in 18 previously healthy women and one man. Each of the patients had been hospitalized with heart-attack-like symptoms after having been “stunned” in some profound way (ranging from a car crash to a surprise birthday party). The cases were unique in that none of the patients had blood clots, clogged arteries, or other signs of heart attack; all had distinctly abnormal electrocardiograms not indicative of heart attack; and all recovered completely with no lasting damage to the heart. On the basis of the results of extensive tests, the authors concluded that in each case a stunning event had triggered a significant burst of the stress hormone adrenaline, which was toxic to the heart muscle and temporarily impeded its ability to contract properly. They dubbed the syndrome “stress cardiomyopathy.”

      Numerous trials had shown that a low-dose regimen of aspirin reduced the risk of a first heart attack in men (although it did not lower their risk of stroke to any substantial degree), and many women therefore also followed such a regimen in hope of staving off heart attacks. During the year the surprising results of the Women's Health Study, which involved almost 40,000 initially healthy women, were published. Most of the women who took 100 mg of aspirin every other day had outcomes that were essentially the opposite of those in men: their risk of heart attack and of dying from heart disease was not reduced, but they did have a significantly lower likelihood of stroke. (For a subset of women in the trial—those aged 65 years and older—the risk of heart attack was reduced.)

      In the United States the advertising of prescription drugs directly to consumers—particularly on television—came under fire in the fall of 2004 when the widely advertised arthritis medication and pain reliever Vioxx (rofecoxib) was forced off the market because postmarketing studies had found that it doubled the risk of heart attacks and strokes. Critics of direct-to-consumer (DTC) drug advertising contended that commercials such as those for Vioxx prompted patients to ask their doctors for expensive prescription medications that they did not need, and that, with considerable regularity, doctors complied. Indeed, 93 million prescriptions were written for Vioxx from the time it was approved in 1999 to the time it was taken off the market in September 2004.

      The pharmaceutical industry, which spent more than $4 billion on advertising in 2004, called DTC advertising “an invaluable communications tool” that both increased public awareness of diseases and symptoms and potentially averted underuse of effective treatments. Nonetheless, in response to widespread criticism, the Pharmaceutical Research and Manufacturers of America, which represented pharmaceutical research and biotechnology companies, drew up new guidelines on DTC advertising. The guidelines called for pharmaceutical manufacturers to put off advertising new drugs directly to consumers for “an appropriate amount of time” in order for drug companies “to educate health professionals about new medicines.” The guidelines also discouraged TV commercials that promoted drugs without saying what they were for (such ads instead encouraged consumers to “ask your doctor if…is right for you”). Those ads were popular with drug companies because by not saying what a drug was for, they were not required to list the side effects and risks that were associated with it.

      Starting Jan. 1, 2006, Medicare—the U.S. government's health care program for people aged 65 and older and for some people with disabilities—would begin offering insurance coverage for prescription drugs, known as Medicare Part D. Between Nov. 15, 2005, and May 15, 2006, beneficiaries could enroll in one of the private insurance plans that Medicare had approved. In most states more than 40 prescription-drug plans were available, which had widely varying benefits and costs. The government estimated that with the average plan beneficiaries would pay a monthly premium of about $37, with a yearly deductible of up to $250. Plan beneficiaries would also pay a share of their yearly prescription-drug costs. For the first $2,000 in prescription-drug costs beyond the deductible, they would pay a 25% share; for the next $2,850, they would pay a 100% share; and for prescription-drug costs beyond $5,100, they would pay a 5% share. People with limited income and resources would be eligible for extra help with paying for prescription drugs.

      President Bush called the plan “the greatest advance in health care for seniors” in 40 years, but many seniors found Part D in general and the enrollment process in particular to be complicated and confusing. In a letter to the editor of the New York Times, a senior citizen from New Jersey wrote, “I have two engineering degrees and an M.B.A. and find it almost impossible to compare the different plans offered for the new Medicare drug benefit. It is not an apples-to-apples comparison, but rather apples to every other kind of fruit.” The U.S. secretary of health and human services, Michael O. Leavitt, responded to such criticism by saying, “Health care is complicated. We acknowledge that. Lots of things in life are complicated: filling out a tax return, registering your car, getting cable television. It is going to take time for seniors to become comfortable with the drug benefit.”

Other Developments.
 In 2005 the U.S. Department of Agriculture released a redesigned food-guide pyramid, which presented the government's newly revised dietary guidelines as a graphic for use by the general public. The new pyramid, known as MyPyramid, was available as an online tool that could be personalized. (See Graphic—>.)

      Surgeons in France performed the first partial face transplant. The surgeons grafted the nose, lips, and chin from a deceased donor onto the face of a woman who had been severely disfigured in an attack by a dog.

      An advance in human- cloning research reported in May 2005 by a team led by Hwang Woo Suk, a South Korean scientist, raised expectations that stem cells derived from embryos cloned from the skin cells of individuals with a disease or injury could be readily obtained for therapeutic use. By the end of the year, however, the report had been discredited, and the results of his other stem-cell work had fallen under scrutiny.

Ellen Bernstein

▪ 2005

      More than 17,000 delegates gathered in Bangkok on July 11–16, 2004, for the 15th International AIDS Conference, the theme of which was “Access for All.” The biennial event had evolved from a strictly scientific conference into a forum that covered all facets of the HIV/AIDS pandemic and was attended by persons who represented a large variety of voices, experiences, and concerns.

      On the eve of the conference, the Joint United Nations Programme on HIV/AIDS ( UNAIDS) released its 2004 Report on the Global AIDS Epidemic, which painted a very grim picture. The report indicated that in 2003 more people had contracted HIV—close to 5 million—and more people had died from AIDS—nearly 3 million—than in any other single year since the deadly virus emerged. Nowhere was the picture bleaker than in sub-Saharan Africa, home to 25 million of the estimated 38 million people infected with HIV worldwide.

      The report sounded an alarm over the rapid rise of HIV in Eastern Europe and Asia. China, Vietnam, Indonesia, and Russia were experiencing the steepest increases in HIV infections, while India had the largest number of infected people outside South Africa. UNAIDS Executive Director Peter Piot compared the epidemic in Asia in 2004 to the situation in southern Africa 15 years earlier.

      Globally women made up almost half of the number of adults who were living with HIV/AIDS, and the number of infected women had increased in every region of the world. Moreover, women were physically more susceptible to HIV infection than men, and gender-based violence in many countries exacerbated their vulnerability. A widely discussed topic at the Bangkok conference was the development of microbicides—in the form of gels, creams, or other substances—that could be applied vaginally to reduce the risk of the transmission of HIV and other sexually transmitted infections. The promise of microbicides was that they would offer women a prevention method that they could control.

      UNAIDS described the 3 by 5 Initiative of the World Health Organization (WHO) as one of the most ambitious health projects ever conceived. Launched on World AIDS Day (December 1) in 2003, the 3 by 5 Initiative was established to provide antiretroviral drug treatment to three million people in less-developed countries by the end of 2005. A six-month progress report on the initiative indicated that 40,000 people had started therapy by mid-2004; the target had been 100,000. Not all efforts had fallen short of their goals, however. Notable progress had been made in many countries in building health care infrastructures with the capacity to support HIV/AIDS treatment; about 15,000 health care workers had been trained to deliver and monitor antiretroviral therapy, and nearly 5,000 sites were providing HIV testing and counseling.

      Another positive development was the reduction in cost of antiretroviral drugs in less-developed countries to about $150 per person annually. New generic formulations that combined three drugs in a single pill were found to be as effective as expensive patented drugs used by patients in developed countries but at only about two-fifths the cost.

Avian Flu.
      Beginning in late December 2003, an epidemic of avian (bird) flu, a deadly disease of birds caused by type A influenza viruses, devastated poultry populations in most of Southeast Asia. The outbreaks led to the slaughter of more than 100 million fowl.

      In late January 2004 the first cases of human infection with the avian flu strain known as A( H5N1) were reported in Vietnam and Thailand. By the end of October, 44 human cases and 32 deaths had been confirmed in the two countries. That humans could catch bird flu had been demonstrated in Hong Kong in 1997, when 18 people were infected with A(H5N1) and 6 died. In fact, several avian flu strains were known to have “jumped the species barrier” and infected humans. The human cases of the disease in Vietnam and Thailand were acquired through either direct or indirect contact with infected poultry. In late September, however, a 26-year-old woman who lived in a suburb of Bangkok might have contracted the illness directly from her daughter. While staying with her aunt in a rural village, the 11-year-old girl had become ill with H5N1 flu after she helped dispose of sick chickens. The mother tended her severely ill daughter until the daughter died. Several days later the mother came down with the flu, and she died shortly thereafter. WHO called this a possible case of human-to-human transmission.

      Although a relatively small number of humans had been infected, the death rate was extraordinarily high—72%. Public health officials were duly alarmed. There was considerable evidence that in the nearly seven years since the Hong Kong outbreak, H5N1 had grown more virulent. It had also acquired the ability to replicate in mammals—most notably in pigs. Because pigs are susceptible to both avian and human influenza viruses, flu experts believed that they might serve as “mixing vessels” in which the H5N1 virus could swap genetic material with a human type A influenza virus; a virulent new strain that could be readily transmitted from human to human and to which humans would have no immunity might then emerge. There was little doubt that such a scenario could set off a pandemic on the scale of the deadliest influenza outbreak the world had ever seen, the 1918 “Spanish flu.” (Ominously, American and British influenza researchers—who had been studying preserved lung tissue from people who succumbed to the 1918 flu—reported in February that it was likely that the influenza, responsible for 20 million to 40 million deaths worldwide, started as a form of bird flu. Their studies suggested that minute changes in a single amino acid in an avian flu virus might have allowed it to infect humans.)

      In mid-November WHO convened a meeting of international vaccine manufacturers and national health officials to address the need for sufficient quantities of vaccine to protect people around the world against H5N1. Klaus Stöhr, WHO's senior influenza expert, said that it was “not a question of if, but of when” a pandemic would occur. The U.S. government did not wait for the November meeting to take steps to prepare for a flu pandemic; in September it contracted with a vaccine manufacturer to prepare and store two million doses of an H5N1 vaccine.

Flu-Shot Shortage.
      In the midst of the alarm over a possible avian flu outbreak, people around the world were taking the imminent 2004–05 flu season very seriously, and unprecedented numbers sought flu-vaccination shots, the best protection available. In the United States, however, 50 million doses of flu vaccine—about half the intended U.S. supply—were never delivered. The government had contracted with only two manufacturers for its supply. (In contrast, the U.K. obtained its supply of influenza vaccine from five manufacturers and was not caught short, nor were shortages a problem in Europe.) One supplier of American vaccine, Chiron Corp., discovered in August that some of the flu vaccine produced in its manufacturing plant in Liverpool, Eng., was contaminated by bacteria. Although the company claimed that the problem was “limited in scope to a few batches,” in early October the British Medicines and Healthcare Products Regulatory Agency suspended Chiron's license, which effectively prevented the company from releasing any of its influenza vaccine. The other supplier, Aventis Pasteur, delivered about 58 million doses. The U.S. Centers for Disease Control and Prevention (CDC) issued guidelines for rationing the sharply reduced supply of influenza vaccine. Priority groups included children aged 6 months–23 months, adults 65 and older, people with chronic medical conditions, pregnant women, residents of nursing homes and long-term-care facilities, children on continuous aspirin therapy, health care workers involved in direct patient care, and people in close contact with children younger than six months.

      In addition to the Aventis Pasteur flu shots, there were about three million doses of FluMist, a live influenza virus nasal spray, which was an immunization option for healthy individuals aged 5–49. The government had also stockpiled enough antiviral medication to treat more than seven million flu-infected people. In early December the Department of Health and Human Services purchased 1.2 million doses of flu vaccine from U.K.-based pharmaceutical company GlaxoSmithKline (GSK). Because the vaccine had not undergone U.S. Food and Drug Administration (FDA) approval, it would have investigational new drug status, and recipients would be required to sign an informed-consent form.

      Researchers tested the possibility of stretching the flu vaccine supply by injecting a small amount into skin (rather than muscle). They found that a dose as small as one-fifth of a standard flu shot was as effective as or more effective than a full dose in healthy adults younger than 60. An advantage of injecting vaccine directly into the skin was that the skin contains abundant dendritic cells, white blood cells that are capable of triggering a strong immune response. Health officials did not recommend that shots be given in this way until the method had been more extensively tested and technical challenges and regulatory hurdles had been overcome.

Other Infectious Diseases.
      The West Nile virus (WNV) season in the U.S. in 2004 was mild compared with that of 2003, when 9,862 human cases and 264 deaths were reported to the CDC. In 2004 there were 2,448 confirmed human cases of WNV and 87 deaths in 41 states; 36% of infections were severe and involved inflammation of the brain (encephalitis) or the membrane that surrounds the brain or spinal cord (meningitis). (WNV is most often transmitted by mosquitoes that have fed on birds that harboured the virus.) Since the first WNV outbreak in the U.S., which occurred in 1999 and was confined to the New York City area, annual outbreaks had pushed steadily westward. Before 2004 California had experienced only a few human cases; in 2004, however, the state reported 760 cases—almost twice as many as any other state. Washington remained free of WNV, and Oregon experienced only three human cases.

      In 2004 deaths from tuberculosis (TB) increased for the first time in more than 40 years. One reason was the rise of drug-resistant strains of Mycobacterium tuberculosis, the causative organism. A WHO survey found that of an estimated 300,000 new cases of drug-resistant TB in 2004, nearly 80% were caused by superstrains—that is, strains resistant to at least three of the four drugs commonly used to treat active TB. Another reason for the increase in TB deaths was that 12 million people worldwide were coinfected with TB and HIV. The synergistic effects of HIV and M. tuberculosis are especially lethal. TB had become the leading killer of people with AIDS, responsible for one-third of the deaths in that group.

       SARS (severe acute respiratory syndrome), the deadly new infectious disease that took the world by surprise in 2003, when it infected almost 8,000 people and killed about 800, fortunately did not reemerge in epidemic fashion in 2004. SARS did, however, infect a handful of people in Beijing and in Anhui province in China. The outbreak was traced to two workers at the National Institute of Virology in Beijing, where experiments on the SARS virus had taken place but biosafety practices reportedly were lax. The workers spread the infection to at least nine people outside the lab, including one lab worker's mother, who died. Chinese authorities acted swiftly—they closed the Beijing lab, traced the contacts of those known to be infected, quarantined more than 500 persons, and screened travelers at airports and railroad stations—and there were no additional cases.

Cardiovascular Disease.
      Statins, a family of drugs also known as HMG-CoA reductase inhibitors, were much in the news in 2004. Statins lower low-density lipoprotein cholesterol (LDL-C) and can reduce the risk of a heart attack or stroke by as much as 40%. The U.S. National Cholesterol Education Program (NCEP) issued new cholesterol guidelines based on the findings of five clinical trials that had involved more than 50,000 people and had been completed since 2001, when the NCEP's guidelines were last revised.

      One of the key new recommendations for people at high risk of heart attack was that statins be used to achieve an extreme lowering of LDL-C levels, to under 70 mg/dl (milligrams per decilitre) of blood. The guidelines increased the number of people in the U.S. who met the criteria for statin therapy to 36 million—more than three times the number who took the drugs in 2004. Globally, there was an even bigger gap. Though it was estimated that more than 200 million people would benefit, only 25 million were receiving statin therapy.

      In July pharmacies in the U.K. began to sell the statin drug simvastatin (Zocor) in a low (10-mg) dosage without a doctor's prescription to people at moderate risk of heart disease—a group that was estimated to include 5 million–10 million people. The decision to make simvastatin available over the counter was based on the consensus of experts that the benefits outweighed the risks. In general, the drugs were considered extremely safe—much safer, in fact, than aspirin, which millions of people took on a daily basis to prevent heart attacks.

      The question whether men should have an annual blood test that measures prostate-specific antigen (PSA), a protein produced by the prostate gland, had long been controversial. Generally, the higher a man's PSA level was, the more likely it was that he had prostate cancer, and the test was widely used to screen men over age 50 for prostate cancer. The majority of tumours discovered by PSA tests were harmless, however, so it remained unclear whether the chief reason for PSA screening—to catch tumours early—outweighed the risk of complications from unnecessary treatment. Moreover, there remained no way to distinguish between cancers that could be safely left alone and those that would kill.

      The findings of two studies shed new light on the limitations of PSA screening. A National Cancer Institute (NCI)-sponsored study of men with low or normal PSA levels (four nanograms of PSA per millilitre of blood or less) found, through biopsies, that 15% of them had prostate tumours, of which 15% were high-grade and aggressive. In other words, standard PSA screening would have missed a significant number of potentially deadly tumours.

      In a study of men who had been treated for prostate cancer, those whose PSA levels rose more than two points in the year prior to their cancer diagnosis had a higher risk of dying from aggressive tumours within seven years, even after they underwent radical surgery. The investigators calculated that the change in a man's PSA level in the year before diagnosis was 10 times more predictive of deadly prostate tumours than the level per se.

      Twin epidemics of obesity and diabetes were intimately linked and threatened to reduce both the quality and the length of life for people around the world. Though the guru of low-carbohydrate diets, Robert Atkins, died in 2003, the craze he started for high-protein, high-fat, low-carb eating endured. A few studies published during the year found that, in the short term, people lost more weight faster on restricted-carbohydrate eating plans than on low-fat diets, but at one year the difference in weight loss with the two diets was minimal. No studies had demonstrated that weight lost the low-carb way would be maintained, and the long-term health effects were unknown. (See Agriculture: Sidebar (Craze for Curbing Carbs ).)

      One of the biggest medical news stories of the year was the withdrawal from the market of the arthritis drug rofecoxib ( Vioxx) in late September. Available by prescription since 1999, Vioxx had been used by more than 80 million people worldwide, and at the time the manufacturer, Merck & Co., decided to take it off the market, the drug was being used by almost two million people for relief from the symptoms of arthritis, acute pain, and menstrual pain. The decision followed the discovery that people who took the drug for more than 18 months had twice the risk of heart attack and stroke than those who took a placebo.

      A week after the withdrawal of Vioxx, a paper published online by the British medical journal The Lancet reviewed 18 trials of rofecoxib that had included more than 25,000 patients and found that a significant risk of heart attack in patients who took the drug was evident by the end of 2000. An accompanying editorial by Lancet's editor lambasted Merck and the FDA for having “acted out of ruthless, short-sighted, and irresponsible self-interest” in not having recalled Vioxx years earlier.

      Following the recall, David Graham, a senior drug reviewer in the FDA's Office of Drug Safety, testified before the Senate Finance Committee that the regulatory agency was “incapable of protecting America.” Referring to the increased cardiovascular risks with Vioxx, he said, “We are faced with what may be the single greatest drug-safety catastrophe in this country or the history of the world.” In his invective against the FDA, he cited five other drugs that he believed should not be on the market. They were valdecoxib; sibutramine hydrochloride monohydrate (Meridia), a diet drug associated with serious cardiovascular problems and sometimes death; salmeterol xinafoate (Serevent), an asthma medication that had caused life-threatening asthma episodes and deaths; rosuvastatin calcium (Crestor), a cholesterol-lowering agent linked to acute kidney failure and a serious muscle-weakening disease; and isotretinoin (Accutane), an acne drug that had caused severe birth defects.

      Revelation of the heart-attack risk associated with rofecoxib prompted further scrutiny of other drugs in its class, cyclooxygenase-2 (COX-2) inhibitors. This class became rapidly popular when first introduced in the late 1990s because the drugs appeared to cause fewer adverse gastrointestinal effects than traditional nonsteroidal anti-inflammatory drugs (NSAIDs)—aspirin, ibuprofen, and naproxen. There was little evidence, however, that COX-2 inhibitors (which are also NSAIDs) offered superior relief of pain or inflammation. Of the two other COX-2 inhibitors on the market, celecoxib (Celebrex) and valdecoxib (Bextra), valdecoxib had been shown to increase heart-attack risk in patients who had undergone coronary-artery bypass surgery. In mid-December a large NCI trial that was investigating the potential of celecoxib to prevent colon cancer was discontinued when data revealed a 2.5-fold increased risk of cardiovascular events for participants taking celecoxib (200 mg twice daily) compared with those who were taking a placebo. Less than a week later, a study of the potential of celecoxib and the over-the-counter NSAID naproxen (Aleve) to prevent Alzheimer disease found that people taking naproxen had a significantly increased risk of stroke and heart attack; no increased risk was evident for people taking celecoxib. In light of the new evidence, the FDA advised caution concerning the use of COX-2 inhibitors, naproxen, and other NSAIDs pending further review of data that were continuing to be collected.

      The Vioxx withdrawal raised important questions about the role of direct-to-consumer advertising in creating “blockbuster” drugs. Merck had spent at least $100 million annually to promote Vioxx to consumers, which paid off in $2.5 billion in sales in 2003. In late December the manufacturer of Celebrex and Bextra, Pfizer Inc., agreed that it would sell Bextra with a warning label and would pull all consumer-aimed ads for Celebrex. Sales of the two drugs had totaled more than $2.5 billion in 2003. Critics of direct-to-consumer advertising emphasized that consumers were being bombarded by ads for costly newer drugs, for which the long-term effects were unknown.

      The pharmaceutical industry was at the centre of another drug-safety controversy, which concerned the manufacturers of selective serotonin reuptake inhibitors (SSRIs), a widely used class of antidepressants. The manufacturers had failed to disclose the results of clinical trials that found that the drugs lacked effectiveness in children and teenagers and that they were associated with an increased risk of suicidal thoughts and acts. In June, New York Attorney General Eliot Spitzer (see Biographies (Spitzer, Eliot )) sued one of the manufacturers, U.K.-based GSK, for having committed fraud by withholding the results of four of five studies on the use of the SSRI paroxetine (Paxil) in children and adolescents. A settlement was reached in August under which GSK agreed to disclose all information on clinical studies of Paxil to the public on its Web site. GSK also made plans to post all of its clinical trial data for its other marketed medicines on the Internet.

      The first part of the reforms made to Medicare took effect in June, when drug discount cards became available. The cards were an interim measure meant to offer relief to seniors (as well as to some people with disabilities) from high prescription-drug prices until the full prescription-drug benefits took effect in 2006. But confusion reigned as seniors were faced with more than 70 different cards from which to choose. An analysis carried out by the House of Representatives Government Reform Committee revealed that “the prices available with the new Medicare discount drug cards [were] far higher than the prices available in Canada and…no lower than the prices…available to individuals who [did] not have the cards.” (See Social Protection: Sidebar (Medicare's New Prescription-Drug Program ).)

      Ironically, the Medicare Web site that allowed beneficiaries to compare drug prices may have driven more people to have their prescriptions filled in Canada. Indeed, well over one million Americans were buying their prescription drugs from Canada, where the same drug often cost 25% to 80% less than in the United States. Despite the huge traffic in cross-border prescription-drug sales, the FDA maintained its opposition to the importation of foreign drugs on safety grounds.

Stem Cells.
      Early in the year, Hwang Woo Suk and Moon Shin Yong (see Biographies (Hwang Woo Suk and Moon Shin Yong )) at Seoul National University reported that through a complex process of cloning called nuclear transfer, they had created human embryos, from which they had then extracted stem cells. The cells were capable of developing into virtually any tissue type or organ, and the stem-cell line they created could be grown in a laboratory culture indefinitely. The South Koreans published a detailed report of their work in the journal Science. They stated that their intention was solely to advance understanding of human diseases and provide the foundation for novel therapies. Upon learning of the achievement by the South Koreans, Leon R. Kass, chairman of the U.S. President's Council on Bioethics, said, “The age of human cloning has apparently arrived: today, cloned blastocysts for research, tomorrow cloned blastocysts for babymaking.” He went on to call for Congress to enact a law that would ban all human cloning.

      In March, Boston scientists reported that they had derived 17 new human embryonic stem-cell lines from 286 frozen human embryos produced by in vitro fertilization. Their goal was to facilitate the “understanding of the mechanisms by which differentiation of embryonic stem cells may be controlled to produce cell types for drug development and for transplantation in the treatment of disease.” They were making the newly created stem-cell lines available to researchers, but because of regulations that had been imposed by U.S. Pres. George W. Bush in August 2001, none of the lines could be used for federally funded research.

      Although the president had not budged on his position, in the November election California voters decisively approved Proposition 71, the Stem Cell Research and Cures Initiative, a $3 billion bond measure to fund stem-cell research. The passage of “Prop 71” was expected to make California a global leader in the pioneering field of stem-cell research.

Ellen Bernstein

▪ 2004

      In early 2003 a virulent new infectious disease caught the world off guard. The Chinese Ministry of Health reported to the World Health Organization (WHO) in mid-February that 305 people in Guangdong province had developed an acute pneumonia-like illness and that 5 of them had died. Laboratory tests had been negative for influenza viruses, anthrax, plague, and other infectious pathogens. By mid-March WHO realized that hundreds of people in Hong Kong, mainland China, Vietnam, and Canada had come down with the mysterious rapidly spreading disease, which was not responding to antibiotics or antiviral drugs, and for the first time in its history it issued a “global alert.” Three days later WHO issued emergency guidance for travelers and airlines. By that time it was known that a doctor who had attended patients with the unusual pneumonia in Guangdong was ill with the disease when he subsequently visited Hong Kong. There he spread the illness to fellow travelers, who took it to Hanoi, Singapore, and Toronto, seeding major outbreaks in all three metropolises.

 WHO called the illness severe acute respiratory syndrome (SARS). Over the next few months, SARS spread to more than two dozen countries on six continents. (See Map—>.) The last confirmed case of the outbreak occurred in Taiwan in mid-June, and by late July the SARS pandemic was considered over. The final count was 8,098 cases and 774 deaths, with health care workers accounting for 20% of cases. In fact, the first cases of SARS had occurred in Guangdong province in November 2002, but China failed to report the outbreak until three months later.

      Determination of the cause—a coronavirus unlike any other known human or animal virus in its family—and sequencing of the virus's genetic makeup occurred with impressive rapidity. Subsequent epidemiological investigations determined that Himalayan palm civets and raccoon dogs sold at food markets in Guangzhou, the provincial capital, were the likely source of SARS.

      Ultimately, SARS illustrated the impact that a new disease could have in a highly mobile world. Every city with an international airport was regarded as a potential hot spot for an outbreak. Many observers noted that the public fears inspired by SARS spread faster than the virus itself. (See Special Report (What's Next After SARS? ).)

Other Infectious Diseases.
      In May WHO extolled the Americas for having gone six months without a case of measles, the leading vaccine-preventable childhood disease. In other parts of the world, however, measles continued to take a terrible toll, affecting over 30 million children and killing some 745,000 each year, more than half of that number in Africa.

      WHO and UNICEF brought together key players in the fight against measles for a summit in Cape Town in October. These leaders mapped out a strategy for reducing the number of childhood measles deaths by 2,000 a day. Shortly thereafter, all of Uganda's 12.7 million children were immunized against measles in about two weeks' time. The hugely successful campaign was carried out with support of the government, churches, kings, and tribal leaders.

      The WHO-led global campaign to eradicate polio by 2005 shifted its overall strategy during the year, owing to a resurgence of the viral disease in India, Pakistan, and Nigeria. In 2001 just 329 polio cases were reported worldwide, down from an estimated 350,000 cases in 1988, the year the global campaign began. In 2002, however, the number increased nearly sixfold to 1,919 cases, with 1,556 in India. Consequently, WHO cut back immunization activity in 93 countries and concentrated it in the 13 countries where cases were still occurring and where there was a high risk of polio's return.

      Although the outbreak in India was a setback, leaders of the eradication effort remained confident that their goal could be accomplished. In September WHO Director-General Lee Jong Wook (see Biographies (Lee Jong Wook )), while attending the launch of a five-day immunization blitz that targeted tens of millions of Indian children, warned that even a single case of polio remaining in the world could allow the disease to spread. The scenario that Lee warned of was played out in late October when polio spread from Nigeria to neighbouring countries Benin, Burkina Faso, Ghana, Niger, and Togo. A tragedy was averted when hundreds of thousands of volunteers and health workers participated in a three-day campaign to vaccinate every child in those countries.

      Between mid-May and late June, the first outbreak in the Western Hemisphere of monkeypox in humans occurred in six states in the U.S. Midwest. Of 72 cases reported, 37 were confirmed by laboratory tests. Monkeypox, so named because it was first observed in monkeys, is a relative of smallpox and occurs mainly in rainforests of central and western Africa. Those affected in the U.S. typically experienced fever, headaches, dry cough, swollen lymph nodes, chills, and sweating, followed by blisterlike skin lesions. The source of infection was traced to Gambian giant pouched rats and dormice imported from Ghana and purchased by an exotic-pets dealer in Illinois, who housed them in the same facility as some 200 prairie dogs. People became infected through close contact with infected prairie dogs. The Centers for Disease Control and Prevention (CDC), Atlanta, Ga., recommended smallpox vaccines for persons who had been exposed to the virus. The CDC and the Food and Drug Administration (FDA) banned the importation of all rodents from Africa as well as the sale, transport, or release into the environment of prairie dogs.

      The fifth annual outbreak of West Nile virus (WNV) in the U.S. started in early July. By the end of November, 8,567 cases had been reported in 46 states, with 199 deaths; Colorado, with 2,477 cases, was hardest hit. (In 2002 there were 4,156 cases and 284 deaths in 44 states.) For the first time, rural areas were sharply affected. The mosquito that spread WNV in western states was Culex tarsalis, a particularly hearty species found mainly on farmland but able to travel great distances. A CDC official called the species “the most efficient vector of West Nile virus ever discovered.”

      During the 2002 WNV season, the virus had been found to be transmissible from person to person through blood transfusions and organ transplantation. Fortunately, by the start of the 2003 season, a new blood-screening test was available and detected WNV in more than 600 donors. Nevertheless, the screening process was not foolproof. At least two transfusion recipients developed severe West Nile illness with encephalitis (inflammation of the brain).

      By mid-November Canada had experienced 1,314 probable or confirmed cases of human WNV and 10 deaths during its third annual outbreak. In 2002 the total number of laboratory-confirmed human cases had been under 100. Mexico reported having tested more than 500 people for WNV, 4 of whom were classified as WNV-positive.

      Using improved epidemiological monitoring methods, UNAIDS (Joint United Nations Programme on HIV/AIDS) and WHO revised the estimate of the number of people in the world living with HIV from 42 million to 40 million. The reduction was apparent rather than real and reflected a change in surveillance methods, not in the overall toll of the pandemic. A comprehensive report issued by the agencies in late November estimated that during the year HIV infected five million people, while AIDS killed three million—the highest numbers ever.

      Although most people with HIV in developed countries were living a decade or more beyond diagnosis, thanks to life-sustaining drugs, the epidemic in those countries was far from over. In the U.S., public health officials were alarmed by a 17.7% surge in new cases among gay and bisexual men since 1999.

      A mounting problem in developed countries was the appearance of strains of HIV that were resistant to available drugs. At a meeting of the International AIDS Society in Paris in July, results of the largest study ever conducted on antiretroviral drug resistance were presented. The study found that 10% of newly infected Europeans had viral strains resistant to at least one antiretroviral drug. That meant that HIV-infected people on antiretroviral therapy and carrying a virus that had developed resistance were passing on the virus by engaging in high-risk sex or needle sharing.

      It had long been believed that drug-resistant strains of HIV were most likely to arise and thrive when patients took their drugs erratically. Investigators based in San Francisco found, however, that irregular drug use by individuals of low economic status, primarily the homeless, did not lead to the development of drug resistance. In fact, they found nearly twice as many drug-resistant mutations of HIV in blood samples of those who took their drugs conscientiously as in the blood of those who were noncompliant.

      In March the FDA approved enfuvirtide (Fuzeon), the first in a new class of antiretroviral medications for HIV/AIDS called fusion inhibitors, which prevent HIV from entering host cells. Fuzeon had to be given by injection and was meant for those who had used other drugs but still had evidence of active disease.

      While life-prolonging drugs were extending the lives of people who had access to therapy, the vast majority of those living with HIV were in sub-Saharan Africa, where only about 50,000 were receiving treatment. In September at a high-level UN meeting, representatives of WHO and UNAIDS announced their organizations' commitment to providing drug treatment to three million people in the less-developed countries by the end of 2005—a plan dubbed the “3 by 5 Initiative.”

      Many international public health professionals held that it would be impossible to provide AIDS drugs to people in Africa because too many were infected, the drugs were too costly, the regimens were too complicated, and there was no way to ensure compliance with therapy. Nevertheless, surveys carried out in 2002 and 2003 in Botswana, Uganda, Senegal, South Africa, and Zambia found that compliance with treatment among Africans was extremely high—higher, in fact, than among AIDS patients in developed countries. Jeffrey Stringer, working at the Center for Infectious Disease Research in Zambia, was quoted in the September 13 issue of The Lancet as saying: “High rates of antiretroviral adherence are clearly possible in African settings, and while the unique set of issues around adherence to medication in African populations should be considered carefully as we design antiretroviral treatment programs, it in no way should delay large-scale implementation.”

      Availability of low-cost, high-quality antiretroviral drugs would be crucial to the success of the “3 by 5” program. Thus, it was welcome news when the drug company GlaxoSmithKline said it would further cut the prices of its AIDS drugs for the world's poorest countries by as much as 47%. In addition, former U.S. president Bill Clinton announced that he had brokered a deal with four generic-drug companies to cut the cost of AIDS drugs for African and Caribbean countries by as much as one-half.

      Providing medications to all who needed them was an undisputed necessity, yet most AIDS experts believed that the greatest hope for reversing the pandemic would be an effective vaccine. In 2003 more than 25 potential vaccines were being tested in some 12,000 volunteers worldwide.

      In late February the California firm VaxGen Inc. announced the results of the first large-scale clinical trial of an AIDS vaccine to reach completion. The subjects were more than 5,000 North American and European volunteers; none were infected with HIV at the start of the trial, but all were at high risk for sexual exposure to the virus. Two-thirds received injections of the experimental vaccine over a period of three years, while one-third received a placebo. All participants were advised on safer sex practices.

      In the study population as a whole, the vaccine did not provide protection against HIV infection. A surprising finding requiring further study, however, was that minorities other than Hispanics who received the vaccine had 67% fewer HIV infections than minorities who received the placebo. Black vaccine recipients had 78% fewer infections than black placebo recipients.

      The results of a second large-scale VaxGen trial—conducted in Thailand—were released in November. Some 2,500 injection-drug users who were not infected with HIV at the start of the 36-week trial received either vaccine or placebo. This vaccine too failed to protect the recipients from becoming infected with HIV; furthermore, it did not slow the progression of AIDS in those who became infected. Although the overall results of these important trials were disappointing, vaccine proponents remained confident in the validity of an AIDS vaccine.

Bioterrorism Preparedness.
      In December 2002 U.S. Pres. George W. Bush announced a smallpox vaccination program to protect Americans in the event of a terrorist attack with the deadly virus. The plan called for immunizing about 500,000 health care workers first, then as many as 10 million emergency responders—police, firefighters, and paramedics. The CDC had estimated that 1.2 million immunized health care workers would be needed to vaccinate the entire U.S. population within 10 days of a smallpox attack.

      The program was highly controversial because there was no imminent threat of a smallpox outbreak and because the vaccine was known to carry significant risks of life-threatening complications and death. (About 450,000 members of the U.S. military were successfully vaccinated against smallpox between December 2002 and June 2003, with very few serious adverse events.) The program to vaccinate civilian health care workers got under way in January but was riddled with problems. The federal government had estimated that each vaccination would cost $13, but state and local health officials reported the actual cost to be $75–$265. Many hospital workers initially refused the vaccine because no provisions had been made to compensate people who suffered adverse reactions. By the end of March, the CDC had reports of 72 cases of heart problems among military and civilian vaccinees—notably inflammation of the heart muscle (myocarditis)—and three fatal heart attacks. (In April Congress finally approved a bill that would ensure compensation for those who experienced short-term or permanent disability or death from the vaccine.) Although the relationship between the vaccinations and the medical problems was not clear, the CDC said that persons with heart disease or major cardiac risk factors should no longer receive the vaccine. In the end, only about 38,000 civilian health care workers were immunized.

      Meanwhile, a study of Americans previously vaccinated against smallpox (before 1972, when routine vaccination was discontinued in the U.S.) found that more than 90%—even people vaccinated as far back as 1928—still had the full range of antibodies to smallpox. The results suggested that a significant proportion of middle-aged and older Americans would be protected in the event of a smallpox attack.

Cardiovascular Disease.
      For decades, anyone with blood pressure under 140/90 was considered to be in the healthy range. Recently acquired knowledge about the damage done to arteries when blood pressure was even slightly elevated, however, prompted the U.S. National Heart, Lung, and Blood Institute to issue new guidelines, according to which adults with blood-pressure levels previously considered normal (some 45 million in the U.S.) would now be in a category called prehypertension. This group included people with systolic pressure (top number) of 120–139 or diastolic pressure (bottom number) of 80–89. Those in the new category were urged to make lifestyle changes such as losing excess weight, quitting smoking, and consuming less sodium. Those with systolic readings of 140–159 or diastolic readings of 90–99 were in a category called stage 1 hypertension and in most cases would require treatment with blood-pressure-lowering medication. For those with 160/100 and higher—stage 2 hypertension—aggressive treatment with medication to lower blood pressure to at least 140/90 was strongly advised.

      Cardiologists had long believed that about half of all heart disease was unrelated to any of the best-known risk factors: high blood pressure, high cholesterol, smoking, and diabetes. Two reports published in the Journal of the American Medical Association in August, however, found that 80–90% of people with heart disease had at least one of the four risk factors.

      By 2003 most medical scientists had come to appreciate that injury to the arteries resulting from factors such as high blood pressure, high cholesterol, and smoking triggered an inflammatory reaction. A number of biochemical markers of inflammation had been found, but the one for which the most accurate and sensitive test had been devised was C-reactive protein (CRP), a substance found in the blood and produced by the liver in response to inflammation in the body. One study of healthy women found CRP to be a better predictor of cardiovascular disease risk than low-density lipoprotein (the “bad” cholesterol).

      In January the CDC and the American Heart Association issued guidelines for physicians on when to order the CRP test (called high sensitivity CRP, or hs-CRP). The guidelines specified that hs-CRP would be useful mainly when it was unclear whether an individual would benefit from preventive treatment (lifestyle changes, medication, or both). A good candidate for the test might be a healthy person with normal blood pressure, cholesterol, and blood sugar but with a family history of heart disease. Most cardiovascular experts believed that considerable further investigation was needed before the implications of elevated CRP in the blood would be fully understood. Moreover, the guidelines emphasized that many things other than damaged arteries could cause inflammation—e.g., infection and autoimmune diseases.

      Results of a huge American Cancer Society study found that excess body weight significantly increased the risk of death from cancer. The study followed more than 900,000 initially cancer-free American adults for 16 years, during which time slightly more than 57,000 died from cancer. The investigators correlated the volunteers' body-mass index (weight in kilograms divided by the square of height in metres) at the time of entry into the study with the subsequent development of deadly cancers. On the basis of the findings, they estimated that “current patterns of overweight and obesity in the United States could account for 14% of all deaths from cancer in men and 20% of those in women.” The study identified several types of cancer that previously had not been associated with excess body weight: cancers of the stomach (in men), liver, pancreas, prostate, cervix, and ovary, as well as non-Hodgkin lymphoma and multiple myeloma.

      Cancer treatment specialists were elated about a Canadian-led study's finding that a drug in the class known as aromatase inhibitors significantly prolonged disease-free survival of women who had had breast cancer. The standard, highly effective regimen for women with breast cancer after tumour removal was to take the drug tamoxifen, an antiestrogen, for five years. Beyond that period, however, taking tamoxifen offered no benefit. The new study, which involved more than 5,000 women in Canada, the U.S., and Europe, was stopped early when it became clear that taking the aromatase inhibitor letrozole (Femara) following a five-year course of tamoxifen significantly reduced the likelihood of developing cancer in the other breast and of having the original cancer recur or spread to other sites in the body. Consequently, it was likely that letrozole would be offered to most postmenopausal women with estrogen-receptive breast cancer following tamoxifen treatment, although the optimal length of letrozole therapy was not yet known. (Estrogen stimulates the growth of cancer cells. In postmenopausal women, androgens produced by the adrenal glands are converted to estrogens by the enzyme aromatase. Letrozole works by blocking the action of aromatase and thereby inhibiting the conversion of androgens to estrogens.)

Women's Health.
      The Women's Health Initiative (WHI) was established by the U.S. National Institutes of Health in the early 1990s as a long-term research program to address the most common causes of death and disability in postmenopausal women. In 2002 a landmark WHI clinical trial was stopped several years early when it became clear that women receiving hormone replacement therapy (HRT) had an increased risk of developing breast cancer, heart disease, stroke, and blood clots and that the risk significantly outweighed any health benefits from HRT. After the study's results were released, the number of women on HRT—i.e., taking estrogen plus progestin—plummeted from an estimated six million to three million in the U.S. alone.

      During 2003 more bad news about HRT emerged from additional analyses of the data from the WHI trial. These in-depth studies found that HRT doubled the risk of Alzheimer disease and other dementias in women who began using hormones at age 65 or older. It increased the risk of cognitive decline by a slight but clinically significant amount and increased the risk of stroke. It caused changes in breast tissue that increased the likelihood of abnormal mammograms and impaired the early detection of tumours by mammography. It increased the risk of heart disease by 81% in the first year of therapy. Moreover, HRT failed to improve women's quality of life.

Ellen Bernstein

▪ 2003

      Bioterrorism preparedness became a national priority in many countries in 2002 in the wake of the previous year's September 11 terrorist attacks and subsequent anthrax mailings in the U.S. The possibility that terrorists would use deadly pathogens as weapons underscored the need for new drugs to treat and prevent infectious diseases. The Pharmaceutical Research and Manufacturers of America reported in April that more than 100 companies, predominantly American firms, were developing 256 such medicines, which included vaccines, antibiotics, and antiviral agents. At the same time, the pharmaceutical industry was identifying existing antibiotics that could be used to counter bacterial agents, among them anthrax, tularemia, and plague, if they were used as weapons.

      By far the major focus of bioterrorism planning was on smallpox, which was eradicated from the planet in 1980 and for which routine vaccination in the U.S. ceased in 1972. Only two high-security laboratories—at the Centers for Disease Control and Prevention (CDC), Atlanta, Ga., and the State Research Center of Virology and Biotechnology, Koltsovo, Russia—were known to have live samples of the smallpox virus. Government security officials, however, had good reason to suspect that clandestine samples could be in the hands of potential terrorists.

      In late October the Food and Drug Administration (FDA) licensed the use of the U.S. government's 30-year-old stockpile of smallpox vaccine—15.4 million doses. The government also possessed 75 million doses that the French vaccine maker Aventis Pasteur discovered in its storage facilities during the year, and it ordered a further 209 million new doses from the British company Acambis, to be prepared by means of modern cell-culture techniques.

      Securing an ample vaccine supply to protect the entire U.S. population proved easier than determining who should be vaccinated, especially because smallpox vaccine has significant risks. For every million persons vaccinated, hundreds would be likely to develop severe rashes or other non-life-threatening illnesses, 15 would likely have life-threatening complications, and 1 or 2 would die. Furthermore, for every million receiving the vaccine, the live vaccinia virus from which the vaccine is made could spread by contact to as many as 27 others who had not been vaccinated and who then would be at risk for various adverse effects. Because of these risks a federal advisory panel on immunizations specified that certain groups should not be vaccinated against smallpox. They included people with current or past eczema, atopic dermatitis, or similar skin diseases, as well as people living with someone who has such a skin disease; people with HIV; people with impaired immunity; pregnant women; and women trying to become pregnant.

      On December 13 U.S. Pres. George Bush announced his long-awaited smallpox-vaccination plan. Its first phase called for about 500,000 military and other personnel serving in high-risk areas to be immunized immediately. In addition, civilian health-care and emergency workers who would be likely to come in contact with the initial victims of a smallpox attack on the U.S. would be asked to volunteer for immunizations. Subsequently the vaccine would be offered to more traditional first responders such as fire, police, and emergency medical service personnel. At the time, Bush recommended against vaccination for the general public. (Well before the December announcement, more than 15,000 soldiers and health-care workers in Israel had received smallpox vaccine on a voluntary basis, with relatively few adverse effects.)

      In July American scientists reported having successfully created a poliovirus from scratch—that is, from only its genome sequence, which was available in the public domain, and genetic material provided by a scientific mail-order supplier. J. Craig Venter, one of the geneticists instrumental in the sequencing of the human genome—an accomplishment announced in 2000—called the work, which had been financed in part by the Pentagon, “inflammatory without scientific justification” and “irresponsible.” The relative ease with which the experiment was completed led many scientists to wonder whether other, potentially more lethal viruses such as smallpox or Ebola virus could also be synthesized.

Infectious Diseases.
      The dreaded Ebola hemorrhagic fever, called one of the “most virulent viral diseases known to humankind,” struck Gabon in late 2001 and quickly spread to neighbouring villages in the Republic of the Congo. By March 2002 about 100 persons had been infected, and 80% of them had died. The speedy arrival of international health teams helped curtail the outbreak and undoubtedly saved many lives. In May the U.S. National Institutes of Health (NIH) contracted with Crucell, a small Dutch biotechnology company, to develop the first human vaccine against Ebola hemorrhagic fever; the collaborators hoped to have a product ready to test in humans within two years.

      An alarming rise in the number of cases of gonorrhea resistant to the first-line drugs used to treat the sexually transmitted disease (STD) was seen in California. Strains of Neisseria gonorrhoeae resistant to antibiotics known as fluoroquinolones had migrated from East Asia to Hawaii and then to California. In response, the state issued new guidelines for treating gonorrhea, specifying that another drug group, cephalosporins, should replace fluoroquinolones. Late in the year two new vaccines against STD were reported to be highly effective—one against human papillomavirus type 16, which is responsible for half of all cervical cancers, and other against genital herpes (herpes simplex viruses types 1 and 2) in women. Neither vaccine would be on the market until considerable further testing was completed.

      The mosquitoborne disease West Nile virus (WNV) made its fourth annual late-summer appearance in the U.S., striking with a vengeance. As of mid-December, 3,829 human cases had been reported in 39 states and the District of Columbia, with 225 deaths. The virus was found in 29 species of mosquitoes, at least 120 species of birds, and many mammals, including squirrels, dogs, horses, mules, goats, and rabbits. A number of exotic species housed in zoos had also been infected, including penguins, cormorants, and flamingos. During the year evidence emerged that WNV could be transmitted between humans via blood transfusion and organ transplantation and possibly by infected mothers to infants through breast milk.

      The U.S. National Institute of Allergy and Infectious Diseases continued to sponsor research on several potential WNV vaccines, with hopes that one might be ready for trials in 2003. The FDA was developing a blood-screening process for WNV, which could be in use by mid-2003.

      Following the first outbreak of WNV in the New York City area in the summer of 1999, health authorities in Canada had begun to plan for its possible arrival in that country. In the summer of 2001, WNV was confirmed in mosquitoes and birds in southern Ontario. The first human cases occurred in 2002; from August through October there were 79 probable cases and 31 confirmed cases.

      Some 17,000 participants from 124 countries gathered in Barcelona, Spain, in July for the 14th—and largest—International AIDS Conference. Twenty-one years after the first cases of a new deadly disease were diagnosed in the U.S., the AIDS pandemic had become one of the most virulent scourges in human history. Worldwide, 40 million people were infected with HIV, and new infections were occurring at a rate of 15,000 a day. The lethal virus had taken 20 million lives and created at least 14 million “AIDS orphans,” defined as children under age 15 who had lost one or both parents to AIDS. In seven countries in sub-Saharan Africa, more than 20% of adults were infected with HIV, and life expectancy had been reduced to less than 40 years.

      A major report prepared by a team of public health experts, clinicians, research scientists, and people affected by HIV/AIDS was released just prior to the conference and largely set the tone of the weeklong meeting. Entitled “Global Mobilization for HIV Prevention: A Blueprint for Action,” it argued that massive expansion of the HIV/AIDS epidemic was not inevitable. Rather, if significantly scaled-up and appropriately targeted prevention efforts were initiated without delay, they could reverse the course of the pandemic by 2010 and prevent about 28 million new infections. According to the report, despite the “immense resources” at the global community's disposal, prevention efforts were reaching fewer than 20% of those at risk. Cited were dozens of examples of prevention strategies, such as school sex education and programs to increase condom use, that had curbed the spread of HIV in high-risk groups. Many of the successes were in less-developed countries.

      The report's view that prevention and treatment were “natural partners in the global fight against HIV/AIDS” was echoed by the World Health Organization (WHO), which acknowledged that the battle against AIDS would never be won as long as drugs remained unavailable to nearly six million HIV-infected people in less-developed countries. WHO took several important steps toward changing that situation. For the first time, it issued guidelines on the various combinations of three drugs—so-called AIDS drug cocktails—that were known to work best, and it stressed that they should be made available to people in poor countries. It also outlined the minimal acceptable laboratory tests both for diagnosing HIV infection and for monitoring treatment. Furthermore, WHO added a dozen antiretroviral drugs to its essential-drugs list in an effort to encourage generic companies to increase their output of inexpensive effective drugs for treating HIV infection.

      An alarming report released in September by the National Intelligence Council, an advisory group for the U.S. Central Intelligence Agency, predicted that the growth of AIDS in five countries—India, China, Nigeria, Russia, and Ethiopia—would pose economic, social, and political security threats to the respective regions as well as to the U.S. HIV epidemics in each of the countries were in their infancy but were poised for explosion. The report estimated that by 2010 the number of cases in those five countries, which together represented 40% of the world's population, would be 50 million to 75 million.

      On the clinical front, there was considerable excitement about a new class of antiretroviral drugs called fusion inhibitors, which act by preventing HIV's entry into host cells. (The other classes of antiretroviral drugs act by preventing replication of HIV after entry.) Trials in the U.S., Europe, Australia, and South America, involving people whose HIV infections were partially or wholly resistant to existing drugs, were focused on an experimental drug called T-20 (or enfuvirtide), which would be marketed under the trade name Fuzeon. Study participants who received T-20 in combination with customized AIDS drug cocktails experienced significant reductions in the amount of virus in their systems as well as increases in their healthy immune cells. It was expected that the FDA would approve the drug by early 2003.

Cardiovascular Disease.
      In the hundreds of thousands of balloon angioplasty procedures performed each year to open blocked coronary arteries, it was common practice to place tiny mesh coils, called stents, in the treated artery to help keep it open. In up to 20% of cases, however, scar tissue formed at the stent site, causing reblockage (restenosis). During the year, investigators reported promising results from trials that had tested the ability of stents coated with an immunosuppressive drug to inhibit restenosis. The coated stents prevented renarrowing of the artery in 96–100% of recipients. They were expected to receive FDA approval and be available in the U.S. in 2003.

      Another approach to staving off restenosis after angioplasty was investigated by Swiss and American researchers. Previous studies had shown that high blood levels of the amino acid homocysteine were highly predictive of restenosis following angioplasty. It was also known that a group of B vitamins lowered homocysteine levels. Accordingly, the researchers gave patients who had undergone angioplasty a combination of the B vitamin folic acid and vitamins B12 and B6, in dosages considerably higher than those in standard multivitamins, for a period of six months. Compared with angioplasty patients who were not given the vitamin regimen, those receiving vitamins had a significantly decreased incidence of restenosis and other adverse cardiac events—outcomes that lasted well beyond the time they took the vitamins.

      In mid-August the FDA approved the drug oxaliplatin (Eloxatin) for patients with advanced colon cancer that had failed to respond to existing drugs. The approval, which occurred in the record time of seven weeks, was based on a trial that found that oxaliplatin used in conjunction with two other chemotherapeutic drugs, 5-fluorouracil and leucovorin, shrank tumours by at least 30% in about 9% of patients and prevented tumours from growing again for several months. At the time oxaliplatin was approved in the U.S., it was already in use in more than 55 countries.

      Cancer death rates for African Americans, compared with those for whites, had been disproportionately high ever since statistics on cancer were first collected. Some scientists thought the difference had a biological basis. In 2002 a team of researchers published a review of data on nearly 190,000 whites and 32,000 blacks with 14 different types of cancer. Rather than identifying any biological differences between the two groups, the review found that blacks received less-optimal care than whites and were generally diagnosed at a later, less-curable stage of the disease. The researchers believed that it was time to abandon the biological trail and focus on remedying the underlying socioeconomic causes of elevated cancer mortality among blacks.

      The latest data from an ongoing government-sponsored survey of the health and nutrition of the U.S. population indicated that nearly 65% of American adults were overweight and more than 30% were obese. The most disquieting finding was that more than 80% of all black women over age 40 were overweight and half were obese. In a separate report focusing on children and adolescents, 15% of those aged 6–19 were overweight, with the highest prevalences in Mexican American and black adolescents.

      CDC researchers published the disturbing results of a 20-year study that analyzed hospital-discharge records of children. They found that overweight children were increasingly being diagnosed with illnesses formerly seen mainly in overweight or obese adults. These included type II (non-insulin-dependent) diabetes, gallbladder disease, and sleep apnea. Although the overall numbers of children with these serious conditions remained relatively low, the increases over the period 1979–99 were striking. For example, the diagnosis of gallbladder disease in 6–17-year-olds rose 228%.

      A report on obesity among children worldwide by the London-based International Obesity Task Force was presented in May at the annual meeting of the World Health Assembly, WHO's decision-making body. The task force estimated that 22 million children under age five were overweight or obese. Among 10-year-olds, the U.S. had the third highest prevalence of overweight children, after Malta and Italy. Much to the surprise of many health professionals, obesity was found to be a growing problem in less-developed countries. In Morocco and Zambia, for example, more children were overweight than malnourished. In Egypt, Chile, Mexico, and Peru, as many as 25% of children aged 4–10 were overweight or obese.

      Two hormones associated with appetite and weight gain were identified during the year. One appeared to stimulate appetite and the other to suppress it. Ghrelin, a hormone secreted by cells in the stomach and small intestine, was shown to increase hunger, slow metabolism, and decrease the body's fat-burning capacity. Researchers found that people who had lost significant weight produced large quantities of ghrelin, which helped explain why maintaining weight loss was so difficult. On the other hand, extremely obese people who had undergone gastric bypass surgery, which reduces the size of the stomach as well as the ability of the small intestine to absorb nutrients, had low levels of ghrelin and decreased appetites. This finding helped explain why those who had received the surgery tended to be successful at keeping weight off. Whether these findings would lead to new treatments for obesity, such as a drug that turns off ghrelin production, remained unclear.

      Scientists had known about the substance called peptide YY3-36 for years but did not know what role it played in controlling appetite. Recently they found that the hormone was directly linked to the feeling of fullness that tells a person to stop eating. When it was given to study subjects two hours before a buffet meal, they consumed about 33% fewer calories than they did when they were not given the hormone. The appetite suppression lasted about 12 hours. Even after the hormone's effects had worn off, subjects did not overeat to make up for their reduced caloric intake. Although further research was needed, obesity specialists were enthusiastic about the possibility of using the hormone to help people lose weight. It appeared to have no adverse effects and was relatively easy and inexpensive to synthesize.

Women's Health.
      The medical story that probably received the most attention during the year was the discontinuation of a major study of postmenopausal hormone replacement therapy (HRT) three years earlier than planned. The study was part of the Women's Health Initiative (WHI), a long-term project to study diseases that affect women. It involved more than 16,000 healthy women between the ages of 50 and 79 who took either estrogen plus progestin or a placebo. When it became clear a little over five years into the study that women taking the hormones were developing breast cancer as well as heart disease, stroke, and blood clots more often than placebo takers, the investigators decided that risks of HRT exceeded any health benefits.

      The news about these previously unknown risks was a source of great concern not only for the millions of women on HRT but also for the doctors who had been enthusiastically prescribing it. Its wide use had been encouraged by long-term observational studies of large groups of women, the results of which had suggested multiple benefits. HRT not only eased the hot flushes, night sweats, and vaginal dryness of menopause but also appeared to lower the risk of osteoporosis, heart disease, Alzheimer disease, incontinence, and even depression. In speculating on how doctors and patients drew false assurance from these observations, surgeon and breast cancer specialist Susan Love, in an op-ed article in the New York Times (July 16), wrote that “medical practice … got ahead of medical science” and that although the observations of HRT's benefits led to hypotheses, “observation … can't prove cause and effect.” Only a large randomized placebo-controlled study could do that.

      In October the NIH convened a meeting at which experts offered guidance to clinicians on key HRT questions. On the whole, they agreed that no healthy woman should take HRT to prevent heart disease or other chronic conditions. For women using hormones to prevent osteoporosis, there were better options, such as calcium and vitamin D supplements, weight-bearing exercise, and the nonhormonal prescription drugs alendronate (Fosamax) and raloxifene (Evista). For women suffering from acute menopausal symptoms, alternatives should be considered first, but for some, HRT might be appropriate at the lowest-possible dosage for the shortest-possible time.

Ellen Bernstein

▪ 2002

      The medical response to the havoc wreaked by four jetliner crashes on September 11 due to terrorist activity was massive and rapid at all three impact sites: Lower Manhattan, the Pentagon in Virginia, and rural Shanksville, Pa. It was in New York City, however, that the need for an unprecedented level of trauma care seemed likely, at least at first. A few hours after the World Trade Center's twin towers collapsed, five designated city hospitals were prepared for the worst. Triage centres were set up within a few blocks of “ground zero,” fully staffed and equipped to treat any possible injury and perform lifesaving surgery. To be sure, about 600 people were treated on September 11, about 150 of whom were critically injured, but as the day wore on, the numbers of new patients dwindled, and the anticipated deluge never materialized.

      It soon became obvious that far more people had perished than had survived with injuries. It was the rescue crews, not medical personnel, who had their work cut out for them—digging through the rubble day and night in a mostly vain search for the still living, at significant risk to themselves. In fact, the need was greater for specially trained rescue dogs than for doctors to aid in the on-site search and recovery.

      Fears of bioterrorism in the wake of the September 11 terrorist attacks led the U.S. government to evaluate its supply of vaccines against anthrax and smallpox. The available anthrax vaccine was of questionable potency and had safety risks. Whereas new vaccines were in development, they were not available when in early October a smattering of anonymous letters carrying spores of Bacillus anthracis began arriving in the mailboxes of broadcast and print media on the East Coast and federal offices in Washington, D.C. Dissemination of the spores as the letters were processed through postal machinery and handled at their destinations was believed responsible for nearly 20 confirmed cases of cutaneous and inhalation (pulmonary) anthrax and several deaths from the rapidly fatal inhalation form.

      Because anthrax was preventable and treatable with antibiotics, the U.S. government's strategy was not to vaccinate but to treat everyone who may have been exposed to the bacterium with the antibiotic ciprofloxacin (Cipro). The Food and Drug Administration (FDA) took action to approve two other widely available generic antibiotics, doxycycline and penicillin, for treatment of inhalation anthrax in the event of a large-scale terrorist attack. Anthrax could not be spread by infected individuals, which rendered many of the usual communicable-disease-prevention measures unnecessary. Various actions, including widespread testing of suspected locations for the presence of spores and decontamination of spore-tainted buildings, offices, and mail-sorting equipment, were taken in an attempt to limit further dispersal. Mail from contaminated postal facilities was impounded for several weeks until it could be sanitized by irradiation and returned to the mail stream for delivery. Government authorities also moved to install equipment in post offices that would kill anthrax spores during regular mail processing.

      Smallpox, unlike anthrax, was highly contagious, and an estimated 80% of the U.S. population was thought to be susceptible. The devastating viral disease was effectively eradicated from the world in 1977, but samples of the virus still existed and could get into the hands of terrorists. Consequently, the federal government sought to increase its relatively meagre supply of vaccine, 15.4 million doses. Medical scientists at several universities were exploring the possibility of diluting the existing supply to increase the number of doses. At the same time, the government arranged to acquire new smallpox vaccine from several pharmaceutical companies—up to the 300 million doses needed to protect everyone in the U.S.

Stem Cell Research and Human Cloning.
      Although the tragedy of September 11 and the threat of bioterrorism overshadowed so many events in the world, during the year there were myriad noteworthy developments in health and disease. The field that probably generated the most excitement, and the most heated political debate, was research on human stem cells. Stem cells were described as “unspecialized,” “primordial,” and “pluripotent” cells that could be coaxed to become specific kinds of cells—e.g., of skin, cartilage, muscle, cornea, brain, heart, pancreas, or liver. The ideal source of these cells was considered to be a five-day-old human embryo, comprising 200–250 cells. (Stem cells were also available from adults, but they appeared to have less promise than embryonic stem cells.)

      A long-awaited pronouncement on the future of embryonic stem cell research in the U.S. came on August 9. In a television address Pres. George W. Bush stated that he would allow federal support of such research, but only on cell lines that already existed and had been derived from “leftover” embryos grown in infertility clinics. This restriction, according to President Bush, would permit research “without crossing a fundamental moral line by providing taxpayer funding that would sanction or encourage further destruction of human embryos that have at least the potential for life.” Many research scientists considered the decision severely limiting, and in September a committee of the Institute of Medicine (IOM), a branch of the U.S. National Research Council, issued a report concluding that new cell lines would still be needed down the road, in part because the existing lines would likely accumulate harmful genetic mutations over time.

      In November a private Massachusetts biotechnology firm, Advanced Cell Technology, provoked much sound and fury when it announced that it had taken the first steps toward cloning human embryos. According to the company, the goal was not to clone a human being but to produce stem cells for treating disease. In fact, most of the embryos died before reaching even an eight-cell stage, without producing the desired stem cells. President Bush, religious and political leaders, and many scientists condemned the work as immoral and a dangerous move in the wrong direction.

Infectious Disease.
      The World Health Organization's (WHO) Communicable Disease Surveillance and Response service, which tracked major infectious diseases worldwide, reported a number of major outbreaks. They included cholera in West Africa, Chad, Tanzania, South Africa, Pakistan, and India; Ebola hemorrhagic fever in Uganda; measles in South Korea; yellow fever in Brazil, Peru, Côte d'Ivoire, Liberia, and Guinea; plague in Zambia; dengue fever in Venezuela; meningococcal disease in Angola, Ethiopia, Democratic Republic of the Congo, and the “African meningitis belt,” an area that extended across the middle of the continent and included all or part of at least 15 countries between Senegal and Ethiopia; Crimean-Congo hemorrhagic fever in Pakistan and the Kosovo province of Yugoslavia; legionellosis in Spain and Norway; and an illness described as a “highly lethal variant of measles” in India.

      One of the greatest scourges of all time, poliomyelitis, came closer to being a thing of the past, thanks to a massive global eradication effort coordinated by WHO, UNICEF, Rotary International, and the U.S. Centers for Disease Control and Prevention (CDC). From 1999 to 2000 the number of polio cases in the world was cut in half to 3,500, and the number of endemic countries (those with naturally occurring poliovirus) dropped from 50 to 20. As of mid-2001, India, which once bore the world's greatest polio burden, had only 10 confirmed cases. The target date for global eradication was 2005, but completion of the task would require an all-out vaccination effort in Southeast Asia, the eastern Mediterranean, and Africa, at a cost of $400 million.

      Although childhood vaccines had saved millions of youngsters the world over from infectious disease, deformity, and death, their safety continued to be a source of controversy. Studies published during the year demonstrated that some alleged risks of vaccine use were not real. Combination vaccines against diphtheria, pertussis, and tetanus (DPT) and measles, mumps, and rubella (MMR) were shown not to be associated with long-term risks of seizures or other neurological problems in children. Furthermore, no evidence was found that hepatitis B vaccine caused or aggravated multiple sclerosis. Public health professionals hoped these and other “negative results” would alleviate some of the public's fears.

      The year 2001 was the 20th anniversary of the initial reports of a mysterious deadly immune-system disorder that came to be known as AIDS. The medical community, international AIDS organizations, and especially the media saw the occasion as a time to reflect upon the relentless epidemic that had killed more than 21 million people on every continent and from every walk of life. In 2001 an estimated 36 million people were living with HIV infection.

      The long-held hope for an AIDS vaccine continued to be pursued. Although as many as 80 potential vaccines had been tried in humans, only one had reached large-scale human trials. About 8,000 volunteers at high risk for HIV in North America, The Netherlands, and Thailand had received either an experimental preventive vaccine developed by the California-based firm VaxGen or a placebo. Periodically they were being tested for HIV. The trials would continue until 2002–03.

      At the 8th Conference on Retroviruses and Opportunistic Infections, held in Chicago in February, HIV/AIDS treatment specialists voiced a loud cry for newer and safer drugs and pointed out that the highly lauded combination-drug therapies, also known as AIDS “drug cocktails,” were not working for thousands of patients. Clinicians reported a range of adverse effects associated with the life-prolonging drugs, including high cholesterol, diabetes, fat accumulations in the neck and abdomen, weakened bones, and nerve damage in the extremities. Among the many experimental drugs that were described at the conference, perhaps most promising was a new class called entry inhibitors, which blocked the binding of HIV to key receptors on the cell surface.

      Excitement about new treatments, however, had little relevance for the millions of people in less-developed countries living with HIV, many of whom had no access to treatment. The high cost of existing drugs and their unavailability to the vast majority of HIV/AIDS sufferers had aroused considerable ire among government officials and others trying to combat AIDS in less-developed countries. To make treatment more accessible, a handful of pharmaceutical companies in India, Thailand, and other countries began producing cheaper generic versions of the patented agents used in drug cocktails, a move vigorously opposed by the multinational companies holding the patents. As sentiments against the drug giants mounted, however, several conceded to pressure and slashed their prices on AIDS drugs for less-developed countries, and a few waived their patent rights. Some 39 major companies that manufactured AIDS drugs had sued South Africa in 1998 in an effort to bar the country from importing cheaper drugs. In April 2001 the companies dropped their case.

      UN Secretary-General Kofi Annan called the battle against AIDS one of his personal priorities when he initiated a global fund to allot between $7 billion and $10 billion annually to combat a trio of diseases that continued to ravage the Third World—AIDS, tuberculosis, and malaria. Addressing the delegates to the first UN summit on AIDS, held in New York City in June, Annan said, “This year we have seen a turning point. AIDS can no longer do its deadly work in the dark. The world has started to wake up.”

      China was one country that “woke up” to its AIDS crisis. In August its deputy health minister, Yin Dakui, admitted that the country was “facing a very serious epidemic of HIV/AIDS” and that the government had “not effectively stemmed the epidemic.” An estimated 70% of China's cases were among intravenous drug users. The Chinese government claimed that about 600,000 citizens were infected with HIV, whereas the UN estimated the number at more than one million.

      In the U.S. the incidence of new HIV infections among homosexual African American men aged 23 to 29 was called “explosive.” CDC surveys found that 30% of men in this group were HIV-positive.

      Rarely do research scientists become unmitigatedly exuberant over a new treatment. Nevertheless, this was the overwhelming sentiment among cancer specialists about a new drug, imatinib (marketed as Gleevec in the U.S. and Glivec in Europe). Imatinib was one of a new class of anticancer agents known as growth-factor inhibitors, which targeted cancer cells by recognizing their unique molecular defects. The FDA approved imatinib in record time after tests showed that it had induced remissions in 53 of 54 patients with chronic myelogenous leukemia (CML). Less than a month after publication of the CML results, scientists reported that 60% of nearly 200 patients with gastrointestinal stromal cancer (GIST) treated with imatinib had became symptom-free. GIST is a rare intestinal malignancy for which there had been no known treatment.

      An IOM report issued in June put some of the fanfare about new cancer treatments in perspective. “The reality is that half of all patients diagnosed with cancer will die of their disease within a few years,” the report stated. The expert panel that prepared the report was highly critical of the “almost single-minded focus on attempts to cure every patient at every stage of disease.” It found that at least half of dying cancer patients suffered symptoms for which they received little or no treatment; these included pain, difficulty breathing, emotional distress, nausea, and confusion. The report called for a vastly stepped-up program to ensure that suffering cancer patients received palliative (symptom-abating) treatments.

      Diabetes was fast becoming one of the most worrisome epidemics of the 21st century. In 2001 more than 135 million people worldwide were affected, and the number was expected to reach 300 million by 2025. The vast majority had type 2, or non-insulin-dependent, diabetes. With globalization, less-developed countries were experiencing some of the steepest increases. A survey published in September indicated that during the decade of the 1990s the proportion of Americans with diabetes increased 49%. Duly alarmed, CDC Director Jeffrey Koplan said, “If we continue on this course for the next decade, the public health implications in terms of both disease and health care costs will be staggering.”

      As a counterpoint to these dire predictions, a study carried out in Finland found that overweight middle-aged women and men who increased their activity level and ate a low-fat, high-fibre diet were unlikely to develop diabetes, even if their weight loss was minimal. In August a similar study in the U.S. was cut short when it became clear that lifestyle changes were overwhelmingly effective at staving off diabetes in those at high risk.

      Three studies reported during the year showed that a common class of drugs for high blood pressure, angiotensin II receptor blockers, could significantly delay inexorable deterioration of the kidneys in people with diabetes. Commenting on these results, one of the investigators said, “For pennies … we can prevent a lot of disease and ultimately save billions of dollars in treatment.”

      A novel antidiabetes drug, nateglinide (Starlix), which became available in a number of countries, offered a new option for people with poorly controlled blood sugar. Studies found that when taken just before a meal, nateglinide triggered an immediate release of insulin by the pancreas. The insulin prevented spikes in postmeal glucose levels; such spikes were associated with blood vessel damage.

Cardiovascular Disease.
      Balloon angioplasty was among the most frequently performed procedures for restoring blood flow to partially obstructed coronary arteries. In 90% of angioplasties, after a catheter-delivered balloon had been inflated to widen the artery, a tiny mesh coil (stent) was inserted to help keep the artery open. In as many as 20% of cases, however, the artery renarrowed at the site of treatment within six months as a result of scar formation, a process called restenosis. During the year an experimental technique for preventing restenosis after angioplasty was hailed as a “major breakthrough” by the American Heart Association. The approach used stents that were coated with an antibiotic and designed to release the medication slowly over a one-month period to prevent local scar-tissue formation. In a European trial more than 100 patients who received antibiotic-coated stents had no incidence of restenosis seven months after angioplasty.

      In July the first of a new type of artificial heart, developed by the Massachusetts-based firm Abiomed, Inc., was implanted in Robert Tools, aged 58, at Jewish Hospital in Louisville, Ky. Tools had diabetes and end-stage heart disease and was far too sick to be considered for a heart transplant. After removing most of his diseased heart, the surgical team attached the grapefruit-sized device, made mostly of titanium and plastic, to the remains of the two upper heart chambers and aorta. A battery pack worn outside the body transmitted power to the implanted device with no skin penetration. By contrast, the first artificial heart, the Jarvik-7, which had been implanted in a few deathly ill patients in the early 1980s, had tubes leading from an internal pump to cumbersome external compressors and consoles. Tools's recovery exceeded his surgeons' expectations over the first four months, but in November his condition worsened, and he died from severe abdominal bleeding associated with his preimplant illness. Four subsequent patients successfully received artificial hearts during the year. The goal of these first implants was to enable the severely ill recipients to live an extra six months with a satisfactory quality of life. Abiomed expressed the hope that later generations of its device would be suitable for a broader group of patients, who would gain five or more years of life.

Alzheimer Disease.
      The Alzheimer's Association estimated that about 4 million people in the U.S. had Alzheimer disease (AD) at the start of the 21st century and predicted that by 2050 the number would jump to 14 million. WHO estimated that there were 37 million people worldwide with dementia, the large majority of whom had AD. As of 2001, there was still no cure or treatment that could significantly halt the progression of the disease.

      During the year about three dozen clinical trials were either under way or in the recruitment stage to test potential AD treatments. At the University of California, San Diego, medical researchers began testing the first gene therapy procedure for AD. Their first volunteer was a 60-year-old woman with early-stage disease. Initially, skin cells were taken from the woman and genetically modified to produce large amounts of nerve growth factor. Then, in an 11-hour operation, neurosurgeons implanted the cells into diseased tissue in her brain. The primary goal was to see if the treatment was safe. The researchers hoped that ultimately the therapy would prevent the death of specific nerve cells that are affected by AD and enhance the function of others, which would thereby delay the onset of major symptoms.

      An optimistic report published in the March 13 Proceedings of the National Academy of Sciences found that people who were physically and mentally active in early adulthood and middle age had an excellent chance of avoiding AD. Similar findings were emerging from a unique ongoing investigation known as the Nun Study. (See Sidebar. (Alzheimer Disease: Clues from Convents ))

Ellen Bernstein

▪ 2001

      There are very few years in which a single achievement in medicine overshadows all others. Such a year was 2000, however, and such an achievement was the sequencing of the entire human genome.

      At a June 26 White House ceremony marking the occasion, Francis Collins (see Biographies (Collins, Francis )), who led the international, publicly funded Human Genome Project, said, “This is a milestone in biology unlike any other.” J. Craig Venter, head of Celera Genomics, a private company that entered the genome race in 1998, looked ahead: “It's my belief that the basic knowledge that we are providing the world will have a profound impact on the human condition.” Whether one considered the sequencing of the genome to be the end of a colossal project or the beginning of a new science of human beings, there was no question that it would revolutionize medicine. (See Life Sciences: Special Report (Human Genome Project:Road Map for Science and Medicine ).)

Infectious Diseases.
      Across the globe there were outbreaks of old and new infectious diseases. They included Ebola hemorrhagic fever in Uganda; cholera in at least 15 African countries, Afghanistan, and Micronesia; dengue fever in Paraguay; leptospirosis in Canada and France; yellow fever in Liberia; measles in Ireland; Legionnaire disease in Australia; polio in China; variant Creutzfeld-Jakob disease in France, the U.K., and Ireland; and hantavirus in Panama.

      Malaria, long a scourge of the tropical world, was increasing at a rate of about 130 million new cases a year. Some 90% of cases were in Africa, where in the late 1990s close to one million children were dying of the mosquitoborne disease annually. In April the World Health Organization (WHO) convened the first sub-Saharan African summit on malaria, for which leading health economists prepared an eye-opening report on the true costs of the disease. The authors calculated not only the direct medical costs and short-term losses of economic growth and productivity but also the devastating longer-run losses to tourism, foreign investment, and commerce, and they factored in the social and emotional costs of pain and suffering. Their analysis showed that controlling malaria in Africa would save “in the dozens of billions of dollars per year” in a matter of just a few years. The summit ended with a pledge of nearly $750 million in extra funds to fight the disease. The outpouring of cash, which came from the World Bank and several wealthy countries, was earmarked for the already established Roll Back Malaria program, which had the ambitious goal of cutting the incidence of malaria in Africa in half by 2010.

      Another mosquitoborne disease, the West Nile virus, made a comeback in the northeastern U.S. in mid-2000, after having first appeared in the Western Hemisphere a year earlier. The West Nile virus normally circulates between birds and mosquitoes and is capable of infecting humans and other mammals. At its most virulent, the virus causes inflammation of the brain and spinal cord (meningoencephalitis) and death. In the 1999 outbreak, 62 people were infected and 5 died, all in the New York City area. The sweep of the 2000 outbreak was broader—infected birds and mosquitoes were found in New York, Connecticut, New Jersey, Massachusetts, and Maryland—but the toll on humans was comparatively mild. Twelve people were hospitalized with serious nervous system infections, and one person died.

      A far more significant West Nile virus outbreak occurred in Israel, where the virus was in familiar territory. In late September Israeli health authorities declared an epidemic when it appeared that thousands of people were suffering from symptoms of the disease and at least 12 had died.

Antimicrobial Drug Resistance.
      For many years disease authorities around the world had been warning that antimicrobial drugs employed to treat common infections were becoming increasingly ineffectual, which was allowing the comeback of previously conquered diseases and the emergence of virulent new infections. A WHO report issued during the year documented the extent to which infectious diseases, including malaria, tuberculosis (TB), AIDS, pneumonia, and diarrheal diseases, were “arrayed in the increasingly impenetrable armour of antimicrobial resistance.” It noted that in less-developed countries antibiotics and other antimicrobial agents tended to be underused or misused but that in developed countries they were notoriously overused. The report recommended that access to these drugs be widened to include the world's poorest people, but at the same time it stressed that antibiotics should be reserved “to treat only those diseases for which they are specifically required.”

      On a positive note, a Centers for Disease Control and Prevention (CDC) survey showed that in the late 1990s American doctors were writing 34% fewer prescriptions for antibiotics for children than they had at the beginning of the decade. This finding suggested that physicians were getting the message that antibiotics are not effective for colds and other viral illnesses and that inappropriate use promotes resistant bacteria.

      One tactic in the battle against antimicrobial resistance was investment in the development of new antibiotics. In April the U.S. Food and Drug Administration (FDA) approved a long-awaited drug, linezolid (Zyvox), the first in a new class of antibiotics, the oxazolidinones. Zyvox was designed to stop bacteria very early in the reproduction process. The FDA specifically approved the drug for use in adults with severe hospital-acquired infections. Welcomed as it was, Zyvox was not a magic bullet; even before it came on the market, physicians had encountered at least 15 cases of infection resistant to it.

      In order to surmount the growing problem of multidrug-resistant TB, an alliance of researchers and drug companies announced plans to accelerate development of fast-acting TB drugs. Standard TB drugs must be taken for six to nine months to eradicate the infection. Many patients, however, were failing to take the complete course, and the TB organisms were thus allowed to survive and grow resistant to available medications. Having drugs that could wipe out the infection in a shorter period would be a huge boon to the world, where each day more than 5,000 people died from TB and as many as eight million new people were infected.

Vaccine Developments.
      An immunization against herpes simplex 2 (genital herpes) was tested in medical centres in the U.S., Canada, Australia, New Zealand, Italy, and the U.K. To the surprise of the investigators, it was highly effective in women but not in men. The trials involved couples in which one member had herpes and the other did not. Experts said that once the herpes vaccine was on the market, it would have the greatest impact if it was given to prepubescent girls.

      The first vaccine against the varicella-zoster virus, which causes chicken pox and shingles, was approved in 1995 and subsequently was administered to more than 10 million American children. In 2000, researchers studying children in Los Angeles county reported that chicken pox cases had fallen 80% between 1995 and 1999. The vaccine protected not only those children who received it but many children who did not—a phenomenon known as herd immunity.

      In 2000 Alzheimer disease affected about 12 million people in the world. That number could reach 22 million by 2025 unless effective means of prevention or cure were found. Scientists began the first human trials of a vaccine intended to prevent the accumulation in the brain of amyloid plaques, a hallmark of the disease.

      In July more than 12,000 attendees gathered in Durban, S.Af., for the 13th International AIDS Conference. The setting could not have been more poignant—70% of the world's 34 million AIDS cases were in sub-Saharan Africa, where life expectancy would be reduced to about 30 years by the year 2010 unless dramatic steps were taken. Prior to the start of the conference, 5,228 physicians and scientists from 84 countries signed a manifesto called the Durban Declaration. Its message was that “the evidence that AIDS is caused by HIV-1 or HIV-2 is clear-cut, exhaustive and unambiguous.” The declaration was in anticipation of the opening remarks of South African Pres. Thabo Mbeki, who had previously expressed doubts whether HIV was the cause of AIDS. At the conference he questioned whether Western treatments were appropriate for African AIDS. “We are just trying to find solutions that are situated to South Africa, the southern Africa region, and the continent as a whole,” Mbeki told the delegates. The closing speech was delivered by South Africa's former president Nelson Mandela, who urged the delegates to rise above their differences and not be distracted from the main course—that is, stepping up efforts to stop the spread of HIV.

      Sobering statistics indicated that HIV infections and AIDS were spreading rapidly in Eastern Europe, the Caribbean, China, and India. In the U.S., public health practitioners were alarmed by a surge in new cases among homosexual men in San Francisco. That rise was attributed to complacency, brought about in part by the availability of effective treatments. A global HIV/AIDS surveillance report issued by WHO and the UNAIDS program at the end of 2000 indicated that for the first time the incidence of new infections in sub-Saharan Africa had stabilized rather than increased. That good news, however, was offset by the increase in the number of people in the region suffering and dying from AIDS. The same report estimated that the number of AIDS deaths worldwide since the beginning of the pandemic (in the early 1980s) was 21.8 million.

      On the clinical front, a study reported in the journal Nature found that some people with HIV who began highly aggressive antiretroviral therapy very soon after their diagnosis could take a “holiday” from the drugs. Although their viral levels rose with the cessation of the drugs, their immune systems seemed to keep severe illnesses at bay. This suggested that in the future people with AIDS would be able to have “structured treatment interruptions” from complicated and expensive drug regimens.

      British scientists started trials of a vaccine against the strain of HIV most prevalent in Africa. If the vaccine proved safe for the first recipients—18 volunteers in the U.K.—wider trials were expected to begin in Nairobi, Kenya, within a year. Another HIV vaccine trial involving 2,500 volunteers began in Thailand; this was the first large-scale clinical trial of an AIDS vaccine in a less-developed country. Vaccines were considered the single intervention most likely to alter the frightening course of the AIDS pandemic, and in 2000 more than 70 different vaccines were being tested.

New Treatments.
      In 2000 stroke disabled at least 570,000 people in the U.S. alone. Until recently little could be done for the paralysis and loss of function that typically occur. For one thing, it had long been thought that the adult brain was not capable of regeneration. During the year researchers in Birmingham, Ala., and Jena, Ger., helped prove that regeneration was possible even years after a stroke. The scientists tested a technique called constraint-induced-movement therapy in 13 patients with paralyzed arms. The treatment required exercising the disabled arm a full six hours a day for several weeks. Immediately following the therapy, images of the brain showed that nerve connections on distinct areas of brain circuitry had almost doubled; six months later the changes were still evident. The subjects regained about 75% of the use of their arms.

      Canadian researchers developed a unique islet-cell transplant technique that eliminated the need for insulin injections in seven persons with poorly controlled type I (insulin-dependent) diabetes. The achievement was so impressive that The New England Journal of Medicine posted the paper describing the work on the Internet nearly two months prior to its scheduled publication.

      In April the FDA approved two new nondrug treatments for gastroesophageal reflux disease, a severe, persistent form of heartburn. The treatments repaired the actual cause of the problem, a faulty muscular valve (lower esophageal sphincter) between the esophagus and stomach. Both treatments were minimally invasive and were performed by means of a tube that was positioned in the throat. One placed stitches in the sphincter; the other seared it with radio-frequency energy. The procedures enhanced the valve's barrier function, thereby preventing the reflux of bile and stomach acid into the esophagus.

      An especially promising study found that the drug interferon beta-1a (Avonex) could delay the development of established, clinically definite multiple sclerosis (MS), a disease that gradually destroys the myelin covering of nerve fibres. The trial, carried out in the U.S. and Canada, involved people who had experienced a single, isolated neurological event suggestive of MS—for example, weakness of a limb or a visual disturbance. Interferon beta-1a previously had been available for people with diagnosed MS. Avonex's manufacturer, Biogen, Inc., sought FDA approval for expanded use of the drug. An approved cancer drug, mitoxandrone (Novantrone), was approved for treating patients with advanced or chronic MS. The drug was found to reduce the number of relapses and help patients keep their mobility longer.

      Investigators in Germany had impressive early results from a novel vaccinelike treatment given to 17 patients with advanced kidney cancer. The treatment used cells from the patients' own tumours that were fused with immune-system cells from healthy donors. Four patients had been cancer-free for at least 11 months, and two others had tumour shrinkage of more than 50%. In an experimental protocol in the U.S., 15 patients with advanced kidney cancer received a transplant of cancer-fighting cells from the immune system of a sibling. Nine patients were still alive after more than a year, and in four subjects all signs of the cancer were gone. In others, tumours shrank by more than half.

      A treatment already available to women in 15 countries around the world finally became available to women in the U.S. late in the year. The so-called abortion pill—RU-486, or mifepristone—was approved in late September and was on the market before the end of the year, selling under the brand name Mifeprex. Owing to strict regulations imposed by the FDA on the use of mifepristone, many U.S. doctors opted not to dispense it.

Drugs off the Market
      Troglitazone (Rezulin), a prescription drug used to treat type 2 diabetes, was removed from the market in March because of its potential to cause severe liver toxicity. Two newly approved diabetes drugs, rosiglitazone (Avandia) and pioglitazone (Actos), offered the same benefits without the risk.

      In November the FDA asked manufacturers of hundreds of widely sold over-the-counter appetite suppressants, decongestants, cold and cough remedies containing phenylpropanolamine (PPA) to stop marketing them. PPA was linked to a slight but significant risk of stroke in women. Among the products pulled from store shelves were various forms of Contac, Alka-Seltzer, Acutrim, Dexatrim, Robitussin, and Triaminic. The FDA was taking steps to ban PPA as an ingredient in all drug products.

      A drug for irritable bowel syndrome in women, Lotronex (alosetron), was approved in February. In late June, after cases of serious intestinal problems were reported in some women taking the drug, the FDA required pharmacists to distribute a “medication guide” that warned patients directly about the risks. In November the manufacturer voluntarily withdrew Lotronex from the market, at which point 70 cases of adverse effects and 5 deaths had been reported.

Colorectal Cancer.
 In March Katie Couric, cohost of the NBC morning show Today, took a camera crew to her doctor's office, where, under mild sedation, she underwent a colonoscopy examination before millions of television viewers. Couric's husband had died of colon cancer in 1998, and her goal was to convince viewers of the importance of screening. The procedure involved insertion of a flexible lighted tube through the rectum into the colon; video technology enabled the doctor to see the entire lining of the approximately 150-cm (60-in)-long large intestine.

      Two studies published in July found that colonoscopy was a far more reliable way to detect cancerous lesions and precancerous polyps than the recommended preliminary screening procedure, sigmoidoscopy (which allows the doctor to see only inside the rectum and lower colon). The studies suggested that as many as half of all cancerous lesions in the upper portion of the colon were missed by routine sigmoidoscopy. An editorial commenting on the findings compared “relying on flexible sigmoidoscopy” to “performing mammography of one breast” and called for insurers to cover the cost of the far-better screening method for colorectal cancer. At the end of the year the results of an 18-year study showed that the simplest test for colon cancer—an occult fecal blood screen—which detects traces of the blood in the stool, had the potential to reduce the rate of colorectal cancer by as much as 20%. In mid-November the FDA approved a laser system that improves a physician's ability to distinguish small harmless growths from precancerous growths in the colon. The device can be used during sigmoidoscopy or colonoscopy.

Alternative Medicine.
      According to the Nutrition Business Journal, the U.S. public was expected to spend an estimated $15.7 billion on herbal products, vitamins, minerals, and other dietary supplements in 2000. Although the FDA did not require manufacturers of these products to establish their safety or efficacy before marketing them,, a private company, began assessing hundreds of products sold to the public for the purpose of promoting health and wellness. Its evaluations were published on the company's World Wide Web site .

      A review of 20 herbal preparations purportedly containing the stimulant ephedra (also known as ma huang) was published in the American Journal of Health-System Pharmacy in May. The products were found to contain anywhere from 0% to 154% of the amount of ephedra listed on the label, and considerable variation was found between lots of the same product. A report later in the year linked ephedra in dietary supplements to 10 deaths and 13 cases of permanent disability.

      At least 21 million people in the U.S. alone suffered from osteoarthritis, characterized by stiffness, pain, inflammation in the joints, and often some degree of debilitation. When a best-selling book published in 1997 claimed that glucosamine and chondroitin were a “cure” for arthritis, the medical establishment was profoundly skeptical. Despite the lack of scientific evidence that the substances—both natural components of cartilage—worked, millions of arthritis sufferers began using them either separately or combined. Boston University researchers analyzed the results of 15 studies on chondroitin and glucosamine. They found that glucosamine (extracted commercially from crustacean shells) was by itself moderately effective in relieving symptoms, while chondroitin (made from cow, pig, or shark cartilage) offered more significant relief. A problem, however, was that most chondroitin products on the market were of unreliable quality. The side effects of both were fewer and milder than those associated with standard arthritis pain relievers.

      During the year the Washington Post carried out the first survey in the U.S. on the illness and death associated with the growing use of supplements. Among other things, the survey found that poison-control centres in many states were seeing a dramatic increase in the number of adverse reactions caused by supplements, including ephedra, Saint-John's-wort, melatonin, and ginseng; that people taking products containing ephedra or its derivatives for weight loss or extra energy experienced adverse effects ranging from jitteriness to chest pains, insomnia, addiction, stroke, and death; and that children increasingly were being given supplements and suffering adverse reactions. The survey revealed rampant abuse of body-building supplements like gamma-hydroxybutyrate, or GHB, which was held responsible for hundreds of hospital and poison-centre visits and several deaths. Dangerous contaminants such as mercury, arsenic, and lead were found in supplements, especially in herbal products from Asia.

Health Systems.
      The first assessment ever attempted of the world's health systems was published by WHO in June. Countries whose systems were ranked highest (on the basis of five indicators) included France, Italy, San Marino, Andorra, Malta, Singapore, Spain, Oman, Austria, and Japan. The assessment found wide variation in the performance of health systems, even among countries with similar levels of income and expenditures on health. Other key findings were that the vast majority of countries were underutilizing available resources and that poorly performing health systems had profound effects on the poorest people, often driving them deeper into poverty. It was not surprising that the lowest-ranking systems were in sub-Saharan Africa. The U.S., which of all countries spent the highest proportion of its gross domestic product on health, received the highest ranking for one indicator—the availability of resources. Overall, however, the country ranked 37th out of 191 countries evaluated. (See Special Report (Socialized Medicine's Aches and Pains ).)

Stem Cell Research.
      In August the U.S. National Institutes of Health released new rules governing the use of human stem cells in medical research. Stem cells are undifferentiated cells that can be coaxed to grow into various types of specific cells and thus have great potential for the repair of damaged or defective tissues and organs. The rules stipulated that federally funded researchers could work with embryonic stem cells but that the cells had to come from excess frozen embryos (those already destined for destruction) obtained from private fertility centres. Prior to the release of the guidelines, American researchers had been experimenting with stem cells derived from adult organs. Although adult stem cells had distinct therapeutic possibilities, they were sometimes difficult to isolate and purify, and they had less capacity to proliferate than embryonic cells. The biomedical research community, therefore, enthusiastically welcomed the ruling.

Ellen Bernstein

▪ 2000

      In 1999 the international team of scientists participating in the $3 billion Human Genome Project made impressive strides toward the goal of locating, analyzing, and identifying virtually every one of the estimated 100,000 human genes. On December 1 it was announced that cooperating scientists from four institutions had meticulously mapped 97% of the genetic material contained on chromosome 22. As Francis Collins, chairperson of the publicly funded international project, noted, “This is the first time that we've had a complete chapter in the human construction book.” Mutation to genes located on chromosome 22 were known to play a role in several dozen human diseases, including disorders of the heart and immune system, certain cancers, mental retardation, and schizophrenia. Although chromosome 22 represents only about 1.1% of the genes in the human body, the scientists involved in the decoding effort expected to complete a “first draft” of the entire genome project early in 2000—several years ahead of the originally projected completion date.

      A major bioethical debate during the year centred on research using human embryonic stem and germ cells, both first isolated in late 1998. While such research held great promise for scientific advances, it also raised serious ethical questions. (See Special Report: The Science and Ethics of Embryonic Stem Cell Research. (Science and Ethics of Embryonic Stem Cell Research ))

      In October the Washington Post surveyed about 2,000 Americans to find out what issues were worrying them most. From a list of 51 possible “worries,” the single greatest concern, irrespective of political leanings, was that “insurance companies are making decisions about medical care that doctors and patients should be making.” Two other health care issues ranked among the respondents' top five worries—that elderly Americans would not be able to afford the prescription drugs they need and that the respondents' current medical benefits would be reduced or eliminated.

      Americans had good reason to fear the power of insurance companies when in June the U.S. Justice Department's antitrust division approved the takeover of Prudential Health Care by Aetna, Inc., creating the nation's largest managed-care company. A spokesman for the group Consumers for Quality Care called the takeover “a black eye for the Clinton Administration in terms of patient protection.”

      Americans worried about health care may have gained some relief in November when the UnitedHealth Group, which insured 14.5 million people—8.7 million in managed-care plans—announced that it would let doctors make their own decisions on care. The Minneapolis, Minn.-based insurer would no longer interfere with physicians' treatment choices, including the decision to hospitalize a patient. Physicians' groups hailed the step. Thomas Reardon, president of the American Medical Association, called it “historic” and “a long overdue victory for American patients and the care they receive.”

      Thanks to the wide use and effectiveness of so-called highly active antiretroviral therapy (HAART)—potent combinations of anti-HIV medications—in industrialized countries many people with HIV/AIDS were living longer and healthier lives. Several studies published in 1999, however, pointed to the inherent limitations of HAART. One study showed that even though some people who had taken the drugs had the amount of HIV in their blood reduced to near-undetectable levels, the virus continued to lurk in their immune systems. The finding suggested that people who responded to HAART might need to keep taking the drugs indefinitely. Moreover, those same people remained capable of transmitting the virus, despite their apparent wellness.

      Another study shed important new light on the earliest stages of HIV infection. It found that almost as soon as the virus invades the body, it establishes a “reservoir of infection” that is especially refractory to attack by antiviral drugs and the body's own immune system. The finding was viewed as a setback for those seeking to create an AIDS vaccine. Yet another disheartening discovery was that about one-sixth of new HIV infections were drug-resistant.

      Although AIDS death rates in the U.S. were declining, the rate of new HIV infections was a cause for concern. From July 1998 through June 1999, a total of 47,083 HIV/AIDS cases were reported in the U.S. Notably high rates were seen in African Americans, Hispanics, and women. In late November UNAIDS, the United Nations agency charged with combating the spread of HIV/AIDS, reported 2.6 million deaths from AIDS worldwide in 1999 and 5.6 million new HIV infections. According to Peter Piot, the agency's executive director, “The epidemic is far from over. The crisis is actually growing.”

      The gloomiest AIDS news came from the less-developed world. In sub-Saharan Africa an estimated 22.5 million adults and 1 million children of the region's 600 million people were HIV-infected. Those unprecedented rates had reduced life expectancy from 64 to 47 years. Only two African countries, Uganda and Senegal, were effectively controlling the spread of AIDS. (See Special Report: Africa's Struggle Against AIDS .)

      In January an international team of researchers reported that they had traced the origin of the AIDS virus to a subspecies of chimpanzee in Gabon. Specifically, the team found that HIV-1, the virus that had caused the overwhelming majority of the world's estimated 34 million AIDS cases to date, was originally transmitted to humans by chimpanzees of the subspecies Pan troglodytes troglodytes. (HIV-2, which causes a milder and far-less-common form of AIDS, previously had been linked to a species of African monkey.) The path to the latest discovery involved conducting sophisticated genetic tests on viruses isolated from four chimpanzees that carried a simian virus nearly identical to HIV-1; the infected primates, however, did not become ill. Scientists speculated that humans living in the native habitat of the chimpanzee subspecies contracted the virus through exposure to the blood of butchered animals, but they could not explain how a microbe that had such a benign effect in the apes became so virulent when it infected humans.

Infectious Diseases.
      According to the World Health Organization (WHO), the top six infectious killers worldwide, ranked by the number of lives they took in 1998, were acute respiratory infections, including influenza and pneumonia (3.5 million), AIDS (2.3 million), diarrheal diseases (2.2 million), tuberculosis (1.5 million), malaria (1.1 million), and measles (900,000). A report issued by WHO in 1999 described “an infectious disease crisis of global proportions,” accounting for 13.3 million of the 53.9 million deaths worldwide in 1998. It noted that infectious diseases were the biggest killer of children and young adults and that one in two deaths in less-developed countries had an infectious cause. The report also pointed out that the situation had worsened as a result of mass population movements; in particular, refugees and displaced persons, who were highly vulnerable to infection, were readily spreading infectious diseases into new areas. In addition, the report emphasized that the arsenal of drugs available to treat infectious diseases was being progressively depleted owing to the growing drug resistance of microbes.

      Outbreaks of infectious disease occurred across the globe, some more alarming than others. In April Angola experienced an outbreak of paralytic poliomyelitis. In response, WHO mounted an emergency campaign to immunize 700,000 Angolan children against the highly contagious disease. Although there remained a few “hot spots” for polio in Africa and Asia, WHO still aimed for global eradication of the disease by the end of 2000. Its eradication campaign, launched in 1988, had been monumentally successful in most of the world.

      Probably the most publicized infectious disease outbreak in 1999 was one in New York City that was responsible for only about 50 cases of human illness and fewer than 10 deaths. In August a mosquitoborne illness initially thought to be St. Louis encephalitis killed scores of domestic and exotic birds, particularly crows; it also caused serious illness and brain inflammation in a few humans. In September epidemiologists confirmed that the responsible microbe was a virus similar to one that causes West Nile fever, an illness never previously seen in the Western Hemisphere. The original source of the outbreak was not known; one speculation, however, was that the disease reached the U.S. through illicitly imported African birds. The New York epidemic was effectively brought under control when the city mounted an all-out ground and aerial insecticide-spraying effort and advised city dwellers to protect themselves from mosquitoes.

      As in other recent years, a large number of infectious outbreaks in 1999 were traced to human consumption of contaminated food. In one of the biggest food-poisoning incidents, more than 1,000 people who had attended a county fair in upstate New York were infected with a deadly strain of Escherichia coli. The source was a well that had been contaminated by runoff from cow manure; fairgoers presumably consumed water from the well in the form of ice in soft drinks, lemonade, snow cones, and other refreshments. Although many became seriously ill, only two people died—a 3-year-old girl and a 79-year-old man.

      In Denmark 25 confirmed cases of virulent salmonella infection occurred in people who had eaten meat from infected pigs; 11 were hospitalized and 2 died. Their strain had reduced susceptibility to treatment with an important class of antibiotics (fluoroquinolones). The outbreak caused considerable alarm among infectious-disease experts worldwide. It was recognized that when pigs and other farm animals are given antibiotics to eliminate infection and enhance growth—a commonplace practice—bacteria grow increasingly resistant to the drugs. The scientists who investigated this outbreak called for sharp restrictions on the use of fluoroquinolones in food animals.

      In October, about a year after a new vaccine for children was licensed in the U.S., it was withdrawn by its American manufacturer. The vaccine, Rotashield, was designed to protect against a potentially fatal diarrheal illness caused by rotavirus. In the late 1990s rotavirus caused about three million cases of illness annually in the U.S. alone and killed an estimated 600,000 children worldwide. At the time the vaccine was licensed, a federal health-advisory panel recommended three doses of Rotashield before a child's first birthday. Subsequently, however, the vaccine was found to be responsible for a potentially fatal bowel obstruction in dozens of babies who had received it.

      The year was also one of major achievements in the conquest of infectious diseases. On October 6 WHO acknowledged a public health triumph that could never have been achieved without the intimate collaboration of the public and private sectors. As recently as the 1970s, the disease known as onchocerciasis, or river blindness, annually robbed hundreds of thousands of West Africans of their sight. The disease agent, a parasitic worm, is transmitted by the bite of blackflies that breed in fast-flowing rivers. To wipe out onchocerciasis, WHO, the Carter Center in Atlanta, Ga., and more than 20 donor countries and agencies teamed up with Merck & Co., a major pharmaceutical manufacturer. Merck donated a veterinary drug, ivermectin (Mectizan), which effectively prevented sight loss in people living in endemic areas; a single pill offered protection for about a year. Thanks to the cooperative campaign, which started in 1974, an estimated 12 million children who would have been at high risk of becoming blind were spared that fate. Programs to eliminate river blindness were under way in other parts of Africa and in six countries in Latin America.

      At its peak in the early 1940s, measles affected almost 900,000 people annually in the U.S. and killed more than 2,000. In 1998 there were only 100 confirmed U.S. cases of the vaccine-preventable viral illness, 71% of which were imported from other countries. In 1999, although it remained a major infectious killer globally, measles was on its way to being eradicated in the U.S.

      In 1999 two new drugs, one inhaled (zanamivir [Relenza]) and one in pill form (oseltamivir [Tamiflu]), were approved for treating the two common strains of influenza virus (types A and B). The drugs were the first of a new class of antiviral compounds called neuraminadase inhibitors. Studies showed that zanamivir was 80% effective in preventing flu. It also reduced the duration of flu symptoms by a day or two. Oseltamivir was similarly effective, preventing flu in up to 84% of adults who took the medication once daily for six weeks. Many physicians questioned whether use of the new drugs would be cost-effective, and infectious-disease experts emphasized that inexpensive, widely available vaccines at the beginning of the flu season remained the best protection.

      A study published in the journal Nature in July raised considerable hope about the possibility of preventing Alzheimer's disease with a vaccine. Researchers at the San Francisco firm Elan Pharmaceuticals vaccinated mice that had been genetically programmed to overproduce amyloid, a protein-carbohydrate complex that forms harmful deposits in the brain, known as plaques. Amyloid plaques are a hallmark of Alzheimer's disease. In young healthy mice the vaccine prevented the formation of brain-clogging plaques altogether, and in older mice it prevented further progression of existing plaques.

      Multiple myeloma is a severe, often fatal cancer of the bone marrow. In the 1990s treatment typically involved massive doses of chemotherapeutic drugs, but even after the most rigorous therapy, patients commonly relapsed. The five-year survival rate for treated patients was only about 29%. In a trial extending over several years, oncologists at the University of Arkansas treated 84 multiple myeloma patients with advanced disease with thalidomide, a drug developed in the 1950s (and notorious for having caused birth deformities in infants born of mothers who had taken it during early pregnancy). One-third of the myeloma patients were helped, and two patients experienced complete remission. Experts considered such results “remarkable.” In studies under way at the end of the year, researchers were hoping to learn why thalidomide worked so well in some patients and not at all in others.

      In 1997 research pioneer Judah Folkman of Children's Hospital in Boston showed that two recently discovered substances, angiostatin and endostatin, could shrink and in some cases obliterate malignant tumours—even massive ones—in mice. These natural proteins worked by inhibiting the angiogenesis, or blood vessel development, that provides tumours with their own blood supply and thereby allows them to grow from tiny, harmless masses into large, spreading malignancies. Folkman's studies generated enormous excitement among cancer treatment specialists and the public alike. At first, other scientists failed to duplicate his results, but in early 1999 researchers in several laboratories across the U.S. succeeded in suppressing mouse tumours with angiogenesis inhibitors. Moreover, during the year Folkman and his colleagues achieved the impressive feat of using endostatin to eliminate human prostate cancers that had been implanted in mice. In October the first human trials of endostatin began at Boston's Dana-Farber Cancer Institute. They were Phase I trials to determine the safety of the drug and the dose at which it should be given.

      A common and often fatal cancer of women was the focus of attention in February. Officials at the U.S. National Cancer Institute notified cancer specialists around the world of a major advance in the treatment of cervical cancer. Five separate studies had shown that a combination of chemotherapy and the standard treatment, radiation, reduced cervical cancer death rates by 30–50%. Experts speculated that the combined treatment worked so well because the drugs made cancer cells more vulnerable to radiation.

      In the less-developed world, cervical cancer killed more women than any other cancer, largely because the women lacked access to an inexpensive and accurate screening method. Whereas about 70% of women in industrialized countries received routine Pap smears, only about 5% in less-developed countries had such tests. Researchers in Zimbabwe, however, developed a “low-tech” means of diagnosing cervical abnormalities quite accurately at an early stage. By simply swabbing a woman's cervix with vinegar, then checking the cervix visually to see if any cells turned white, they found it possible to detect precancerous or at least suspicious lesions. Women with suspected abnormalities could then be referred for a more decisive test such as a biopsy.

Cardiovascular Disease.
      On rare occasions a clinical trial involving a large number of subjects is terminated early because the preliminary results are so dramatic. This was the case in November when an international team of cardiovascular researchers realized the lifesaving potential of the drug ramipril (Altace) for a broad array of patients at high risk for heart attack, stroke, or death from cardiovascular causes. In the trial one group took ramipril, an angiotensin converting enzyme (ACE) inhibitor; the other group received a placebo. Less than four years into the trial (scheduled to last five years), the treated group had significantly lower rates of death, heart attack, and stroke than those in the placebo group. Ramipril was not a new drug but one that had been used successfully for nearly a decade to treat high blood pressure. (As an ACE inhibitor, the drug relaxes blood vessels and thereby lowers blood pressure and decreases the heart's workload.) Because of the “potential therapeutic implications” for so many patients, The New England Journal of Medicine took the uncommon step of posting the study's findings on the Internet ( three months before the final version of the report was scheduled for publication.

Antioxidants in Disease Prevention.
      Blueberries, pomegranates, green tea, and cabernet wine were among many antioxidant-rich foods and drinks shown to prevent disease in 1999. Antioxidants prevent the damage done to cells by free radicals, molecules that are released during the normal metabolic process of oxidation. Oxidation can lead to cancerous changes, accelerate the aging process, and contribute to heart disease and degenerative diseases such as arthritis.

      Although it did not go so far as to state that “ketchup prevents cancer,” a major report in the Journal of the National Cancer Institute concluded that people who consumed large amounts of “tomatoes and tomato products [were] at a substantially decreased risk of numerous cancers, although probably not all cancers.” Researchers at Harvard Medical School analyzed 72 studies that had looked at the link between tomato consumption and cancer; 35 of those studies found a statistically significant reduction in risk, while 15 were inconclusive or showed a slight reduction. A number of the studies had focused in particular on lycopene, the nutrient in tomatoes that acts as a powerful antioxidant and also gives the fruit its red colour. The cancers most commonly prevented were those of the prostate, lung, and stomach, but there was also evidence that pancreatic, colorectal, esophageal, oral, breast, and cervical cancers were prevented with tomato consumption. Raw and cooked tomatoes and processed tomato products that did not contain excessive sugar or unhealthy fats were all found to be beneficial.

      A study published in the Journal of the American Medical Association found that the consumption of at least five servings a day of fruits and vegetables—in particular citrus fruits and juices, leafy green vegetables, and cruciferous vegetables such as broccoli, cabbage, and turnip—cut the risk of ischemic stroke by 30%. (Ischemic stroke occurs when blood clots block the flow of blood to the brain, which results in brain injury or death.) The study also found that drinking a single glass of orange juice a day lowered stroke risk by 25%. Juice manufacturers wasted no time in advertising this finding.

      A recently published survey of American adults found that the prevalence of obesity (defined as a body-mass index [BMI] of 30 or more) increased from 12% of the population in 1991 to 18% at the end of 1998. (BMI is determined by dividing one's weight in kilograms by the square of one's height in metres.) A second survey found that about half of the U.S. population was overweight (having a BMI of 25 or higher) and that excess weight was strongly associated with chronic diseases, including high blood pressure, high blood cholesterol, type II (non-insulin-dependent) diabetes, gallbladder disease, coronary heart disease, and osteoarthritis.

      Orlistat (Xenical), a new drug for the treatment of obesity, was approved by the U.S. Food and Drug Administration in April. Unlike most other medications for weight loss, orlistat works in the intestine, where it blocks about one-third of the fat a person consumes from being absorbed; undigested fat is eliminated in feces. Orlistat was designed to be used in conjunction with a reduced-fat, reduced-calorie diet. In clinical trials subjects taking orlistat for one year lost an average of 6.1 kg (13.4 lb), whereas those on a reduced-calorie diet alone lost 2.6 kg (5.8 lb). Side effects were mainly gastrointestinal (e.g., diarrhea, oily stools, and flatulence). Although the drug was a prescription item and meant for people who were at least 20% overweight, orlistat was widely advertised to the general public and available over the Internet after an on-line medical “consultation.”

U.S. Failures
      Reports issued at the end of the year drew attention to critical failures in the U.S. health care system. A committee of the Institute of Medicine found “stunningly high rates of medical errors—resulting in deaths, permanent disability, and unnecessary suffering.” Calling such mistakes “unacceptable in a medical system that promises first to ‘do no harm,'” the committee drew up a comprehensive strategy to reduce medical errors by 50% over five years.

      A scathing report issued by U.S. Surgeon General David Satcher noted that mental illness affected one in five Americans and that more than half of those who needed treatment did not get it. The report, posted on the Internet (, was critical of insurance policies that did not provide adequate coverage for mental illness and of American society, which continued to stigmatize the illness.

Ellen Bernstein

▪ 1999


Medical Developments
      In 1998 antibiotic-resistant organisms were spreading in both less-developed and industrialized countries, a situation that was presenting an increasing threat to public health worldwide. The global scope of tuberculosis (TB) was highlighted by a World Health Organization (WHO) survey that found drug-resistant cases of the disease in 35 countries. The proliferation of resistant TB strains was largely attributable to weaknesses in TB-control programs. At the same time, however, there were disquieting signs that, at least in some locations, the tubercle bacillus, Mycobacterium tuberculosis, was becoming inherently more virulent.

      Increasing drug resistance was seen in Salmonella typhimurium, a major agent of food poisoning. This prompted calls for stricter controls on the use of antibiotics in farm animals (to promote growth and prevent disease). Particularly prevalent in England and Wales, multidrug-resistantS. typhimurium had also emerged in several European countries and the U.S. One American survey showed that the prevalence of salmonella strains unresponsive to five antibiotics (ampicillin, chloramphenicol, streptomycin, sulfonamides, and tetracyclines) had increased from 0.6% to 34% in 16 years.

      Researchers who analyzed more than 1,000 strains of Streptococcus pneumoniae (pneumococcus) from hospitals in the U.S. and Canada reported in the October issue of Clinical Infectious Diseases that the common bacterium had grown increasingly resistant to penicillin and cephalosporin antibiotics. S. pneumoniae, the bacterium most frequently responsible for infections of the bloodstream, pneumonia, and ear infections, was the third most common cause of bacterial meningitis in children.

      The year was also one in which a number of long-term investments in basic scientific research bore fruit. American researchers succeeded in extending the life span of human cells grown in the laboratory. Most human cells divide in half a finite number of times before entering a condition known as senescence. As some types of cells age, their telomeres—protective caps at the ends of the chromosomes—shorten. This probably happens because telomerase, the enzyme that facilitates normal rebuilding of the telomeres, becomes less active. By incorporating the gene that gives rise to telomerase into senescent cells, however, scientists were able to reextend telomeres and thereby rejuvenate the cells. This achievement prompted speculation that it may one day be possible to maintain normal cells in a youthful state and thereby prevent many aging-related changes in the human body. There was, however, a catch-22 associated with telomerase (sometimes dubbed the "immortality enzyme"): the longer cells lived, the greater their chances were of becoming cancerous.

      Equally dramatic was the isolation and growth in the laboratory of a key type of cell from human embryos and fetuses that gives rise to specialized tissues throughout the developing body. The cells, known as human embryonic stem cells, have the potential to be grown in the laboratory in large quantities and to replenish damaged tissues in patients suffering from an array of illnesses. The new findings, reported independently in November by teams from the University of Wisconsin and Johns Hopkins University, Baltimore, Md., were viewed by most members of the scientific community as a breakthrough with enormous potential. Other groups, who were opposed to any kind of research on human embryos, were critical of the work.

      Another exciting but surprising finding came from scientists in Sweden and California, who discovered for the first time that the adult human brain may be capable of producing new nerve cells, or neurons. This finding flew in the face of the long-held dogma that human brain cells do not regenerate. In the long run this new insight may lead to new means of treating the victims of stroke and certain degenerative brain conditions, including Alzheimer's disease and Parkinson's disease.

      The medical event that arguably received the greatest publicity was the approval and subsequent marketing of the drug sildenafil (Viagra) for the treatment of male impotence. (See Sidebar (Viagra: A Second Honeymoon? ).)

      In December two teams of researchers in the U.S. announced that they had identified a molecule, interleukin-13 (IL-13), which may be responsible for the airway inflammation characteristic of asthma. One group treated asthma-prone mice with a drug that blocks the action of IL-13 and then exposed the animals to a substance that normally triggers an asthma attack; the mice did not develop breathing problems. The other group treated the nasal passages of mice with a substance that blocks IL-13. When exposed to an asthma-triggering protein, these mice had few asthma symptoms.

      In the U.S. the Centers for Disease Control and Prevention (CDC) issued a major report in April indicating that asthma rates had jumped 75% between 1980 and 1994, to an estimated 13.7 million sufferers nationwide. Increases in reported asthma cases and deaths affected all ages and racial groups, but rates of emergency room visits, hospitalizations, and deaths were consistently higher among African-Americans, as compared with whites.

      On the global front international asthma experts met in December and launched an initiative aimed at reducing the burden of childhood asthma over the next five years. The goals were to reduce asthma death rates in children by at least 50%, to reduce the number of school days lost owing to asthma by 50%, and to cut asthma-related hospitalizations by at least 25%.

      Investigators in the U.S. called a halt to a large clinical trial of the hormonal drug tamoxifen 14 months earlier than originally planned when they found that women at high risk of breast cancer who took this drug had reduced their chances of developing the disease by 44%. U.K. researchers were critical of the decision to end the trial, pointing out that tamoxifen may simply have delayed the development of the breast cancer. The debate continued when the results of two smaller European trials published later in the year showed that the drug offered no protection against breast cancer in healthy women. The latter trials, however, may have included too few women for any benefit to have become apparent.

      Despite these uncertainties, the U.S. Food and Drug Administration (FDA) approved tamoxifen for the prevention of breast cancer in otherwise healthy women who were at high risk for the disease. (Previously, the drug was approved only as a treatment for diagnosed cancer.) Cancer specialists stressed that benefits of the treatment for individual patients would have to be weighed carefully against the risks, since the drug was known to cause potential adverse effects, including uterine cancer and blood clots in the veins or lungs. Factors that put a woman at high risk of breast cancer included advancing age, personal history of abnormal breast changes, family history of the disease, birth of a first child at age 30 or older, or onset of menstruation before age 12.

      Two studies found that a new biologically engineered weapon against breast cancer, Herceptin, boosted the benefits of chemotherapy in women with invasive breast cancer. Herceptin is a type of protein (a so-called monoclonal antibody) created from mouse cells and designed to bind to the receptors that control growth in breast cells. In September the FDA approved Herceptin for an especially aggressive form of breast cancer known as HER-2/neu.

      The once-obscure work of Boston's Children's Hospital researcher Judah Folkman (see BIOGRAPHIES (Folkman, Judah )) became front-page news when his laboratory announced it had discovered two new drugs that had eradicated malignant tumours—even huge ones—in mice. For more than three decades, Folkman had been studying angiogenesis, the process by which localized blood-vessel growth feeds malignant tissues, enabling solid tumours to thrive and spread. Widespread publicity about the success of the drugs—endostatin and angiostatin— which had no apparent side effects, led to speculation that such an approach would work equally well in humans. Indeed, the National Cancer Institute (NCI) announced that getting the drugs into clinical trials was a top priority. Despite the promising prospects of the new drugs, experienced researchers, including Folkman himself, felt that the media coverage had raised premature hopes of a cure among cancer patients.

      Cancer statistics for the U.S., released in March by the NCI, CDC, and American Cancer Society, showed that rate of new cases and deaths had declined overall. During the period 1990-95, the overall rates for new cases decreased about 0.7% annually, and overall cancer death rates declined about 0.5% per year. The good news, however, did not include all Americans. African-American men bore a disproportionate share of the cancer burden. The comprehensive survey had looked at 23 types of cancer in four ethnic/racial groups: whites, African-Americans, Hispanics, and Asian or Pacific Islanders. The prostate, lung, breast, and colon-rectum were the four leading cancer sites, accounting for nearly half of all newly diagnosed cases during the six-year period.

      A study involving 29,000 male smokers, carried out by researchers from the NCI and the National Public Health Institute of Finland, suggested that long-term use of a vitamin E supplement significantly reduced subjects' risk of prostate cancer. The research found that men between the ages of 50 and 69 who took 50 mg a day of vitamin E in the form of alpha-tocopherol for five to eight years had 32% fewer cases of prostate cancer and 41% fewer deaths from the disease than men who did not receive the supplement. Experts urged that additional studies be carried out to confirm the beneficial effect.

Cardiovascular Disease.
      For many years medical researchers had suspected that women with heart disease did not respond as well as their male counterparts to existing therapies. A large-scale study carried out under the auspices of the U.S. National Heart, Lung, and Blood Institute put at least some of those doubts to rest. The study followed men and women who underwent either coronary artery bypass surgery or balloon angioplasty—procedures that promote blood flow to the heart. After five years 87% of the women patients were alive, a rate almost identical to that for men in the study.

      Encouraging news for the early prevention of heart disease came from a U.S. government survey showing declines in total cholesterol levels in American adolescents between the late 1960s and the early 1990s. Elevated blood cholesterol levels early in life were known to increase the risk of heart disease in adulthood. Data from the survey showed that '90s teenagers had lower intakes of saturated fat and total fat than did their '60s counterparts. Previous surveys had shown that similar changes in the consumption patterns of American adults had contributed to a 50% decline in coronary heart disease deaths.

      Lifestyle changes, including weight loss through diet and exercise and reduction of salt intake, can reduce, or possibly eliminate, the need for medication to control hypertension (high blood pressure) among elderly individuals. A clinical trial involving 975 men and women between the ages of 60 and 80 was the first of sufficient size and scope to confirm that older people who changed their behaviour could reduce their reliance on antihypertensive drugs.

      In June a prescription drug for lowering blood pressure and treating angina pain, which had been available for less than a year, was taken off the market in 38 countries. The drug, Posicor (mibefradil), a calcium-channel blocker, was used by an estimated 400,000 patients worldwide. The reason for the swift withdrawal was that the drug was found to cause toxic—and sometimes lethal—reactions when taken in combination with certain other drugs.

      Researchers in the U.S. and the U.K. announced that they had completely sequenced the genome of the roundworm Caenorhabditis elegans. This was the first time scientists had sequenced the genetic instructions for a complete animal. Although the lowly roundworm is tiny (25 in one inch), it can shed light on many characteristics of humans. It has a relatively complex nervous system, and about 40% of its 19,099 genes match those of other organisms. Scientists around the world hailed the accomplishment, which they saw as an invaluable research tool for studying everything from embryonic development to aging.

      The complete genomes of three important human pathogens—M. tuberculosis, Treponema pallidum (the syphilis spirochete), and Chlamydia trachomatis (responsible for the most common sexually transmitted disease)—were sequenced during the year. Having deciphered the complete sets of instructions that these microbes need to infect human cells and thrive therein, scientists were now better equipped than ever before to find new ways to eliminate the diseases they caused.

      In October a worldwide effort involving 64 scientists produced a new gene map, marking the chromosomal locations of more than 30,000 human genes, nearly half of the human genome. The compilation, called GeneMap'98, was accessible on the Internet ( and is expected to help in the identification of numerous human disease-causing genes. Owing to this more-rapid-than-expected progress, American leaders of the Human Genome Project proposed that the goal of sequencing the entire three billion base-pair human genome could be accomplished by the end of 2003, two years ahead of schedule.

      In January researchers reported the discovery of the first gene associated with human hair loss. The gene, called hairless, was found in a Pakistani family with the rare disorder alopecia universalis, in which those affected have no head or body hair, eyebrows, or eyelashes. Scientists speculated that this discovery could lead to a new understanding of more common types of hair loss, including alopecia areata, an autoimmune disorder characterized by the loss of large patches of head hair and affecting as many 2.5 million people in the U.S. alone, and male-pattern baldness, a hormone-controlled disorder that causes some degree of hair thinning or baldness in up to 80% of men and women. If scientists were to discover a gene responsible for male-pattern baldness, it might be possible to prevent or treat the most common form of hair loss with gene therapy.

Infectious Diseases.
      Vaccines against the tick-borne bacterial infection Lyme disease were developed by two pharmaceutical companies, one of which received FDA approval in late December to market its product LYMErix. Lyme disease can affect the skin, joints, heart, and nervous system and be highly debilitating. In clinical trials the vaccine demonstrated efficacy rates of 78% after three doses and 50% after two doses against symptomatic Lyme disease. Although the vaccine was approved for marketing, it was likely to be given only to very select individuals, such as those planning to travel to heavily tick-infested areas. Meanwhile, uncertainties remained over a number of issues, including the number of booster doses likely to be required for continued protection and the safety and efficacy of the vaccine in children, who represented 23% of Lyme disease cases.

      An experimental influenza vaccine, given by nasal spray rather than hypodermic syringe, was found to be 93% effective in healthy American youngsters aged 15 months to 6 years. The nasal spray also proved to be highly protective against otitis media, a common ear infection of children. Healthy children were not routinely immunized against influenza, but public health officials were hopeful that the new product would be approved for use by the FDA in 1999 and that the availability of a safe, effective, and painless vaccine would lead to widespread vaccination of most children in schools and clinics.

      In August the FDA licensed the first vaccine to prevent serious rotavirus infections, the most common cause of severe diarrhea and vomiting among American infants. Prior to the development of the vaccine, about 80% of children under age five experienced rotavirus symptoms annually, and about 55,000 were hospitalized for severe diarrhea and potentially life-threatening dehydration. The new vaccine was to be given orally at ages two, four, and six months. Authorities pointed out, however, that the new vaccine was too expensive to use in many countries, where rotaviral disease was responsible for about 870,000 deaths each year.

      In the U.S. there were further signs of progress in the fight against AIDS, and the disease dropped off the list of the nation's top 10 killers. The CDC reported in October that HIV infection had dropped from the 8th leading cause of death to number 14; moreover, age-adjusted death rates from HIV infection dropped an unprecedented 47% between 1996 and 1997. The declining death rate was largely attributed to the success of combination drug therapies that included protease inhibitors. Health authorities noted, however, that the incidence of HIV infections—i.e., the number of new cases reported per year—had not declined, which suggested that prevention efforts needed to be stepped up.

      Internationally the HIV/AIDS news was much grimmer. A United Nations country-by- country survey found that there were 30 million people in the world infected with HIV and that 21 million of them were in Africa. About 90% of all AIDS deaths were in sub-Saharan Africa, where the vast majority of the victims had no access to the life-prolonging drugs available in the West.

      Physicians reported that transplantation of bone marrow from an unrelated donor whose tissues were matched with those of the recipient was a safe and effective therapy for selected patients with chronic myeloid leukemia. Their results indicated that this procedure was potentially curative for most victims of the disease aged 50 or under. Previously the possibility of a cure was considered to be realistic for only a minority of young patients with this type of cancer of the blood cells.

      Immunologists in the U.S. genetically modified bone marrow cells in mice in such a way that grafts of foreign tissues were no longer rejected, which suggested that the same method could be used to facilitate the transplantation of tissues from nonhuman donors into humans. Given that there were major shortages of human donor organs in most countries, this achievement could make xenotransplantation (animal to human transplants) a more acceptable and feasible prospect.

      In September Clint Hallam, an Australian man who had lost his arm as the result of an industrial injury, received a transplanted forearm and hand. An international team of surgeons in Lyon, France, transplanted the donor arm in a 13 1/2 -hour microsurgical procedure that involved carefully attaching the patient's nerves, blood vessels, tendons, muscles, bones, and skin to those of the donor arm. A previous attempt at such a transplant, in Ecuador in 1964, had failed when the patient's body rejected the donor arm two weeks after the operation. The French doctors had high hopes that antirejection drugs would prevent such an outcome. In mid-November there were no signs of rejection in Hallam's new arm, and he was able to move each of the donor fingers. The surgeons estimated that it would be at least a year before they knew whether the recipient would be able to feel sensations in his new appendage.

Other Developments.
      A team of pediatricians and cardiologists in Italy may have discovered the underlying basis of a large proportion of cases of sudden infant death syndrome (SIDS), or cot death, as it was known in Britain. Electrocardiograph (ECG) testing of more than 33,000 infants a few days after birth revealed a developmental heart rhythm defect, indicated by a prolongation of the so-called QT interval, in more than one-third of those who eventually became SIDS victims. This suggested that routine neonatal ECG screening may allow physicians to identify babies at greatest risk; preventive measures could then be initiated.

      Although previous surveys had shown that the "Back to Sleep" campaign launched in 1994 in the U.S. had been enormously effective at reducing the incidence of SIDS (by 38% between 1992 and 1996), three 1998 studies indicated that certain segments of the population were not heeding the public health admonition to put infants to sleep on their backs, not on their abdomens. New efforts were proposed to target groups that were not being reached by the advice.

      The FDA announced that it would require new alcohol-warning labels on all nonprescription pain relievers and fever reducers, including aspirin, acetaminophen, and ibuprofen. Labels would advise those who consumed three or more alcoholic drinks daily to consult their doctors before taking such medications because their drinking could put them at increased risk of liver damage or bleeding in the stomach. The ruling was to take effect on April 23, 1999.

      Scientists in Seoul, S.Kor., announced that they had combined an egg and a human cell from an infertile woman to produce an early-stage embryo. They allowed the cloned cell to grow only into a four-cell embryo and did not take the critical step of implanting it in the woman's uterus. If they had done so, it would have been theoretically possible for the embryo to grow into a fetus that was genetically identical to the woman. In other cloning experiments during the year, scientists in Hawaii produced mice that were cloned from the cells of a single mouse, and Japanese scientists produced calf clones.

      The Journal of the American Medical Association devoted an entire issue to alternative therapies, which, according to a Harvard Medical School survey, were used by 4 out of 10 American adults in 1997. The same survey found that women were more likely than men to try alternative treatments, as were higher-income, well-educated members of the baby-boom generation. Australian physicians reported that Chinese herbal medicines were helpful in relieving the symptoms of irritable bowel syndrome. A study from China found that moxibustion (the application of burning herbs to acupuncture points on the body) given to women in the 33-35th weeks of pregnancy stimulated fetal movements and helped alter the position of babies presenting in the breech position. Yoga was effective in relieving the hand and wrist pain of carpal tunnel syndrome. On the other hand, chiropractic spinal manipulation did not seem to relieve chronic tension headaches, and an Indian herbal product called Garcinia cambogia was of little or no help in promoting weight loss. Nor did acupuncture alleviate pain associated with HIV-related nerve damage.

      Gro Harlem Brundtland, former three-term prime minister of Norway, was elected to a five-year term as director general of WHO. She pledged, among other things, to restore credibility to the beleaguered organization. She took the helm of the agency in late July when Hiroshi Nakajima, director general for a stormy 10 years, stepped down. Brundtland, who had a medical degree from the University of Oslo and a public health degree from Harvard, was highly respected for her political skills and had been recognized for her leadership in environmental health. She said that one of her goals was to make the governmental heads put the health needs of their people at the top of their political agendas.


Mental Health
      A proposed new treatment for schizophrenia, based on neuroscience research in the U.S., lacked the disadvantages of currently used drugs. Investigators described an experimental compound that reduced levels of the chemical glutamate—one of the neurotransmitters that relays messages between nerve cells—in the brain. Given to rats with symptoms (such as incessant head turning) that paralleled the psychotic symptoms of human schizophrenia, it brought marked relief with no evidence of harmful side effects. The discovery stemmed from the finding that phencyclidine (PCP, or "angel dust") induced effects similar to schizophrenia in healthy individuals by altering glutamate transmission in the brain. Current treatments for schizophrenia worked by interfering with another neurotransmitter, dopamine. These, however, often failed to control all of the symptoms. The new approach might provide the alternative type of therapy that had been sought for many years.

      A study in Manchester, Eng., established that intensive cognitive behaviour therapy can provide help for patients with chronic schizophrenia. Psychiatrists compared patients receiving medication and other routine care with those also given cognitive behaviour therapy (which aims to change thought processes, behaviour, and emotions). Those having the additional treatment were eight times more likely to show major improvements in their psychotic symptoms, which can be intensely disabling, than were the patients treated conventionally. Another Manchester team systematically reviewed six independent investigations on the value of cognitive behaviour therapy to combat childhood and adolescent depressive disorder. Their study of 208 patients (aged 8-19) treated in this way showed that cognitive behaviour therapy was indeed effective in helping those with moderately severe depressive disorders. The same approach, however, could not yet be recommended for those suffering from severe depression.

      Research in Australia and the U.K. shed light on the underlying basis of major depression disorder in older people, which often has a poor prognosis. Brain scanning revealed that many such people often undergo changes known as deep white-matter lesions. Depression, it was discovered, is much more likely to relapse and to become chronic in individuals with the lesions than in those lacking them. Investigations now under way on the chemical basis of these changes could lead to improved therapies.

      Depressive symptoms in older women were found to be associated with higher mortality. A seven-year study of white American women aged 67 or above showed that the mortality rate for those with six or more depressive symptoms was 24%, compared with 17% for those with three to five symptoms and 7% for women with no symptoms. The greatest increased risk was that of dying from cardiovascular disease.

      Evidence continued to accumulate on the relationship between psychiatric illness and adverse factors such as unemployment and poverty. Analyses of more than 7,000 British people established that both of these social factors were linked with the continuation, but not the onset, of most common mental disorders. Individuals suffering from more than a year's poverty and financial strain were significantly more likely to develop a psychiatric illness.

      Another U.K. survey, of more than 10,000 adults, showed that, independent of other influences, a low standard of living was associated with an increased prevalence of neurotic psychiatric disorders. The authors of this study observed that during the previous 20 years one of the largest increases in income inequality in the Western world had taken place in the U.K., and they argued that this may have had adverse consequences for the mental health of the population.

      American researchers pinpointed a gene that predisposes its carriers to psychiatric illness. Wolfram syndrome, characterized by diabetes mellitus and degeneration of the optic nerve and other parts of the nervous system, had previously been known to occur in people whose cells contained two copies of a particular mutant gene. The new findings concerned individuals who had one mutant and one normal gene. Although they did not suffer from Wolfram syndrome, they were 26 times more likely than average to develop psychiatric disease requiring hospital care. This discovery could explain the occasional reports that relatives of Wolfram syndrome patients were unusually likely to attempt suicide and to be admitted to a hospital for psychiatric reasons.

      Researchers in The Netherlands implicated smoking in Alzheimer's disease. Contrary to previous studies, which implied that the habit might be protective, they reported that in a study of 7,000 individuals, smoking was associated with a doubling of the risk of dementia and Alzheimer's disease. Because the investigation was retrospective, following the subjects over time, the conclusions were likely to be more reliable than those of earlier studies based on less-rigorous methods.

      Correlation of information regarding traffic accidents in the U.K. with information about drugs prescribed for the drivers showed that those taking benzodiazepines or zopiclone had an increased risk of experiencing an accident. The investigators concluded that users of those tranquilizers should be advised not to drive.


Veterinary Medicine
      Concerns arose during 1998 that the widespread use of antibiotics in farm animals could result in a loss of effectiveness when antibiotics were used to treat human infections. There had been suggestions that the use of products based on quinolone and fluoroquinolone could contribute to the creation of resistant strains of foodborne bacteria such as Salmonella and Campylobacter, which cause severe illness in humans as well as animals. A World Health Organization (WHO) meeting convened in Geneva in June recommended international cooperation to gather data, standardize testing methods, and develop a code of practice for the use of such products.

      The biennial congress of the International Pig Veterinary Society, July 5-9, attracted more than 1,500 veterinarians to Birmingham, Eng. Delegates from 50 countries discussed problems in the production, health, welfare, and disease control of hogs. There was particular emphasis on porcine reproductive and respiratory syndrome, a viral disease that occurred worldwide and could cause serious losses among affected animals.

      A symposium organized by the Office International des Epizooties on classical swine fever was held in conjunction with the conference. The disease had resisted international efforts to eradicate it and remained widespread, causing economic problems in Asia, Europe, and Latin America. A recent resurgence in Western and Central Europe affected pig breeding and called into question the effectiveness of prevention and control strategies. Recent developments in diagnostic and vaccine technology, however, were said to offer prospects for new approaches to controlling the disease.

      A new variant strain of foot-and-mouth disease identified by the World Reference Laboratory, Pirbright, Eng., as originating in Iran and named A/Iran/96 had spread to Turkey by 1998. Existing vaccines had proved ineffective, and so vaccines incorporating the new strain were produced. Vaccination of all ruminants in nearby areas was urged.

      Scrapie is a disease of sheep caused by a prion protein (PrP) that has links with Creutzfeldt-Jakob disease in humans; material from scrapie-infected sheep was also believed to be the origin of bovine spongiform encephalopathy ("mad cow" disease) in cattle. Attempts to eradicate scrapie would be greatly helped by the availability of a test to diagnose it before signs of the disease appeared. B.E.C. Schreuder and colleagues at The Netherlands Institute for Animal Science and Health devised a test that detected scrapie infection at 10 months of age, about halfway through the incubation period and well before clinical signs developed. The test was simple to perform and relatively noninvasive, using biopsies of material taken from the tonsil of the animal.

      Knowledge of the weight of a horse is essential for calculating the dosage of medicines, formulating rations, and training for optimum condition. Methods of assessing the weight in the absence of a weighbridge (a platform scale flush with the roadway) included specially calibrated tapes, formulas based on body girth and length, precalculated tables, and visual estimation relying on the experience of the observer. J.M. Ellis of Warwickshire College, Moreton Morrell, Eng., and colleague Teresa Hollands endeavoured to establish the comparative accuracy of different methods by comparing the results in 600 horses of similar size and age against the actual weight. The accuracy of the results varied widely, the degree of error depending on the height of the horse. Most accurate, at 98.6%, was a formula developed in 1988 by C. L. Carroll and P. J. Huntingdon: weight (kg) equals the square of the girth multiplied by the body length (cm) divided by 11,877. Least accurate was visual estimation, scoring 88.3%.


▪ 1998


Medical Developments
      The degree to which medical and scientific experts should interfere with the natural order of things, in both creating and terminating life, became a major concern in medical science in 1997. In February a startled world said hello to a cloned Scottish sheep named Dolly. The surprising scientific feat stirred moral and legal concerns about the prospect that genetically identical humans could be created as well. Special Report). (Uses and Ethics of Cloning ) Meanwhile, medical science was already providing an array of high-tech pregnancy assistance, sometimes with dramatic consequences. In November Bobbi McCaughey, a Carlisle, Iowa, woman who had taken a fertility drug, gave birth to septuplets, four sons and three daughters, the first known case in the United States of seven live human births. A month earlier an Atlanta, Ga., fertility clinic had announced that for the first time in the U.S., two healthy baby boys had been born from eggs that had been frozen and thawed before being fertilized, a technique that was being studied in a number of countries.

      At the other end of the spectrum, legal debates about how and when it is appropriate to end life confronted the United States Supreme Court, which ruled that terminally ill patients do not have a constitutional right to physician-assisted suicide. The states were left free to take action, however, and in November Oregon voters reaffirmed a controversial Death with Dignity Act allowing doctors to prescribe drugs to help terminally ill people die.

      The United States also got a favourable new health report card. An annual report from the federal Centers for Disease Control and Prevention showed a dramatic decline in the AIDS death rate, drops in homicide and suicide rates, and a continuing reduction in the teenage birthrate. American life expectancy achieved an all-time high of 76.1 years in 1996, and infant mortality reached a new low, 7.2 deaths per 1,000 live births. An estimated 15% reduction in mortality rates from sudden infant death syndrome helped account for the continuing infant mortality decline.

      Amid growing concern about food safety, the U.S. Food and Drug Administration (FDA) in December approved the use of irradiation to control disease-causing microorganisms in meat products. It said that studies found the procedure to be safe and to have no effect on nutrition, taste, or appearance of fresh and frozen meat, including beef, pork, and lamb. The FDA said that irradiation could help kill dangerous Escherichia coli bacteria, which had been traced to undercooked hamburger. In another food-safety initiative, U.S. Pres. Bill Clinton announced in October that the government would be undertaking new steps to ensure the safety of imported as well as domestic fruits and vegetables.

      Two major studies of depression in the elderly demonstrated that poor physical and mental health seem to go hand in hand. They found that older patients who suffer from significant signs of depression are far more likely to suffer serious physical illnesses.

      Several studies indicated that people living in Europe were receiving insufficient quantities of selenium, which plays a vital role in thyroid hormones and in various bodily processes. Although the element is found in cereals, meat, fish, and poultry, the decline in intake was largely attributed to a fall in imports from North America of selenium-rich, high-protein wheat for bread making. This prompted calls for flour to be supplemented with selenium and for selenium to be more widely used in fertilizers (as had been adopted recently in Finland).

      There was progress in the treatment of rheumatoid arthritis. Although the causes of this condition were not fully understood, a role was thought to be played by tumour necrosis factor (TNF), which otherwise has beneficial effects in the body. U.S. researchers therefore developed a protein specifically engineered to interfere with the action of TNF. Given to 180 patients whose rheumatoid arthritis had not responded to conventional treatments, it reduced their symptoms and appeared to be safe and well-tolerated.

      Medical experts gathered by the U.S. National Institutes of Health (NIH) approved the use of more widespread genetic testing for cystic fibrosis, the most common inherited disorder for people of northern European descent. The independent panel recommended that testing for gene mutations that cause cystic fibrosis be offered to all couples expecting babies and those planning pregnancy, as well as individuals with a family history of the disease and their partners. The debilitating and often deadly lung and digestive disease occurs when a child inherits a defective gene from each parent. Genetic testing can identify healthy adult carriers with only one defective gene—about one in 29 Caucasians—that may be passed on to their offspring.

      Significant research progress continued to be made in the Human Genome Project. University of Washington molecular biologists reported in Science magazine that by the end of 1997, partial genetic sequences from approximately 40,000 to 50,000 human genes, roughly half of the total, had been recorded in various databases around the world. The detailed sequencing of the three billion base pairs, or genetic building blocks, of the human genome was, however, just beginning, with only about 2% of the total analyzed by the year's end. The genomes of the E. coli bacterium, yeast, and 11 other microbes were completely sequenced, which greatly improved the basic understanding of genetics.

      During the year the genes responsible for several heritable diseases were found. These included tuberous sclerosis, which causes distinctive tumours in the brain, skin, heart, lungs, and kidneys; Niemann-Pick type C disease, a fatal condition resulting from a failure to process cholesterol; one form of age-related macular degeneration, the most common uncorrectable cause of loss of vision in the elderly; and a type of familial atrial fibrillation, which causes strokes and heart-rhythm abnormalities. Researchers also reported a link between autism and a specific gene, and others found a gene that can suppress the development of tumours in the brain, breast, and prostate. All these discoveries could in time lead to earlier detection and, perhaps, treatment for the conditions concerned.

Cardiovascular Disease.
      Obesity and efforts to combat obesity continued to pose major health problems, particularly to the heart. In the United States two popular prescription diet drugs—dexfenfluramine and fenfluramine—were withdrawn from the market in September at the recommendation of the FDA. The two drugs were often taken singly or with another drug, phentermine, in a combination known popularly as "fen-phen." In November preliminary studies suggested that as many as one-third of the drugs' users may have suffered heart valve damage. People who had taken either of the diet drugs were urged to consider having a medical checkup. Valve damage can make people more vulnerable to bacterial infection of the heart following dental and medical procedures.

      Obesity is a problem that often begins in childhood. New research from a heart study done in Bogalusa, La., found that the children of parents with heart disease were more often overweight than were other children. They also had a higher incidence of obesity—and heart disease risk factors like elevated cholesterol levels—when they became young adults.

      In addition to reducing their weight, Americans fighting heart disease needed to double the amount of fibre in their diets to help lower their blood cholesterol and control their body weight, according to a report from an American Heart Association nutrition committee. The committee suggested that a variety of grains, beans, other vegetables, and fruits—important sources of fibre—be included in the diet.

      A new Harvard University study found that margarine and other foods made with hardened vegetable oils, including many baked goods, contain a "trans fat" that could increase the risk of heart disease by as much as one-third. Such fats may be even worse than the saturated fats found in meats and cheese, according to the study of more than 80,000 women in the Harvard Nurses' Health Study. University of Washington researchers released a new study that found that lowering the amount of total fat in the diet to about 30% of total intake helped lower cholesterol in people with high levels. This supported national guidelines set by government experts. The study found, however, that more aggressive fat-restriction diets may not help and may even hurt by decreasing levels of high-density lipoprotein, the so-called good form of cholesterol.

      In addition to a low-fat diet and exercise, new research evidence suggested that cholesterol-lowering drugs, such as lovastatin, may be valuable in reducing heart attacks in patients with borderline cholesterol levels and no sign of heart disease as well as in patients with high cholesterol and a history of heart disease. Such medication might be important for patients with a family history of heart disease.

      U.S. researchers demonstrated the beneficial effect of fish consumption in relation to coronary heart disease. They had a unique opportunity to study a group of 1,800 men who were aged 40 to 55 and free of cardiovascular disease when they were first enrolled in a health study in 1957. Follow-up studies showed that those who were eating 35 g (1.2 oz) or more of fish each day at the outset were much less likely to have suffered a fatal heart attack over the ensuing 30 years than were those who avoided fish altogether.

      American women in their 40s received conflicting advice about whether to get regular screening mammograms for breast cancer diagnosis. Two major cancer organizations recommended that women aged 40-49 be regularly screened with mammography, a low-dose X-ray test intended to pick up hidden breast tumours. The American Cancer Society, a major voluntary group, urged all women 40 and older to get a mammogram every year, and the government's National Cancer Institute (NCI) said that women 40 and older should be screened every one to two years. Earlier, however, an advisory group convened by the NIH had concluded that the available scientific evidence was not strong enough to warrant a universal recommendation that all women in their 40s get screening mammograms. The panel said women in that age group should decide for themselves, after weighing the risks and benefits. There had long been strong medical agreement that women aged 50 and older should obtain mammograms on a regular basis.

      In the treatment of breast cancer, researchers with the U.S. National Surgical Adjuvant Breast and Bowel Project, a federally funded group, recommended that most patients with early-stage disease, regardless of their age, the type of tumour, or the chance that the cancer has spread to nearby lymph nodes, consider undergoing chemotherapy in addition to surgery. The recommendation followed a large new study showing that even lower-risk patients with localized breast cancer that had not spread and who were estrogen-receptor-positive, a sign of a more positive outcome, were more likely to live longer and be disease-free five years after having received a combination of chemotherapy and hormone therapy with tamoxifen.

      Evidence from Taiwan showed that the prevention of hepatitis B could also lead to a reduction in the incidence of hepatocellular carcinoma. This type of liver cancer had long been associated with the hepatitis B virus, although the precise relationship was unclear. The Taiwan study revealed that during the decade since the inception of a nationwide immunization program, not only had hepatitis B declined, but hepatocellular carcinoma in children had also fallen to half its original level.

      Research in Australia on the effect of diet on cancer suggested that the risk of breast cancer was lower in women who had a high intake of phytoestrogens. These are chemicals, found in many edible plants, whose chemical structures are similar to that of estrogen. One type occurs predominantly in soy products, and another is in grains, fruits, and vegetables.

      International collaboration clarified the previously uncertain relationship between breast cancer and hormone replacement therapy (HRT). An analysis showed a slight increase in risk of the disease for every year of use of HRT. The effect is reduced after cessation of the therapy and largely, if not entirely, disappears after five years.

Infectious Diseases.
      Communicable diseases were the subject of both good and bad news in 1997. Several newly emerging infections provoked concern, as did the spread of strains of bacteria that cause familiar diseases but that have become resistant to the antibiotics routinely used for treatment. There was also evidence, however, that this problem could be ameliorated by more prudent use of antibiotics.

      Arguably the most ominous development was the isolation of a resistant strain of Yersinia pestis, the organism responsible for bubonic plague, from a patient in Madagascar. It was insensitive to every one of the antibiotics normally administered to combat this life-threatening disease. Because bubonic plague is acquired from fleas that carry the bacterium from infected rats, there was little chance of epidemics in most countries. Nevertheless, the discovery of the multiply-resistant strain was disquieting, especially since the resistance could be passed on to other, initially sensitive strains of Y. pestis.

      There was dramatic evidence from Finland of how more selective prescribing of antibiotics could lead to a decline in the prevalence of resistant bacteria. This followed anxiety earlier in the 1990s over increasing resistance to erythromycin in streptococci, which cause skin and respiratory infections. National guidelines were instituted so that hospital outpatients received erythromycin only when strictly necessary and only in the required dosage. As a result, the frequency of resistant strains in throat swabs and pus samples fell over four years from 16.5% to 8.6%. Although such a trend might have been predicted, its magnitude was unexpected. It was also the first conclusive demonstration of the benefits of the discriminating deployment of antibiotics.

      Federal health statistics documented a dramatic one-year decline in the U.S. death rate for HIV/AIDS, a 26% drop between 1995 and 1996. The latest annual report showed that HIV infection, which had been the leading killer of Americans 25-44, now ranked second in that age group, behind accidents and their adverse effects (largely from car crashes) and just ahead of cancer. Mortality from HIV had increased significantly between 1987 and 1994; the first evidence that mortality was leveling off appeared in 1995.

      Nonetheless, there was renewed concern about prevention of new HIV cases, particularly among young people. It was highlighted by an alarming case in which a 20-year-old HIV-infected man may have created a one-man AIDS epidemic. Nushawn Williams, apparently aware of his HIV status, had unprotected sex with numerous young women in a rural area of western New York and in New York City. In an unusual move, health authorities obtained court permission to bypass AIDS confidentiality laws and release the man's name, which led dozens of women to get their blood tested for signs of infection.

      On the AIDS treatment front, many patients receiving powerful combination drug therapy remained in good health. Hopes for a permanent cure were put on hold, however, when three teams of researchers reported that the virus could hide out in the body's immune cells even in patients with no signs of the virus in their blood for as long as two years. The research suggested that although the virus can be held at bay, patients may have to stick with the drug treatment indefinitely unless new approaches can be developed.

      Scientists found that along with AIDS, other serious infectious diseases, such as tuberculosis, had recently been spread in medical settings by the use of contaminated instruments. Patients in South Carolina and Maryland were found to be infected with TB after they had undergone a common procedure, fibre-optic bronchoscopy, in which a lighted tube is inserted into the lungs for diagnosis or treatment. Researchers were able to prove that the infections were caused by bronchoscopes that had not been properly cleaned. About 460,000 patients undergo fibre-optic bronchoscopy in the U.S. each year.

      Stricter hygiene precautions in slaughterhouses and butcher shops were recommended in the report of an inquiry into the previous year's outbreak of food poisoning in Scotland attributed to E. coli. Although this bacterium was at one time considered to be entirely innocuous, strain 0157 not only attacks the intestinal tract but also can trigger life-threatening kidney failure. Nearly 500 people became ill and 19 died during the Scottish epidemic—the world's second worst outbreak of disease caused by E. coli. Public health authorities in other countries were advised on measures to prevent the organism, which occurs in the feces of infected cattle, from reaching meat for human consumption.

      An international team reported that a new drug, zanamavir, reduces the symptoms of influenza A or B if treatment is begun sufficiently early. Trials in 38 centres in North America and Europe indicated that zanamavir is a valuable supplement to vaccines for the treatment of influenza; some vaccines may not be effective against new strains of the virus.

      Early in December a new strain of influenza appeared in Hong Kong. By the year's end at least 16 people were known to have been infected, and 4 had died. Researchers determined that the virus was the first ever to have been transmitted directly from birds to humans. More than one million chickens, ducks, and geese were subsequently slaughtered in Hong Kong.

      U.S. virologists also reported success in preventing rhinovirus infections by spraying into chimpanzees' noses a substance to prevent the virus from invading their cells. Researchers believed that this method could be used to prevent the 50% of human colds that are caused by rhinoviruses.

      There was a major advance in immunization against meningitis and pneumonia produced by Haemophilus influenzae type B (Hib). Vaccines had been highly effective in recent years in preventing Hib infections in industrialized countries. There was, however, no comparable evidence of their efficacy in less-developed countries. The success of a new trial, conducted in The Gambia, indicated that a Hib vaccine will substantially reduce childhood deaths due to meningitis and pneumonia in less-developed countries.

Alternative Medicine.
      Acupuncture, the ancient Chinese practice of using thin needles for treating various ailments, gained an endorsement from American mainstream medicine. An NIH panel concluded that it could be effective in treating nausea and vomiting from surgery, chemotherapy, or pregnancy, as well as postoperative dental pain. The panel said that there was some evidence that acupuncture could also be helpful in treating muscle and skeletal aches, low back pain, headache, drug addiction, arthritis, and asthma. Special Report. (Alternative Medicine ))

      There was considerable progress in clarifying the effect of smoking, especially passive smoking, on cancer and other conditions. First, a large-scale analysis by London-based epidemiologists, bringing together 19 separate research studies, concluded that marriage to a smoker increased by 26% the chances of a nonsmoking partner's developing lung cancer. There was also a clear dose-response relationship. Those breathing in more tobacco smoke were correspondingly more likely to contract the disease. Another major study, conducted at the Harvard School of Public Health, showed that regular exposure to other peoples' smoke nearly doubled a nonsmoker's risk of contracting coronary artery disease. An analysis by the London group put the increased risk at about 25%.

      A new analysis of five major studies of the health effects of cigarettes found that the hazards to women smokers were rising most quickly, with the largest increases occurring in the risks of lung cancer and other smoking-related cancers. The 565-page report released by the NCI found that overall smoking-related mortality rates from all causes, including cancer, heart disease, stroke, and lung disease, had increased among both women and men since the first Surgeon General's report, in 1964, on the health hazards of smoking. For example, a comparison of two long-term studies, one starting in 1959 and the other in 1982, found that lung cancer risks of male smokers doubled between the two studies, whereas the relative risk increased more than fourfold among female smokers. The report noted that cigarettes currently contained smaller amounts of hazardous tar and nicotine than in the past, but the lifetime exposure to cigarette smoke was greater because smokers started earlier, inhaled more deeply, and consumed more cigarettes per day. Another study warned that China, the country with the most smokers in the world, was in the early stages of a smoking epidemic that would likely get much worse. Unless control measures were taken, half of the current 300 million Chinese smokers could die from smoking-related illnesses, according to an estimate by a University of Hong Kong research group. Among China's male smokers, the chief causes of death were cancers of the lung, esophagus, and liver.

      Yet another indictment of cigarettes came from a Chinese study showing that children whose fathers smoked faced a higher risk of developing early childhood cancers than those of nonsmoking fathers. The study, conducted in Shanghai by Chinese and American researchers, suggested that the risk occurred before conception from sperm damaged by paternal smoking.

      Regarding the effects of active smoking, another analysis in London showed that habitual use of cigarettes also contributed to the loss of bone density. This, in turn, increased the risk of hip fracture by about 50%. Experience reported from the Mayo Clinic in Rochester, Minn., revealed that patients having the operation known as percutaneous coronary revascularization, to increase blood flow to the heart muscle, should be discouraged from smoking. Those who continued to smoke after surgery were much more likely to develop serious irregularities of the heartbeat—and to die—than those who gave up the habit.

      This article updates medicine.

Mental Health
      In 1997 the cause of schizophrenia was clarified as a result of investigations carried out in Iowa City, Iowa, on 17 patients at an early stage in their illness. Prior research had suggested that individuals with schizophrenia have abnormally low metabolic activity in a part of the brain called the prefrontal cortex. Past studies were not conclusive, however, since results may have been affected by medication and by the chronic state of the patients' illness. The 10 young men and 7 young women in the new study had not yet been treated with drugs. Most were experiencing symptoms for the first time and had not been previously admitted to a hospital. The research team used a scanning technique, positron emission tomography, to examine blood flow in various parts of the patients' brains. The brains of 17 healthy volunteers of similar ages were also examined.

      Compared with the volunteers who were used as controls, the schizophrenia patients had extensive areas of abnormally low blood flow in several regions of the prefrontal cortex, but other regions showed abnormally high blood flow. This suggested an imbalance between different parts of the brain. The investigators concluded that the dysfunction they observed may impair the normal function of the brain in schizophrenia patients so that the brain cannot process input efficiently or produce output effectively. This impairment leads to hallucinations, delusions, and difficulty in making decisions.

      The value of clozapine in treating schizophrenia also became clearer as a result of the first long-term assessment of its use. Clozapine, a relatively expensive drug, was already known to be effective in reducing hallucinations and delusions. Psychiatrists were uncertain as to whether it was more effective than other drugs, especially haloperidol, in dealing with lack of motivation and other symptoms. The new study involved more than 400 patients whose schizophrenia was difficult to manage and who required frequent hospitalization. They were monitored over one year at 15 Veterans Affairs medical centres in the U.S. The results confirmed that clozapine was somewhat more effective than haloperidol. Clozapine also had fewer side effects, and the overall costs were similar for both drugs.

      Research conducted in Oxford, Eng., threw new light on the brain chemistry responsible for the relatively common disorder known as depression, greatly strengthening the hypothesis that the condition is associated with reduced activity of serotonin, a neurotransmitter that nerve cells use to communicate with each other. Substances that increase serotonin activity in the brain act as antidepressants; however, there was no direct evidence that depression can be triggered by a low level of serotonin. The Oxford findings provided the most convincing evidence to date.

      The subjects in the Oxford study were 15 women who had suffered recurrent episodes of depression but had recovered and were no longer on drug treatment. In laboratory tests they drank a mixture of amino acids that either included or excluded tryptophan, the amino acid that makes up serotonin. Before the tests and seven hours later, the women were evaluated to determine their level of depression. They also rated their own mood. After drinking the tryptophan-free solution (which reduced by 75% the level of tryptophan in their bloodstream), 10 of the 15 women experienced temporary but significant depressive symptoms. The mixture including tryptophan had no such effect. Thus, it seemed that a rapid decrease of tryptophan could precipitate depression in vulnerable individuals, probably by depleting the amount of serotonin in the brain.

      There was a major innovation in London in applying a technique—already widely used in the treatment of psychiatric disorders—to help people who are not suffering specific mental health problems. The technique was cognitive behavioural training (CBT), which aimed to modify patients' perceptions of the external and internal reasons for their successes and failures in life. It had been successfully adopted in dealing with depression and compulsive obsessional neurosis. The London researchers felt that the same approach could be helpful for long-term unemployed people. It may help individuals who are free of psychiatric illness but who have developed reduced expectations and self-esteem, which decreases the likelihood of a successful outcome of job hunting and perhaps reduces their motivation to look for work at all.

      A total of 200 volunteers took part in the experiment and were divided at random into equal groups to receive either three-hour CBT sessions each week for seven weeks or corresponding sessions that simply emphasized social support. Before and after the program, participants completed questionnaires regarding their mental health and their efforts to find employment. Both groups improved their mental health scores during the training period, but the improvement was significantly greater among those who had received CBT. The groups did not differ in their job-seeking activity during or immediately after training. Four months later, however, 34% of the individuals given CBT had found full-time work, as compared with only 13% of those on the alternative program. The organizers of the study concluded that CBT can improve mental health status and produce tangible results in job hunting, to the benefit of individuals and society at large.


      This article updates mental disorder.

Veterinary Medicine
      A global overview of animal health problems by the Office International des Epizooties (OIE) in 1997 surveyed progress in controlling major disease threats to the world's livestock population. Rinderpest was restricted to areas of sub-Saharan Africa, the Middle East, and India; a coordinated vaccination program aimed to eradicate the disease completely in the coming decade. Foot-and-mouth disease was eliminated or controlled in North America, southern South America, Europe, Oceania, Japan, and Southeast Asia. An outbreak in Taiwan, however, resulted in the slaughter of three million pigs. Contagious bovine pleuropneumonia remained a serious concern in Africa, where it was spreading to the south of the continent.

      Another cause for concern was classical swine fever, which reappeared in Europe. In The Netherlands economic losses resulting from the disease exceeded $250 million. Bovine tuberculosis was again seen in many regions, while the incidence of brucellosis in small ruminants and trypanosomiasis in cattle in Africa and horses in Asia and the Middle East was also creating problems. Rabies was being controlled in Western Europe as a result of an oral-vaccination campaign in foxes, but it continued to represent a growing threat in less-developed countries and in Eastern Europe.

      The number of suspected cases of bovine spongiform encephalopathy (BSE; "mad cow" disease) in Great Britain continued to fall. By the middle of 1997, the number of new cases reported each week had dropped to about 100 from a peak of 1,000 in 1993. About 80% of the cases were confirmed. The policy of culling all cattle over 30 months old, introduced in 1996 at the direction of the European Commission, resulted in the slaughter of 1.3 million cattle. A ban on British exports of beef and beef products remained in force, although progress was made toward lifting the ban on certified BSE-free herds where lifetime identity records had been maintained.

      Actuarial studies of the life-span expectancy and the causes of mortality in different breeds of dog were in their infancy. The increasing amount of insurance taken out for companion animals was, however, establishing a database from which patterns were beginning to emerge. A study of data on more than 220,000 Swedish dogs enrolled in life insurance programs analyzed rates of mortality and identified 25 breeds that had either consistently high or consistently low mortality. Large breeds generally tended to die earlier, with the Irish wolfhound topping the list; smaller breeds had much lower rates of mortality, with the incidence for the soft-coated wheaten terrier nine times less than for the Irish wolfhound.

      At the Roslin Institute, Edinburgh, researchers cloned a lamb from a single cell derived from the mammary gland of an adult sheep. The cell nucleus was implanted into an egg from another sheep and transferred to a third, which carried the embryo to full term and gave birth to a healthy lamb, named Dolly. A similar cloning technique was used to produce a transgenic lamb carrying a human gene for a therapeutic protein. This protein would be harvested from the milk of the adult sheep during lactation. After purification the protein could be used for therapeutic purposes. The technique was expected to facilitate the production of a range of proteins with specific medical applications, such as the treatment of cystic fibrosis. (See Special Report. (Life Sciences ))

      Described as the largest small-animal congress ever held, a joint meeting of the World Small Animal Veterinary Association, the British Small Animal Veterinary Association, and the Federation of European Companion Animal Veterinary Associations at Birmingham, Eng., in April attracted an attendance of more than 6,700. The meeting paid particular attention to the potential of information technology for improving veterinary services.


      See also Molecular Biology (Life Sciences ).

▪ 1997

      Health issues played a prominent role in politics around the world in 1996. In the United States, Pres. Bill Clinton made the prevention of tobacco use by children a campaign issue. His Republican opponent, Bob Dole, citing statistics showing a resurgence in teenage drug use, charged the Clinton administration with failure to combat drug abuse among American youth. While Dole opposed abortion, Clinton championed abortion rights and vetoed a controversial bill banning late-term "partial birth" abortions. There was bipartisan support, however, for legislation to make health insurance "portable" when workers change or lose jobs.

      Amid new revelations about exposure of U.S. troops to chemical weapons in the 1991 Persian Gulf War, Congress and a White House commission investigated critics' charges that the Pentagon had failed to respond adequately to Gulf veterans' health problems. Great Britain, the Czech Republic, and Slovakia also announced that they would broaden investigations of their Gulf troops' health complaints.

      In November Russian Pres. Boris Yeltsin underwent coronary artery bypass surgery to treat the severe heart disease that had threatened his personal and political health throughout the year. In Great Britain the government of Prime Minister John Major faced a crisis in consumer confidence as evidence mounted of a link between "mad cow" disease and a new form of a fatal human brain disorder; fears about the safety of the food supply led many European countries to ban the import of British beef. Another controversy erupted in July when fertility clinics in England were required to discard thousands of unclaimed frozen human embryos that had been stored for five years, the maximum time allowed by law.

      The pace of the Human Genome Project accelerated in 1996. In October scientists from the U.S., Canada, Europe, and Japan published the most complete map to date, detailing the sequence and location of more than 16,000 of the estimated 50,000-100,000 human genes. The new map, available on the Internet through the U.S. National Library of Medicine (http:/ /, was expected to be a valuable tool in the search for genes that predispose individuals to disease.

      Progress continued to be made in locating specific disease-related genes. Scientists in Seattle, Wash., identified the gene for Werner's syndrome, a rare inherited disease marked by premature aging. Affected individuals usually die in their 40s of heart attacks or cancer. Further study of the gene, located on chromosome 8, was expected to yield clues to the normal aging process.

      After a four-year search U.S., Australian, and Swedish researchers cloned a tumour suppressor gene that, when mutated, was believed to be responsible for basal-cell skin cancers. A research team in Philadelphia identified a gene that may be involved in esophageal, stomach, and colon cancers, and U.S. and Swedish scientists announced the discovery of a gene believed to predispose men to cancer of the prostate. U.S. and Italian researchers identified a site on chromosome 4 that is linked to some cases of Parkinson's disease, a common neurodegenerative disorder.

      A collaborative study by researchers in several countries revealed the gene responsible for Friedreich's ataxia, a disorder that affects gait and strength in the legs and confines most victims to a wheelchair by their late 20s.

      The prospect of a simple, noninvasive prenatal test based on the isolation of fetal cells from the mother's blood was advanced by scientists at the University of California, San Francisco, who used the technique to diagnose inherited blood disorders in two fetuses. As genetic testing became more feasible, its potential pitfalls became more apparent. A study of individuals with a family history of breast or ovarian cancer found that fewer than half wanted to undergo an experimental test for genetic susceptibility. Many declined because of concerns about job and insurance discrimination if they tested positive. Another study of 332 individuals in families with genetic disorders found that one-fourth believed they had been discriminated against in terms of obtaining life insurance; one-fourth reported discrimination in obtaining health insurance, and 13% reported discrimination in employment.

Cardiovascular Disease.
      Two studies sponsored by the U.S. National Institutes of Health (NIH) found that dietary changes and weight loss can prevent and control high blood pressure (hypertension). A multicentre investigation involving more than 450 adults with and without hypertension, found that reducing overall fat intake and eating more fruits and vegetables (9 to 10 servings a day) and low-fat dairy products (3 servings a day) were as effective as drugs in lowering blood pressure. A study of more than 900 people aged 60 to 80 found that blood pressure control could be safely maintained in many subjects by means of weight loss and reduced salt intake without the use of antihypertensive drugs.

      The debate over the contribution of dietary sodium (salt) to hypertension continued. A meta-analysis of 56 trials of salt restriction published in May found that older people with high blood pressure benefited, but younger individuals with normal blood pressure did not. The Canadian authors concluded that current recommendations calling for universal dietary sodium restriction are unnecessary. They were challenged, however, by scientists from the Intersalt study, an investigation of the relationship between salt and blood pressure in more than 10,000 people in 32 countries. An update of the 1988 Intersalt data, also published in May, reaffirmed the importance of salt restriction in the control of blood pressure.

      Researchers at Tufts University, Medford, Mass., found that high levels of a "bad" form of cholesterol, known as lipoprotein (a), can double a man's risk of premature heart attack. New research showed that the fatty substances known as triglycerides can thicken the blood and increase the risk of heart attacks at lower levels than previously thought. A large study of an experimental blood-thinning drug called clopidogrel found it to be more effective and safer than aspirin in preventing heart attacks, while a separate study showed that an experimental clot-inhibitor, integrelin, works better than aspirin in patients suffering from reduced blood flow to the heart.

      There was further evidence of the life-saving potential of thrombolytic, or clot-dissolving, drugs for treatment of acute heart attack. Research at Erasmus University, Rotterdam, Neth., highlighted the importance of administering the drugs as soon as possible after the attack. This conclusion was supported by work in Scotland that showed that in patients receiving thrombolytic drugs two or more hours after the onset of symptoms, every hour's delay in the administration of the drugs had an appreciable effect on long-term survival.

      A Dutch study published in January found that laser surgery to open blocked arteries—laser angioplasty—is no more effective than balloon angioplasty, a procedure in which a balloon threaded into a blocked vessel is inflated at the site of the blockage. Not all physicians who perform balloon angioplasty have sufficient experience with the procedure, however. According to a report presented at the annual meeting of the American Heart Association in November, patients whose doctors performed an annual average of only 30 balloon angioplasties had higher death rates and required additional surgery more often than those whose clinicians performed 50 or more procedures.

      An analysis of 300 men and women with coronary heart disease in Belgium reawakened interest in personality as a factor that—along with cholesterol, blood pressure, and other variables—can affect prognosis. It showed that those with so-called type-D personality—characterized by depression, social alienation, and the suppression of feelings—had significantly higher death rates than those with other personality types. A Harvard Medical School study of 1,305 veterans found that the grumpiest old men—those who reported episodes of extreme anger—were at three times greater risk than their more placid counterparts.

      More than three decades after the first U.S. surgeon general's report on smoking and lung cancer, scientists finally uncovered a distinct biological mechanism by which tobacco use can cause lung tumours. Researchers demonstrated that benzo[a]pyrene, a component of tobacco smoke, damages specific regions of the key tumour suppressor gene p53. These same regions are commonly found to be mutated in human lung cancer patients.

      The National Cancer Institute (NCI) reported in November that U.S. cancer death rates had decreased in the 1990s—the first such drop recorded in the 20th century. Experts attributed the reduction of nearly 3% between 1991 and 1995 to inroads against smoking, earlier diagnosis, and better treatments. The decline in cancer deaths was greater in men than in women. This disparity was attributed to the drop in deaths from lung, colorectal, and prostate cancers. The smaller reduction observed among women reflected declining death rates from breast, colorectal, and gynecologic cancers. Lung cancer deaths in women had continued to rise, however. Moreover, despite lower mortality rates, U.S. cancer incidence—the number of new cases reported—had increased during the 1990s.

      Researchers at the Harvard School of Public Health announced that the vast majority of cancer deaths resulted from unhealthy lifestyles. Their report, published in the journal Cancer Causes and Control, concluded that only 2% of cancer mortality was attributable to environmental exposures and only 10% to genetics.

      A study by the NCI found that black men have higher overall rates of new cancers and of cancer deaths than whites, largely because they have a disproportionate incidence of prostate and lung cancers. An expert panel convened by the NIH called for broader screening for cervical cancer, noting that regular use of the Pap test could virtually eradicate the disease.

Infectious Diseases.
      The 11th International Conference on AIDS, held in July in Vancouver, B.C., was marked by unprecedented optimism. Most encouraging was the news that for many people with AIDS, proper treatment may prolong life indefinitely. Several studies found that combination therapy—the concurrent use of two or more anti-HIV drugs—reduces the amount of virus circulating in the blood, delays the progression of HIV infection to AIDS, and improves patient survival. By the year's end U.S. physicians had access to nine different anti-HIV drugs for use alone or in combination. In much of the world, however, the cost of such therapies put them out of reach of most HIV-infected individuals.

      The year was also marked by significant advances in the understanding of how HIV infects cells and why some people who are exposed to the virus do not become infected. First, two coreceptors for HIV were identified. Coreceptors are molecules on a cell's surface that, together with other molecules, mediate the entry of substances into the cell. A single HIV receptor, CD4, had long been recognized, but the existence of others was suspected. In 1996 scientists identified two of these, which they named fusin and CKR-5 (also called CCR-5). It was subsequently discovered that people who possess two mutated copies of the gene that codes for CKR-5 are virtually immune to the most common strains of HIV. Moreover, infected persons who have one mutated gene for the receptor are slower than others to progress to AIDS.

 Nearly every continent was affected by outbreaks of infectious disease in 1996. Dengue, a mosquitoborne viral infection, was responsible for at least 8,000 cases of illness in the Mekong delta of Vietnam, while the even more deadly variant known as dengue hemorrhagic fever killed some 300 people and made thousands more ill in New Delhi. Public health officials in South America worked to contain an epidemic of another mosquitoborne viral infection, Venezuelan equine encephalitis, a disease that affects both horses and humans. Meningitis killed more than 4,500 in western Africa, and the rare but extremely lethal Ebola virus, which had surfaced in Zaire in 1995, claimed 10 lives in Gabon.

      Foodborne diseases captured the headlines in developed countries. A deadly strain of the bacterium Escherichia coli, earlier blamed for the deaths of U.S. youngsters who ate undercooked hamburgers, affected about 9,500 people in Japan between May and November. In the U.S. a small outbreak of E. coli infection was traced to unpasteurized apple juice. Individuals in 11 states and Canada fell victim to a gastrointestinal disorder attributed to a little known organism, Cyclospora cayetanensis; investigators traced the infection to raspberries grown in Guatemala, but the source of the contamination remained undetermined.

Lifestyle, Habits, and Health.
      Blindness was added to the long list of adverse consequences of smoking. Two Boston studies found that pack-a-day male and female smokers were more than twice as likely as nonsmokers to develop age-related macular degeneration, the leading cause of blindness in elderly Americans. A study of deaths among Minnesota alcoholics found that one-half had died of smoking-related causes, including heart disease and cancer, while only about one-third succumbed to alcohol-related disorders.

      Research conducted in Sydney, Australia, and London added to the rapidly accumulating evidence that passive smoking is a substantial cause of heart disease. The investigation focused on the capacity of the arteries to dilate in response to bodily demands for increased blood supply. Impairment of this capacity had been implicated in the onset of atherosclerosis (narrowing of the arteries due to buildup of fatty deposits) and had been demonstrated in young cigarette smokers. The new research showed that the capacity of the arteries to dilate is also significantly reduced in young adults who have never smoked but have been exposed to tobacco smoke for at least one hour daily for three or more years.

      Two studies, however, found intriguing evidence of tobacco's positive effect on the brain. Neuroscientists at Baylor College of Medicine, Houston, Texas, described how low concentrations of nicotine in the blood help to improve memory by triggering communication between nerve cells, while scientists at Case Western Reserve University, Cleveland, Ohio, found that nicotine may help prevent or delay the formation of neural plaques, brain lesions characteristic of Alzheimer's disease.

      The Centers for Disease Control and Prevention (CDC) reported that about 90% of tobacco use was initiated among youngsters aged 18 and under and that tobacco use among teens was continuing to rise. Concern about this problem propelled the U.S. Food and Drug Administration (FDA) to declare tobacco an addictive drug. The agency also finalized new regulations that required store clerks to ask for verification of age before selling tobacco products to young people. (Minimum age for purchase was 18.) Restrictions on tobacco advertising and promotion were also made more stringent.

      The federal government reported that alcohol-related driving deaths rose by 4% in 1995, the first such increase in a decade. About 4 out of 10 traffic fatalities involved alcohol.

      A 17-year follow-up survey of 11,000 vegetarians and "health-conscious" people in the U.K. showed that those who ate fresh fruit each day had a 24% lower risk of dying from coronary heart disease and a 32% lower risk of dying as a result of a stroke. The overall death rate in this group was also 21% lower than that of a control group of individuals who did not eat fruit regularly. In addition to reduced mortality from heart disease and stroke, the decline in deaths overall was largely attributable to a decreased rate of deaths from lung cancer and respiratory conditions, which may have reflected the low proportion of smokers (11%) in the sample.

      In April the FDA approved the first antiobesity drug in 23 years, a chemical called dexfenfluramine, which helps dieters eat less by reducing the craving for food. The drug was already available in Europe. Dexfenfluramine and related medications were not without risk, however. A European study found a slight increase in primary pulmonary hypertension (elevated pressure in vessels carrying blood to the lungs), a potentially fatal condition, in patients taking these drugs.

      The first U.S. surgeon general's report on physical activity and health, released in July, concluded that any activity that burns at least 150 calories per day can help reduce the risk of such chronic ailments as heart disease, diabetes, and depression. Such activity can include swimming laps (20 minutes), gardening (30-45 minutes), and washing and waxing a car (45-60 minutes). The report noted, however, that more than 60% of U.S. adults are not physically active on a regular basis, and of these, 25% do not get any exercise at all.

Other Developments.
      An extensive investigation into sudden infant death syndrome (SIDS, or cot death) in the U.K. demonstrated that infants were at heightened risk if the mother had smoked during pregnancy. Household exposure to tobacco smoke had an additional, independent effect in increasing the likelihood of SIDS. Clinicians in New Zealand reported that there was an appreciably lower risk of SIDS among infants who slept in the same room—but not those who slept in the same bed—as their parents.

      An analysis conducted at the Stanford University School of Medicine clarified the benefits to be obtained by screening healthy individuals for the ulcer-causing bacterium Helicobacter pylori and treating those who are infected with antibiotic drugs. When present in the stomach, H. pylori is an important risk factor for gastric cancer, the second leading cause of death from cancer worldwide. Although the bacterium occurs in 30-40% of the U.S. population, fewer than 1% of these people develop cancer. The Stanford study showed, however, that screening and treatment are potentially cost-effective in preventing gastric cancer, especially in high-risk populations like that of Japan and other Asian countries. In the U.S. a breath test for detecting the bacterium was licensed for use by health care professionals.

      Still other newsworthy developments of 1996 included the following:

      The Pentagon announced that it was stepping up efforts to investigate possible causes of Persian Gulf War veterans' health complaints. Although two reports published during the year showed that U.S. troops who served in the Gulf did not have higher death or hospitalization rates than other military personnel, preliminary findings from two other studies indicated that Gulf War vets were more likely than others to suffer from serious, even disabling, medical conditions.

      A University of Kentucky study compared brief autobiographies written by young nuns 60 years earlier with tests of the brain function of the now-elderly women. The researchers found that the nuns whose early writings demonstrated high idea density and grammatical complexity were less likely to have developed Alzheimer's disease in later life than those whose prose style was simple. This finding may indicate that the brain deterioration of Alzheimer's begins long before typical signs of the disease—cognitive impairment and personality changes—become apparent. In other Alzheimer's research, estrogen replacement therapy was shown to reduce the risk of the disease in postmenopausal women, and one small study—of only 12 subjects—found that estrogen treatment significantly improved memory and concentration in elderly women diagnosed with Alzheimer's.

      The United Network for Organ Sharing, the agency responsible for allocating all donor organs in the U.S., changed the national liver transplantation guidelines, giving top priority on the waiting list to critically ill individuals with sudden onset of a liver disorder, who had the best survival chances. Previously, priority had been given to those suffering from chronic liver disease, who, because of their long-term illness, were less likely to make a successful recovery.

      The CDC formally endorsed a change in the schedule of routine childhood immunizations, recommending that all U.S. children receive two injections of inactivated polio vaccine (IPV), followed by two doses of oral polio vaccine (OPV). Previously, the schedule had called for four doses of OPV. The change was made because the oral vaccine, which uses live virus, was occasionally known to cause the disease. IPV was already being used for routine immunization in Scandinavia, France, The Netherlands, and Canada.

      U.S. automotive safety experts issued new warnings that air bags, intended to save lives during a car crash, may pose a significant risk of death to children under 12 and to some small adults sitting in the front passenger seat. They advised drivers to keep younger children belted in the back seat.


      In 1996, for the first time, a multinational study was able to demonstrate clear similarities and differences in the rates of specific mental illnesses in several countries throughout the world. Unlike previous investigations, in which different methods were employed in different countries, the new survey was based on a uniform methodology. Its purpose was to assess the pattern and extent of two conditions, major depression and bipolar disorder, in Canada, France, the former West Germany, Italy, South Korea, Lebanon, New Zealand, Puerto Rico, Taiwan, and the U.S.

      One principal finding was that major depression varied considerably in incidence. The lifetime rate ranged from 1.5 cases per 100 adults in Taiwan to 19 cases per 100 adults in Lebanon. Similarly, the annual rate ranged from 0.8 cases per 100 adults in Taiwan to 5.8 cases per 100 adults in New Zealand.

      There was, however, much less variation in the pattern of major depression. In all countries in the study, this condition had a similar age of onset (usually the mid- to late 20s) and affected more women than men. Persons who were separated or divorced had significantly higher rates of major depression than married persons in most of the countries. The majority of those affected reported both insomnia and loss of energy.

      In the case of bipolar disorder, the data showed more uniformity in both the incidence and the pattern of the disorder. The lifetime rates ranged only from 0.3 per 100 adults in Taiwan to 1.5 per 100 adults in New Zealand. The sex ratios were nearly equal, and the age at onset was earlier than for major depression.

      The investigators believed that cultural differences or varying risk factors may at least partially explain the differing rates of major depression. Nevertheless, some of the findings remained puzzling. For example, Paris, a city with a temperate climate and a stable economy and political structure, had a rate of major depression almost as high as that of Beirut, Leb., which was ravaged by war for some 15 years.

      Psychiatrists in London reported that black Caribbean and African patients suffering from certain psychotic illnesses differed from whites in the likelihood of involuntary hospitalization. The subjects were individuals from two areas in the south of the city; they had conditions such as schizophrenia and psychotic affective disorder. The higher rate of compulsory detention for blacks was independent of psychiatric diagnosis and was irrespective of other factors such as employment and marital status. The reasons for the disparity were unclear. The authors of the report speculated, however, that black people may perceive mental health services as "untherapeutic." They thus delay seeking help and thereby increase the chances that they will be hospitalized involuntarily.

      A major problem in treating psychotic patients was that up to 80% of them failed to take their medication as directed. Given the efficacy of modern antipsychotic drugs and the potentially serious consequences of relapse, ensuring compliance became a major goal of mental health researchers. One group in London showed that an approach known as "compliance therapy" could significantly improve patients' reliability in taking their drugs. It also produced long-lasting results.

      The therapy aimed to help people to change their behaviour by means of interviews intended to provide motivation but to avoid the confrontation and stalemate that often impair the relationship between patient and psychiatrist. In the London experiment, those receiving the new therapy were five times more likely to attain an acceptable level of compliance than patients simply given their medication and instructed to take it regularly but provided with no further encouragement or support.

      Researchers in Valencia, Spain, reported progress in helping patients with severe depression resistant to all drugs normally used to treat this condition. The new technique involved placing an electrical coil on the patient's scalp and creating a rapidly changing magnetic field, which reached the brain structures beneath, specifically a region known to be linked with depression. When 17 patients with intractable depression were treated in this way, 11 experienced a pronounced improvement that lasted for about two weeks. With further refinement of the therapy, more permanent results should be possible.

      An English trial of estrogen therapy for severe postnatal depression reported encouraging results. The subjects were 61 women who within three months of childbirth had developed major depression, which had then persisted for up to 18 months. During their three months of treatment with estrogen delivered by means of a skin patch, the women receiving treatment found that their depression waned rapidly. Although those receiving a placebo also felt slightly better, improvement in the estrogen-treated women was much more dramatic. (BERNARD DIXON)

      This article updates mental disorder.

      Ramifications of the fatal cattle disease bovine spongiform encephalopathy (BSE), also known as "mad cow" disease, dominated the veterinary news in much of the Western world in 1996. The disease, found mainly in the U.K., was attributed to the practice of feeding dairy cows manufactured feeds containing protein material from sheep infected with scrapie, a similar disease. It takes some years for signs of BSE to appear in infected animals, the main manifestation being erratic behaviour and increasing difficulty in moving.

      First identified in England in 1986, BSE had been the subject of an ongoing eradication process based on the slaughter of affected animals. This process had been proceeding more or less according to plan; the numbers of new cases of BSE had declined sharply from a peak in 1993, and it was predicted that the disease would be eliminated from herds in the U.K. soon after the year 2000.

      In 1996, however, scientists announced a possible link between consumption of beef from BSE-infected cows and several cases of a new form of Creutzfeldt-Jakob disease, a fatal neurodegenerative disorder. Further, contrary to earlier predictions, preliminary results of long-term studies in the U.K. suggested that BSE might be transmissible from cow to calf. As a result, the European Commission prohibited the U.K. from exporting beef and beef products and cattle, and beef sales dropped sharply throughout Europe.

      The new findings prompted demands from the Commission for more urgent measures to eradicate the disease before the export ban could be lifted. These included the establishment of rigid precautions in slaughterhouses, as well as the destruction of hundreds of thousands of cattle born before 1993. The proposals for mass slaughter, which meant that many healthy cattle would have to be killed, caused an outcry from veterinarians, farmers, and animal welfare activists in the U.K. Many argued that the program already in place would have eradicated the disease just as quickly as the new plan.

      A report from the World Health Organization (WHO) on the progress of the European campaign to control—and, it was hoped, eradicate—rabies by laying vaccine-impregnated baits for foxes, the main carrier of the disease, found that rabies prevalence had been reduced to 20% of its former level. The success rates in the 14 participating countries differed considerably, however, and the cost—$83 million in total to date—was causing support to wane in some areas. The authors of the WHO report called for a review of the campaign to identify problems responsible for the variable success rate and to draw up guidelines for the future.

      The physical attributes of different breeds of dog are well-defined, but the temperamental characteristics, although of equal importance to a potential owner, are much less so. J.W.S. Bradshaw and colleagues at the University of Southampton, Eng., surveyed veterinarians and animal-care professionals to establish an objective assessment of behavioral traits such as excitability, watchdog behaviour, and aggression toward other dogs in 50 popular breeds. They also asked whether males or females were more likely to exhibit a particular behaviour. The results broadly confirmed existing anecdotal opinion. They showed that females were, in general, easier to train, more demanding of affection, and more mature than males. The most aggressive breeds were rottweilers, German shepherds, Doberman pinschers, and bull terriers; the least aggressive included the spaniels, setters, and sheepdogs. (EDWARD BODEN)

      See also Molecular Biology (Life Sciences ).

      This article updates diagnosis; disease; infection (infectious disease); medicine.

▪ 1996


Medical Developments.
      Celebrities attracted international attention to a variety of medical causes in 1995. The announcement in late 1994 that former U.S. president Ronald Reagan was suffering from Alzheimer's disease led to the establishment of a new institute to conduct research into this brain disorder. Baseball legend Mickey Mantle's (see OBITUARIES (Mantle, Mickey )) liver transplant and subsequent death promoted public awareness of the acute need for donor organs and the ethical issues involved in deciding who is to receive them. Superman star Christopher Reeve's paralysis following a fall from a horse publicized the devastating consequences of spinal cord injuries. The murder trial of former football great O.J. Simpson focused attention on the problem of domestic violence.

      A deadly tickborne illness known as human granulocytic ehrlichiosis was reported in the United States, an outbreak of the killer Ebola virus surfaced in Zaire, and health officials from Central and South America launched an emergency plan to combat a major epidemic of dengue hemorrhagic fever, which is spread by the Aedes aegypti mosquito.

      Chronic diseases continued to take the greatest toll in the industrialized world, however. A mid-decade report from the U.S. Department of Health and Human Services found that Americans were making progress in some respects (living longer, smoking less, and cutting deaths from heart disease, stroke, and alcohol-related automobile crashes) but that setbacks had occurred in efforts to reduce obesity and in the prevention of violence, teen pregnancy, and deaths from pneumonia and influenza.

      The Human Genome Project, an international effort to identify and analyze the 100,000 or so genes that make up the entire human genetic complement, was progressing faster than expected. Laboratories in the U.S., France, and Britain reported that detailed mapping efforts already had determined the approximate location of about 75% of the human genes, and more than 50% had been sequenced (i.e., broken down into their constituent parts). Experts predicted that 99% of the genome may be sequenced by the year 2002. The first-ever sequencing of the full genome of a free-living organism, the infectious bacterium Hemophilus influenzae, was reported by J. Craig Venter (see BIOGRAPHIES (Venter, J. Craig )) and co-workers.

      Efforts to isolate specific disease-related genes also raced ahead. Researchers at the University of Texas Health Science Center at San Antonio reported that the BRCA1 gene, isolated in 1994 in women with a family history of breast cancer, also plays a role in the more common nonfamilial form of the disease. Another study found that a significant proportion of Ashkenazi, or Eastern European, Jews carry a particular mutation of BRCA1 that puts them at a much greater than average risk of breast and ovarian cancer. British scientists announced in December the discovery of a second gene linked to breast cancer, BRCA2. Still another piece of the breast cancer puzzle may have been supplied by the discovery of the gene defect responsible for ataxia telangiectasia (AT), a progressive, fatal neurological disorder. AT first becomes apparent as an unsteady gait in toddlers. Affected individuals, who have two copies of the mutated gene, usually die in their teens or 20s. Carriers—those who inherit only one copy of the mutated gene—have three to five times the normal risk of cancer, and women who carry the mutated gene may have as much as six times the normal risk of breast cancer. About 1% of the U.S. population—2.5 million people—may be carriers.

      Back-to-back reports identified two genes responsible for early-onset forms of Alzheimer's disease, which tend to run in families. A University of Toronto team announced in June that a gene on chromosome 14 appears to be responsible for as many as 80% of familial cases. In August investigators from Seattle, Wash., and Boston simultaneously reported that a similar gene on chromosome 1 may account for most other such cases. Scientists hoped these findings would speed the understanding of all forms of Alzheimer's disease.

      In New York City, Rockefeller University investigators, who cloned an obesity gene in 1994, reported in July 1995 that the protein product of the gene dramatically reduced body weight in mice after only two weeks of treatment. Additional research published in October suggested that the protein, dubbed leptin (from the Greek root leptos, "thin"), plays a role in regulating fat storage in the body.

      The first clear evidence that a gene plays a role in non-insulin-dependent diabetes mellitus (NIDDM), a disorder that usually develops in later life, was announced by researchers in France. Scientists in Sweden, France, and the U.S. reported in August that they had pinpointed another gene that was associated with both obesity and earlier-than-usual onset of NIDDM in some populations.

      Dean Hamer and his colleagues at the National Institutes of Health (NIH) confirmed and extended their 1993 work suggesting that a particular region of the X chromosome influences the development of homosexuality in males. Other "finds" included the gene believed responsible for Batten disease, the most common neurodegenerative disorder afflicting children; a mutation that increases susceptibility to venous thrombosis (blood clots in the veins); and two genes that cause the heart disorder known as long QT syndrome.

      Pioneering gene therapy protocols were evaluated and found to have produced mixed results. Treatment of a rare condition called adenosine deaminase deficiency was beneficial, while no therapeutic improvements were seen in patients with cystic fibrosis or Duchenne muscular dystrophy.

Cardiovascular Disease.
      Although heart transplantation is an accepted procedure, its success is compromised in some recipients by the development of high blood cholesterol levels. Elevated cholesterol, in turn, may cause fatty deposits, blocking the coronary arteries and producing the symptoms that necessitated the operation in the first place. Researchers at the University of California, Los Angeles, School of Medicine and Brigham and Women's Hospital, Boston, showed that the cholesterol-lowering drug pravastatin markedly reduces the risk of restenosis (i.e., renarrowing of the arteries) after heart transplantation. Patients given pravastatin had much lower cholesterol levels a year after transplantation than those not receiving the drug. They were also much less likely to reject their new hearts, and their survival rate was significantly higher.

      Several studies raised concerns about the safety of calcium channel blocking drugs used in treating millions of patients in the U.S. and elsewhere with hypertension (high blood pressure) and certain heart disorders. The National Heart, Lung, and Blood Institute issued a warning in September that one of these drugs, short-acting nifedipine, should be used with great caution, if at all, but declared that more research was needed on other calcium channel blockers.

      Evidence of the role of diet in cardiovascular disease continued to accumulate. A University of Washington study showed that eating as little as one serving per week of "fatty" fish, such as salmon, tuna, or mackerel, can reduce the risk of cardiac arrest. These kinds of fish are rich in omega-3 fatty acids. Another report from the same institution concluded that folic acid, a B vitamin already known to play a part in preventing birth defects, also helps prevent coronary heart disease. Paralleling an earlier finding in women, a report by investigators at Harvard Medical School demonstrated that men who eat a diet high in fruits and vegetables have a significantly reduced risk of stroke compared with men who consume less of these antioxidant-rich foods.

      A report issued in February by the National Cancer Institute found that the rate of new cancer cases in the U.S. had risen nearly 19% in men and 12% in women from the mid-1970s to the early '90s, largely because of more widespread early detection of prostate and breast cancers and increased incidence of smoking-related lung cancers. The rates of several less common cancers, such as non-Hodgkin's lymphoma and skin, kidney, testicular, and brain cancers, also had increased.

      The form of leukemia known as adult T-cell leukemia-lymphoma, which is associated with a virus similar to the one that causes AIDS, is one of the most difficult cancers to treat. In 1995, however, studies in several hospitals in both France and the U.S. showed that alpha interferon, combined with zidovudine (which is also used to combat AIDS), was effective even in patients in whom conventional therapies had failed.

Infectious Diseases.
      The incidence of tuberculosis (TB) increased in several countries, especially among economically disadvantaged groups. Research in England and Wales established that TB cases had risen by 35% in the poorest tenth of the population over four years and by 13% in the next two-tenths; there was no change in incidence among the remaining 70% of the population. Investigators concluded that socioeconomic factors (such as crowded living conditions) were the major reason for the increase, the immigration of infected persons making only a minor contribution.

      Physicians in The Netherlands expressed concern that TB was spreading more rapidly than expected from high-risk groups to the general population. The number of cases reported in Amsterdam in 1995 rose by 37% over the previous year's total to reach the highest figure since 1966. Although there was a 20% increase in TB incidence among immigrants from countries with high TB rates, new cases rose by 74% among people born in The Netherlands.

      A study in New York City, a locale hard hit by the recent resurgence of TB, suggested that in that city, at least, the tide may have been turned; reported cases had declined by 21% over a two-year period. Reasons for the change included measures to reduce the spread of infection in institutions such as jails and to ensure that patients complete the prolonged (up to one year) course of drug treatment. Failure to complete antibiotic therapy was a factor in the continued spread of the disease, as well as in the rise of drug-resistant strains of the tubercle bacillus.

      Strains of the bacillus insensitive to once-effective antibiotics such as streptomycin posed ongoing problems, however. Especially alarming was the emergence in New York City of organisms resistant to fluoroquinolones—drugs hitherto effective against tubercle bacilli that had become resistant to other agents.

      The emergence of drug-resistant forms of a bacterium that causes pneumonia, Streptococcus pneumoniae, aroused particular concern in the U.S. A survey in metropolitan Atlanta, Ga., showed that a quarter of the strains isolated from both children and adults suffering from invasive pneumonia were resistant to penicillin, formerly the first-choice antibiotic for this disease. This finding prompted calls for more widespread use of the vaccine against S. pneumonia.

      Studies published during the year confirmed that combination therapy is more effective than monotherapy (i.e., use of a single drug) in combating HIV. Scientists at Wellcome Research Laboratories in Kent, England, found that when the drugs AZT (zidovudine) and 3TC (lamivudine) were administered together, they were far more effective in reducing the level of circulating virus particles and protecting vulnerable immune cells than either drug used singly. Patients were also less likely to develop drug resistance. In November 3TC was approved for sale in the U.S. under the trade name Epivir.

      A new class of anti-HIV drugs, called protease inhibitors, was showing promise in clinical trials. These agents attack the virus at a different stage in its life cycle than drugs like AZT. In a finding that had implications for both AIDS vaccine and drug therapy research, researchers at the Macfarlane Burnet Centre for Medical Research in Victoria, Australia, reported in Science in November that they had found a genetically weakened strain of HIV in a small cluster of patients who remained healthy despite having been infected for more than a decade.

Health Behaviours.
      Male former smokers gain about 4.5 kg (10 lb) and females 5 kg (11 lb) in the decade after they quit, but according to the U.S. Centers for Disease Control and Prevention, the decline in smoking in recent years accounted for less than one-fourth of the overall weight gain in the U.S. population in the 1980s. During this period the proportion of Americans who were overweight rose nearly 10% in men and 8% in women.

      U.S. teenagers were engaging in unhealthy behaviours in greater numbers than before and at ever-younger ages. Data published in July based on a 1992 government survey of more than 10,000 youths aged 12 to 21 showed that more than one-fourth were current smokers, one-fourth said they had indulged in "binge drinking" (five or more drinks in a row), one in 10 had smoked marijuana, and one in 7 had carried a weapon in the previous month. Six out of 10 never-married youths had engaged in sexual intercourse. In August the administration of Pres. Bill Clinton launched an unprecedented attack on teen smoking, proposing curbs on advertising and vending machine sales and mandating new antismoking education campaigns. Tobacco companies responded by taking the government to court.

Women and Infants.
      An international consensus emerged as to the most effective way of dealing with eclampsia—the occurrence of convulsions (not attributable to a condition such as epilepsy) in women who develop high blood pressure during pregnancy. In what the British Medical Journal described as "the most important obstetric trial of the 20th century," researchers at 23 centres in eight countries assessed the different therapies currently in use worldwide and concluded that magnesium sulfate (rather than the formerly widely used phenytoin or diazepam) should be the treatment of choice in the future.

      In the wake of complaints that the medical problems of women had received short shift in the past, basic and clinical research in the field of women's health continued to grow. A Harvard Medical School study of more than 115,000 women found that even being mildly to moderately overweight is hazardous to health. In this study a gain of 6.8-9.1 kg (15-20 lb) after age 18 was associated with an increased risk of heart attack in later life. Even being of "average" weight increased a woman's risk of dying prematurely. As a result of these and other data, government agencies were revising—downward—the weight guidelines for adults.

      A three-year NIH study of healthy women aged 45 to 64 found that taking any one of four hormone regimens significantly increased blood levels of high-density lipoprotein (HDL), the "good" cholesterol, and decreased low-density lipoprotein (LDL), the harmful form. HDL increases had been shown to reduce the risk of coronary heart disease, the number one killer of men and women alike in most Western countries. Women who took estrogen alone (as opposed to a combination of estrogen and progestin) had the greatest heart benefits but were also at increased risk of uterine cancer. Thus, women who still had a uterus were advised to opt for combination therapy.

      Studies evaluating the breast cancer risk of hormone replacement therapy came to conflicting conclusions. Data from the Nurses' Health Study, a long-term epidemiological investigation of more than 100,000 female nurses, found a slightly increased rate of breast cancer among women who used hormones for five or more years after menopause. A smaller study published almost simultaneously found no link between hormone use and breast cancer.

      A survey commissioned by the Henry J. Kaiser Family Foundation found that the number of U.S. doctors, particularly younger ones, willing to do surgical abortions was declining. Overall only about one-third of practicing obstetrician-gynecologists said they currently performed such procedures. These findings gave added impetus to the search for nonsurgical approaches to ending early-stage pregnancies. In September the New York City-based Population Council completed the clinical part of a U.S. study that could clear the way for government approval of mifepristone, or RU 486, an abortifacient drug already used extensively in Europe.

      Calling it a "silent violent epidemic," the American Medical Association (AMA) issued new guidelines to help physicians become more involved in preventing and treating sexual assault. The AMA said that about 6 out of 10 female victims were under age 18, and three-quarters of sexual assaults were committed by someone known to the victim, such as a friend, acquaintance, partner, or family member. Male victims represented only about 5% of reported sexual assaults.

Other Developments.
      A clue to understanding and treating chronic fatigue syndrome (CFS), a puzzling condition most common in young women but also found in men and women of all ages and occasionally reported in localized outbreaks, came from two small studies at Johns Hopkins Hospital in Baltimore, Md. Doctors identified an abnormality in blood pressure regulation, known as neurally mediated hypotension, that may increase an individual's vulnerability to CFS. Preliminary results suggested that drugs to treat the abnormality and increased salt in the diet could help reduce CFS symptoms. A larger government-funded study was planned for 1996.


      In additional developments worthy of note:

      Investigators at Boston University School of Medicine found that excessive vitamin A intake—more than 10,000 international units per day (the amount found in two to three multivitamin pills)—early in pregnancy increases the risk of birth defects.

      Epidemiologists comparing 200 infants who had died of sudden infant death syndrome, or SIDS, with 200 healthy controls found that exposure to secondhand smoke was strongly associated with sudden unexplained death in otherwise healthy babies.

      A study from the University of Kentucky suggested that soy protein can lower elevated blood cholesterol levels, especially levels of LDL.

      An NIH trial demonstrated that daily doses of hydroxyurea, a drug used for some years to treat certain cancers, significantly reduced the number of painful episodes in patients with sickle-cell disease. Those taking the drug also required fewer hospitalizations and fewer transfusions than their untreated counterparts.

      One of the largest studies ever to evaluate air quality in the U.S. concluded that the risk of death was 15% higher in those cities with the dirtiest air. The higher death rates were attributed to the respiratory effects of microscopic particles in automobile exhaust and industrial emissions.

      A team led by scientists at Yale University School of Medicine confirmed what many had long suspected—that men and women think differently. The Yale investigators used functional magnetic resonance imaging to compare the brain function of men and women while reading; they found that male and female subjects used different parts of their brains while performing the task.

      A report from Denmark indicated that drinking wine—but not beer or liquor—reduces the incidence of deaths from all causes. The beneficial effects were particularly evident with respect to mortality from cardiovascular disease.

      Scientists at Memorial Sloan-Kettering Cancer Center in New York City announced that they had slowed the growth of prostate cancer in laboratory mice by cutting the amount of fat in the animals' diets. They reduced the percentage of fat the mice consumed by nearly half, to 21%. (The average American diet is about 36% fat.)

      The first vaccine to prevent chicken pox was licensed for use in the U.S. The Advisory Committee on Immunization Practices of the U.S. Public Health Service recommended that all children be immunized between 12 and 18 months of age.

      This updates the articles diagnosis; disease; infection (infectious disease); medicine.

      Depression, suicide, suicidal behaviours, and other psychosocial disorders were all increasing rapidly among young people throughout Europe and North America, according to a major international survey conducted in 1995. The study group, chaired by Sir Michael Rutter of the Institute of Psychiatry at the University of London, could find no clear explanation for this growing problem, which was accompanied by similar trends in alcohol and drug dependence. Virtually the only area of mental health that did not show unambiguously worsening figures among teenagers was that of eating disorders. The survey also indicated that the incidence of suicide, substance abuse, and crime was particularly high among males, whereas depression, eating disorders, and suicidal behaviours were especially prevalent among females; however, the male and female rates for depression, substance abuse, suicidal behaviours, and crime were beginning to converge.

      From a global perspective, the outlook appeared no more optimistic. In a report issued in May at United Nations headquarters in New York City, a team of health authorities from 30 countries warned that increasing rates of mental illness in less developed countries threatened the social stability of the Third World. The group cited not only neuropsychiatric disorders such as epilepsy and schizophrenia but also behavioral problems such as substance abuse and violence. It noted that war and political upheaval were responsible for an increased risk of depression, anxiety disorders, and other forms of mental distress among the world's more than 40 million refugees and displaced persons.

      Concern about rising suicide rates among men under 35 in Europe prompted researchers in Helsinki, Finland, to assess the incidence of mental disorders in such individuals. The results showed that significantly more of these men had suffered from a psychotic illness, compared with those aged 35-59 who had committed suicide. The latter had higher rates of alcohol dependence and depression. The prevalence of psychotic disorders in the under-35 age group was much higher (25%) than in previous studies in similar groups in Canada (9%) and in Sweden and the U.S. (17%). However, the prevalence of personality disorders (43%) was about the same as in earlier surveys conducted elsewhere.

      Researchers in Edinburgh reported a disturbing trend in the rate of suicide during the first 28 days after discharge from psychiatric hospitals in Scotland during the years 1968-92. They found that although the incidence of suicide had declined by 40% among discharged male patients, the rate among female patients had almost trebled. The investigators pointed out that this development had occurred during a period when mental health services had changed from largely institutional to predominantly community-based programs, the number of psychiatric beds for adults having declined by 60%.

      A strong association between suicide and parasuicide (an act of self-injury not motivated by a genuine desire to die) emerged from work carried out in Bristol, England. Despite the difference in motivation between the two types of acts, socioeconomic deprivation emerged as a common element.

      A report by the Royal College of Psychiatrists and Royal College of Physicians of London focused on the importance of paying attention to the psychological needs and difficulties of medical patients. People with appreciable physical illness have at least twice the rate of psychiatric disorder of the population at large, yet many hospitals fail to provide appropriate services to assist with these problems, which include depression, mood disorders, and cognitive impairment. In addition to citing direct benefits to the patient, the report included evidence from the U.S. of economic benefits—for example, orthopedic patients in the U.S. who received psychiatric counseling had shorter hospital stays than those not offered such assistance. The report advocated integrated physical and psychiatric care for all patients with significant physical illness.

      Research published during the year contributed to the understanding of auditory verbal hallucinations ("hearing voices") in patients with schizophrenia. The investigators, psychiatrists and neurologists in New York City and London, used brain scanning to study patients with schizophrenia who complained of hearing voices. They also studied schizophrenics who did not hear voices, as well as a group of normal, healthy individuals (controls). The scans were designed to reveal alterations in blood flow as various parts of the brain became active. The procedure showed that there were no differences in blood flow between the hallucinators and the controls when they were asked to "think in sentences." There were differences, however, when the subjects were asked to imagine sentences being spoken in another person's voice—a task that required them to both generate and monitor so-called inner speech. In the latter case one brain region in the hallucinators functioned normally, but abnormally low responses occurred in two other regions, which were activated in both the controls and the nonhallucinating schizophrenics. This finding strongly suggested that a predisposition to "hearing voices" is associated with a failure to activate areas of the brain that play a role in monitoring inner speech. Those who are affected may misperceive such verbal thoughts as coming from external sources, or they may simply be unaware of having them. (BERNARD DIXON)

      This updates the article mental disorder.

      Japan was host to the 1995 World Veterinary Congress in September. The event, which was opened in Yokohama by Emperor Akihito, attracted representatives from 82 countries. The emperor noted that veterinary scientists, with their deep understanding of and rich experience with animals, had provided "many suggestions regarding the optimal relationship between human beings and animals."

      Speakers included Jean Blancou of the International Office of Epizootics, who reviewed the often devastating consequences of past disease outbreaks associated with the movement of animals between countries. As international trade in animals and animal products was likely to increase as a result of the newly established World Trade Organization, Blancou observed, a strengthening of veterinary surveillance arrangements and increased research on animal vaccines were called for.

      At the World Small Animal Veterinary Association Congress, which was held concurrently, the association's president, Peter Bedford of London, announced that the group's Eastern Europe continuing education program, which aimed to update veterinarians in former Eastern bloc countries, would be extended to help less developed nations elsewhere.

      Bovine viral diarrhea is a disorder that affects cattle worldwide and has serious adverse effects on health and productivity. The virus is passed from dam to fetus in the womb, and the calf is born with the infection. Calves often show no signs of disease until they acquire a form of the virus that rapidly causes mucosal disease and death. In the absence of any effective treatment, vaccination of female cattle before they are bred has been recognized as the route to control. Live vaccines have been developed and used in some countries, including the U.S., but have not eradicated the problem. In 1995 a new inactivated vaccine was shown to protect heifers exposed to the virus, and calves subsequently born to them, unlike control animals, were free from infection.

      The production of identical calves potentially would be valuable to the livestock industry by increasing the number of offspring from high-quality parents and to scientific research by providing genetically identical animals for comparative studies. Embryo-transfer and cell-division techniques have been used to this end, but the maximum number of calves produced by these methods was three. In 1995, however, W.H. Johnson and colleagues at the University of Guelph, Ont., succeeded in producing four identical calves from a single embryo. The embryo was divided at the four-cell stage and transferred to two recipients, which resulted in the births of two sets of identical twins—four genetically identical animals. (EDWARD BODEN)

      See also Molecular Biology (Life Sciences ).

▪ 1995


Medical Developments
      In 1994 scientists made major strides in understanding the genetic underpinnings of a number of conditions, including inherited forms of cancer, the skin disease psoriasis, dyslexia (a learning disorder), and even obesity. At the same time, public health authorities issued new warnings about the dangers of emerging and resurgent infectious diseases. Reversing a steady decline of nearly 40 years, tuberculosis deaths in Eastern Europe were again on the rise. An epidemic of pneumonic plague erupted in India, and cholera broke out among refugees fleeing the civil war in Rwanda. At an international meeting in Yokohama, Japan, AIDS researchers acknowledged that HIV was proving stubbornly resistant to their efforts.

      Scientific reports published during the year challenged the conventional wisdom on several fronts. Two large studies questioned the value of vitamin supplements in preventing cancer. Researchers at Harvard Medical School suggested that, in the U.S. at least, popular procedures for treating coronary artery disease were being greatly overused. And an ongoing survey of nutrition and eating habits in the U.S. found that despite the health and fitness craze, more Americans were obese than ever before.

      The keenly contested race to identify genes associated with breast cancer susceptibility culminated in the isolation of one such gene, BRCA1, on chromosome 17, followed by the identification of another, BRCA2, located within a particular region of chromosome 13. Between them, mutations in these two genes may be responsible for most hereditary forms of the disease (which, in turn, account for 5-10% of all breast cancer cases).

      The cloning of BRCA1, accomplished by researchers at the University of Utah Medical Center, Salt Lake City, and colleagues at other U.S. and Canadian institutions, was potentially highly significant for women with a strong family history of breast cancer. More than half of those who carried mutated forms of the gene would be diagnosed with breast cancer by age 50, and more than 85% would develop the disease by age 70.

      Another gene race ended in a tie in March as two teams reported that they had independently isolated a second gene involved in a common form of colon cancer, hereditary nonpolyposis colorectal cancer (HNPCC). In December 1993 many of these same researchers had announced isolation of the first such colon cancer gene. Both genes were known to be involved in the repair of DNA. Together, defects in the two were thought to account for most cases of HNPCC. About one in 200 people carried an inherited mutation for this form of colon cancer, and the defective genes were also involved in uterine and ovarian cancers.

      The pace of research in cancer genetics raised the prospect of widespread testing to identify those who were susceptible to inherited forms of breast and colon cancer. In March a U.S. National Institutes of Health (NIH) advisory council warned that it was premature to offer DNA testing or screening for cancer predisposition outside of carefully controlled research projects.

      Investigators in England, Wales, and The Netherlands succeeded in isolating the gene responsible for autosomal dominant polycystic kidney disease (ADPKD), one of the most common disorders attributed to a single abnormal gene. ADPKD causes progressive damage as fluid-filled cysts grow in the kidneys, leading to total kidney failure by the age of 60. About 10% of kidney transplant recipients in Europe and the U.S. suffered from ADPKD. The breakthrough would facilitate both understanding of the disease and earlier diagnosis, allowing complications such as hypertension (high blood pressure) and urinary tract infection to be treated more quickly.

      The gene defect responsible for achondroplasia, the most common form of inherited dwarfism in most parts of the world, was identified by researchers at the University of California at Irvine. The gene, located on chromosome 4, codes for a protein that binds to growth factors. A tiny change in the amino acids that constitute the protein results in the characteristic skeletal deformations.

      In other notable developments, an Australian team identified a single gene that has a significant influence on bone density and, by extension, risk of osteoporosis. Scientists studying families with a history of dyslexia found a characteristic defect within a particular region on chromosome 6, confirming the view that this learning disorder may have a biological basis. And investigators at the Howard Hughes Medical Institute, Rockefeller University, New York City, announced that they had cloned a gene that apparently regulates the size of the body's fat stores. In mice a mutation in this gene causes a severe hereditary form of obesity.

Cardiovascular Disease.
      Experts continued to debate the best treatment options for heart disease sufferers. Two separate clinical trials by U.S. and German researchers concluded that angioplasty was a reasonable alternative to coronary artery bypass surgery in treating some symptomatic heart patients with multiple blocked arteries. Their reports, published simultaneously in the New England Journal of Medicine, found that the two procedures had similar overall risks of complications and death in such patients. Those who underwent bypass surgery were initially hospitalized much longer and were more likely to have procedure-related heart attacks. On the other hand, those who underwent angioplasty, a simpler procedure in which a tiny balloon is inflated within a blocked artery, were far more likely to require repeat procedures within the next one to three years and to require medication for angina (chest pain). Heart disease specialists emphasized that treatment choices had to be made on an individual basis.

      Health policy analysts at Harvard Medical School opined, however, that these treatments were being greatly overused. On the basis of a review of Medicare data on 200,000 elderly Americans hospitalized with heart attacks, the Harvard group concluded that invasive heart procedures, such as cardiac catheterization, angioplasty, and bypass surgery, could be reduced by more than 25% with no effect on death rates. They suggested that redirecting resources toward better emergency care of heart attack victims would do more to reduce mortality.

      A meta-analysis of numerous trials of antiplatelet therapy (i.e., treatment to inhibit blood clotting) confirmed that regular consumption of aspirin (75-325 mg per day) provided worthwhile protection against a subsequent heart attack or stroke and decreased the risk of death in individuals with circulatory and related conditions. There was, however, no clear evidence for recommending routine aspirin use among apparently healthy people with no history of cardiovascular problems.

      Paralleling previous findings in the U.S., evidence from the U.K. established that men received better treatment than women for acute myocardial infarction (heart attack). One study in Nottingham showed that the survival chances of female patients both in the hospital and after discharge were poorer than those of males, in part because the women had longer delays in reaching the hospital, were less likely to be admitted to a coronary care unit, and were less likely to be given drugs to inhibit blood clotting. Research in London confirmed that female heart attack victims had an inferior prognosis over the first 30 days as a result of receiving less vigorous treatment than their male counterparts.

      A formerly controversial surgical procedure received an endorsement in October when the directors of a multicentre U.S. and Canadian study reported their finding that the operation, called carotid endarterectomy, reduced by about half the projected risk of stroke in patients who had narrowed carotid arteries but no symptoms of incipient stroke. The carotid, a major artery in the neck, carries blood to the brain. Fatty deposits inside the artery can decrease blood flow and eventually cause a stroke. The investigators were puzzled by one result of the investigation: the risk reduction of women was considerably less than that of men.

      A report presented in November at the annual meeting of the American Heart Association could have far-reaching implications for patients with coronary heart disease. Scandinavian scientists found that a cholesterol-lowering drug reduced the risk of death in such patients by 42%—the first "proof" that these medications have an impact on survival.

      An independent panel of experts assessed the U.S. government's war on cancer and found that, overall, cancer incidence had increased 18% and the death rate had risen by 7% since the effort was launched in 1971. While there had been progress in basic research and in treatments that keep patients alive longer, the panel concluded that more needed to be done to improve quality of life and access to care. The report noted that government policies subsidizing tobacco—the leading preventable cause of disability and death in the U.S. and many other countries—were undermining cancer-prevention efforts.

      Throughout 1994 the U.S. Congress and the Food and Drug Administration (FDA) engaged in the first serious national inquiry over whether to regulate the nicotine in tobacco products as a drug. An FDA advisory committee concluded that nicotine in tobacco is indeed addictive. Congressional hearings were held, but the issue of tobacco regulation remained unresolved. (See Sidebar (TOBACCO: Last puff postponed ).)

      Two studies published in the New England Journal of Medicine cast doubt on the theory that antioxidant vitamin supplements can prevent cancer. In April a major trial of beta-carotene and vitamin E supplements, administered for five to eight years to more than 29,000 male smokers in Finland, found no significant protective effects against lung cancer. In July researchers at Dartmouth Medical School, Hanover, N.H., and five other U.S. medical centres said that administering beta-carotene, vitamin C, or vitamin E for four years did not reduce the development of new colon cancers in patients who had had a polyp removed before entering the study. Both studies were apparently at odds with the vast body of epidemiological evidence showing that people whose diets are rich in fruits and vegetables have reduced cancer risks. It was not clear whether the vitamins in these foods or some other protective substances were responsible for their anticancer properties. Longer-term studies now under way may shed light on the question.

Infectious Diseases.
      In 1994 leading authorities warned the public and the medical community that the international spread of drug-resistant organisms threatened to become a major health crisis by the end of the 20th century. Convening in Prague for the sixth International Congress for Infectious Diseases, U.S. microbiologist Alexander Tomasz and other government and academic experts observed that the world was entering an era in which some common disease-causing bacteria could become resistant to all available drug therapies. Few new antibiotics were being introduced, and an informal survey of large U.S. and Japanese pharmaceutical companies found that about half had reduced or phased out their antibacterial research programs, in part because of an erroneous assumption that bacterial diseases had already been brought under control.

      Sensational headlines about flesh-eating "killer bugs" dominated the newspapers in the U.K. after several reports of serious invasive disease due to a particularly virulent strain of group A streptococcus. British public health authorities were quick to point out that such infections, although extremely grave, were not new and had not increased appreciably in recent years.

      In the U.S. a number of events served as reminders of the persistence of microbial threats to health. Several hundred passengers on two cruise ships had to be evacuated as a result of outbreaks of Legionnaires' disease and shigellosis, and a 15-state salmonella epidemic was reported among customers of a Minnesota dairy. In response to a recent increase in foodborne disease attributed to a virulent strain of Escherichia coli—which can cause a fatal kidney condition—a group of medical, public health, and food industry experts suggested changes in the U.S. meat-inspection process and recommended that some ground beef be irradiated.

      After a little more than a decade of controversy over the significance of the bacterium Helicobacter pylori—the so-called ulcer bug—several independent pieces of evidence confirmed the role of the organism as possibly the most important factor in the development of duodenal ulcers. Following earlier studies showing the effectiveness of a regimen of antibiotics plus acid-suppressing drugs, a clinical trial at the Prince of Wales Hospital, Hong Kong, demonstrated that in most ulcer patients antibiotics alone eradicated the bacterium and healed the ulcers. Longer-term research at the Royal Perth (Australia) Hospital and the University of Virginia confirmed that the reduction in ulcer recurrence following eradication of H. pylori persisted for at least seven years. Further, a survey in Stoke-on-Trent, England, showed that adults from crowded childhood homes were particularly likely to carry antibodies to H. pylori—an indication that the bacterium is transmitted directly from person to person and may be commonly acquired in early life. On the strength of these and other recent studies, an NIH panel issued an official statement endorsing antimicrobial drugs for the treatment of ulcers.

Diet and Nutrition.
      In October the U.S. National Task Force on the Prevention and Treatment of Obesity published a review of nearly 30 years of medical research on "yo-yo" dieting. Contrary to some individual studies, this overview found no convincing evidence that repeated loss and gain of weight carried significant health hazards. The task force concluded that obesity posed a far more serious medical problem than did dieting.

      A panel of experts convened by the NIH concluded in June that a large percentage of Americans were not getting enough calcium. They noted that children and young adults must consume an adequate amount of calcium if they are to reach their peak bone mass. Individuals who fail to achieve their peak are more vulnerable to the effects of bone loss in later life. The panel issued new recommendations for optimal daily calcium consumption. For several age groups the suggested levels were considerably higher than the recommended dietary allowances, or RDAs.

      Clinicians in Cambridge, England, investigated the effect of milk consumption in childhood and early adulthood on the bone density of women aged 44 to 74. Their study showed that the frequent drinking of milk earlier in life had a favourable effect on the bone mass of the hip at the later age. The benefit was independent of factors such as body size, smoking, and hormone replacement therapy, which also influence bone density. Another study examined the purported relationship of coffee consumption and decreased bone density. The researchers, from the University of California at San Diego, found a positive correlation between caffeinated coffee intake and low bone mineral content, but they also determined that the harmful effects of coffee drinking on bone mass could be offset by regular consumption of milk.

 Medical and nutrition professionals around the world continued to examine the health benefits of low-fat, high-fibre diets. One style of eating that was receiving a major share of attention was the diet of the Mediterranean region, where the population had traditionally enjoyed low rates of heart disease and some cancers. In 1994 an international group of experts interested in traditional eating patterns developed the Mediterranean diet pyramid (see Figure—>) as a model for healthful eating. The Mediterranean pyramid called for a largely plant-based diet. Cheese, yogurt, and olive oil were included with fruits, vegetables, and grains as foods that could be eaten daily, while red meat was to be consumed only a few times a month. Not all nutrition authorities were in favour of the concept. For one thing, the diet of the Mediterranean region derives more than 30% of its calories from fat, and current U.S. dietary recommendations call for limiting fat calories to 30% or less. For another, wine is a regular feature of meals in Mediterranean countries, and many U.S. public health authorities hesitated to advocate a regimen that included alcohol as even an optional element.

      Meanwhile, in France investigators from the Lyon Heart Study demonstrated that a Mediterranean-style diet was effective in reducing the risk of further heart problems in individuals who had already experienced a heart attack. Some 300 patients were encouraged to increase their consumption of grains, fruits, and vegetables and to eat less red meat and more poultry. The butter in their diet was replaced by a spread rich in alpha-linolenic acid, which some experts believed to have cardioprotective effects. During a follow-up, which averaged 27 months, there were three coronary deaths and five nonfatal heart attacks among those on the diet, compared with corresponding figures of 16 and 17 in a similar group that received no dietary advice.

      The health benefits of a vegetarian diet were substantiated by the results of a 12-year survey conducted by nutritionists in London and Oxford, England. Comparing the fates of more than 5,000 British meat eaters with those of some 6,000 who were not meat eaters, the investigators reported a 40% lower rate of death from cancer among the vegetarians. Those who did not eat meat also had a markedly lower rate of atherosclerotic heart disease, though this was at least partly attributable to their much larger proportion of nonsmokers.

Other Developments.
      A major advance was reported in the treatment of Crohn's disease, a chronic inflammatory bowel disorder. Although corticosteroids had proved useful in the past, they sometimes produced potentially serious side effects. A multicentre Canadian study of the synthetic steroid budenoside showed not only that the drug was effective but also that those who received it had no greater incidence of adverse effects than patients who took a placebo.

      A potentially significant finding about the etiology of amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease (and, in the U.K., as motor neurone disease), was reported from Scotland. Researchers in Glasgow, searching for possible signs of an infectious agent, found evidence of viral genetic material in spinal cord tissue from a high proportion of patients who had died of the disease but not in tissue samples from matched controls. The scientists cautioned that association of the virus—an enterovirus (a member of the family that includes the poliovirus)—with the disease did not prove a cause-and-effect relationship. In the meantime, there were cautiously optimistic claims from researchers in Paris that an experimental drug, riluzole, appeared to slow the progression of the inevitably fatal condition and to improve survival in certain patients. Perhaps the most promising development of the year, however, was the announcement by scientists in the U.S. that they had created a strain of mice genetically engineered to contract ALS. The existence of an animal model for the disease was expected to speed the search for effective therapies.

      Evidence of the harmful effects of smoking continued to accumulate during 1994. Research conducted at the University of Melbourne, Australia, greatly strengthened the previously suspected link between smoking by women and an increased risk of osteoporosis. A study directed by investigators from the U.S. National Cancer Institute found that breast cancer patients who smoked had a 25% greater risk of dying from the disease than their nonsmoking counterparts. Scientists studying children and teens with high cholesterol levels found that those whose parents smoked had considerably lower levels of the so-called good cholesterol (believed to help prevent heart attacks) than the children from nonsmoking families. Since all other variables were the same, it seemed likely that exposure to secondhand smoke was responsible for the difference in the youngsters' cholesterol profiles. On the positive side, a multicentre U.S.-Canadian study published in November showed that smokers who already had chronic bronchitis and emphysema could effectively prevent further deterioration in lung function by quitting smoking.

      A study from Sweden contributed to the ongoing disagreement as to whether radon gas, which was known to cause lung cancer in miners, was also responsible for the disease in people exposed to radon at home, albeit at much lower concentrations. Scientists from several Swedish environmental agencies and medical institutions studied 1,360 men and women with lung cancer and measured radon levels in nearly 9,000 buildings in which the individuals had lived in the past. A comparison with control subjects showed that the risk of lung cancer clearly increased in accordance with the level of radon exposure. The Swedish researchers concluded that residential exposure to radon was an important cause of lung cancer in the general population. U.S. and Canadian studies published during the year found just the opposite, however.


      See also Life Sciences: Molecular Biology (Life Sciences ).

      This updates the articles diagnosis; disease; infection (infectious disease); medicine.

      Paralleling the trend in other areas of medical research, advances were made in the understanding of the genetic basis of mental illness and the effects of abnormal genes on brain function. Researchers in Japan reported evidence that a variant of the gene that encodes one of the receptors for the neurotransmitter dopamine may be a risk factor for some types of schizophrenia. Comparing 156 schizophrenic patients with controls, they found that the frequency of the gene was significantly higher among patients, especially in those whose illness had begun before the age of 25 and those with a family history of the condition.

      Fragile X syndrome—the most common form of mental retardation caused by a single gene defect—was also yielding some of its secrets. In 1991 molecular geneticists had discovered that the mutation responsible for the condition consists of large numbers of repeated sequences of nucleotides (the subunits that constitute DNA). Now research has shown that carriers have "premutations"—smaller numbers of nucleotide repeats—that have the potential to increase as the gene is transmitted to subsequent generations. As well as accounting for the development of the disease, these discoveries have facilitated prenatal diagnosis.

      Concurrent with developments of this sort, there was growing interest in the social context of mental illness. The relationship between mental disorders and unemployment was a matter of increasing concern in the U.K. Investigators based in Bristol, England, reported on a detailed analysis of the relationship between the occupancy of psychiatric hospital beds and the numbers of people out of work in different parts of their region. Their findings showed that unemployment was an extremely powerful indicator of the rate of serious mental illness requiring hospital treatment among individuals under 65.

      Psychiatrists at the Clinical Research Centre, Harrow, England, and other U.K. centres published the results of a study that examined the social adjustment in childhood of people who developed psychiatric disorders as adults. The investigators consulted teachers' assessments of the social behaviour of 7- and 11-year-olds who by age 28 had been hospitalized for schizophrenia, affective psychoses (e.g., major depression accompanied by hallucinations), or neurotic illness (e.g., milder forms of depression). The results showed that whereas the individuals in the second category had differed little from normal controls at the younger ages, those later diagnosed as schizophrenic had all been rated at seven as manifesting more social maladjustment. This was more apparent in boys than in girls. By the age of 11 the pre-neurotic children, especially the girls, also had an increased rating of maladjustment.

      Several studies carried out in different parts of the world provided encouraging evidence of the effectiveness of a new drug for the treatment of schizophrenia. The drug, risperidone, was introduced in the U.K. in 1993 and approved in the U.S. in 1994. Risperidone was reported to help patients who had failed to respond to other antipsychotic drugs and to have a beneficial effect on a wider range of symptoms than some of these existing alternatives. As was often the case with new drug compounds, however, it was considerably more expensive than the older agents.

      Research at the University of California at San Diego clarified earlier claims from several European countries that low blood pressure was sometimes accompanied by an increased prevalence of weeping, fatigue, and psychological dysfunction. Psychiatrists in the U.S. and the U.K. had generally been skeptical about such reports. The new evidence came from a study of 594 male residents, aged 60-89, of Rancho Bernardo, Calif., who were categorized as having low, normal, or high blood pressure. The researchers observed a significant association between relatively low blood pressure and higher scores for both overt depression and symptoms of depression, irrespective of age or weight loss.

      There was also progress in understanding the basis of the mental deterioration that often occurs in elderly people who are not suffering from Alzheimer's disease or other well-defined dementing disorders. One possible explanation was that a decline in cognitive function could be attributed to narrowing of the arteries that supply blood to the brain. Exploring the link between mental status and circulatory disease, researchers at Erasmus University Medical School, Rotterdam, Neth., examined some 5,000 subjects aged 55-94 for clinical signs of atherosclerosis and gave them tests of memory, attention, and other mental skills. The results were compatible with the view that impaired blood flow to the brain accounts for a considerable proportion of cognitive impairment among the elderly. (BERNARD DIXON)

      This updates the article mental disorder.

      A major international symposium to assess the past and predict the future progress of the veterinary profession was held by the Royal College of Veterinary Surgeons in London. The event was the keynote of the 150th-anniversary celebrations marking the granting of a royal charter to the college by Queen Victoria in 1844. The charter had set the seal on the professional status of veterinarians in the U.K. and, by extension, it affected the development of veterinary practice throughout the English-speaking world. Speakers at the symposium reviewed the contemporary demands placed on the training of veterinarians and discussed issues in the care and welfare of animals, the production of livestock, and the safeguarding of public health. The symposium identified enormous potential benefits arising from biotechnology but also noted that such advances—for example, the use of bovine somatotrophin to increase milk production—raise serious ethical considerations.

      HIV, the organism believed responsible for AIDS, is the best known of the lentiviruses (slow viruses), but others affect cats, horses, sheep, goats, and monkeys. Unlike other members of the group, all of which eventually cause disease in the host animal, bovine immunodeficiency virus (BIV), which affects cattle, could be carried for years without producing clinical signs. In 1994, however, this accepted view was challenged when BIV was discovered in a Cheshire, England, herd that was suffering from a mysterious wasting disease. Confirmation of the virus's role in causing the illness was hampered by the very slow development of the disease—a similar problem to that encountered in the study of bovine spongiform encephalopathy, a neurological disorder that also affects cattle.

      The practice of judging the age of a horse by the appearance of its teeth goes back well over 2,000 years, but there had never been any scientific validation of the method. J.D. Richardson and her colleagues at the Universities of Bristol, England, and London undertook a study to establish whether tooth wear is in fact an accurate measure of age. They examined the teeth of horses of known age and then compared estimated age, as indicated by the teeth, with the actual age. They found that up to the age of five the actual and estimated ages were similar. In older horses, however, the results were much less accurate. The pattern of wear was affected by diet, environment, and breed as well as by age. They concluded that while a horse's teeth could provide a convenient practical guide to its age, the result was more an informed guess than a precise answer.

      Concern over the effects of high humidity on animals competing in the equine events at the 1996 Olympic Games in Atlanta, Ga., led the International Equestrian Federation to study the effects of high temperature and humidity on exercising horses. Work carried out at the Animal Health Trust in England involved treadmill exercises in an environment-controlled building. The tests demonstrated that high humidity, as might be encountered in Atlanta, could cause health problems resulting from increased fatigue. As a result, the rules of the three-day event might need to be changed to protect the horses' welfare.


▪ 1994


      In 1993 exciting developments in the application of genetics to the diagnosis, understanding, and potential treatment of a number of diseases shared the stage with the worsening epidemics of AIDS and tuberculosis (TB). The year was also marked by growing concern not only about the emergence of previously unrecognized infectious diseases but also about the capacity of familiar—and apparently vanquished—infections to exact further human tolls.

      As scientists around the world observed the 40th anniversary of the elucidation of the molecular structure of DNA, French researchers announced that they had succeeded in constructing the first rough map of all the human chromosomes. Progress continued in the ongoing hunt for genes responsible for particular diseases. Among the disorders whose underlying genetic defects were pinpointed in 1993 were neurofibromatosis type 2, the inherited cancer syndrome known as von Hippel-Lindau disease, one type of diabetes, and a form of amyotrophic lateral sclerosis (ALS; Lou Gehrig's disease).

      The crowning achievement of the year was the discovery on chromosome 4 of the gene for Huntington's disease, a hereditary neurological affliction that leads to incoordinated limb movements, mental deterioration, and, eventually, death. The search for the gene took 10 years and involved more than 50 researchers in laboratories in the U.S. and Europe. The mutation was an unusual type that so far had been found in only four other diseases: fragile-X syndrome (the most common type of inherited mental retardation), myotonic dystrophy (a kind of muscular dystrophy that affects adults), spinobulbar muscular atrophy (Kennedy's disease), and spinocerebellar ataxia type 1. Its basis is a genetic "mistake" in which a sequence of three nucleotides (the building blocks of DNA) is repeated in a manner some have likened to a stutter. Affected individuals were found to have as many as 100 of these repetitions. People who had a greater number of repetitions seemed to develop the disease earlier and had more severe cases.

      Because of both the unusual nature of the mutation and the lack of knowledge about the function of the normal gene's protein product, dubbed huntingtin by researchers, no treatment was yet in hand. However, it was possible to identify those who would eventually get the disease. This situation could create psychological difficulties for members of affected families. Whereas previously they could only wait for signs of the disease to appear—usually in middle age—now they had the option of seeking early diagnosis through DNA analysis. If the test was positive, they would then have to cope with the news that they faced inevitable, devastating disease later in life.

      Progress also occurred in the understanding of genetic factors in Alzheimer's disease. Allen Roses and colleagues at Duke University Medical Center, Durham, N.C., found that people with one variant of the gene for the cholesterol-carrying protein apolipoprotein E were at increased risk of getting late-onset Alzheimer's. (The late-developing form of the disease accounted for about 80% of U.S. cases.) The protein binds a substance called beta-amyloid, which is known to accumulate in the brains of Alzheimer's patients.

      The pace of gene-therapy trials quickened considerably. Two different teams performed gene therapy in patients with cystic fibrosis (CF), introducing normal versions of the CF gene by aerosol into airway cells. Ronald Crystal of Cornell University Medical Center, New York City, used a genetically modified cold virus to carry the normal genes; James Wilson, at the University of Pennsylvania, used a slightly different method. Wilson also used gene therapy to successfully treat a few patients with familial hypercholesterolemia, an inherited disorder in which the gene for the low-density lipoprotein (LDL) receptor is defective, resulting in failure to remove LDL from the blood and allowing fatty deposits to build up in blood vessels.

      Genetics researchers also published the results of studies that suggested an inherited basis for sexual orientation. Investigators at the National Cancer Institute (NCI), Bethesda, Md., linked homosexuality in men to a region on the X, or female, chromosome, the sex chromosome that males inherit from their mothers. (Women have two X chromosomes; men have one X and one Y.) The researchers first interviewed gay men, finding that they had a higher-than-expected number of homosexual relatives on the maternal side. Focusing on 40 families in which there were two gay brothers, they found that in 33 pairs the two brothers had identical regions at the tip of the X chromosome—a much higher proportion than would be expected. No specific gene was identified as predisposing to homosexuality, however, and the work done thus far would have to be confirmed by others. Moreover, the trait seemed to be paternal in some families. Nonetheless, the study was regarded as the most scientific in the field to date, and the group had already started a study of female homosexuality. Another study of lesbians, based solely on interviews, found that in 71 identical twin pairs, 48% of the sisters were either lesbian or bisexual, compared with only 16% of 37 nonidentical twin pairs and 6% of 35 adoptive sisters.

      In one of the most controversial developments of the year, Robert Stillman, a fertility specialist at George Washington University Medical Center, Washington, D.C., reported in October that he had experimentally cloned human embryos, using techniques already well known in the breeding of livestock and other animals. The report raised the possibility that identical twins could be born years, or even generations, apart. The ethical and legal dilemmas posed by such a capability would not easily be resolved.

Infectious Diseases.
      U.S. microbiologists reported a marked increase in cases of pulmonary TB in New York City caused by organisms resistant to the drugs normally used to treat the disease. There were also reports that the drug-resistant bacteria could reinfect patients while they were being treated (or after they were treated) for TB caused by antibiotic-sensitive strains. This finding contradicted previous beliefs that because of immunity conferred by the initial infection, reinfection occurred only very rarely. (See Sidebar (Drug-resistant Diseases ).)

      Cholera loomed large as a global health problem. In mid-1993 the World Health Organization (WHO) reported that the continuing pandemic of the disease, caused by the so-called El Tor strain of Vibrio cholerae (first isolated in Indonesia in 1961) had claimed more than three million victims and caused tens of thousands of deaths. In August researchers at the International Centre for Diarrhoeal Diseases Research in Bangladesh described epidemics in Bangladesh and India caused by a new strain of V. cholerae, prompting fears of the start of yet another pandemic, the eighth such worldwide outbreak in history.

      The role of an infectious agent, the bacterium Helicobacter pylori, in both duodenal ulcers and gastric cancer became clearer during the year. A study carried out in Austria demonstrated that administration of antibiotics to eradicate this bacterium was followed by a marked reduction in ulcer recurrence. And a major international study strongly implicated H. pylori as a cause of gastric cancer. Researchers in Britain and Italy showed that in five out of six patients with one particular type of stomach cancer, elimination of the bacterium through antibiotic therapy was followed by regression of the tumour.

      That new infectious diseases continue to emerge was dramatically demonstrated by the appearance in the southwestern U.S. of a mysterious, often fatal flulike illness that brought about rapid respiratory failure in young, previously healthy individuals. A number of cases were subsequently reported in other parts of the country, from California to Louisiana. An intensive investigation led to the identification of a rodent-borne hantavirus of a type not previously associated with human illness in the U.S. Authorities speculated that increased rainfall may have caused a population explosion in the white-footed mouse, which carries the organism. The virus itself was not believed to be "new."

      The AIDS epidemic continued virtually unabated, and WHO estimated that the number of those infected with HIV would reach 30 million-40 million by the year 2000. The WHO data also showed that half of new HIV infections were occurring in people 25 years old or younger. Modes of transmission varied by region. In Africa and parts of Asia, heterosexual sex was the primary factor in the spread of HIV. In the U.S., Europe, and South America, however, transmission via intravenous drug abuse and homosexual contacts still predominated.

      Enormous disappointments occurred during the year when several studies, principally the collaborative British-French Concorde study, showed that use of AZT (zidovudine; Retrovir) early in the course of HIV infection was not as effective in forestalling AIDS as had been believed. A second disappointment came when a widely hailed study showing great promise for a combination therapy—concurrent use of three anti-HIV drugs—was revealed to have been flawed. Because of the initial excitement over the study, a national clinical trial of the treatment had already begun; it was apparently to continue despite the revelation.

      In light of the failure of efforts to find better treatment, more experts stressed prevention. Michael Merson, head of WHO's global AIDS program, urged the spending of an additional $1.5 billion to $2.9 billion a year in less developed countries on prevention programs. "Governments can pay now or pay a much higher price later," he said. The strategies would include continued emphasis on the importance of condom use; a concerted effort to treat other sexually transmitted diseases, which potentiate the risk of contracting HIV; an increase in AIDS education programs; and the operation of needle-exchange programs for people who inject illicit drugs.

Cardiovascular Disease.
      The American Heart Association, in a position statement published in February, confirmed what many already believed and practiced—that taking aspirin can prevent some cardiovascular problems and treat others. Although some people—those with severe hypertension (high blood pressure), for example—should not take aspirin, in general it was recommended for treatment of heart attacks in progress, transient ischemic attacks (short-lived mini-strokes), and the type of unpredictable chest pain called unstable angina. In addition, aspirin can be used to prevent heart attacks, strokes caused by obstructions of blood flow in the brain, and early occlusion of coronary artery bypass grafts. While aspirin use was known to be effective in men, there had been some doubt about its ability to provide cardiovascular benefits in women. One 1993 study showed that aspirin does not dissolve blood clots as rapidly in women as in men, which may account for the gender difference in its protective ability.

      A study in the New England Journal of Medicine indicated that atherectomy devices—which cut out and remove lipid-containing plaques from the insides of coronary arteries—yielded no better outcomes than balloon angioplasty techniques, in which an inflatable device is used to reopen blocked arteries. Another study showed that women who underwent angioplasty had a higher in-hospital death rate from the procedure than men.

      Attention focused on the growing evidence that regular light alcohol consumption affords some degree of protection against coronary heart disease (CHD). Danish researchers demonstrated that moderate drinking is especially protective in men of a certain blood type who are at particularly high risk of developing—and dying from—CHD.

      Most of the recent work on the cardiovascular benefits of regular drinking had indicated that the active agent was ethyl alcohol itself. One puzzle, however, had been the low incidence of atherosclerosis (narrowing of blood vessels caused by fatty deposits) in some regions of France where the typical diet contains considerable quantities of saturated fat. This finding was in marked contrast to the close correlation observed elsewhere between saturated fat intake and CHD. The discovery of this "French paradox" prompted suggestions that something in red wine—the preferred beverage in France—other than its alcohol content may exert a protective effect. Working in collaboration, researchers at the University of California at Davis and the Volcani Center in Israel demonstrated that phenolic compounds, which occur in red wine, could prevent oxidation of LDL. Other antioxidants had been shown in animal experiments to reduce atherosclerosis.

      Several teams were working during the year to isolate a gene—called BRCA1—whose inherited mutation confers a very high risk of developing breast and ovarian cancer. Using molecular markers and studying families in which both diseases were highly prevalent, researchers traced BRCA1 to chromosome 17. In April 1993 they fully identified women who had defective forms of the gene.

      In a major breakthrough in cancer genetics, two teams of researchers, one headed by Bert Vogelstein of the Johns Hopkins Oncology Center, Baltimore, Md., and the other directed by Richard D. Kolodner of the Dana-Farber Cancer Institute, Boston, and Richard Fishel of the University of Vermont Medical School, announced in December that they had isolated a gene implicated in an extremely common type of colon cancer, hereditary nonpolyposis colorectal cancer. The gene directs the synthesis of a protein that corrects mistakes in the pairing of nucleotides. It was estimated that one in every 200 people inherits the defective form of the gene; such individuals face a 70 to 90% chance of developing colon cancer. In addition, women with the mutation are at greater risk for uterine and ovarian cancer. Scientists predicted that a test for people from families with a history of colon cancer could be ready as early as the middle of 1994, enabling presymptomatic screening for those at risk. A similar test was developed in 1993 for another hereditary condition that often leads to colon cancer.

      Such genetic screening techniques may someday replace the stool tests for fecal occult blood that were currently used to screen for colorectal cancer. The drawbacks of stool tests led one group of physicians during the year to call the method ineffective in detecting colon cancer, although another group concluded that regular testing decreased colon cancer deaths by 33%. Such tests were inexpensive but also inconclusive. A positive stool test would therefore have to be confirmed by further—and much more costly—studies.

      In breast cancer, use of another screening test, mammography, had been controversial for women in their 40s because of doubts that such screening reduced the death rate in this age group, as it did in women over 50. Articles published and meetings held during the year did not resolve the controversy.

      Can colorectal cancer be prevented by regular use of aspirin or other nonsteroidal anti-inflammatory drugs? Several studies published in 1993 suggested that it can, although another discounted this conclusion. Experts said that long-term, randomized studies involving several aspirin doses would be necessary before any specific recommendations could be made.

      Cancer of the prostate gland, the second most common cancer (after lung cancer) in U.S. men, got increased attention in 1993, although major controversies continued about how to diagnose and treat it and whether small, localized prostate cancers should be treated at all. One study showed that radiation therapy was more likely to cure large, inoperable prostate tumours if the men were given hormones for two months before radiation therapy. And a major study from Harvard looked at 300 men (out of a group of more than 47,000) who had been diagnosed with prostate cancer between 1986 and 1990 and found that the intake of animal fat, especially fat from red meat, was directly correlated with the risk of developing advanced disease.

Other Developments.
      A major, 10-year study of nearly 1,500 persons with type I (insulin-dependent) diabetes showed that intensive treatment to control the level of blood glucose worked better than conventional treatment in reducing common complications of the disease, such as eye and kidney problems and nerve damage. The intensive regime involved taking three to five insulin injections per day or using a pump that automatically injected insulin into the bloodstream, testing of blood glucose four to seven times per day, and working closely with a medical and support team. A major drawback, however, was that the intensive regimen was difficult to adhere to and more costly than the conventional approach.

      There was a step forward in the understanding of sudden infant death syndrome (SIDS), also known as crib death or, in Britain, cot death. Although several retrospective studies had indicated that babies who slept in the prone position were at increased risk of SIDS and had suggested that infants be placed on their backs or sides, many pediatricians were reluctant to make this recommendation because of uncertainty as to why the prone position should be hazardous. The new research, carried out in Australia and New Zealand, not only confirmed the higher risk of prone sleeping but also identified four contributing factors that increase risk—the use of soft (natural fibre) mattresses, swaddling, recent illness, and overheating of bedrooms.

      The apparent role of vitamins and minerals, both from dietary sources and as supplements, continued to grow in importance. In one NCI study, conducted in a rural area of China where intake of fresh fruits, meat, and dairy products was limited, people who took a supplement of beta-carotene, vitamin E, and selenium for five years had a 13% lower risk of dying from cancer, a 10% reduction in the risk of death from stroke, and a 9% lower risk of death due to all causes. The group taking the supplement also experienced a 21% decline in deaths from stomach cancer, which occurs at a particularly high rate in this region.

      Studies conducted in the U.S. indicated that dietary consumption of antioxidants—especially beta-carotene and vitamin E—reduces the risk of stroke in general and the thickness of the walls of the carotid arteries (which carry blood to the brain) in particular. Moreover, reports from two ongoing studies of U.S. health professionals showed that taking vitamin E supplements was associated with a significant decrease in the risk of CHD.

      A number of studies had shown that women who take folic acid supplements have a reduced risk of bearing children with the congenital malformations known collectively as neural tube defects. However, it was very difficult to obtain a protective level of folic acid from the diet alone. Therefore, in 1993 the U.S. Food and Drug Administration recommended that folic acid be used to fortify common grain products so that women who become pregnant will have enough of the substance in their bodies very early in pregnancy, when the neural tube (which becomes the brain and spinal cord) begins to form.


      Two advances in the understanding of the most common of the serious mental illnesses, schizophrenia, emerged from work at King's College Hospital, London. First, a team there set out to locate the basis of the auditory hallucinations ("hearing voices") that characterize this illness. According to one view, such hallucinations occur because schizophrenics are unable to monitor their own thoughts, or "inner speech," which they therefore regard as alien.

      The researchers used an imaging technique known as photon emission tomography (PET) to study 12 men both when they were and when they were not experiencing such hallucinations. The PET scans showed that blood flow in a part of the brain called Broca's area was significantly greater during hallucinations than at other times. This suggested that the production of auditory hallucinations is associated with increased activity in one of the main regions of the brain that is specialized for language.

      Research by the same group also threw light on the origin of schizophrenia. Four different studies showed that exposure of pregnant women to influenza during the fifth or sixth month of gestation increased the risk of schizophrenia's appearing later in the offspring. The effect was more pronounced in female than in male children.

      A collaborative survey conducted in London and Bordeaux, France, highlighted significant differences in the diagnosis of schizophrenia by British and French psychiatrists. This disparity, which reflected the greater influence of psychoanalytic ideas in France, could partially explain why first hospital admission rates in France for schizophrenia are much higher before the age of 45 but lower after that age. Particularly in light of political and economic union, which means that medical professionals in the European Community (EC) will be able to practice in any EC country, the authors of the report argued that further work was necessary to ensure that psychiatrists speak a common language.

      Another disparity that came to light during 1993 was that of the prevalence of dementia in different elderly populations. A study among 85-year-olds in Göteborg, Sweden, showed that 29.8% of them were suffering from dementia. This was similar to the figure reported following a recent survey in Shanghai but contrasted with the 47.2% found by community-based screening in East Boston, Mass., and the 12.6% rate for medically diagnosed cases of dementia in Rochester, Minn. It was not yet clear whether these differences reflected regional variations in incidence or differing diagnostic criteria. In the Swedish study almost half of those with dementia appeared to have a form of the disease related to circulatory problems, which may be more amenable to treatment or prevention than Alzheimer's disease.

      Reflecting increasing interest in the relationship between mind and body, a link between mental illness and the circulatory system emerged from a study carried out at the University of California at San Diego. Researchers there were interested in finding out why, according to several recent clinical trials of cholesterol-lowering drugs, benefits in the reduction of deaths from coronary heart disease were accompanied by significant increases in suicides and other violent deaths. One possible explanation was that the lowering of blood cholesterol triggered a rise in depressive illness. This turned out to be true. In men aged 70 and older, depression was three times more common in those with low cholesterol than in those with higher levels. Since health authorities now widely recommended measures to reduce blood cholesterol, the investigators were further studying the significance of this relationship and possible mechanisms responsible for it.

      A Danish survey established that psychological distress late in pregnancy is associated with a heightened risk of preterm delivery, which is in turn linked with increased rates of infant death and other adverse consequences. Although there had been similar suggestions previously, this study of some 6,000 women put the matter beyond dispute, indicating the need for intervention to avoid psychological distress during pregnancy. (BERNARD DIXON)

      This updates the article mental disorder.

      As Pres. Bill Clinton introduced his U.S. health reform proposal during 1993, the American Dental Association (ADA) concurred with the recommendation that a high priority be placed on children's dental services but sharply disagreed with the administration's contention that costs would limit the scope of coverage in the early years. A preliminary draft of the plan proposed coverage of children's preventive services to age 18 and preventive care for adults by the year 2,000.

      A team of researchers at the University of Florida College of Dentistry was using recombinant DNA technology to construct a non-acid-producing microorganism that could be used in the mouth to replace the common oral bacterium Streptococcus mutans, which lives on the teeth and appears to cause most dental cavities. Replacement therapy depends on finding a bacterial strain that does not cause disease itself and that, by virtue of its presence, prevents infection by a pathogenic strain. In previous studies the team constructed an organism that effectively prevented dental cavities in animals.

      Contrary to earlier reports, two studies published during the year failed to find a link between drinking fluoridated water and having an increased risk of osteoporosis, the thinning of the bones that occurs with age. Fluoridation drastically reduced dental cavities around the world and resulted in generations of people being virtually free of tooth decay. A long-term Canadian study showed that hip-fracture rates were the same in Edmonton, Alta., which had fluoride levels of one part per million, and Calgary, Alta., with a fluoride level of 0.3 ppm. A second study, conducted by the Mayo Clinic, found that hip-fracture rates in Rochester, Minn., were slightly higher prior to fluoridation of drinking water in 1960 than they were afterward.

      Smoking has been linked for several decades with an increased risk of various health problems, including periodontal disease. New findings suggested that smoking not only affects the development of gum disease but also reduces the success of treatment. Research conducted at the University of Texas at San Antonio and the University of California at Los Angeles found that patients with severe gum disease who smoke at least half a pack of cigarettes daily do not respond to treatment as well as do nonsmokers. Periodontal therapy successfully eliminated bacteria associated with the disease in only 48% of the patients who were smokers, compared with a 70% success rate in the nonsmokers. It was not yet known whether a short-term cessation of smoking during the therapy would be sufficient to improve the results. (LOU JOSEPH)

      It had long been suspected that some dogs could predict the onset of an epileptic seizure in humans, but little solid evidence had been produced. In 1993 Andrew Edney, a veterinarian in the U.K., published the results of a survey of objective accounts of such incidents. Respondents reported significant behaviour changes in dogs preceding a seizure in their owners. The dogs barked or whined, licked the subjects' faces or hands, or sought assistance. Edney suggested that further work might make it possible to identify dogs possessing the ability, with a view toward helping people with epilepsy.

      Although used for centuries, the practice of "firing" in the treatment of equine lameness—application of a hot iron to the affected area—had become discredited. Instead, several alternative treatments were suggested, including anti-inflammatory drugs such as corticosteroids. Substances that combined anti-inflammatory action with replacement of defective synovial fluid (a natural fluid that bathes the joints) also were used in treating equine lameness caused by joint disease. One such substance was sodium hyaluronate, a natural constituent of both cartilage tissue and synovial fluid. It had several drawbacks, however. The product had to be injected directly into the affected joint, which was not always easy, and it was expensive. Traditional manufacture was based on extraction of material from animal tissues. In a major advance, scientists working in the U.S. for pharmaceuticals manufacturer Bayer devised a method of producing hyaluronic acid from bacteria by a fermentation process. Not only did this reduce the cost, it enabled a product of greater purity to be produced. The new product could be administered intravenously.

      An ambitious worldwide concept for an information database for practicing veterinarians was launched in 1993 by a group based in Cambridge, England. Called Vetstream, the system envisaged a "central information depot" linked by telephone line and satellite to veterinary practices. Each would have a computer terminal with core information on CD-ROMs plus a facility for continual updating. Comprehensive clinical information would be presented via text, sound, and pictures. For example, a veterinarian consulting the system about a particular heart condition would be able to see a typical cardiogram and simultaneously listen to the related heart sounds. If a surgical procedure was indicated, it could be screened in full colour, complete with commentary. (EDWARD BODEN)

      See also Molecular Biology (Life Sciences ).

      This updates the articles diagnosis; disease; infection (infectious disease); medicine.

* * *

Universalium. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Immunity in Health and Disease — The prime function of immune system is to protect the host against the invading pathogens. The body tries its best to overcome various strategies of infectious agents (bacteria, viruses), and provides immunity.Some of the important immunological… …   Wikipedia

  • Health and intelligence — are two closely related aspects of human well being. The impact of health on intelligence is one of the most important factors in understanding human group differences in IQ test scores and other measures of cognitive ability. Several factors can …   Wikipedia

  • Health and Social Class — Health and social class, from a sociological perspective, refers to the idea that there are inequalities in morbidity and mortality among upper and lower social classes. These health inequalities suggests not only neutral health differences… …   Wikipedia

  • Health and Human Services — The United States government s principal agency for protecting the health of all Americans and providing essential human services, especially for those who are least able to help themselves. Also known as DHHS and HHS. The U.S. Department of… …   Medical dictionary

  • health and illness, sociology of — A field of sociology concerned with the social dimensions of health and illness, it covers three main areas: namely, the conceptualization of health and illness; the study of their measurement and social distribution; and the explanation of… …   Dictionary of sociology

  • Health and Human Services — noun the United States federal department that administers all federal programs dealing with health and welfare; created in 1979 • Syn: ↑Department of Health and Human Services, ↑HHS • Hypernyms: ↑executive department • Part Meronyms: ↑ …   Useful english dictionary

  • Swedish National Board of Health and Welfare — The Swedish National Board of Health and Welfare ( Socialstyrelsen ) is a Swedish government agency. The agency was the result of a merge between the Swedish Royal Medical Board and the Swedish Royal Board of Social Affairs in 1968.The Board is… …   Wikipedia

  • National Board of Health and Welfare (Sweden) — The Swedish National Board of Health and Welfare (Socialstyrelsen) is a Swedish government agency. The agency was the result of a merge between the Swedish Royal Medical Board and the Swedish Royal Board of Social Affairs in 1968. The Board is… …   Wikipedia

  • Maya health and medicine — Health and medicine among the ancient Maya was a complex blend of mind, body, religion, ritual, and science. Important to all, medicine was practiced only by a select few, who generally inherited their positions and received extensive education.… …   Wikipedia

  • Disease management (health) — Disease management is defined as a system of coordinated health care interventions and communications for populations with conditions in which patient self care efforts are significant. [1][2][3] For people who can access health care… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”