pharmaceutical industry

pharmaceutical industry
Producers of pharmaceuticals, substances used in the diagnosis, treatment, and prevention of disease and the modification of organic functions.

The earliest records of medicinal plants and minerals are those of the ancient Chinese, Hindu, and Mediterranean civilizations. Medicines were prepared first by physicians and later by apothecary shops. The modern pharmaceutical industry began in the 19th century with the discovery of highly active medicinal compounds that could be manufactured most efficiently on a large scale. As these drugs replaced the herbal medicines of earlier times, the occurrence and severity of such diseases as rheumatic fever, typhoid fever, pneumonia, poliomyelitis, syphilis, and tuberculosis were greatly reduced. Many drugs are extracted from plant substances; alkaloids such as quinine, cocaine, and morphine are among the best-known examples. Others are made from animal substances, such as the glandular extracts that are used to produce insulin. Pharmaceutical industry research has greatly aided medical progress, and many new drugs have been discovered and produced in industrial laboratories. Increasing health-care costs, government regulation, and research ethics are all issues of concern to the industry.

* * *

Introduction

      the discovery, development, and manufacture of drugs and medications (pharmaceuticals (pharmaceutical)) by public and private organizations.

      The modern era of the pharmaceutical industry—of isolation and purification of compounds, chemical synthesis, and computer-aided drug design—is considered to have begun in the 19th century, thousands of years after intuition and trial and error led humans to believe that plants, animals, and minerals contained medicinal properties. The unification of research in the 20th century in fields such as chemistry and physiology increased the understanding of basic drug-discovery processes. Identifying new drug targets, attaining regulatory approval from government agencies, and refining techniques in drug discovery and development are among the challenges that face the pharmaceutical industry today. The continual evolution and advancement of the pharmaceutical industry is fundamental in the control and elimination of disease around the world.

      The following sections provide a detailed explanation of the progression of drug discovery and development throughout history, the process of drug development in the modern pharmaceutical industry, and the procedures that are followed to ensure the production of safe drugs. For further information about drugs, see drug. For a comprehensive description about the practice of medicine and the role of drug research in the health care industry, see medicine.

History

The origin of medicines
Medicines of ancient civilizations
      The oldest records of medicinal preparations made from plants, animals, or minerals are those of the early Chinese, Hindu, and Mediterranean civilizations. An herbal compendium, said to have been written in the 28th century BC by the legendary emperor Shennong (Shen Nung), described the antifever capabilities of a substance known as chang shan (from the plant species Dichroa febrifuga), which has since been shown to contain antimalarial alkaloids (alkaloid) (alkaline organic chemicals containing nitrogen). Workers at the school of alchemy that flourished in Alexandria, Egypt, in the 2nd century BC prepared several relatively purified inorganic chemicals, including lead carbonate, arsenic, and mercury. According to De materia medica, written by the Greek physician Pedanius Dioscorides (Dioscorides, Pedanius) in the 1st century AD, verdigris (basic cupric acetate) and cupric sulfate were prescribed as medicinal agents. While attempts were made to use many of the mineral preparations as drugs, most proved to be too toxic to be used in this manner.

      Many plant-derived medications employed by the ancients are still in use today. Egyptians treated constipation with senna pods and castor oil and indigestion with peppermint and caraway. Various plants containing digitalis-like compounds (cardiac stimulants) were employed to treat a number of ailments. Ancient Chinese physicians employed ma huang, a plant containing ephedrine, for a variety of purposes. Today ephedrine is used in many pharmaceutical preparations intended for the treatment of cold and allergy symptoms. The Greek physician Galen (Galen Of Pergamum) (c. 130–c. 200 AD) included opium and squill among the drugs in his apothecary shop (pharmacy). Today derivatives of opium alkaloids are widely employed for pain relief, and, while squill was used for a time as a cardiac stimulant, it is better known as a rat poison. Although many of the medicinal preparations used by Galen are obsolete, he made many important conceptual contributions to modern medicine (medicine). For example, he was among the first practitioners to insist on purity for drugs. He also recognized the importance of using the right variety and age of botanical specimens to be used in making drugs.

Pharmaceutical science in the 16th and 17th centuries
      Pharmaceutical science improved markedly in the 16th and 17th centuries. In 1546 the first pharmacopoeia, or collected list of drugs and medicinal chemicals with directions for making pharmaceutical preparations, appeared in Nürnberg, Ger. Previous to this time, medical preparations had varied in concentration and even in constituents. Other pharmacopoeias followed in Basel (1561), Augsburg (1564), and London (1618). The London Pharmacopoeia became mandatory for the whole of England and thus became the first example of a national pharmacopoeia. Another important advance was initiated by Paracelsus, a 16th-century Swiss physician-chemist. He admonished his contemporaries not to use chemistry as it had widely been employed prior to his time in the speculative science of alchemy and the making of gold. Instead, Paracelsus advocated the use of chemistry to study the preparation of medicines.

      In London the Society of Apothecaries (pharmacists) was founded in 1617. This marked the emergence of pharmacy as a distinct and separate entity. The separation of apothecaries from grocers was authorized by King James I, who also mandated that only a member of the society could keep an apothecary's shop and make or sell pharmaceutical preparations. In 1841 the Pharmaceutical Society of Great Britain was founded. This society oversaw the education and training of pharmacists to assure a scientific basis for the profession. Today professional societies around the world play a prominent role in supervising the education and practice of their members.

      In 1783 the English physician and botanist William Withering published his famous monograph on the use of digitalis (an extract from the flowering purple foxglove, Digitalis purpurea). His book, An Account of the Foxglove and Some of Its Medicinal Uses: With Practical Remarks on Dropsy and Other Diseases, described in detail the use of digitalis preparations and included suggestions as to how their toxicity might be reduced. Plants containing digitalis-like compounds had been employed by ancient Egyptians thousands of years earlier, but their use had been erratic. Withering believed that the primary action of digitalis was on the kidney, thereby preventing dropsy ( edema). Later, when it was discovered that water was transported in the circulation with blood, it was found that the primary action of digitalis was to improve cardiac performance (cardiac output), with the reduction in edema resulting from improved cardiovascular function. Nevertheless, the observations in Withering's monograph led to a more rational and scientifically based use of digitalis and eventually other drugs.

Isolation and synthesis of compounds
      In the 1800s many important compounds were isolated from plants for the first time. About 1804 the active ingredient, morphine, was isolated from opium. In 1820 quinine ( malaria treatment) was isolated from cinchona bark (Cinchona) and colchicine ( gout treatment) from autumn crocus. In 1833 atropine (variety of uses) was purified from Atropa belladonna (belladonna), and in 1860 cocaine (local anesthetic) was isolated from coca leaves (coca). Isolation and purification of these medicinal compounds was of tremendous importance for several reasons. First, accurate doses of the drugs could be administered, something that had not been possible previously because the plants contained unknown and variable amounts of the active drug. Second, toxic effects due to impurities in the plant products could be eliminated if only the pure active ingredients were used. Finally, knowledge of the chemical structure of pure drugs enabled laboratory synthesis of many structurally related compounds and the development of valuable drugs.

       pain relief has been an important goal of medicine development for millennia. Prior to the mid-19th century, surgeons (surgery) took great pride in the speed with which they could complete a surgical procedure. Faster surgery meant that the patient would undergo the excruciating pain for shorter periods of time. In 1842 ether (ethyl ether) was first employed as an anesthetic during surgery, and chloroform followed soon after in 1847. These agents revolutionized the practice of surgery. After their introduction, careful attention could be paid to prevention of tissue damage, and longer and more-complex surgical procedures could be carried out more safely. Although both ether and chloroform were employed in anesthesia for more than a century, their current use is severely limited by their side effects; ether is very flammable and explosive and chloroform may cause severe liver toxicity in some patients. However, because pharmaceutical chemists knew the chemical structures of these two anesthetics, they were able to synthesize newer anesthetics, which have many chemical similarities with ether and chloroform but do not burn or cause liver toxicity.

The development of anti-infective agents
Discovery of antiseptics (antiseptic) and vaccines (vaccine)
      Prior to the development of anesthesia, many patients succumbed to the pain and stress of surgery. Many other patients had their wounds become infected and died as a result of their infection. In 1865 the British surgeon and medical scientist Joseph Lister (Lister, Joseph, Baron Lister, Of Lyme Regis) initiated the era of antiseptic surgery in England. While many of the innovations of the antiseptic era are procedural (use of gloves and other sterile procedures), Lister also introduced the use of phenol as an anti-infective agent.

      In the prevention of infectious diseases (infectious disease), an even more important innovation took place near the beginning of the 19th century with the introduction of smallpox vaccine. In the late 1790s the English surgeon Edward Jenner (Jenner, Edward) observed that milkmaids who had been infected with the relatively benign cowpox virus were protected against the much more deadly smallpox. After this observation he developed an immunization procedure based on the use of crude material from the cowpox lesions. This success was followed in 1885 by the development of rabies vaccine by the French chemist and microbiologist Louis Pasteur (Pasteur, Louis). Widespread vaccination programs have dramatically reduced the incidence of many infectious diseases that once were common. Indeed, vaccination programs have eliminated smallpox infections. The virus no longer exists in the wild, and, unless it is reintroduced from caches of smallpox virus held in laboratories in the United States and Russia, smallpox will no longer occur in humans. A similar effort is under way with widespread polio vaccinations (polio vaccine); however, it remains unknown whether the vaccines will eliminate polio as a human disease.

Improvement in drug administration
      While it may seem obvious today, it was not always clearly understood that medications must be delivered to the diseased tissue in order to be effective. Indeed, at times apothecaries made pills that were designed to be swallowed, pass through the gastrointestinal tract, be retrieved from the stool, and used again. While most drugs are effective and safe when taken orally, some are not reliably absorbed into the body from the gastrointestinal tract and must be delivered by other routes. In the middle of the 17th century, Richard Lower and Christopher Wren, working at the University of Oxford, demonstrated that drugs could be injected into the bloodstream of dogs using a hollow quill. In 1853 the French surgeon Charles Gabriel Pravaz invented the hollow hypodermic needle, which was first used in the treatment of disease in the same year by Scottish physician Alexander Wood. The hollow hypodermic needle had a tremendous influence on drug administration. Because drugs could be injected directly into the bloodstream, rapid and dependable drug action became more readily producible. Development of the hollow hypodermic needle also led to an understanding that drugs could be administered by multiple routes and was of great significance for the development of the modern science of pharmaceutics, or dosage form development.

Drug development in the 19th and 20th centuries
New classes of pharmaceuticals
      In the latter part of the 19th century a number of important new classes of pharmaceuticals were developed. In 1869 chloral hydrate became the first synthetic sedative-hypnotic (sedative-hypnotic drug) (sleep-producing) drug. In 1879 it was discovered that organic nitrates such as nitroglycerin could relax blood vessels (blood vessel), eventually leading to the use of these organic nitrates in the treatment of heart problems. In 1875 several salts of salicylic acid were developed for their antipyretic ( fever-reducing) action. Salicylate-like preparations in the form of willow bark extracts (which contain salicin) had been in use for at least 100 years prior to the identification and synthesis of the purified compounds. In 1879 the artificial sweetener saccharin was introduced. In 1886 acetanilide, the first analgesic-antipyretic drug (relieving pain and fever), was introduced, but later, in 1887, it was replaced by the less toxic phenacetin. In 1899 aspirin (acetylsalicylic acid) became the most effective and popular anti-inflammatory, analgesic-antipyretic drug for at least the next 60 years. Cocaine, derived from the coca leaf, was the only known local anesthetic until about 1900, when the synthetic compound benzocaine was introduced. Benzocaine was the first of many local anesthetics with similar chemical structures and led to the synthesis and introduction of a variety of compounds with more efficacy and less toxicity.

Transitions in drug discovery
      In the late 19th and early 20th centuries, a number of social, cultural, and technical changes of importance to pharmaceutical discovery, development, and manufacturing were taking place. One of the most important changes occurred when universities began to encourage their faculties to form a more coherent understanding of existing information. Some chemists developed new and improved ways to separate (separation and purification) chemicals from minerals, plants, and animals, while others developed ways to synthesize novel compounds. Biologists did research to improve understanding of the processes fundamental to life in species of microbes, plants, and animals. Developments in science were happening at a greatly accelerated rate, and the way in which pharmacists and physicians were educated changed. Prior to this transformation the primary means of educating physicians and pharmacists had been through apprenticeships. While apprenticeship teaching remained important to the education process (in the form of clerkships, internships, and residencies), pharmacy and medical schools began to create science departments and hire faculty to teach students the new information in basic biology and chemistry. New faculty were expected to carry out research or scholarship of their own. With the rapid advances in chemical separations and synthesis, single pharmacists did not have the skills and resources to make the newer, chemically pure drugs. Instead, large chemical and pharmaceutical companies began to appear and employed university-trained scientists equipped with knowledge of the latest technologies and information in their fields.

      As the 20th century progressed, the benefits of medical, chemical, and biological research began to be appreciated by the general public and by politicians, prompting governments to develop mechanisms to provide support for university research. In the United States, for instance, the National Institutes of Health, the National Science Foundation, the Department of Agriculture, and many other agencies undertook their own research or supported research and discovery at universities that could then be used for pharmaceutical development. Nonprofit organizations were also developed to support research, including the Australian Heart Foundation, the American Heart Association, the Heart and Stroke Foundation of Canada, and H.E.A.R.T UK. The symbiotic relationship between large public institutions carrying out fundamental research and private companies making use of the new knowledge to develop and produce new pharmaceutical products has contributed greatly to the advancement of medicine.

Establishing the fight against infectious disease
Early efforts in the development of anti-infective drugs
      For much of history, infectious diseases (infectious disease) were the leading cause of death in most of the world. The widespread use of vaccines (vaccine) and implementation of public health measures, such as building reliable sewer systems and chlorinating water to assure safe supplies for drinking, were of great benefit in decreasing the impact of infectious diseases in the industrialized world. However, even with these measures, pharmaceutical treatments for infectious diseases were needed. The first of these was arsphenamine, which was developed in 1910 by the German medical scientist Paul Ehrlich (Ehrlich, Paul) for the treatment of syphilis. Arsphenamine was the 606th chemical studied by Ehrlich in his quest for an antisyphilitic drug. Its efficacy was first demonstrated in mice with syphilis and then in humans. Arsphenamine was marketed with the trade name of Salvarsan and was used to treat syphilis until the 1940s, when it was replaced by penicillin. Ehrlich referred to his invention as chemotherapy, which is the use of a specific chemical to combat a specific infectious organism. Arsphenamine was important not only because it was the first synthetic compound to kill a specific invading microorganism but also because of the approach Ehrlich used to find it. In essence, he synthesized a large number of compounds and screened each one to find a chemical that would be effective. Screening for efficacy became one of the most important means used by the pharmaceutical industry to develop new drugs.

      The next great advance in the development of drugs for treatment of infections came in the 1930s, when it was shown that certain azo dyes (azo dye), which contained sulfonamide groups, were effective in treating streptococcal infections (Streptococcus) in mice. One of the dyes, known as Prontosil, was later found to be metabolized in the patient to sulfanilamide, which was the active antibacterial molecule. In 1933 Prontosil was given to the first patient, an infant with a systemic staphylococcal infection (staphylococcus). The infant underwent a dramatic cure. In subsequent years many derivatives of sulfonamides, or sulfa drugs (sulfa drug), were synthesized and tested for antibacterial and other activities.

Discovery of penicillin
  The first description of penicillin was published in 1929 by the Scottish bacteriologist Alexander Fleming (Fleming, Sir Alexander). Fleming had been studying staphylococcal bacteria (staphylococcus) in the laboratory at St. Mary's Hospital in London. He noticed that a mold had contaminated one of his cultures, causing the bacteria in its vicinity to undergo lysis (membrane rupture) and die. Since the mold was from the genus Penicillium, Fleming named the active antibacterial substance penicillin. At first the significance of Fleming's discovery was not widely recognized. It was more than 10 years later before British biochemist Ernst Boris Chain (Chain, Sir Ernst Boris) and Australian pathologist Howard Florey (Florey, Howard Walter Florey, Baron), working at the University of Oxford, showed that a crude penicillin preparation produced a dramatic curative effect when administered to mice with streptococcal infections (Streptococcus).

      The production of large quantities of penicillin was difficult with the facilities available to the investigators. However, by 1941 they had enough penicillin to carry out a clinical trial in several patients with severe staphylococcal and streptococcal infections. The effects of penicillin were remarkable, although there was not enough drug available to save the lives of all the patients in the trial.

      In an effort to develop large quantities of penicillin, the collaboration of scientists at the United States Department of Agriculture's Northern Regional Research Laboratories in Peoria, Ill., was enlisted. The laboratories in Peoria had large fermentation vats that could be used in an attempt to grow an abundance of the mold. In England the first penicillin had been produced by growing the Penicillium notatum mold in small containers. However, P. notatum would not grow well in the large fermentation vats available in Peoria, so scientists from the laboratories searched for another strain of Penicillium. Eventually a strain of Penicillium chrysogenum that had been isolated from an overripe cantaloupe was found to grow very well in the deep culture vats. After the process of growing the penicillin-producing organisms was developed, pharmaceutical firms were recruited to further develop and market the drug for clinical use. The use of penicillin very quickly revolutionized the treatment of serious bacterial infections. The discovery, development, and marketing of penicillin provides an excellent example of the beneficial collaborative interaction of not-for-profit researchers and the pharmaceutical industry.

Discovery and development of hormones (hormone) and vitamins
Isolation of insulin
      The vast majority of hormones (hormone) were identified, had their biological activity defined, and were synthesized in the first half of the 20th century. Illnesses relating to their excess or deficiency were also beginning to be understood at that time. Hormones, produced in specific organs, released into the circulation, and carried to other organs, significantly affect metabolism and homeostasis. Some examples of hormones are insulin (from the pancreas), epinephrine (epinephrine and norepinephrine) (or adrenaline; from the adrenal medulla), thyroxine (from the thyroid gland), cortisol (from the adrenal cortex), estrogen (from the ovaries), and testosterone (from the testes). As a result of discovering these hormones and their mechanisms of action in the body, it became possible to treat illnesses of deficiency or excess effectively. The discovery and use of insulin to treat diabetes is an example of these developments.

      In 1869 Paul Langerhans, a medical student in Germany, was studying the histology of the pancreas. He noted that this organ has two distinct types of cells—acinar cells, now known to secrete digestive enzymes, and islet cells (now called islets of Langerhans (Langerhans, islets of)). The function of islet cells was suggested in 1889 when German physiologist and pathologist Oskar Minkowski and German physician Joseph von Mering showed that removing the pancreas from a dog caused the animal to exhibit a disorder quite similar to human diabetes mellitus (elevated blood glucose and metabolic changes). After this discovery, a number of scientists in various parts of the world attempted to extract the active substance from the pancreas so that it could be used to treat diabetes. We now know that these attempts were largely unsuccessful because the digestive enzymes present in the acinar cells metabolized the insulin from the islet cells when the pancreas was disrupted.

      In 1921 Fredrick Banting (Banting, Sir Frederick Grant), a young Canadian surgeon in Toronto, convinced a physiology professor to allow him use of a laboratory to search for the active substance from the pancreas. Banting guessed correctly that the islet cells secreted insulin, which was destroyed by enzymes from the acinar cells. By this time Banting had enlisted the support of Charles Best (Best, Charles H.), a fourth-year medical student. Together they tied off the pancreatic ducts through which acinar cells release the digestive enzymes. This insult caused the acinar cells to die. Subsequently, the remainder of the pancreas was homogenized and extracted with ethyl alcohol and acid. The extract thus obtained decreased blood glucose levels in dogs with a form of diabetes. Shortly thereafter, in 1922, a 14-year-old boy with severe diabetes was the first human to be treated successfully with the pancreatic extracts.

      After this success other scientists became involved in the quest to develop large quantities of purified insulin extracts. Eventually, extracts from pig and cow pancreases created a sufficient and reliable supply of insulin. For the next 50 years most of the insulin used to treat diabetes was extracted from porcine and bovine sources. There are only slight differences in chemical structure between bovine, porcine, and human insulin, and their hormonal activities are essentially equivalent. Today, as a result of recombinant DNA technology, most of the insulin used in therapy is synthesized by pharmaceutical companies and is identical to human insulin (see below Synthetic human proteins (pharmaceutical industry)).

Identification of vitamins (vitamin)
      Vitamins (vitamin) are organic compounds that are necessary for body metabolism and, generally, must be provided from the diet. For centuries many diseases of dietary deficiency had been recognized, although not well defined. Most of the vitamin deficiency disorders were biochemically and physiologically defined in the late 19th and early 20th centuries. The discovery of thiamin (vitamin B1) exemplifies how vitamin deficiencies and their treatment were discovered.

      Thiamin deficiency produces beriberi, a word from the Sinhalese meaning “extreme weakness.” The symptoms include spasms and rigidity of the legs, possible paralysis of a limb, personality disturbances, and depression. This disease became widespread in Asia in the 19th century because steam-powered rice mills produced polished rice, which lacked the vitamin-rich husk. A dietary deficiency was first suggested as the cause of beriberi in 1880 when a new diet was instituted for the Japanese navy. When fish, meat, barley, and vegetables were added to the sailor's diet of polished rice, the incidence of beriberi in the navy was significantly reduced. In 1897 the Dutch physician Christiaan Eijkman (Eijkman, Christiaan) was working in Java when he showed that fowl fed a diet of polished rice developed symptoms similar to beriberi. He was also able to demonstrate that unpolished rice in the diet prevented and cured the symptoms in fowl and humans. By 1912 a highly concentrated extract of the active ingredient was prepared by the Polish biochemist Casimir Funk, who recognized that it belonged to a new class of essential foods called vitamins. Thiamin was isolated in 1926 and its chemical structure determined in 1936. The chemical structures of the other vitamins were determined prior to 1940.

Emergence of modern diseases and treatment (therapeutics)
      The rapid decline in the number of deaths from infections due to the development of vaccines (vaccine) and antibiotics (antibiotic) led to the unveiling of a new list of deadly diseases in the industrialized world during the second half of the 20th century. Included in this list are cardiovascular disease, cancer, and stroke. While these remain the three leading causes of death today, a great deal of progress in decreasing mortality and disability caused by these diseases has been made since the 1940s. As with treatment of any complex disease, there are many events of importance in the development of effective therapy. For decreasing death and disability from cardiovascular diseases and stroke, one of the most important developments was the discovery of effective treatments for hypertension (high blood pressure)—i.e., the discovery of thiazide diuretics. For decreasing death and disability from cancer, one very important step was the development of cancer chemotherapy.

      Hypertension has been labeled the “silent killer.” It usually has minimal or no symptoms and typically is not regarded as a primary cause of death. Untreated hypertension increases the incidence and severity of cardiovascular diseases and stroke. Before 1950 there were no effective treatments for hypertension. U.S. Pres. Franklin D. Roosevelt (Roosevelt, Franklin D.) died after a stroke in 1945, despite a large effort by his physicians to control his very high blood pressure by prescribing sedatives and rest.

      When sulfanilamide was introduced into therapy, one of the side effects it produced was metabolic acidosis (acid-base imbalance). After further study, it was learned that the acidosis was caused by inhibition of the enzyme carbonic anhydrase. Inhibition of carbonic anhydrase produces diuresis (urine formation). Subsequently, many sulfanilamide-like compounds were synthesized and screened for their ability to inhibit carbonic anhydrase. Acetazolamide, which was developed by scientists at Lederle Laboratories (now a part of Wyeth Pharmaceuticals, Inc.), became the first of a class of diuretics (diuretic) that serve as carbonic anhydrase inhibitors. In an attempt to produce a carbonic anhydrase inhibitor more effective than acetazolamide, chlorothiazide was synthesized by a team of scientists led by Dr. Karl Henry Beyer at Merck & Co., Inc., and became the first successful thiazide diuretic. While acetazolamide causes diuresis by increasing sodium bicarbonate excretion, chlorothiazide was found to increase sodium chloride excretion. More importantly, by the mid-1950s it had been shown that chlorothiazide lowers blood pressure in patients with hypertension. Over the next 50 years many other classes of drugs that lower blood pressure (antihypertensive drugs) were added to the physician's armamentarium for treatment of hypertension. Partially as a result of effective treatment of this disease, the death rate from cardiovascular diseases and stroke decreased dramatically during this period.

      The discovery of chlorothiazide exemplifies two important pathways to effective drug development. The first is screening for a biological effect. Thousands of drugs have been developed through effective screening for a biological activity. The second pathway is serendipity—i.e., making fortunate discoveries by chance. While creating experiments that can lead to chance outcomes does not require particular scientific skill, recognizing the importance of accidental discoveries is one of the hallmarks of sound science. Many authorities doubt that Fleming was the first scientist to notice that when agar plates were contaminated with Penicillium mold, bacteria did not grow near the mold. However, what made Fleming great was that he was the first to recognize the importance of what he had seen. In the case of chlorothiazide, it was serendipitous that sulfanilamide was found to cause metabolic acidosis, and it was serendipitous that chlorothiazide was recognized to cause sodium chloride excretion and an antihypertensive effect.

Early progress in cancer drug development
      Sulfur mustard was synthesized in 1854. By the late 1880s it was recognized that sulfur mustard could cause blistering of the skin, eye irritation possibly leading to blindness, and severe lung injury if inhaled. In 1917 during World War I, sulfur mustard was first used as a chemical weapon. By 1919 it was realized that exposure to sulfur mustard also produced very serious systemic toxicities. Among other effects, it caused leukopenia (decreased white blood cells) and damage to bone marrow and lymphoid tissue. During the interval between World War I and World War II there was extensive research into the biological and chemical effects of nitrogen mustards (chemical analogs of sulfur mustard) and similar chemical-warfare compounds. The toxicity of nitrogen mustard on lymphoid tissue caused researchers to study the effect of nitrogen mustard on lymphomas (lymphoma) in mice. In the early 1940s nitrogen mustard (mechlorethamine) was discovered to be effective in the treatment of human lymphomas. The efficacy of this treatment led to the widespread realization that chemotherapy for cancer could be effective. In turn, this realization led to extensive research, discovery, and development of other cancer chemotherapeutic agents.

Pharmaceutical industry in the modern era
      The pharmaceutical industry has become a large and very complex enterprise. At the end of the 20th century, most of the world's largest pharmaceutical companies were located in North America, Europe, and Japan; many of the largest were multinational, having research, manufacturing, and sales taking place in multiple countries. Since pharmaceuticals can be quite profitable, many countries are trying to develop the infrastructure necessary for drug companies in their countries to become larger and to compete on a worldwide scale. The industry has also come to be characterized by outsourcing. That is, many companies contract with specialty manufacturers or research firms to carry out parts of the drug development process for them. Others try to retain most of the processes within their own company. Since the pharmaceutical industry is driven largely by profits and competition—each company striving to be the first to find cures for specific diseases—it is anticipated that the industry will continue to change and evolve over time.

Drug discovery and development

Drug development process
 A variety of approaches is employed to identify chemical compounds that may be developed and marketed. The current state of the chemical and biological sciences required for pharmaceutical development dictates that 5,000–10,000 chemical compounds must undergo laboratory screening for each new drug approved for use in humans. Of the 5,000–10,000 compounds that are screened, approximately 250 will enter preclinical testing, and 5 will enter clinical testing. The overall process from discovery to marketing of a drug can take 10 to 15 years. This section describes some of the processes used by the industry to discover and develop new drugs. The flowchart—> provides an overall summary of this developmental process.

Research and discovery
      Pharmaceuticals are produced as a result of activities carried out by a complex array of public and private organizations that are engaged in the development and manufacture of drugs. As part of this process, scientists at many publicly funded institutions carry out basic research in subjects such as chemistry, biochemistry, physiology, microbiology, and pharmacology. Basic research is almost always directed at developing new understanding of natural substances or physiological processes rather than being directed specifically at development of a product or invention. This enables scientists at public institutions and in private industry to apply new knowledge to the development of new products. The first steps in this process are carried out largely by basic scientists and physicians working in a variety of research institutions and universities. The results of their studies are published in scientific and medical journals. These results facilitate the identification of potential new targets for drug discovery. The targets could be a drug receptor, an enzyme, a biological transport process, or any other process involved in body metabolism. Once a target is identified, the bulk of the remaining work involved in discovery and development of a drug is carried out or directed by pharmaceutical companies.

Contribution of scientific knowledge to drug discovery
      Two classes of antihypertensive drugs serve as an example of how enhanced biochemical and physiological knowledge of one body system contributed to drug development. hypertension (high blood pressure) is a major risk factor for development of cardiovascular diseases (cardiovascular disease). An important way to prevent cardiovascular diseases is to control high blood pressure. One of the physiological systems involved in blood pressure control is the renin-angiotensin system. renin is an enzyme produced in the kidney. It acts on a blood protein to produce angiotensin. The details of the biochemistry and physiology of this system were worked out by biomedical scientists working at hospitals, universities, and government research laboratories around the world. Two important steps in production of the physiological effect of the renin-angiotensin system are the conversion of inactive angiotensin I to active angiotensin II by angiotensin-converting enzyme (ACE) and the interaction of angiotensin II with its physiologic receptors, including AT1 receptors. Angiotensin II interacts with AT1 receptors to raise blood pressure. Knowledge of the biochemistry and physiology of this system suggested to scientists that new drugs could be developed to lower abnormally high blood pressure.

      A drug that inhibited ACE would decrease the formation of angiotensin II. Decreasing angiotensin II formation would, in turn, result in decreased activation of AT1 receptors. Thus, it was assumed that drugs that inhibit ACE would lower blood pressure. This assumption turned out to be correct, and a class of antihypertensive drugs called ACE inhibitors was developed. Similarly, once the role of AT1 receptors in blood pressure maintenance was understood, it was assumed that drugs that could block AT1 receptors would produce antihypertensive effects. Once again, this assumption proved correct, and a second class of antihypertensive drugs, the AT1 receptor antagonists, was developed. Agonists are drugs or naturally occurring substances that activate physiologic receptors, whereas antagonists are drugs that block those receptors. In this case, angiotensin II is an agonist at AT1 receptors, and the antihypertensive AT1 drugs are antagonists. Antihypertensives illustrate the value of discovering novel drug targets that are useful for large-scale screening tests to identify lead chemicals for drug development.

Drug screening
Sources of compounds
      Screening chemical compounds (chemical compound) for potential pharmacological effects is a very important process for drug discovery and development. Virtually every chemical and pharmaceutical company in the world has a library of chemical compounds that have been synthesized over many decades. Historically, many diverse chemicals have been derived from natural products such as plants, animals, and microorganisms. Many more chemical compounds are available from university chemists. Additionally, automated, high-output, combinatorial chemistry methods have added hundreds of thousands of new compounds. Whether any of these millions of compounds have the characteristics that will allow them to become drugs remains to be discovered through rapid, high-efficiency drug screening.

Lead chemical identification
      It took Paul Ehrlich (Ehrlich, Paul) years to screen the 606 chemicals that resulted in the development of arsphenamine as the first effective drug treatment for syphilis. From about the time of Ehrlich's success (1910) until the latter half of the 20th century, most screening tests for potential new drugs relied almost exclusively on screens in whole animals such as rats and mice. Ehrlich screened his compounds in mice with syphilis, and his procedures proved to be much more efficient than those of his contemporaries. Since the latter part of the 20th century, automated in vitro screening techniques have allowed tens of thousands of chemical compounds to be screened for efficacy in a single day. In large-capacity in vitro screens, individual chemicals are mixed with drug targets in small, test-tube-like wells of microtiter plates, and desirable interactions of the chemicals with the drug targets are identified by a variety of chemical techniques. The drug targets in the screens can be cell-free (enzyme, drug receptor, biological transporter, or ion channel), or they can contain cultured bacteria, yeasts, or mammalian cells. Chemicals that interact with drug targets in desirable ways become known as leads and are subjected to further developmental tests. Also, additional chemicals with slightly altered structures may be synthesized if the lead compound does not appear to be ideal. Once a lead chemical is identified, it will undergo several years of animal studies in pharmacology and toxicology to predict future human safety and efficacy.

Lead compounds from natural products
      Another very important way to find new drugs is to isolate chemicals from natural products. digitalis, ephedrine, atropine, quinine, colchicine, and cocaine were purified from plants. Thyroid hormone (thyroxine), cortisol, and insulin originally were isolated from animals, whereas penicillin and other antibiotics were derived from microbes. In many cases plant-derived products were used for hundreds or thousands of years by indigenous peoples from around the world prior to their “discovery” by scientists from industrialized countries. In most cases these indigenous peoples learned which plants had medicinal value the same way they learned which plants were safe to eat—trial and error. Ethnopharmacology is a branch of medical science in which the medicinal products used by isolated or primitive people are investigated using modern scientific techniques. In some cases chemicals with desirable pharmacological properties are isolated and eventually become drugs with properties recognizable in the natural product. In other cases chemicals with unique or unusual chemical structures are identified in the natural product. These new chemical structures are then subjected to drug screens to determine if they have potential pharmacological or medicinal value. There are many cases where such chemical structures and their synthetic analogs are developed as drugs with uses unlike those of the natural product. One such compound is the important anticancer drug taxol, which was isolated from the Pacific yew (Taxus brevifolia).

Taxol and the Pacific yew
      As a member of the yew family, Taxaceae, the Pacific yew (Taxus brevifolia) has flat, evergreen needles and produces red, berrylike fruits. The toxicity of members of the yew family was described in ancient Greek literature. Indeed, the genus name Taxus derives from the Greek word toxon, which can be translated as toxin or poison. Pliny the Elder described people who died after drinking wine that had been stored in containers made from yew wood. Julius Caesar described how one of his enemies, Catuvolcus, poisoned himself using a yew plant. The early Japanese used yew plant parts to induce abortion and to treat diabetes (diabetes mellitus), and Native Americans used yew to treat arthritis and fever. In part because of widespread historical accounts of the pronounced biological effects inherent in members of the yew family, samples of the Pacific yew were included in screens for potential anticancer drugs.

      This screening process was initiated as a cooperative venture between the United States Department of Agriculture (USDA) and the National Cancer Institute (NCI) of the United States. Extracts from the Pacific yew were tested against two cancer cell lines in 1964 and found to have promising effects. After a sufficient quantity of the extract was prepared, the active compound, taxol, was isolated in 1969. In 1979 pharmacologist Susan Horwitz and her coworkers at Yeshiva University's Albert Einstein College of Medicine reported a unique mechanism of action for taxol. In 1983 NCI-supported clinical trials with taxol were begun, and by 1989 NCI-supported clinical researchers at Johns Hopkins University reported very positive effects in the treatment of ovarian cancer. Also in 1989 the NCI reached an agreement with Bristol-Myers Squibb (Bristol-Myers Squibb Company) to increase production, supplies, and marketing of taxol. Taxol marketing for the treatment of ovarian cancer began in 1992. Bristol-Myers Squibb applied to trademark the name taxol, which became Taxol®, and the generic name became paclitaxel.

      Initially, the sole source of taxol was the bark of the Pacific yew, native to the old-growth forests along the northwest coast of the United States and in British Columbia. This led to considerable public controversy. Environmental groups feared that harvesting of the yew would endanger its survival. It took the bark of between three and ten 100-year-old plants to make enough drug to treat one patient. There were also fears that harvesting the yew would lead to environmental damage to the area and could potentially destroy much of the habitat for the endangered spotted owl. After several years of controversy, Bristol-Myers Squibb adopted a semisynthetic process for making taxol. This process uses a precursor, which is chemically converted to taxol. The precursor is extracted from the needles (renewable biomass) of Taxus baccata, which is grown in the Himalayas and in Europe. Although there were some political controversies surrounding the discovery and development of taxol, the story of its development and marketing provides another example of how public and private enterprise can cooperate in the development of new discoveries and new drugs.

Strategies for drug design and production
Structure-activity relationship
      The term structure-activity relationship (SAR) is now used to describe the process used by Ehrlich (Ehrlich, Paul) to develop arsphenamine, the first successful treatment for syphilis. In essence, Ehrlich synthesized a series of structurally related chemical compounds and tested each one to determine its pharmacological activity. In subsequent years many drugs were developed using the SAR approach. For example, the β-adrenergic antagonists (antihypertensive drugs) and the β2 agonists (asthma drugs) were developed initially by making minor modifications to the chemical structure of the naturally occurring agonists epinephrine (epinephrine and norepinephrine) (adrenaline) and norepinephrine (noradrenaline). Once a series of chemical compounds had been synthesized and tested, medicinal chemists began to understand which chemical substitutions would produce agonists and which would produce antagonists. Additionally, substitutions that would cause metabolic enzyme blockade and increase the gastrointestinal absorption or duration of action began to be understood. Three-dimensional molecular models of agonists and antagonists that fit the drug receptor allowed scientists to gain important information about the three-dimensional structure of the drug receptor site. By the 1960s SAR had been further refined by creating mathematical relationships between chemical structure and biological activity. This refinement, which became known as quantitative structure-activity relationship, simplified the search for chemical structures that could activate or block various drug receptors.

Computer-aided design of drugs
      A further refinement of new drug design and production was provided by the process of computer-aided design (CAD). With the availability of powerful computers and sophisticated graphics software, it is possible for the medicinal chemist to design new molecules and evaluate their potential interaction with a receptor or an enzyme before they are synthesized. This means that the chemist may be able to synthesize and test only the most promising compounds, thus allowing potential new drugs to be synthesized more efficiently and cheaply.

Combinatorial chemistry
      Combinatorial chemistry was a development of the 1990s. It originated in the field of peptide chemistry but has since become an important tool of the medicinal chemist. Traditional organic synthesis is essentially a linear process with molecular building blocks being assembled in a series of individual steps. Part A of the new molecule is joined to part B to form part AB. After part AB is made, part C can be joined to it to make ABC. This step-wise construction is continued until the new molecule is complete. Using this approach, a medicinal chemist can, on average, synthesize about 25 new compounds per year. In combinatorial chemistry, one might start with five compounds (A1–A5). These five compounds would be reacted with building blocks B1–B5 and building blocks C1–C5. These reactions take place in parallel rather than in series, so that A1 would combine with B1, B2, B3, B4, and B5. Each one of these combinations would also combine with each of the C1–C5 building blocks, so that 125 compounds would be synthesized. Using robotic synthesis and combinatorial chemistry, hundreds of thousands of compounds can be synthesized in much less time than would have been required to synthesize a few compounds in the past.

Synthetic human proteins
      Another important milestone for medical science and for the pharmaceutical industry occurred in 1982, when regulatory and marketing approval for Humulin®, human insulin, was granted in the United Kingdom and the United States. This marketing approval was an important advancement because it represented the first time a clinically important, synthetic human protein had been made into a pharmaceutical product. Again, the venture was successful because of cooperative efforts between physicians and scientists working in research institutions, universities, hospitals, and the pharmaceutical industry.

      Human insulin is a small protein composed of 51 amino acids (amino acid) and has a molecular weight of 5,808 daltons (units of atomic mass). The amino acid sequence and chemical structure of insulin had been known for a number of years prior to the marketing of Humulin®. Indeed, the synthesis of sheep insulin had been reported in 1963 and human insulin in 1966. It took almost another 20 years to bring synthetic human insulin to market because a synthetic process capable of producing the quantities necessary to supply market needs had not been developed.

      In 1976 a new pharmaceutical firm, Genentech Inc., was formed. The goal of Genentech's founders was to use recombinant DNA technology in bacterial cells (bacteria) to produce human proteins such as insulin and growth hormone. Since the amino acid sequence and chemical structure of human insulin were known, the sequence of DNA that coded for synthesis of insulin could be reproduced in the laboratory. The DNA sequence coding for insulin production was synthesized and incorporated into a laboratory strain of the bacteria Escherichia coli (coliform bacteria). In other words, genes made in a laboratory were designed to direct the synthesis of insulin in bacteria. Once the laboratory synthesis of insulin by bacteria was completed, scientists at Genentech worked with their counterparts at Eli Lilly & Co. to scale up the new synthetic process so that marketable quantities of human insulin could be made. Regulatory approval for marketing human insulin came just six years after Genentech was founded.

      In some ways, the production of human growth hormone by recombinant DNA technology, first approved for use in 1985, was more important than the synthesis of insulin. Prior to the availability of human insulin, most people with diabetes (diabetes mellitus) could be treated with the bovine or porcine insulin products, which had been available for 50 years (see above Isolation of insulin (pharmaceutical industry)). Unlike insulin, the effects imparted by growth hormone are different for every species. Therefore, prior to the synthesis of human growth hormone, the only source of the human hormone was from cadaver pituitaries (pituitary gland). However, there are now a number of recombinant preparations of human growth hormone and other human peptides and proteins on the market.

Drug regulation and approval

Regulation by government agencies
      Concerns related to the efficacy and safety of drugs have caused most governments to develop regulatory agencies to oversee development and marketing of drug products and medical devices. Use of any drug carries with it some degree of risk of an adverse event. For most drugs the risk-to-benefit ratio is favourable; that is, the benefit derived from using the drug far outweighs the risk incurred from its use. However, there have been unfortunate circumstances in which drugs have caused considerable harm. The harm has come from drug products containing toxic impurities, from drugs with unrecognized severe adverse reactions, from adulterated drug products, and from fake or counterfeit drugs. Because of these issues, effective drug regulation is required to ensure the safety and efficacy of drugs for the general public.

Public influence on drug regulation
      The process of drug regulation has evolved over time. Laws regulating drug marketing and development, government regulatory agencies with oversight of drug development and use, drug evaluation boards, drug information centres, and quality control laboratories have become part of the cooperative venture that produces and develops drugs. In some countries drug laws omit or exempt certain areas of pharmaceutical activity from regulation. For example, some countries exempt herbal or homeopathic products from regulation. In other countries there is very little regulation imposed on drug importation. Over time, the scope of drug laws and the authority vested in regulatory agencies have gradually expanded. In some instances, strengthening of drug laws has been the result of a drug-related catastrophe that prompted public demand for more restrictive legislation to provide more protection for the public. One such example occurred in the 1960s with thalidomide that was prescribed to treat morning sickness in pregnant women. Thalidomide had been on the market for several years before it was realized to be the causative agent of a rare birth defect, known as phocomelia, that had begun appearing at epidemic proportions. There was a dramatic reaction to the devastation caused by thalidomide, especially because it was considered a needless drug.

      At other times the public has perceived that drug regulation and regulatory authorities have been too restrictive or too cautious in approving drugs for the market. This concern typically has been related to individuals with serious or life-threatening illnesses who might benefit from drugs that have been denied market approval or whose approval has been inordinately delayed because regulations are too strict. At times, governments have responded to these concerns by streamlining drug laws and regulations. Examples of types of drugs given expedited approval are cancer drugs and AIDS drugs. Regulatory measures that make rapid approval of new drugs paramount sometimes have led to marketing of drugs with more toxicity than the public finds acceptable. Thus, drug regulations can and probably will remain in a state of flux, becoming more lax when the public perceives a need for new drugs and more strict following a drug catastrophe.

Objectives and organization of drug regulatory agencies
      Effective regulation of drugs requires a variety of functions. Important functions include (1) evaluation of safety and efficacy data from animal and clinical trials, (2) licensing and inspection of manufacturing facilities and distribution channels to assure that drugs are not contaminated, (3) monitoring of adverse drug reactions for investigational and marketed drugs, and (4) quality control of drug promotion and advertising to assure that safety and efficacy claims are accurate. In some countries all functions surrounding drug regulation come under a single agency. In others, particularly those with a federal system of government, some drug regulatory authority is assumed by state or provincial governments.

      Around the world, financing of drug regulatory agencies varies. Many governments provide support for such agencies with revenue from general tax funds. The theory behind this type of financing is that the common good is served by effective regulations that provide for safe and effective medicines. In other countries the agencies are supported entirely by fees paid by the pharmaceutical firms seeking regulatory approval. In still other countries the work of drug regulatory agencies is supported by a mixture of direct government support and user fees. The World Health Organization (WHO) has developed international panels of experts in medicine, law, and pharmaceutical development that are responsible for recommending standards for national drug laws and regulations.

Drug approval processes
      Drug approval processes are designed to allow safe and effective drugs to be marketed. Drug regulatory agencies in various countries attempt to rely on premarketing scientific studies of the effects of drugs in animals and humans in order to determine if new drugs have a favourable risk-to-benefit ratio. Although most countries require similar types of premarketing studies to be completed, differences in specific regulations and guidelines exist. Thus, if pharmaceutical firms wish to market their new drugs in many countries, they may face challenges created by the differing regulations and guidelines for premarketing studies. In order to simplify the approval process for multinational marketing of drugs, the WHO and many drug regulatory agencies have attempted to produce harmonization among regulations in various parts of the world. Harmonization, which aims to make regulations and guidelines more uniform, theoretically can decrease the cost of new drugs by decreasing the cost of development and regulatory approval. Because every new drug is somewhat different from preexisting ones, unforeseen safety or efficacy issues may arise during the regulatory review. Some of these issues may prompt an individual regulatory agency to require additional safety or efficacy studies. Thus, agreements on harmonization of regulations and guidelines can be more complicated and difficult to achieve than may seem to be the case.

      The following sections describe in general terms the steps required for regulatory approval of drugs in one country—the United States. Although the descriptions are based on the Food and Drug Administration (FDA) regulations and guidelines, these requirements are similar to those in many other countries.

Drug applications

The Investigational New Drug application
      Two important written documents are required from a pharmaceutical firm seeking regulatory approval from the U.S. FDA (Food and Drug Administration). The first is the Investigational New Drug (IND) application. The IND is required for approval to begin studies of a new drug in humans. Clinical trials for new drugs are conducted prior to marketing as part of the development process. The purpose of these trials is to determine if newly developed drugs are safe and effective in humans. Pharmaceutical companies provide selected physicians with developmental drugs to be studied in their patients. These physicians recruit patients, provide them with the study drug, evaluate the effect of the drug on their disease, and record observations and clinical data.

      There are three phases—designated Phase 1, Phase 2, and Phase 3—of human clinical studies required for drug approval and marketing. Phase 1 studies describe the first use of a new drug in humans. These studies are designed to determine the pharmacological and pharmacokinetic profile of the drug and to assess the adverse effects associated with increasing drug doses. Phase 1 studies provide important data to allow for the design of scientifically sound Phase 2 and Phase 3 studies. Phase 1 studies generally enroll 20–200 subjects who either are healthy or are patients with the disease that the drug is intended to treat. Phase 2 studies are designed primarily to assess the efficacy of the drug in the disease to be treated, although some data on adverse events or toxicities may also be collected. Phase 2 studies usually enroll several hundred patients. Phase 3 studies enroll several hundred to several thousand patients and are designed to collect data concerning both adverse events and efficacy. When these data have been collected and analyzed, a judgment can be made about whether the drug should be marketed and if there should be specific restrictions on its use. An IND should contain information about the chemical makeup of the drug and the dosage form, summaries of animal pharmacology and toxicology studies, pharmacokinetic data, and information about any previous clinical investigations. Typically, Phase 1 protocols (descriptions of the trials to be conducted) are briefer and less detailed than Phase 2 and Phase 3 protocols.

      Prior to its regulatory approval, a drug is generally restricted to use in patients who are formally enrolled in a clinical trial. In some cases a drug that has not yet been approved for marketing can be made available to patients with a life-threatening disease for whom no satisfactory alternative treatment is available. If the patient is not enrolled in one of the clinical trials, the drug can be made available under what is called a Treatment IND. A Treatment IND, which has sometimes been called a compassionate use protocol, is subject to regulatory requirements very similar to those of a regular IND.

The New Drug Application
      The second important regulatory document required by the FDA (Food and Drug Administration) is the New Drug Application (NDA). The NDA contains all of the information and data that the FDA requires for market approval of a drug. Depending on the intended use of the drug (one-time use or long-term use) and the risk associated with its intended use, INDs may be from tens to hundreds of pages long. In contrast, NDAs typically are much larger and much more detailed. In some instances they can represent stacks of documents up to several metres high. Basically, an NDA is a detailed and comprehensive report on what is known about the new drug under review. It contains technical sections on (1) chemistry, manufacturing, and dosage forms, (2) animal pharmacology and toxicology, (3) human pharmacokinetics and bioavailability, (4) comprehensive results of clinical trials, (5) statistics, and (6) microbiology (in the case of anti-infective or antiviral drugs).

      Another important NDA component is the proposed labeling for the new drug. The label of a prescription drug is actually a comprehensive summary of information made available to health care providers. It contains the claims that the pharmaceutical company wants to make for the efficacy and safety of the drug. As part of the review process, the company and the FDA negotiate the exact wording of the label because it is the document that determines what claims the company legally can make for use of the drug once it is marketed.

Safety testing in animals (animal)
      A number of safety tests are performed on animals, prior to clinical trials in humans, in order to select the most suitable lead chemical and dosage form for drug development. The safety tests can include studies of acute toxicity, subacute and chronic toxicity, carcinogenicity, reproductive and developmental toxicity, and mutagenicity.

Toxicity tests
      In acute toxicity studies, a single large or potentially toxic dose of the drug is administered to animals via the intended route of human administration, and the animals are observed for one to four weeks, depending on the drug. At the end of the observation period, organ and tissue toxicities are evaluated. Acute toxicity studies generally are required to be carried out in two mammalian species prior to beginning any Phase 1 (safety) study in humans. Subchronic toxicity studies (up to three months) and chronic toxicity studies (longer than three months) require daily drug administration and usually do not start until after Phase 1 studies are completed. This is because the drug may be withdrawn after Phase 1 testing and because data on the effect of the drug in humans may be important for the design of longer-duration animal studies. When these studies are required, they are conducted in two mammalian species and are designed to allow for detection of neurological, physiological, biochemical, and hematological abnormalities occurring during the course of the study. Organ and tissue toxicity and pathology are evaluated when the studies are terminated.

      The number and type of animal safety tests required varies with the intended duration of human use of the drug. If the drug is to be used for only a few days in humans, acute and subacute animal toxicity studies may be all that is required. If the human drug use is for six months or longer, animal toxicity studies of six months or more may be required before the drug is marketed. Carcinogenicity (carcinogen) (potential to cause cancer) studies are generally required if humans will use the drug for longer than six months. They usually are conducted concurrently with Phase 3 (large-scale safety and efficacy) clinical trials but may begin earlier if there is reason to suspect that the drug is a carcinogen.

Teratogenicity and mutagenicity tests
      If a drug is intended for use during pregnancy or in women of childbearing potential, animal reproductive and developmental toxicity studies are indicated. These studies include tests that evaluate male and female fertility, embryonic and fetal death, and teratogenicity (induction of severe birth defects). Also evaluated are the integrity of the lactation process and the quality of care for her young provided by the mother.

      Genetic toxicity, or mutagenicity (mutagen), studies have become an integral component of regulatory requirements. Since no one mutagenicity test can evaluate all types of genetic toxicity, two or three tests are usually performed. Typical mutagenicity tests include a bacterial point mutation test (the Ames test), a chromosomal aberrations test in mammalian cells in vitro, and an in vivo (intact animals) test.

Biopharmaceutical studies

Pharmacokinetic investigation
      In addition to the animal toxicity studies outlined above, biopharmaceutical studies are required for all new drugs. The chemical makeup of the drug and the dosage form of the drug to be used in trials must be described. The stability of the drug in the dosage form and the ability of the dosage form to release the drug appropriately have to be evaluated. Bioavailability (how completely the drug is absorbed from its dosage form) and pharmacokinetic studies in animals and humans also have become important to include in a drug development plan. Pharmacokinetics is the study of the rates and extent of drug absorption, distribution within the body, metabolism, and excretion. Pharmacokinetic studies give investigators information about how often a drug should be taken to achieve adequate blood levels. The metabolism and excretion data can also provide clues about whether a new drug will interact with other drugs a patient may be taking. For example, if two drugs are inactivated (metabolized or excreted) via the same biological process, one or even both of the drugs might have its sojourn in the body prolonged, resulting in increased blood levels and increased toxicity. Conversely, some drugs induce the metabolism and shorten the body sojourn of other drugs, resulting in blood levels inadequate to produce the desired pharmacological effect.

Dosage form development
       Administration of drugsDrugs are rarely administered to a patient solely as a pure chemical entity. For clinical use they are almost always administered as a formulation designed to deliver the drug in a manner that is safe, effective, and acceptable to the patient. One of the most important objectives of dosage form design is to produce a product that will achieve a predictable and reliable therapeutic response. The dosage form must also be suitable for manufacture on a large scale with reproducible quality. The table (Administration of drugs) shows routes of drug administration and common dosage forms.

Tablets
      Tablets are by far the most common dosage form. Normally, they are intended for the oral or the sublingual routes of administration. They are made by compressing powdered drug along with various excipients in a tablet press. Excipients are more or less inert substances added to the powdered drug in order to (1) facilitate the tablet-making process, (2) bind the tablet together so it will not break apart during shipping and handling, (3) facilitate dissolution after the tablet has been consumed, (4) enhance appearance and patient acceptance, and (5) allow for identification. Frequently, the active ingredient makes up a relatively small percentage of the weight of a tablet. Tablets with two or three milligrams of active drug may weigh several hundred milligrams. Tablets for oral administration may be coated with inert substances such as wax. Uncoated tablets have a slight powdery appearance and feel at the tablet surface. Coatings usually produce a tablet with a smooth, shiny appearance and decrease the likelihood that the patient will taste the tablet contents when the tablet is in the mouth before swallowing. Enteric coated tablets have a coating that is designed not to dissolve in the acidic environment of the stomach but to pass through the stomach into the small intestine prior to the beginning of dissolution. Sublingual tablets generally do not have a coating and are designed so that they will dissolve when placed under the tongue.

      Tablets are traditionally referred to as pills. Prior to the widespread use of the machine-compressed tablet, pills were very popular products that usually were prepared by a pharmacist. To make a pill, powdered drug and excipients were mixed together with water or other liquid and a gumlike binding agent such as acacia or tragacanth. The mixture was made into a plastic mass and rolled into a tube. The tube was cut into small sections that were rolled to form spheres, thereby making pills. Pills fell into disfavour because they are more expensive to make than tablets or capsules and because the amount of drug released from pills varies more than from tablets or capsules.

Capsules
      Capsules are another common oral dosage form. Like tablets, capsules almost always contain inert ingredients to facilitate manufacture. There are two general types of capsules—hard gelatin capsules and soft gelatin capsules. Hard gelatin capsules are by far the most common type. They can be filled with powder, granules, or pellets. In some cases they are filled with a small capsule plus powder or a small tablet plus powder. Typically, the small internal capsule or tablet contains one or more of the active ingredients. Soft gelatin capsules may contain a liquid or a solid. Both hard and soft gelatin capsules are designed to mask unpleasant tastes.

Other solid dosage forms
      Other solid dosage forms include powders, lozenges, and suppositories. Powders are mixtures of active drug and excipients that usually are sold in the form of powder papers. The powder is contained inside a folded and sealed piece of special paper. Lozenges usually consist of a mixture of sugar and either gum or gelatin, which are compressed to form a solid mass. Lozenges are designed to release drug while slowly dissolving in the mouth. Suppositories are solid dosage forms designed for introduction into the rectum or vagina. Typically, they are made of substances that melt or dissolve at body temperature, thereby releasing the drug from its dosage form.

Liquid dosage forms
      Liquid dosage forms are either solutions or suspensions of active drug in a liquid such as water, alcohol, or other solvent. Since liquid dosage forms for oral use bring the drug and vehicle into contact with the mouth and tongue, they often contain various flavours and sweeteners (sweetener) to mask unpleasant tastes. They usually also require sterilization or addition of preservatives (preservative) to prevent contamination or degradation. Syrups are water-based solutions of drug containing high concentrations of sugar. They usually also contain added flavours and colours. Some syrups contain up to 85 percent sugar on a weight-to-volume basis. Elixirs are sweetened hydro-alcoholic (water and alcohol) liquids for oral use. Typically, alcohol and water are used as solvents when the drug will not dissolve in water alone. In addition to active drug, they usually contain flavouring and colouring agents to improve patient acceptance.

      Since some drugs will not dissolve in solvents suitable for medicinal use, they are made into suspensions. Suspensions consist of a finely divided solid dispersed in a water-based liquid. Like solutions and elixirs, suspensions often contain preservatives, sweeteners, flavours, and dyes to enhance patient acceptance. They frequently also contain some form of thickening or suspending agent to decrease the rate at which the suspended drug settles to the container bottom. Emulsions (emulsion) consist of one liquid suspended in another. Oil-in-water emulsions will mix readily with water-based liquids, while water-in-oil emulsions mix more easily with oils. milk is a common example of an oil-in-water emulsion. In order to prevent the separation of the two liquids, most pharmaceutical emulsions contain a naturally occurring emulsifying agent such as cholesterol or tragacanth or a synthetic emulsifying agent such as a nonionic detergent. Antimicrobial agents may also be included in emulsions in order to prevent the growth of microorganisms in the aqueous phase. Emulsions are created using a wide variety of homogenizers, agitators, or sonicators.

Semisolid dosage forms
      Semisolid dosage forms include ointments and creams. Ointments are preparations for external use, intended for application to the skin. Typically, they have an oily or greasy consistency and can appear “stiff” as they are applied to the skin. Ointments contain drug that may act on the skin or be absorbed through the skin for systemic action. Many ointments are made from petroleum jelly. Like many other pharmaceutical preparations, they frequently contain preservatives and may also contain aromatic substances and dyes to enhance patient acceptance. Although there is generally no agreed-upon pharmaceutical definition for creams, they are very much like ointments in their use. Their composition is somewhat like that of ointments except that creams often have water-in-oil emulsions as the base of the formulation. When applied to the skin, creams feel soft and supple and spread easily.

Specialized dosage forms
      Specialized dosage forms of many types exist. Sprays are most often used to irrigate nasal passages or to introduce drugs into the nose. Most nasal sprays are intended for treatment of colds or respiratory tract allergies (allergy). They contain medications designed to relieve nasal congestion and to decrease nasal discharges. Aerosols (aerosol) are pressurized dosage forms that are expelled from their container (aerosol container) upon activation of a release valve. Aerosol propellants typically are compressed, liquefied volatile gases. Other aerosol ingredients are either suspended or dissolved in the propellant. When the release valve is activated, the liquid is expelled into the air at atmospheric pressure. This causes the propellant to vaporize, leaving very finely subdivided liquid or solid particles dispersed in the vaporized propellant. Some aerosols are intended for delivery of substances such as local anesthetics (anesthetic), disinfectants (disinfectant), and spray-on bandages to the skin. Metered-dose aerosols typically are used to deliver calibrated doses of drug to the respiratory tract. Usually, the metered-dose aerosol or inhaler is placed in the mouth for use. When the release valve is activated, a predetermined dose of drug is expelled. The patient inhales the expelled drug, delivering it to the bronchial airways. Patches are dosage forms intended to deliver drug across the skin and are placed on the skin much like a self-adhesive bandage. The patch is worn for a predetermined length of time in order to deliver the correct amount of drug to the systemic circulation.

Modified-release dosage forms
      Modified-release dosage forms have been developed to deliver drug to the part of the body where it will be absorbed, to simplify dosing schedules, and to assure that concentration of drug is maintained over an appropriate time interval. One type of modified-release dosage form is the enteric coated tablet. Enteric coating prevents irritation of the stomach by the drug and protects the drug from stomach acid. Most modified-release dosage forms are tablets and capsules designed to deliver drug to the circulating blood over an extended time period. A tablet that releases its drug contents immediately may need to be taken as many as four or six times a day to produce the desired blood-concentration level and therapeutic effect. Such a drug might be formulated into an extended-release dosage form so that the modified tablet or capsule need be taken only once or twice a day. Repeat-action tablets are one type of extended-release dosage form. They usually contain two single doses of medication, one for immediate release and one for delayed release. Typically, the immediately released drug comes from the exterior portion of the tablet, with the delayed release coming from the interior portion. Essentially, there is a tablet within a tablet, with the interior tablet having a coating that delays release of its contents for a predetermined time.

      An additional type of extended-release dosage form is accomplished by incorporating coated beads or granules into tablets or capsules. Drug is distributed onto or into the beads. Some of the granules are uncoated for immediate release while others receive varying coats of lipid, which delays release of the drug. Another variation of the coated bead approach is to granulate the drug and then microencapsulate some of the granules with gelatin or a synthetic polymer. Microencapsulated granules can be incorporated into a tablet or capsule with the release rate for the drug being determined by the thickness of the coating. Embedding drug into a slowly eroding hydrophilic matrix can also allow for sustained release. As the tablet matrix hydrates in the intestine, it erodes and the drug is slowly released. Another type of sustained release is produced by embedding drug into an inert plastic matrix. To accomplish this, drug is mixed with a polymer powder that forms a solid matrix when the tablet is compressed by a tablet machine. The drug leaches out of the matrix as the largely intact tablet passes through the gastrointestinal tract. Drug may be adsorbed onto ion exchange resins (ion-exchange resin) in order to bring about sustained release. For example, a cationic, or positively charged, drug can be bound to an anionic, or negatively charged, resin. The resin can be incorporated into tablets, capsules, or liquids. As the resin passes through the small intestine, the drug is released slowly.

Parenteral dosage forms
      Parenteral dosage forms are intended for administration as an injection or infusion. Common injection types are intravenous (into a vein), subcutaneous (under the skin), and intramuscular (into muscle). Infusions typically are given by intravenous route. Parenteral dosage forms may be solutions, suspensions, or emulsions, but they must be sterile. If they are to be administered intravenously, they must readily mix with blood.

Radiopharmaceuticals
      Radioactive dosage forms, or radiopharmaceuticals, are substances that contain one or more radioactive (radioactive isotope) atoms and are used for diagnosis or treatment of disease. In some cases the radioactive atoms are incorporated into a larger molecule. The larger molecule helps to direct the radioactive atoms to the organ or tissue of interest. In other cases the diagnostic or therapeutic molecule is the radioactive atom itself. For example, radioactive iodine, such as iodine-131, can be used in thyroid (thyroid gland) studies, and radioactive gases, such as xenon-133, can be used for lung function studies. However, more often than not, the radioactive atom allows detection or imaging of the tissue of interest, and the physiological or pharmacological properties of the larger molecule direct the radiopharmaceutical to the target tissue. For diagnostic purposes, radiopharmaceuticals are administered in amounts as small as possible so as not to perturb the biological process being evaluated in the diagnosis. For therapeutic purposes, such as treatment of various types of cancer, it is the radiation produced by the radioactive atom that kills the tumour cells. As is the case for many diagnostic agents, the pharmacological effect produced by the larger molecule, into which the radioactive atom is incorporated, is of little or no consequence for the therapeutic effect of the radiopharmaceutical. Many authorities believe that monoclonal antibodies (monoclonal antibody) will become powerful tools for directing radiopharmaceuticals to specific tumours, thereby revolutionizing the treatment of cancer.

Obstacles in drug development

Adverse reactions
      Adverse drug events are unanticipated or unwanted effects of drugs. In general, adverse drug reactions are of two types, dose-dependent and dose-independent. When any drug is administered in sufficiently high dose, many individuals will experience a dose-dependent drug reaction. For example, if a person being treated for high blood pressure ( hypertension) accidentally takes a drug dose severalfold higher than prescribed, this person will probably experience low blood pressure ( hypotension), which could result in light-headedness and fainting. Other dose-dependent drug reactions occur because of biological variability. For a variety of reasons, including heredity, coexisting diseases, and age, different individuals can require different doses of a drug to produce the same therapeutic effect. A therapeutic dose for one individual might be a toxic dose in another. Many drugs are metabolized and inactivated in the liver, whereas others are excreted by the kidney. In some patients with liver or kidney disease, lower doses of drugs may be required to produce appropriate therapeutic effects. Elderly (old age) individuals often develop dose-related adverse effects in response to doses that are well tolerated in younger individuals. This is because of age-related changes in body composition and organ function that alter the metabolism and response to drugs.

      The fetus is also susceptible to the toxic effects of drugs that cross the placental barrier from the pregnant mother. Body organs begin to develop during the first three months of pregnancy (first trimester). Some drugs will cause teratogenicity in the fetus if they are administered to the mother during this period. Drugs given to the mother during the second and third trimester can also affect the fetus by altering the function of normally formed organs or tissues. Fortunately, very few drugs cause teratogenicity in humans, and many of those that do are detected in animal teratology studies during drug development. However, animal teratogenicity screens are not perfect predictors of all human effects, so there remains some potential of drug-induced birth defects (congenital disorder).

      Dose-independent adverse reactions are less common than dose-dependent ones. They are generally caused by allergic reactions to the drug or in some cases to other ingredients present in the dosage form. They occur in patients who were sensitized by a previous exposure to the drug or to another chemical with cross-antigenicity (antigen) to the drug. Dose-independent adverse reactions can range from mild rhinitis or dermatitis to life-threatening respiratory difficulties, blood abnormalities, or liver dysfunction.

Postmarketing adverse drug events
      Although there may have been several thousand patients enrolled in Phase 1, 2, and 3 clinical trials, some adverse drug events may not be identified before the drug is marketed. For example, if 3,000 patients participated in the clinical trials and an unforeseen adverse event occurs only once in 10,000 patients, it is unlikely that the unforeseen adverse event will have been identified during the clinical trials. Thus, postmarketing adverse-event data are collected and evaluated by the FDA (Food and Drug Administration). The pharmaceutical company is responsible for reporting adverse drug events to the FDA on a regularly scheduled basis. There have been many examples of serious adverse drug events that were not identified until the drug was marketed and available to the population as a whole.

      Identifying adverse drug events is not always easy or straightforward. For example, the FDA may receive a few reports of fever or hepatitis (liver inflammation) associated with use of a new drug. Both fever and hepatitis can occur in the absence of any drug. If either occurs at the same time someone is taking a new drug, it is not always easy or even possible to say whether the event was caused by the drug. There are established procedures that can help determine whether the adverse event is related in a cause-and-effect manner with the drug use. If one stops taking the drug and the adverse event disappears, this suggests the event may be related to use of the drug. If the adverse event reappears when the drug is re-administered, this provides even more evidence that the two events are related. However, for serious adverse events, it is often not advisable to reintroduce a drug suspected of causing the event. Because of difficulties in associating adverse events with a causative agent, these drug-induced adverse events sometimes go unrecognized for a long period of time. There have been instances when pharmaceutical manufacturers and the FDA have been criticized for failing to warn the public about an adverse drug event early enough. In some circumstances the manufacturer and the FDA had suspected that an adverse event might be caused by a drug, but they did not have sufficient data to connect the drug and the event with reasonable accuracy. This issue can be particularly difficult if the drug in question helps severely ill patients, since premature or incorrect reporting of an adverse event may result in a drug being withheld from patients who are in great need of treatment.

Drug interactions
      Drug interactions occur when one drug alters the pharmacological effect of another drug. The pharmacological effect of one or both drugs may be increased or decreased, or a new and unanticipated adverse effect may be produced. Drug interactions may result from pharmacokinetic interactions (absorption, distribution, metabolism, and excretion) or from interactions at drug receptors (receptor).

      Interactions during drug absorption may lower the amount of drug absorbed and decrease therapeutic effectiveness. One such interaction occurs when the antibiotic tetracycline is taken along with substances such as milk or antacids (antacid), which contain calcium, magnesium, or aluminum ions. These metal ions bind with tetracycline and produce an insoluble product that is very poorly absorbed from the gastrointestinal tract. In addition, drug interactions may affect drug distribution, which is determined largely by protein binding. Many drugs are bound to proteins (protein) in the blood. If two drugs bind to the same or adjacent sites on the proteins, they can alter the distribution of each other within the body.

      Interactions of drugs during drug metabolism can alter the activation or inactivation of many drugs. One drug can decrease the metabolism of a second drug by inhibiting metabolic enzymes (enzyme). If metabolism of a drug is inhibited, it will remain longer in the body, so that its concentration will increase if it continues to be taken. Some drugs can increase the formation of enzymes that metabolize other drugs. Increasing the metabolism of a drug can decrease its body concentration and its therapeutic effect. Drugs can also interact by binding to the same receptor. Two agonists or two antagonists would intensify each other's actions, whereas an agonist and an antagonist would tend to diminish each other's pharmacological effects. In some interactions, drugs may produce biochemical changes that alter the sensitivity to toxicities produced by other drugs. For example, thiazide diuretics (diuretic) can cause a gradual decrease in body potassium, which in turn may increase the toxicity of cardiac drugs like digitalis. Finally, in the case of drugs excreted by the kidney, one drug may alter kidney function in such a manner that the excretion of another drug is increased or decreased.

      While it is important to recognize that drug interactions can cause many adverse effects, it is also important to point out that there are a number of therapeutically beneficial drug interactions. For example, thiazide diuretics (which cause potassium loss) can interact with other diuretics that cause potassium retention in such a way that the combination has no significant impact on body potassium. cancer chemotherapeutic (chemotherapy) agents are often given in combination because cellular interactions (such as inhibiting cell replication and promoting apoptosis) among the drugs cause more cancer cell death. Antihypertensive drugs are often given in combination because some of the side effects produced by one drug are overcome by the actions of the other. These are just a few of the many examples of beneficial drug interactions.

Drug patents (patent)
      Most governments grant patents (patent) to pharmaceutical firms. The patent allows the firm to be the only company to market the drug in the country issuing the patent. During the life of the patent, the patented drug will have no direct market competition. This allows the pharmaceutical company to charge higher prices for the product so that it can recover the cost of developing the drug. Virtually all drugs have brand names created by the companies that develop them. All drugs also have generic names. After the patent has expired, other companies may market the drug under its generic name or under another brand name. In addition, the price of the patented drug usually decreases when a patent expires because of competition from other companies that begin marketing a generic version of the drug. The cost of developing a generic version of a drug for market is significantly less than the cost of developing the patented drug, since many of the studies required for first regulatory approval of a drug are not required for marketing approval for subsequent generic versions. Essentially, the only requirement is to demonstrate that the new version is biologically equivalent to the already approved drug. Bioequivalent drug products have the same rate and extent of absorption and produce the same blood concentration of drug when the two drugs are given in the same dose and in the same dosage form.

John W. Dailey

Additional Reading
Albert S. Lyons and R. Joseph Petrucelli II, Medicine: An Illustrated History (1978, reprinted in 1987), provides a historical account of important developments in medicine and pharmacy through the 20th century. John C. Krantz, Jr., Historical Medical Classics Involving New Drugs (1974), presents a series of short stories about important drug developments with emphasis on the individuals primarily responsible for those developments. Jordan Goodman and Vivien Walsh, The Story of Taxol: Nature and Politics in the Pursuit of an Anti-cancer Drug (2001), describes how taxol was discovered, developed, manufactured, and marketed. Ramakrishna Seethala and Prabhavathi B. Fernandes (eds.), Handbook of Drug Screening (2001), provides details concerning the drug screening processes used by the pharmaceutical industry. Richard A. Guarino (ed.), New Drug Approval Process: The Global Challenge, 3rd ed. (2000), describes how to develop new drugs and obtain regulatory approval in global markets. Michael E. Aulton (ed.), Pharmaceutics: The Science of Dosage Form Design, 2nd ed. (2002), provides a detailed description of dosage forms and how they are manufactured. Sauwakon Ratanawijitrasin and Eshetu Wondemagegnehu, Effective Drug Regulation: A Multicountry Study (2002), compares and summarizes drug regulation in representative countries around the world.John W. Dailey

* * *


Universalium. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Pharmaceutical industry — The pharmaceutical industry develops, produces, and markets drugs licensed for use as medications.[1] Pharmaceutical companies are allowed to deal in generic and/or brand medications and medical devices. They are subject to a variety of laws and… …   Wikipedia

  • Pharmaceutical industry in China — The pharmaceutical industry is one of the leading industries in China, covering synthetic chemicals and drugs, prepared Chinese medicines, medical devices, apparatus and instruments, hygiene materials, packing materials, and pharmaceutical… …   Wikipedia

  • Pharmaceutical industry in the People's Republic of China — The pharmaceutical industry is one of the leading industries in People s Republic of China, covering synthetic chemicals and drugs, prepared Chinese medicines, medical devices, apparatus and instruments, hygiene materials, packing materials, and… …   Wikipedia

  • Pharmaceutical industry in Bangladesh — This article describes the history and status of the pharmaceutical or drug industry in Bangladesh and how it relates and compares with the global pharmaceutical industry.The rise of the pharmaceutical sector of BangladeshIn Bangladesh the… …   Wikipedia

  • Pharmaceutical Industry ETF — A sector following fund that invests in developers and manufacturers of pharmaceuticals, with the objective of matching the investment performance of an underlying pharmaceutical index. Since it invests largely in drug companies, a pharmaceutical …   Investment dictionary

  • China Pharmaceutical Industry Association — The China Pharmaceutical Industry Association (CPIA) was founded in September 1988 and it is a non governmental, non profit pharmaceutical industry association in China. Its members are mainly from large and medium sized pharmaceutical… …   Wikipedia

  • Statisticians in the Pharmaceutical Industry — Statisticians in the Pharmaceutical Industry, abbreviated to PSIref|PSI, is an organisation for the promotion of statistical thinking in order to improve the quality of research and development in the pharmaceutical industry.PSI is a non profit… …   Wikipedia

  • Association of the British Pharmaceutical Industry — The Association of the British Pharmaceutical Industry (ABPI) is the trade association for companies in the UK producing prescription medicines. This is the British equivalent of PhRMA, however the member companies research, develop, manufacture… …   Wikipedia

  • Pharmaceutical engineering — is a branch of pharmaceutical science and technology that involves development and manufacturing of products, processes, and components in the pharmaceuticals industry (i.e. drugs biologics). While developing pharmaceutical products involves many …   Wikipedia

  • Pharmaceutical statistics — is the application of statistics to matters concerning the pharmaceutical industry. This can be from issues of design of experiments, to analysis of drug trials, to issues of commercialisation of a medicine.There are many professional bodies… …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”