Skip to Content

Found 364 Resources

Precautions Taken Due to Flu Epidemic

Smithsonian Archives - History Div
Smithsonian Institution Archives, Record Unit 157, Smithsonian Institution, Buildings Management Department, Records, 1881-1973, Box 54, Folder: General, 1881-1945; Hygienic and Sanitary Conditions

In response to the influenza epidemic raging across the globe, H. S. Mustard, the Medical Officer in Charge, District of Columbia Sanitary Zone, Bureau of Public Health Service, Treasury Department, issues an edict requiring stringent sanitary measures in the National Museum. The buildings are to be thoroughly ventilated before employees arrive each morning. Twice each day, at noon and 2:30 pm, all employees are to take a recess and remain outdoors for fifteen minutes. The buildings are to be thoroughly ventilated while they are outside. All telephone transmitters must be disinfected twice each day. Gauze masks, covering the nose and mouth, are to be worn by all employees who have direct contact with the pubic. The order was cancelled on November 13, 1918.

Collecting an Epidemic: The AIDS Memorial Quilt

National Museum of American History

How Advertising Shaped the First Opioid Epidemic

Smithsonian Magazine

When historians trace back the roots of today’s opioid epidemic, they often find themselves returning to the wave of addiction that swept the U.S. in the late 19th century. That was when physicians first got their hands on morphine: a truly effective treatment for pain, delivered first by tablet and then by the newly invented hypodermic syringe. With no criminal regulations on morphine, opium or heroin, many of these drugs became the "secret ingredient" in readily available, dubiously effective medicines.

In the 19th century, after all, there was no Food and Drug Administration (FDA) to regulate the advertising claims of health products. In such a climate, a popular so-called “patent medicine” market flourished. Manufacturers of these nostrums often made misleading claims and kept their full ingredients list and formulas proprietary, though we now know they often contained cocaine, opium, morphine, alcohol and other intoxicants or toxins.

Products like heroin cough drops and cocaine-laced toothache medicine were sold openly and freely over the counter, using colorful advertisements that can be downright shocking to modern eyes. Take this 1885 print ad for Mrs. Winslow’s Soothing Syrup for Teething Children, for instance, showing a mother and her two children looking suspiciously beatific. The morphine content may have helped.

Image by NIH National Library of Medicine. 1885 advertisement for Mrs. Winslow's Soothing Syrup. This product was for teething children and contained morphine. (original image)

Image by NIH National Library of Medicine. Published in Mumbles Railway Publishing, 19th century. (original image)

Yet while it’s easy to blame patent medicines and American negligence for the start of the first opioid epidemic, the real story is more complicated. First, it would be a mistake to assume that Victorian era Americans were just hunky dory with giving infants morphine syrup. The problem was, they just didn’t know. It took the work of muckraking journalists such as Samuel Hopkins Adams, whose exposé series, “The Great American Fraud” appeared in Colliers from 1905 to 1906, to pull back the curtain.

But more than that, widespread opiate use in Victorian America didn’t start with the patent medicines. It started with doctors.

The Origins of Addiction

Patent medicines typically contained relatively small quantities of morphine and other drugs, says David Herzberg, a professor of history at SUNY-University at Buffalo. “It’s pretty well recognized that none of those products produced any addiction,” says Herzberg, who is currently writing a history of legal narcotics in America.

Until the Harrison Narcotics Act of 1914, there were no federal laws regulating drugs such as morphine or cocaine. Moreover, even in those states that had regulations on the sale of narcotics beginning in the 1880s, Herzberg notes that “laws were not part of the criminal code, instead they were part of medical/pharmacy regulations.”

The laws that existed weren't well-enforced. Unlike today, a person addicted to morphine could take the same “tattered old prescription” back to a compliant druggist again and again for a refill, says David Courtwright, a historian of drug use and policy at the University of North Florida.

And for certain ailments, patent medicines could be highly effective, he adds. “Quite apart from the placebo effect, a patent medicine might contain a drug like opium,” says Courtwright, whose book Dark Paradise: A History of Opiate Addiction in America, provides much of the original scholarship in this area. “If buyers took a spoonful because they had, say, a case of the runs, the medicine probably worked.” (After all, he points out, “opium is a constipating agent.”)

Patent medicines may not have been as safe as we would demand today or live up to claims of panacea, but when it came to coughs and diarrhea, they probably got the job done. “Those drugs are really famous, and they do speak to a time where markets were a little bit out of control,” Herzberg says. "But the vast majority of addiction during their heyday was caused by physicians.”

From handbills and pamphlets advertising glyco-heroin 1900-1920, from the College of Physicians of Philadelphia's collection of medical trade ephemera. (Historical Medical Library, College of Physicians of Philadelphia)

Marketing to Doctors

For 19th century physicians, cures were hard to come by. But beginning in 1805, they were handed a way to reliably make patients feel better. That’s the year German pharmacist Friedeich Serturner isolated morphine from opium, the first “opiate” (the term opioid once referred to purely synthetic morphine like drugs, Courtwright notes, before becoming a catchall covering even those drugs derived from opium).

Delivered by tablet, topically and, by mid-century, through the newly invented hypodermic syringe, morphine quickly made itself indispensable. Widespread use by soldiers during the Civil War also helped trigger the epidemic, as Erick Trickey reports in Smithsonian.com. By the 1870s, morphine became something of “a magic wand [doctors] could wave to make painful symptoms temporarily go away,” says Courtwright.

Doctors used morphine liberally to treat everything from the pain of war wounds to menstrual cramps. “It’s clear that that was the primary driver of the epidemic,” Courtwright says. And 19th century surveys Courtwright studied showed most opiate addicts to be female, white, middle-aged, and of “respectable social background”—in other words, precisely the kind of people who might seek out physicians with the latest tools.

Industry was quick to make sure physicians knew about the latest tools. Ads for morphine tablets ran in medical trade journals, Courtwright says, and, in a maneuver with echoes today, industry sales people distributed pamphlets to physicians. The College of Physicians of Philadelphia Historical Medical Library has a collection of such “medical trade ephemera” that includes a 1910 pamphlet from The Bayer Company titled, “The Substitute for the Opiates.”

The substitute? Heroin hydrochloride, at the time a new drug initially believed to be less addictive than morphine. Pamphlets from the Antikamnia Chemical Company, circa 1895 show an easy cheat sheet catalog of the company’s wares, from quinine tablets to codeine and heroin tablets.

(College of Physicians of Philadelphia's Historical Medical Library)

Physicians and pharmacists were the key drivers in increasing America's per capita consumption of drugs like morphine by threefold in the 1870s and 80s, Courtwright writes in a 2015 paper for the New England Journal of Medicine. But it was also physicians and pharmacists who ultimately helped bring the crisis back under control.

In 1889, Boston physician James Adams estimated that about 150,000 Americans were "medical addicts": those addicted through morphine or some other prescribed opiate rather than through recreational use such as smoking opium. Physicians like Adams began encouraging their colleagues to prescribe “newer, non-opiate analgesics,” drugs that did not lead to depression, constipation and addiction.

“By 1900, doctors had been thoroughly warned and younger, more recently trained doctors were creating fewer addicts than those trained in the mid-nineteenth century,” writes Courtwright.

This was a conversation had between doctors, and between doctors and industry. Unlike today, drug makers did not market directly to the public and took pride in that contrast with the patent medicine manufacturers, Herzberg says. “They called themselves the ethical drug industry and they would only advertise to physicians.”

But that would begin to change in the early 20th century, driven in part by a backlash to the marketing efforts of the 19th century patent medicine peddlers.

"San Diego lynx bares its fangs vigorously when zoo veterinarian is near cage, vet says it acts this way because it fears his hypodermics," reads the first photo caption for this Librium advertisement. "Tranquil as a tabby," says the second. (LIFE Magazine)

Marketing to the Masses

In 1906, reporting like Adams’ helped drum up support for the Pure Food and Drug Act. That gave rise to what would become the Food and Drug Administration, as well as the notion that food and drug products should be labeled with their ingredients so consumers could make reasoned choices.

That idea shapes federal policy right up until today, says Jeremy Greene, a colleague of Herzberg’s and a professor of the history of medicine at Johns Hopkins University School of Medicine: “That path-dependent story is part of the reason why we are one of the only countries in the world that allows direct-to-consumer advertising," he says.

At the same time, in the 1950s and 60s, pharmaceutical promotion became more creative, coevolving with the new regulatory landscape, according to Herzberg. As regulators have set out the game, he says, “Pharma has regularly figured out how to play that game in ways that benefit them.

Though the tradition of eschewing direct marketing to the public continued, advertising in medical journals increased. So, too, did more unorthodox methods. Companies staged attention-grabbing gimmicks, such as Carter Products commissioning Salvador Dali to make a sculpture promoting its tranquilizer, Miltown, for a conference. Competitor Roche Pharmaceuticals invited reporters to watch as its tranquilizer Librium was used to sedate a wild lynx.

Alternatively, some began taking their messaging straight to the press.

“You would feed one of your friendly journalists the most outlandishly hyped-up promise of what your drug could do,” Greene says. “Then there is no peer review. There is no one checking to if see it’s true; it’s journalism!” In their article, Greene and Herzberg detail how ostensibly independent freelance science journalists were actually on the industry payroll, penning stories about new wonder drugs for popular magazines long before native advertising became a thing.

One prolific writer, Donald Cooley, wrote articles with headlines such as “Will Wonder Drugs Never Cease!” for magazines like Better Homes and Garden and Cosmopolitan. “Don’t confuse the new drugs with sedatives, sleeping pills, barbiturates or a cure,” Cooley wrote in an article titled “The New Nerve Pills and Your Health.” “Do realize they help the average person relax.”

As Herzberg and Greene documented in a 2010 article in the American Journal of Public Health, Cooley was actually one of a stable of writers commissioned by the Medical and Pharmaceutical Information Bureau, a public relations firm, working for the industry. In a discovery Herzberg plans to detail in an upcoming book, it turns out there is “a rich history of companies knocking at the door, trying to claim that new narcotics are in fact non-addictive” and running advertisements in medical trade journals that get swatted down by federal authorities.

A 1932 ad in the Montgomery Advertiser, for instance, teases a new “pain relieving drug, five times as potent as morphine, as harmless as water and with no habit forming qualities.” This compound, “di-hydro-mophinone-hydrochlorid” is better known by the brand name Dilaudid, and is most definitely habit forming, according to Dr. Caleb Alexander, co-director of the Center for Drug Safety and Effectiveness at Johns Hopkins.

And while it’s not clear if the manufacturer truly believed it was harmless, Alexander says it illustrates the danger credulity presents when it comes to drug development. “If it sounds too good to be true, it probably is,” he says. “It is this sort of thinking, decades later, that has driven the epidemic."

Image by A selection of contemporary ads for painkillers. (original image)

Image by A selection of contemporary ads for painkillers. (original image)

Image by A selection of contemporary ads for painkillers. (original image)

Image by A selection of contemporary ads for painkillers. (original image)

Image by A selection of contemporary ads for painkillers. (original image)

Image by A selection of contemporary ads for painkillers. (original image)

Image by A selection of contemporary ads for painkillers. (original image)

It wasn’t until 1995, when Purdue Pharma successfully introduced OxyContin, that one of these attempts was successful, says Herzberg. “OxyContin passed because it was claimed to be a new, less-addictive type of drug, but the substance itself had been swatted down repeatedly by authorities since the 1940s,” he says. OxyContin is simply oxycodone, developed in 1917, in a time-release formulation Purdue argued allowed a single dose to last 12 hours, mitigating the potential for addiction.

Ads targeting physicians bore the tagline, “Remember, effective relief just takes two.”

“If OxyContin had been proposed as a drug in 1957 authorities would have laughed and said no,” Herzberg says.

Captivating the Consumer

In 1997, the FDA changed its advertising guidelines to open the door to direct-to-consumer marketing of drugs by the pharmaceutical industry. There were a number of reasons for this reversal of more than a century of practice, Greene and Herzberg say, from the ongoing ripples of the Reagan-era wave of deregulation, to the advent of the “blockbuster” pharmaceutical, to advocacy by AIDS patients rights groups.

The consequences were profound: a surge of industry spending on print and television advertising describing non-opioid drugs to the public that hit a peak of $3.3 billion in 2006. And while ads for opioid drugs were typically not shown on television, Greene says the cultural and political shifts that made direct-to-consumer advertising possible also changed the reception to the persistent pushing of opioids by industry.

Once again, it was not the public, but physicians that were the targets of opioid marketing, and this was often quite aggressive. The advertising campaign for OxyContin, for instance, was in many ways unprecedented.

Purdue Pharma provided physicians with starter coupons that gave patients a free seven to 30-day supply of the drug . The company's sales force—which more than doubled in size from 1996 to 2000—handed doctors OxyContin-branded swag including fishing hats and plush toys. A music CD was distributed with the title “Get in the Swing with OxyContin.” Prescriptions for OxyContin for non-cancer related pain boomed from 670,000 written in 1997, to 6.2 million in 2002.

But even this aggressive marketing campaign was in many ways just the smoke. The real fire, Alexander argues, was a behind-the-scenes effort to establish a more lax attitude toward prescribing opioid medications generally, one which made regulators and physicians alike more accepting of OxyContin.

“When I was in residency training, we were taught that one needn’t worry about the addictive potential of opioids if a patient had true pain,” he says. Physicians were cultivated to overestimate the effectiveness of opioids for treating chronic, non-cancer pain, while underestimating the risks, and Alexander argues this was no accident.

Purdue Pharma funded more than 20,000 educational programs designed to promote the use of opioids for chronic pain other than cancer, and provided financial support for groups such as the American Pain Society. That society, in turn, launched a campaign calling pain “the fifth vital sign,” which helped contribute to the perception there was a medical consensus that opioids were under, not over-prescribed.

.....

Are there lessons that can be drawn from all this? Herzberg thinks so, starting with the understanding that “gray area” marketing is more problematic than open advertising. People complain about direct-to-consumer advertising, but if there must be drug marketing, “I say keep those ads and get rid of all the rest," he says, "because at least those ads have to tell the truth, at least so far as we can establish what that is.”

Even better, Herzberg says, would be to ban the marketing of controlled narcotics, stimulants and sedatives altogether. “This could be done administratively with existing drug laws, I believe, based on the DEA’s power to license the manufacturers of controlled substances.” The point, he says, would not be to restrict access to such medications for those who need them, but to subtract “an evangelical effort to expand their use.”

Another lesson from history, Courtwright says, is that physicians can be retrained. If physicians in the late 19th century learned to be judicious with morphine, physicians today can relearn that lesson with the wide array of opioids now available.

That won’t fix everything, he notes, especially given the vast black market that did not exist at the turn of the previous century, but it’s a proven start. As Courtwright puts it: Addiction is a highway with a lot of on-ramps, and prescription opioids are one of them. If we remove the billboards advertising the exit, maybe we can reduce, if not eliminate the number of travelers.

“That’s how things work in public health,” he says. “Reduction is the name of the game.”

The Science Behind Florida’s Sinkhole Epidemic

Smithsonian Magazine

There are many good reasons why The Villages is known as “Disney for Seniors.”

The largest retirement community in the world, The Villages is also one of America’s safest and most leisurely places to live. Sumter County, populated almost entirely by Villagers, is 62nd among 67 counties in Florida for violent crime—likely because the median age is 66.6, the oldest of any U.S. county. The ubiquity of gates, guard booths and mandatory visitor I.D. cards lends to the low crime. Vehicular deaths are very low, which makes sense given that Villagers commute in golf carts more than cars. The Villages is also located in the safest area of Florida for hurricanes.

But Villagers are increasingly fearful of a growing, surreal threat: the ground suddenly opening up and swallowing them whole.

“Everybody is worried,” a 10-year resident of the Village of Calumet Grove told me this March, pointing to a saucer-sized hole at his curb where sinkhole specialists drilled to check for weak spots. A month earlier, in mid-February, seven sinkholes opened across the street and into a golf course, forming a zig-zag crack across the facade of one house and causing four homes to evacuate. One is now condemned. A town hall that week attracted five times more Villagers than usual. “It’s not a good time to sell,” the elderly neighbor says, with a weary laugh. (He asked me not to use his name.)

In contrast to its otherwise serene status, The Villages is a hotbed of sinkholes. They occur more frequently in Florida than any other state, though this week we've seen them appear on Maryland roads and even in front of the White House. And The Villages is smack in the middle of Sinkhole Alley—a swath of counties in Central Florida that carry the greatest risk. (A German bakery near The Villages even sells a popular pastry called the Sinkhole.)

Typically sinkholes are no more than a headache for property owners, but when tragedy does strike, it’s the stuff of nightmares. Among the six recorded deaths from sinkholes in Florida history is Jeffrey Bush, who was sleeping in his bedroom when a sinkhole sucked him 20 feet underground. His body was never recovered.

The number of reported sinkholes in The Villages has spiked in recent years. An official with The Villages Public Safety Department told the Orlando Sentinel that residents had reported “several” sinkholes in 2016, though none affecting homes—an assessment matched by the archives of Villages-News. Ditto for 2015; in 2014 three sinkholes affected six homes.

In 2017, by stark contrast, at least 32 sinkholes were reported by that independent news site. At least eight homes were affected, plus a country club, a busy intersection, a Lowe's home improvement store, and the local American Legion post, the largest in the world. (The Daily Sun, a large newspaper owned by The Villages’ developer, reported on none of them except the one at the busy intersection, only to say the hole was “later determined not likely” a sinkhole.) In just the first three months of 2018, at least 11 sinkholes were reported by Villages-News, affecting eight homes—all before sinkhole season even started, in early spring. Four more sinkholes sprang up this week.

Golf carts parked along main street in The Villages retirement community in Central Florida. (RSBPhoto / Alamy)

That there’s such a thing as “sinkhole season,” just as there’s a “tornado season” and “hurricane season,” speaks to the many factors that contribute to the threat. Underlying all of them is the fact that Florida is built on a bedrock of carbonate, primarily limestone. That rock dissolves relatively easily in rainwater, which becomes acidic as it seeps through the soil. The resulting terrain, called “karst,” is honeycombed with cavities. When a cavity becomes too big to support its ceiling, it suddenly gives way, collapsing the clay and sand above to leave a cavernous hole at the surface.

The main trigger for sinkholes is water—too much of it, or too little. The normally moist soil of Florida has a stabilizing effect on karst. But during a drought, cavities that were supported by groundwater empty out and become unstable. During a heavy rainstorm, the weight of pooled water can strain the soil, and the sudden influx of groundwater can wash out cavities. Central Florida was in a severe drought at the beginning of 2017, followed by the intense rainfall of Hurricane Irma that hit The Villages in September—and a deluge after a drought is the optimal condition for a sinkhole outbreak.

But those major events from Mother Nature in 2017 don’t account for the spate of sinkholes this year already. The weather in Sumter County has been pretty typical. So what’s going on?

Man-made development, it turns out, is the most persistent factor for increased sinkholes. Earth-moving equipment scrapes away protective layers of soil; parking lots and paved roads divert rainwater to new infiltration points; the weight of new buildings presses down on weak spots; buried infrastructure can lead to leaking pipes; and, perhaps most of all, the pumping of groundwater disrupts the delicate water table that keeps the karst stable. “Our preliminary research indicates that the risk of sinkholes is 11 times greater in developed areas than undeveloped ones,” says George Veni, the executive director of the National Cave and Karst Research Institute who conducted a field study in Sinkhole Alley.

And The Villages has been in development overdrive. It was the fastest growing metropolitan area in the US. four years in a row (2013-16), and it’s still in the top 10. In his 2008 book Leisureville, journalist Andrew Blechman reported that The Villages would “finish its build-out—an industry term for the point when a project is complete—in the very near future,” peaking at “110,000 residents.” Yet a decade later, the population has sped past 125,000. Last year The Villages reported a 93 percent boom in housing construction and a new purchase of land that will yield up to 20,000 homes. Another land deal for 8,000 new homes is nearing completion.

Those new homes will bring more golf courses, and The Villages already has 49 of them (#2 per capita among all U.S. counties). The retention ponds built on those courses can leak into the karst and trigger sinkholes. Irrigating those 49 courses and the tens of thousands of lawns in The Villages is also a significant risk factor. In his 2016 book Oh, Florida, veteran reporter Craig Pittman reveals how his friend who worked at the Daily Sun said the staff was never to write two things: 1) anything complimentary of Barack Obama, and 2) “The numerous sinkholes that open up because of all the water being pumped from the aquifer to keep lawns and golf courses green.”

In a scathing column, Orlando Sentinel’s Lauren Ritchie notes how the fledgling community in 1991 had a water permit to use 65 millions gallons a year, but by 2017 that rate reached “a stunning 12.4 billion gallons a year.” The local aquifer in Sumter County is also threatened by a controversial plan by a bottling company to pump nearly a half-million gallons of water a day—and double that rate during peak months. Despite the protests of Villagers worried that a falling water table will spur sinkholes, pumping will begin soon.

.....

A screenshot taken from a Youtube video after four sinkholes recently opened up in the Villages. (Marion County Sheriff’s Office)

The Villages shouldn't be singled out when it comes to sinkholes. Marion and Lake, the two counties that The Villages pokes into, are #4 and #10, respectively, on RiskMeter’s 2011 list of the most sinkhole-prone counties in Florida. Number one is Pasco, which abuts Sumter to the south. Last summer a 260-foot-wide sinkhole yawned underneath a Pasco neighborhood, consuming two homes and condemning seven more, making it the county’s largest in 30 years. That massive chasm rivaled the epic Winter Park sinkhole in Orange County—#8 for RiskMeter, an online tool providing hazard analysis for insurers.

Citrus, directly to the west of Sumter, is both #6 for RiskMeter and the fourth “grayest” county in the U.S., based on the percentage of residents over age 65. Pasco and Marion are also among the top 10 counties nationwide with both a high concentration and high number of older people.

In Ocala, near The Villages, a sinkhole in a fast-food lot swallowed a car and forced the elderly couple inside to crawl out. A man simply standing in the grass in The Villages slipped through a trapdoor of a five-foot hole. In the Village of Glenbrook, a retired couple found a sinkhole literally on their doorstep. Another Villager reported a “prowler” to 911 only to discover a dark void instead. In the nearby city of Apopka, half a couple’s home collapsed and “nearly 50 years of memories sank with it.”

I spoke with geologist and sinkhole expert David Wilshaw on the same day he was returning from a trip to The Villages to inspect a suspected sinkhole. It turned out to be a false alarm—the small depression was caused by a leaking irrigation line—but the shaken resident told Wilshaw she hadn’t slept the whole night, afraid the ground would gobble her up. Injuries from sinkholes are rare, but “perception is everything,” says Wilshaw, “particularly with the elderly population. They’re also fearful they may lose their best investment”—their house—“and lose it during their retirement years,” when they’re most vulnerable.

Central to that fear factor is how unpredictable sinkholes are. They usually form without warning, and it’s difficult to detect weak spots in the ground. “Drilling exploration holes in The Villages is a challenge,” says Wilshaw, “since rock will be 5 feet down in some places and 100 feet down if you move 20 feet to the side.” Wilshaw, who runs his own company specializing in assessing sinkhole risk, is often hired to survey sites using ground penetrating radar (GPR), which is the best way to detect cavities. But he says many homebuilders “will do absolutely nothing and instead rely on the end user” to check for cavities, since Florida law doesn’t require it. “It’s a little bit of the Wild West,” he says.

Can anything besides GPR help predict sinkholes? NASA technology has shown potential: Interferometric Synthetic Aperture Radar (InSAR) detects subtle changes in ground elevation over time when the sensor is flown repeatedly over an area susceptible to sinkholes, especially the slow-forming ones called “cover-subsidence.” When that use for InSAR emerged in 2014, the Florida Department of Environmental Protection (DEP) reached out to NASA for help, but when I checked in with a DEP spokesperson, she said that’s not happening anytime soon.

Even when a site is surveyed and deemed safe from sinkholes, one can still form a few years later, given the precarious nature of karst. “It’s best to just cross your fingers and buy insurance," says Wilshaw. But homeowners insurance only covers “catastrophic ground collapse”—when a sinkhole makes a home uninhabitable. Any damage just short of that must be covered by sinkhole insurance, whose deductible in Florida is typically 10 percent of the home’s value.

“Not all homes qualify for [that] broader coverage, which is admittedly a scary proposition in Florida,” according to a front-page article in the May 2018 issue of The Bulletin, published by the Property Owners’ Association (POA) of The Villages. (The group isn’t affiliated with the developer).

Even when a sinkhole is repaired (“remediated” is the technical term), it will sometimes reopen. Perhaps the most dramatic sinkhole to ever hit The Villages, in Buttonwood—just look at this photo—lurched open several months after remediation began. So did the sinkhole that killed Jeffrey Bush.

.....

A predictive map that was part of Kromhout’s 2017 report. Sumter County is outlined in white.

Conspicuously absent from RiskMeter’s top 10 list is Sumter County. That 2011 list, though, was based on sinkhole insurance claims, and scads of them were falsely reported in the years before 2011, when Florida lawmakers overhauled the abused system. A much better gauge of sinkhole risk followed two years later (just as The Villages was starting its four-year growth streak): The 2013 Hazard Mitigation Plan, created by the Florida Division of Emergency Management (DEM), which assigned Sumter a “medium” risk for sinkholes. Only eight other counties were given a higher level of risk.

But as DEM acknowledged, that 2013 assessment was “imprecise and poorly substantiated by available geologic data” because it was primarily based on citizen reports of sinkholes unverified by geologists. Enter Clint Kromhout of the Florida Geological Survey: In 2013, he and his team secured more than $1 million in federal funds to travel around Florida verifying those sinkholes and create a predictive map showing which areas have the most “relative vulnerability.” Among the many reporters Kromhout spoke to during the three-year study was Tampa Bay Times’ James L. Rosica, who noted, “The goal for the scale of the state map is at least the county level, but Kromhout said he hopes they will be able to get to a neighborhood-by-neighborhood detail.”

Veni, the karst expert I interviewed, calls Kromhout’s 2017 report “the most detailed, comprehensive analysis of sinkhole risk that I’m aware of.” (Kromhout declined to be interviewed for this story, as did a representative for The Villages.) Its long sought-after predictive map was included in the 2018 Hazard Mitigation Plan that came out in February.

That’s as detailed as the map gets. As the Sinkhole Report states, “Most importantly, the favorability map is not of sufficient detail to provide site specific information regarding sinkhole formation.” The Villages is primarily located in the northern part of Sumter County, which is almost entirely in the red zone.

How helpful is the Sinkhole Report? “I don’t think it is the prediction model that some hoped for (it would be very difficult to create one), but it does advance the science,” says Robert Brinkmann, a geology professor at Hofstra University who wrote Florida Sinkholes: Science and Policy and owns a house in Sinkhole Alley.

“The real challenge here is that the state doesn’t really fund much sinkhole research, particularly since real estate remains one of the driving economic engines in the state,” Brinkmann adds. “The federal government has not really funded any significant studies on the topic except for this modest one. Millions in federal dollars go every year to tsunamis, earthquakes, volcanoes and hurricanes, but little if any goes to studying sinkholes. Certainly [the former] are horrible, but they are one-time dramatic events. Sinkholes are a constant threat and much of the damage happens slowly over time. The annual property damage from sinkholes is staggering”—$300 million is a conservative estimate.

That damage could escalate as climate change intensifies. “As sea level rises in response to climate change, groundwater levels in near-coastal areas will also rise and result in increased flooding of sinkholes,” predicts Veni. “Studies on the potential degree of such flooding and its triggering new sinkhole collapses are just beginning.” He's in the preliminary stages of just such a study with a colleague in Florida.

Some Villagers are tempted to throw in the towel. “When we moved [to the Village of Glenbrook] in 2012 we thought we would be here for the rest of our days,” wrote a member of the "Talk of the Villages" web forum on March 5 after a sinkhole forced his neighbors to evacuate, but “now we’re considering moving again which is the last thing I wanted to do.” (Less than two months later, another eight families would evacuate their homes when a dozen sinkholes appeared in an Ocala neighborhood not far from Glenbrook, making national news.) Another Villager added ominously: “I am dreading when the rainy season starts”—May 27, on average for the area.

But the rainy season came early this year: On May 20, after a week of persistent rain, four sinkholes struck Calumet Grove, the Village that suffered seven sinkholes in February. Thunderstorms are expected to continue thanks to a sub-tropical storm system forming off the coast. One of the residents forced from his home in February, 80-year-old Frank Neumann, spoke to Villages-News. “Prior to Monday’s sinkhole activity, Neumann said he was hoping to have his home repaired and stay in the neighborhood he’s lived in for 14 years—largely because of the friendships he and his wife have formed there,” according to the site. “But as he stood in his front yard looking at the second wave of destruction to strike his property in 95 days, he said he wasn’t so sure that remaining in The Villages was a good idea.”

Remembering AIDS: The 30th Anniversary of the Epidemic

Smithsonian Magazine

To the mark the 30th anniversary of the HIV and AIDS epidemic, the National Museum of American History recently opened three displays throughout the museum that include a panel from the famous AIDS Memorial Quilt, a collection of public health and other political and scientific responses from the early 1980s, and a selection of brochures, photos and other archival ephemera documenting the difficult reactions and tragic stigmas associated with the disease.

Located near the first floor archives center, the two-case exhibit “Archiving the History of an Epidemic: HIV and AIDS, 1985 -2009" recalls the early years when so many Americans at first tried to ignore or dismiss the onslaught of the disease. On June 5, 1981, the U.S. Centers for Disease Control in Atlanta reported that five young, gay men had died of diseases usually seen only in elderly or immune-depressed patients. Within weeks, many more cases surfaced. By 2007, more than 575,000 deaths would be attributed to the disease. Today, the illness has become chronic and manageable with the use of effective multi-drug treatments.

I knew I was in for an overwhelming story of tragedy, but then among a sea of brochures, photos and touching quotes, a handful of collectible trading cards caught my eye. The attractive illustrations featured prominent individuals who had been affected by the disease.

There was football player Jerry Smith, the first professional athlete to die of the disease; young Ryan White who contracted the illness through a blood transfusion; the handsome leading man Rock Hudson, who never publicly revealed his homosexuality.

“These individuals represent the broad spectrum of AIDS sufferers, Rock Hudson, the 1950s ideal of the American male (who happened to be gay and closeted) and Ryan White, a young hemophiliac, who contracted AIDS from a blood transfusion,” the show’s curator Franklin Robinson told me. “We see the terrible loss of talented individuals from all professions and individuals whose lives were cut short before they perhaps realized their potential.  In a broad sense they represent that AIDS does not discriminate, young or old, gay or straight, whatever gender or race, anyone is capable of contracting AIDS.”

Called “AIDS Awareness Cards,” they were published in 1993 by Eclipse Enterprises of Forestville, California, and written by editor Catherine Yronwode. The illustrations were done by Charles Hiscock and Greg Loudon and the cards were distributed in sets of twelve, and packed with a condom to amplify the message of “safe sex,” a term that evolved along with the epidemic.

The cards included images of a young Princess Diana holding one of her children, as well as Elizabeth Taylor and Madonna.

Through these women specifically,” says Franklin, “we see strong and prominent individuals who used their position and means within society to try and dispel the stigma of AIDS. Selflessly they took a stand of reaching out to the AIDS affected population with love and compassion when it was very unpopular.  They illustrated that one can lead by example.”

When published the cards received negative publicity. Some accused Eclipse of capitalizing on the tragedy of the disease. But the editor Catherine Yronwode defended them. In a 1993 Orlando Sentinel article she said, “If you take the time to read the cards, you will come away with a good understanding of the disease.” While 15 percent of the generated revenues were donated to charitable groups fighting the disease, Eclipse ceased producing the cards in 1994, says Robinson.

In a time when people should have been interested in learning about AIDS and the mysteries that surrounded it and HIV, it was a struggle to grab the attention of the young adult audience, Robinson says when asked why he selected them for the exhibit.

I thought the cards were a unique and innovative way for viewers to take away the message that AIDS is not just a disease affecting gay men but in a sense belongs to everyone. I hope the cards inspire viewers to reflect that someone they may know or admire is, or has been, affected by AIDS and that everyone may play a part in battling this epidemic.”

The 30th Anniversary of HIV and AIDS is a three-part commemoration and includes displays in the Archives Center and in the “Science in American Life” exhibit. A panel from the AIDS Memorial Quilt is on view on the first floor in the Artifact Walls display cases.

Image by Archives Center Lesbian, Gay, Bisexual, Transgender Collection, Archives Center, National Museum of American History, Smithsonian Institution. AIDS trading card: Diana, Princess of Wales. (original image)

Image by Archives Center Lesbian, Gay, Bisexual, Transgender Collection, Archives Center, National Museum of American History, Smithsonian Institution. AIDS trading card: Madonna. (original image)

Image by Archives Center Lesbian, Gay, Bisexual, Transgender Collection, Archives Center, National Museum of American History, Smithsonian Institution. AIDS trading card: Earvin "Magic" Johnson. (original image)

Image by Archives Center Lesbian, Gay, Bisexual, Transgender Collection, Archives Center, National Museum of American History, Smithsonian Institution. AIDS trading card: Ryan White. (original image)

Image by Archives Center Lesbian, Gay, Bisexual, Transgender Collection, Archives Center, National Museum of American History, Smithsonian Institution. AIDS trading card: Rock Hudson. (original image)

Image by Archives Center Lesbian, Gay, Bisexual, Transgender Collection, Archives Center, National Museum of American History, Smithsonian Institution. AIDS trading card: Jerry Smith. (original image)

Superspreaders Caused Much of the 2014 Ebola Epidemic

Smithsonian Magazine

In 2014 and 2015, Ebola spread through West Africa like wildfire, affecting over 28,000 people in Guinea, Sierra Leone and Liberia and killing 11,310. But just how did the dangerous virus spread? A new study has a surprising answer, reports the BBC’s James Gallagher—the majority of cases were caused by a small minority of infected people.

A new paper published in the journal Proceedings of the National Academy of Sciences suggests that just three percent of people with Ebola were responsible for around 61 percent of cases. The study, which used statistical models to show how the disease was transmitted, found that age was the biggest predictor of whether an individual would spread the virus or not.

Researchers used data from a burial program conducted by the Red Cross that included GPS locations of where the bodies of 200 people who died of Ebola were collected. The data set also included information on their age, sex and time of burial. Using that data, researchers were able to infer how many people each infected person got sick. They found that people below 15 and above 45 years of age were more likely to spread the virus than those in the middle range.

This phenomenon, also known as “superspreading,” has been observed before. In 2015, an outbreak of MERS in South Korea occurred when a single patient infected at least 22 other people. And most are probably familiar with the story of Typhoid Mary, a superspreader who was herself immune to typhoid, but infected 51 people in a short period of time. Mary Mallon was then placed in a forced, decades-long quarantine.

As The Wall Street Journal’s Sumathi Reddy reports, scientists think that 20 percent of the population spreads disease more easily than the other 80 percent. However, the jury’s still out on exactly why. Steven Riley, one of the Ebola paper’s co-authors, tells Gallagher that he thinks the disease's spread was due to human behavior and that perhaps the fact that the young or old were taken care of by people in the middle age bracket.

One thing is clear: Superspreading can make the difference between a blip and a full-blown epidemic. Epidemiologists are getting better at analyzing data to determine who spreads disease. But given the short incubation period of many diseases—Ebola, for example, can incubate in as few as two days—it can be hard to stop contagion before the death toll begins to climb. Though nothing can replace the lives lost in epidemics, perhaps scientists can learn from these deaths to one day stop future outbreaks.

What’s Causing This Village’s Weird Sleeping Sickness Epidemic?

Smithsonian Magazine

In a little village called Kalachi, tucked away in the northern region of Kazakhstan, over 120 residents have been hit by a strange malady that has doctors and scientists baffled.

Without warning a person will inexplicably fall into a coma-like sleep they often won’t wake from for days. When they do come to, they’re often left with “debilitating symptoms – dizziness, nausea, blinding headaches and memory loss,” Joanna Lillis reports for the Guardian.  

The mystery illness was first officially recorded in the spring of 2013 and has affected about a fourth of the village’s population with some experiencing repeat attacks. The two most recent cases emerged in early March, bringing the total number of incidences, according to Lillis, to 152.

Scientists, along with the government of Kazakhstan, have been scrambling to find a cause for the strange ailment. But despite some strong leads, they have yet to nail one down. Two likely culprits are radon and carbon monoxide poisoning. The symptoms of these problems closely resemble those experienced by Kalachi’s residents. Testing showed unusually high levels of both in some village homes, but still, local officials have ruled them out as a cause.

The scientists involved are determined to find an explanation, however. Thanks to a research coordination commission set up by the Kazakhstan prime minister “by the end of last year over 20,000 laboratory and clinical test had been conducted – on the air, soil, water, food, animals, building materials, and on the residents themselves,” writes Lillis.

Many residents and one Russian scientist interviewed by Newsweek think the cause of the illness may not be coming from Kalachi, however, but rather a site just outside the village. That’s where an old Soviet-era uranium mine lies abandoned since the 1990s.

“In my opinion, a gas factor is at work here,” professor Leonid Rikhvanov from the Tomsk Polytechnic University in Russia told Newsweek. “Radon could be operating as a narcotic substance or an anesthetic. Currently, the underground space of the mine is flooded and gases are being squeezed to the surface.” 

The theory is as yet unproven, however—and in the meantime, authorities have chosen to take drastic measures against the sleeping sickness by offering to relocate locals to villages outside the perceived danger zone. Over 100 citizens have reportedly embraced the “voluntary relocation” already, which officials hope to be complete by May.

There are many in Kalachi who don’t want to move and who have no plan to effectively abandon their lives, despite warnings from Rikhvanov and others that more cases are likely to present themselves. But, as one resident told Lillis of the worrisome illness, “They say it affects the brain; they say it gives people headaches, but our headache now is where we’re being resettled.”

Here’s More About the Drug Behind Indiana’s HIV Epidemic

Smithsonian Magazine

An unprecedented HIV outbreak in Indiana has caused the governor to declare a health emergency and set up a short-term needle exchange. But what’s fueling the epidemic in Scott County, Ind.? Experts say it’s the illegal use of Opana, a powerful pain medication that has been called the “new scourge of rural America.”

Indiana public health officials tell WKMS that Opana, otherwise known as oxymorphone, is “an incredibly powerful and potent opiate.” The drug is usually prescribed to patients with pain that has resisted non-opioid painkillers or who have had insufficient pain relief with immediate-release opioids. But Opana has gained a whole other life as an underground drug in rural communities like Scott County.

In its illegal form, Opana is crushed and injected, defeating the drug’s extended release properties and creating a potent high that’s comparable to that of heroin or OxyContin. Before 2011, the drug was “relatively easy” to shoot up, notes USA Today’s Laura Ungar, but a crush-resistant formulation was created in late 2011 to prevent Opana abuse.

However, those precautions haven’t been enough, and abuse of the drug rose dramatically in 2012. In a 2013 statement, the FDA admitted that the reformulated version of Opana could still “be compromised when subjected to other forms of manipulation, such as cutting, grinding, or chewing.”

All of the people who have been identified as HIV positive in the current outbreak have admitted to intravenous drug use, ABC News reports. And while it’s still unclear whether the governor’s temporary needle exchange will quell the spread of HIV, which is also spread through sexual contact, Indiana isn’t the only state approving needle exchanges for the first time. This Thursday, Kentucky’s governor signed a bill that will allow health departments to create needle exchanges in the state.

Meningitis (Cerebrospinal Meningitis--epidemic) Quarantine Sign - McPherson County Health Department

National Museum of American History
White cardboard sign with black print. "MENINGITIS / (Cerebrospinal Meningitis--epidemic) / Isolation of patient for minimum of 2 weeks from onset. / Exposed children excluded from school for 14 days from / last contact. / McPherson County Health Department"

Outbreak! On the front lines of a measles epidemic

National Museum of American History

In 1904 C. A. Lindsley, the secretary of the Connecticut State Board of Health, complained that "measles is still prevailing in an epidemic form in too many towns in the state." Measles epidemics could, Lindsley insisted, be arrested if quarantines were simply enforced across the state. Unfortunately, "a remarkable and strange misunderstanding" of the state's quarantine regulations had led many health officers to mandate that only the primary or first case of measles in a community be quarantined. As a result, epidemics were spiraling out of control across the state.

Pink sign warning about Measles and Connecticut health regulations

In advocating the use of a strict quarantine, Lindsley was not proposing anything new. For centuries, quarantines have been used to contain a variety of contagious diseases. In fact, the term quarantine dates back to the 14th and 15th centuries when city leaders in Venice imposed a forty-day ("quaranta giorni:") isolation period on ships entering the city's port. By the late 19th century, the imposition of quarantines was still regarded as one of the most effective methods of arresting epidemics of scarlet fever, whooping cough, and a range of other illnesses. But in calling for stricter and more widespread quarantines for a disease that many Americans "commonly regarded as a disease of small moment," Lindsley was at the forefront of a group of public health officials who now saw measles as a potentially life-threatening disease.

Just two years before the measles epidemic that shook the state of Connecticut in 1904, The New York Times had informed its readers that "statistics show that in children under two years of age, the mortality from measles equals 20 per cent, and under five, equals 12 per cent." Measles killed at a higher percentage rate than whooping cough and scarlet fever, two other "childhood" diseases that were commonly viewed as more dangerous than measles.

Because no effective vaccine existed (and would not exist until 1963), Lindsley urged municipalities in his state to use quarantine signs such as this one. But the tendency to view measles as a benign disease meant that quarantines were not, as Lindsley pointed out, rigorously enforced.

Brownish/white sign enforcing measles quarantine

In the absence of a strict quarantine, measles epidemics often erupted. While drug purveyors such as the manufacturer of this medicine often claimed that they could "cure" measles, the disease had no known cure and the results of these epidemics could be devastating. Between 1900 and 1910, an average of 774 deaths from measles occurred each year. Over the course of the next few decades this death rate slowly declined as health care improved, but measles did not totally disappear from America.

Cardboard box reading Munyon's Measles Remedy

Only in 1963 would measles be brought under control. That year, the development of a successful vaccine led several public health officers to speculate that widespread use of this vaccine could permanently eradicate measles by 1967. This ambitious idea reflected the growing confidence of the American medical profession that they could eradicate and destroy a range of infectious diseases. In many ways, this confidence was not misplaced as the medical profession successfully used vaccination throughout the 1960s and 1970s to eradicate smallpox permanently.

Unfortunately, while smallpox vaccination led to the successful eradication of this disease by 1980, eradicating measles proved to be more difficult. Despite a major public health campaign targeting measles during the 1960s, the widespread tendency to underestimate the danger of measles meant that many parents and even some pediatricians were lax about vaccinating children. Further compounding this problem were racial and economic disparities which meant that poor children were less likely to be vaccinated than their white middle-class counterparts.

By the 1980s, measles outbreaks were on the rise again. Ironically, these outbreaks were, in part, the result of improved health measures. Whereas 19th century parents' experiences with measles and other potentially fatal diseases often motivated them to take aggressive action against infectious disease, parents in the late 20th century who lacked these experiences have been less likely to see vaccination as crucial for the health of their child.

Alexandra M. Lord, Ph.D., is chair of the History of Medicine and Science Division.

OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=T59fBqY0fM4:1Ol2XB3iwBw:V_sGLiPBpWU OSayCanYouSee?i=T59fBqY0fM4:1Ol2XB3iwBw:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

Uganda park rangers with cell phones may help stop next world influenza epidemic

Smithsonian Insider

Today, Marra is helping launch an Animal Mortality Monitoring Program in Africa intended to serve as an early warning system for emerging infectious diseases that can pass from animal populations into the human population.

The post Uganda park rangers with cell phones may help stop next world influenza epidemic appeared first on Smithsonian Insider.

The human obesity epidemic, the mismatch paradigm, and our modern "captive" environment

Smithsonian Libraries
In the distant past obesity in humans was rare and likely caused by metabolic dysregulation due to genetic or disease-related pathology. External factors precluded the ability of most people to overeat or under exert. Socio-cultural obesity came about due to the rareness of obesity and its difficulty to achieve. What is rare becomes valuable and what is difficult to achieve becomes a badge of prestige. The modern human obesity epidemic would appear to represent a third class of obesity: environmental obesity. Much like the captive environments which humans construct for the captive/companion animals in our care, the modern human environment has greatly decreased the challenges of life that would restrict food intake and enforce exertion. And like us, our captive/companion animal populations are also experiencing obesity epidemics. A further concern is that maternal obesity alters maternal signaling to offspring, in utero through the placenta and after birth through breast milk, in ways that perpetuate an enhanced vulnerability to obesity. Molecules such as leptin, produced by adipose tissue and placenta, have significant developmental effects on brain areas associated with feeding behavior. Leptin and other cytokines and growth factors are found in breast milk. These molecules have positive effects on gut maturation; their effects on metabolism and brain development are unclear. Placenta and brain also are hotspots for epigenetic regulation, and epigenetic changes may play significant roles in the later vulnerability to obesity and to the development of a diverse array of diseases, including heart disease, hypertension, and noninsulin-dependent diabetes. Am. J. Hum. Biol., 2012. (C) 2012 Wiley Periodicals, Inc.

The Confusing and At-Times Counterproductive 1980s Response to the AIDS Epidemic

Smithsonian Magazine

In 1981, an unknown epidemic was spreading across America. In June of that year, the Centers for Disease Control and Prevention's newsletter mentioned five cases of a strange pneumonia in Los Angeles. By July, 40 cases of a rare skin cancer were reported by doctors working in the gay communities of New York and San Francisco. By August, the Associated Press reported that two rare diseases, the skin cancer Kaposi's sarcoma and pneumocystis, a form of pneumonia caused by a parasitic organism, had infected over 100 gay men in America, killing over half of them. At the end of 1981, 121 men had died from the strange disease; in 1982, the disease was given a name; by 1984, two different scientists had isolated the virus causing it; in 1986, that virus was named HIV. By the end of the decade, in 1989, 27,408 people died from AIDS.

In the years following the AIDS epidemic, medical research has given us a better understanding of HIV and AIDS, as well as made some remarkable breakthroughs unimagined in the 1980s: today, people living with HIV aren’t condemned to a death sentence, but rather have treatment options available. Still, to think of the AIDS epidemic in medical terms misses half of the story--the social aspect, which affected America's perception of HIV and AIDS just as much, if not more than medical research.

The two sides of the story are told through a collection of articles, pictures, posters and pamphlets in Surviving and Thriving: AIDS, Politics and Culture, a traveling exhibit and online adaptation curated by the National Library of Medicine that explores the rise of AIDS in the early 1980s, as well as the medical and social responses to the disease since. The human reaction to the AIDS epidemic often takes a back seat to the medical narrative, but the curators of Surviving and Thriving were careful to make sure that this did not happen--through a series of digital panels, as well as a digital gallery, readers can explore how the government and other community groups talked about the disease.

At the beginning of the epidemic, response was largely limited to the communities which were most affected, especially the gay male community. “People with AIDS are really a driving force in responding to the epidemic and seeing how change is made,” says Jennifer Brier, a historian of politics and sexuality who curated the exhibit.

Michael Callen and Richard Berkowitz, two gay men living with AIDS, wrote How to Have Sex in an Epidemic, which introduced the idea of safe sex in 1982. Image courtesy of Richard Dworkin.

In 1982, Michael Callen and Richard Berkowitz, two gay men living with AIDS in New York City, published How to Have Sex in an Epidemic, which helped spread the idea that safe sex could be used as protection against spreading the epidemic--an idea that hadn't yet become prevalent in the medical community. The pamphlet was one of the first places that proposed that men should use condoms when having sex with other men as a protection against AIDS.

Poster from 1986, courtesy of the Health Education Resource Organization.

Condoms as protection against AIDS became a major theme for poster campaigns. The above poster, paid for by the Baltimore-based non-profit Health Education Resource Organization, shows how visuals attempted to appeal, at least at first, to the gay community. Due to widespread misinformation, however, many people believed that AIDS was a disease that affected only white gay communities. As a response to this, black gay and lesbian communities created posters like the one below, to show that AIDS didn't discriminate based on race.

Poster from the Black Gay and Lesbian Leadership Forum, in Los Angeles, 1985. Photo courtesy of the National Library of Medicine.

Many posters and education campaigns harnessed sexual imagery to convey the importance of safe sex in an attempt to make safety sexy (like the Safe Sex is Hot Sex campaign), but it wasn't a campaign tactic supported by governmental bodies--in fact, in 1987, Congress explicitly banned the use of federal funds for AIDS prevention and education campaigns that "[promoted] or [encouraged], directly or indirectly, homosexual activities" (the legislation was spearheaded by conservative senator Jesse Helms and signed into law by President Reagan).

Instead, federally-funded campaigns sought to address a large number of people from all backgrounds--male, female, homosexual or heterosexual. The America Responds to AIDS campaign, created by the CDC, ran from 1987 to 1996 and became a central part of the "everyone is at risk" message of AIDS prevention.

This poster spoke to parents about the challenges of talking to a teenager about AIDS, but stressed that the issue was relevant and important to young Americans. Courtesy of the National Library of Medicine.

The campaign was met with mixed feelings by AIDS workers. "The posters really do help ameliorate the fear of hatred of people with AIDS," Brier explains. "There’s a notion that everyone is at risk, and that’s important to talk about, but there’s also the reality that not everyone is at risk to the same extent." Some AIDS organizations, especially those providing service to communities at the highest risk for contracting HIV, saw the campaign as diverting money and attention away from the communities that needed it the most--leaving gay and minority communities to compete with one another for the little money that remained. As New York Times reporter Jayson Blair wrote in 2001 (, "Much of the government's $600 million AIDS-prevention budget was used...to combat the disease among college students, heterosexual women and others who faced a relatively low risk of contracting the disease."
(This linked column by Blair was later found to be plagiarized from reporting by the Wall Street Journal, but the point still holds.)

Beyond campaigns that tried to generalize the AIDS epidemic, a different side used the fear of AIDS to try and affect change. These posters, contained under the section "Fear Mongering" in the exhibit's digital gallery, show ominous images of graves or caskets behind proclamations of danger.

"It was like this sort of scared straight model, like if you get scared enough, you really will do what is right," Brier says of the posters. "There were posters that focused on pleasure, or health, or positive things to get people to affect change in their behavior, but there were consistently posters that used the idea that fear could produce behavior change."

"A bad reputation isn't all you can get from sleeping around." Poster courtesy of Dallas County Health Department.

The above poster exemplifies the tactic of fear mongering: a large, visible slogan to affect fear (and shame sexual behavior), while information on how to prevent the spread of AIDS is buried in small print at the bottom of the poster. A lack of information was typical of fear-mongering posters, which relied on catchy, scary headlines rather than information about safe sex, clean needles or the disease itself.

aids fear mongering cross

"AIDS—even its name is deceptive." Poster from the AIDS Resource Center.

"The posters fed on people’s inability to understand how AIDS actually spread. It didn’t really ever mention ways to prevent the spread of HIV," Brier says. "Fear-mongering posters don’t talk about condoms, they don’t talk about clean needles, they don’t talk about ways to be healthy. They don’t have the solutions in them, they just have the fear."

This ominous image claims that "people are dying to know" facts about AIDS. Poster from the Pharmacists Planning Service.

Through exploring the exhibit, users get a sense of the different approaches public organizations took to spread information about AIDS. "It's a fundamental question of public health," says Brier. "Do you spread information by scaring people, do you do it by trying to tap into pleasure or do you do it by recognizing that people’s behavior isn’t just about their individual will but a whole different set of circumstances?"

Fearing a Smallpox Epidemic, Civil War Troops Tried to Self-Vaccinate

Smithsonian Magazine

At the battle of Chancellorsville, fought this week in 1862, nearly 5,000 Confederate troops were unable to take their posts as the result of trying to protect themselves from smallpox.

And it wasn’t just the South. “Although they fought on opposite sides of the trenches, the Union and Confederate forces shared a common enemy: smallpox,” writes Carole Emberton for The New York Times.

Smallpox may not have been as virulent as measles, Emberton writes, but over the course of the war it killed almost forty per cent of the Union soldiers who contracted it, while measles—which many more soldiers caught—killed far fewer of its sufferers.

There was one defense against the illness: inoculation. Doctors from both sides, relying on existing medical knowledge, tried to find healthy children to inoculate, which at the time meant taking a small amount of pus from a sick person and injecting it into the well person.

The inoculated children would suffer a mild case of smallpox—as had the children of the Princess of Wales in the 1722 case that popularized inoculation—and thereafter be immune to smallpox. Then, their scabs would be used to produce what doctors called a “pure vaccine,” uninfected by blood-borne ailments like syphilis and gangrene that commonly affected soldiers.

But there was never enough for everyone. Fearing the “speckled monster,” Emberton writes, soldiers would try to use the pus and scabs of their sick comrades to self-inoculate. The method of delivery was grisly, writes Mariana Zapata for Slate. "With the doctor too busy or completely absent, soldiers resulted to performing vaccination with whatever they had at hand. Using pocket knives, clothespins and even rusty nails... they would cut themselves to make a deep wound, usually in the arm. They would then puncture their fellow soldier's pustule and coat their wound with the overflowing lymph."

The risk of getting smallpox was bigger to the soldiers than the risk of bad infections from this treatment. But besides the lack of sanitation, the big problem was that their comrades might well have other had other ailments or even not had smallpox at all. “The resulting infections incapacitated thousands of soldiers for weeks and sometimes months,” Emberton writes.

Smallpox was just one note in a symphony of terrifying diseases that killed more Civil War soldiers than bullets, cannon balls and bayonets ever did. Although estimates vary on the number of soldiers who died during the war, even the most recent holds that about two of every three men who died were slain by disease.

That’s not hard to understand, given the conditions of the camps and the fact that the idea of doctors washing their hands hadn’t reached North America yet. There’s a reason that the Civil War period is often referred to as a medical Middle Ages.

“Medicine in the United States was woefully behind Europe,” writes the Ohio State University department of history. “Harvard Medical School did not even own a single stethoscope or microscope until after the war. Most Civil War surgeons had never treated a gunshot wound and many had never performed surgery.” That changed during the course of the war, revolutionizing American medicine, writes Emberton: but it didn’t change anything for those who died along the way.

Genetic Sleuthing Clears “Patient Zero” of Blame for U.S. AIDS Epidemic

Smithsonian Magazine

For decades, the world thought that a Canadian man named Gaétan Dugas was the person who brought HIV to the United States, setting a deadly epidemic in motion by spreading the virus to hundreds of other men. For decades, the legend has loomed large in the early history of a disease that ravaged the gay community and has gone on to become a persistent public health threat. But now, more than 30 years after his death, it turns out that Dugas was not to blame. As Deborah Netburn reports for The Los Angeles Times, a new investigation of genetic and historical evidence has not only exonerated Dugas, but has revealed more about how AIDS spread around the world in the 1980s.

In a new paper published in the journal Nature, a group of biologists, public health experts and historians describe how they used genetic testing to demonstrate that Dugas was not the first patient in the U.S. with AIDS. Instead, they found that in 1971 the virus jumped to New York from the Caribbean, where it was introduced from Zaire. By 1973, it hit San Francisco, which was years before Dugas is thought to have been sexually active.

Dugas, who was a flight attendant, later claimed to have had hundreds of sex partners, whom he met in underground gay bars and clubs in New York. Though his name was never released to the public by medical practitioners, Netburn writes, it became public in Randy Shilts​' book And the Band Played On, a history of the first five years of the AIDS epidemic. Shilts portrayed Dugas as an amoral, sex-obsessed “Typhoid Mary.” And despite calls from medical historians to the public to expose the inaccuracies of the depiction, Dugas' name became inextricably associated with  spreading the disease that took his life in 1984. That was, in part, due to his reported refusal to acknowledge that the disease could be spread via sexual contact—a refusal that Shilts used to paint Dugas as someone who infected people with HIV on purpose.

But regardless of how Dugas perceived AIDS, it now appears he could not have been the person who brought it to the U.S. Researchers got their hands on a blood serum sample from Dugas taken the year before his death and used it to assemble an HIV genome. They also studied serum samples of gay men who had blood taken in the late 1970s for a study on Hepatitis B. The samples showed that 6.6 percent of the New York men studied and 3.7 percent of the San Francisco men had developed antibodies to HIV.

Then the team sequenced 53 of the samples and reconstructed the HIV genome in eight. The samples showed a level of genetic diversity in the HIV genome, which suggests that Dugas was far from the first person to develop AIDS.

(Wikimedia Commons)

It turns out that a tragic misreading fueled Dugas’ reputation as “Patient Zero.” Despite being initially identified as the CDC’s 57th case of the then-mysterious disease, writes Netburn, at some point he was tagged with the letter “O” in a CDC AIDS study that identified him as a patient “outside of California.” That O was read as a number at some point, and Shilts, feeling the idea of a patient zero was “catchy,” identified Dugas in his book.

Before Dugas died, the mechanisms by which HIV were spread were still unknown and the disease was still thought to be some form of “gay cancer.”  Dugas was just one of thousands of men forced to take their sex lives underground in an era of intense stigma against homosexuality. Many such men found a community in gay clubs and bathhouses where they could socialize with other gay men—the same locations where HIV began to spread with growing rapidity in the 1970s.

New York and San Francisco were the only places where gay men could express their sexuality with any sense of openness. As Elizabeth Landau reports for CNN, a doctor named Alvin Friedman-Kien, an early researcher of the not-yet-named disease, met with a group of gay men in New York in 1981 to talk to them about health problems plaguing the gay community. He was met with resistance from men who refused to put their sexuality back in the closet. “They weren’t about to give up…their open new lifestyle,” he recalled.

As a man who infected other men with HIV, Dugas was certainly not unique—and he helped scientists make sense of the outbreak by identifying his sex partners and cooperating with public health officials during his illness. But he also paid a price for that openness, as medical historian Richard A. McKay writes. As paranoia about the mysterious virus grew within the gay community, Dugas, whose skin was marked with the cancer that was often the only visible indicator of AIDS, was discriminated against, shunned and harassed. And after his death, when he was identified as Patient Zero, his friends complained that Shilts had portrayed a one-dimensional villain instead of the strong, affectionate man they knew.

Today, the idea of a “Patient Zero” or index case is still used to model how epidemics spread. But given that an index case is only the first person known to have a condition in a certain population rather than the first person affected by it, the idea itself is limiting. In the case of AIDS, which wiped out an entire generation of gay men in America and has killed more than 35 million people since the 1980s, it is now clear that a Patient Zero may never be identified. But thanks to Dugas, now scientists know even more about the origins and early spread of the disease.

The Woman Who Stood Between America and an Epidemic of Birth Defects

Smithsonian Magazine

In 1960, America had a stroke of luck. That was when the application to begin mass-marketing the drug thalidomide in the United States landed on the desk of Frances Oldham Kelsey, a reviewer at the Food and Drug Administration. Today we know that the drug causes severe, devastating birth defects when taken by pregnant women for nausea. But the time, thalidomide’s darker effects were just becoming known. 

Between 1957 and 1962, the sedative would leave thousands of infants in Canada, Great Britain and West Germany permanently and tragically disabled. The U.S., however, never had a crisis of thalidomide-linked birth defects on that magnitude. Why not?

What stood between the drug and the health of the American public was none other than Kelsey and the FDA. As a medical reviewer, Kelsey had the power to prevent a drug from going to market if she found the application to be lacking sufficient evidence for safety. After a thorough review, Kelsey rejected the application for thalidomide on the grounds that it lacked sufficient evidence of safety through rigorous clinical trials.

Today we take it for granted that the FDA wisely spurned an unsafe drug. But in many ways, Kelsey’s education and experience up to that point made her especially well-suited for her position as a medical reviewer—and, in particular, for the thalidomide application.

After completing a master’s degree in pharmacology at McGill University in her home country of Canada, Kelsey was recommended by her graduate advisor to write to a Dr. Eugene Geiling at the University of Chicago to inquire about a research assistant position and to express her interested in obtaining a PhD. Geiling, a medical officer at the FDA known for his studies of the pituitary gland, wrote back offering Kelsey a research assistantship and a scholarship for doctoral study. In 1936, Kelsey joined Geiling at the University of Chicago. 

That consequential step in Kelsey's career may been due to a fortuitous error on the part of Geiling. In her short memoir “Autobiographical Reflections,” Kelsey describes Geiling as “very conservative and old-fashioned,” noting that “he really did not hold too much with women as scientists.” This might explain why Geiling, in his response letter to Kelsey, addressed it to “Mr. Oldham”—believing her to be a man. Kelsey said she continued to wonder “if my name had been Elizabeth or Mary Jane, whether I would have gotten that first big step up.” 

Kelsey was first introduced to the dangers of mass marketed unsafe pharmaceuticals in 1937, when the FDA enlisted Geiling to solve the mystery of Elixir of Sulfanilamide. Sulfanilamide effectively combated infections, but it came in a large and bitter pill that needed to be taken in large dosages. To make the drug more appealing, especially to children, manufacturers added it to a solvent with artificial raspberry flavor.

The problem was that the solvent they chose was diethylene glycol—commonly known as antifreeze. Between September and October, the drug killed 107 people. 

Geiling and his lab of graduate students, including Kelsey, set out to determine what exactly in the elixir was killing people: the solvent, the flavor or the sulfanilamide. Through a series of animal studies—which at the time were not required by federal law for a drug to go to market—Geiling and his lab were able to determine that it was the diethylene glycol that was the cause of death. 

The public outcry to this tragedy prompted Congress to pass the Federal Food, Drug, and Cosmetic Act of 1938, which added a New Drug section requiring manufacturers to present evidence that a drug was safe before going to market. Though this new law “provided for distribution of a new drug for testing purposes,” FDA historian John Swann says “the law did not provide in any explicit or detailed way how oversight of that testing should be conducted.” In other words, clinical trials continued to undergo little to no oversight. 

In 1962, President John F. Kennedy honored Kelsey for her work blocking the marketing of thalidomide. (Food and Drug Administration)

Kelsey graduated from medical school in 1950, and went on to work for the Journal of the American Medical Association before starting work as a medical reviewer at the FDA in 1960. As reviewer of New Drug Applications (NDA), she was one of three people charged with determining a drug’s safety before it could be made available for public consumption. Chemists reviewed the chemical makeup of the drug and how the manufacturer could guarantee its consistency, while pharmacologists reviewed animal trials showing that the drug was safe. 

Though this appears to be a rigorous and thorough process of checks and balances, Kelsey admitted to some weaknesses in her memoir, including the fact that many of the medical reviewers were part-time, underpaid, and sympathetic to the pharmaceutical industry. The most troubling deficiency in the process was the 60 day window for approving or rejecting drugs: If the 60th day passed, the drug would automatically go to market. She recalls that this happened at least once. 

Fortunately, drug manufacturer Richardson-Merrell’s NDA for Kevadon—the U.S. trade name for thalidomide—was only the second NDA Kelsey received, meaning she didn’t yet have a backlog of reviews to get through. For Kelsey and the other reviewers, thalidomide did not pass muster. Not only were there pharmacological problems, but Kelsey found the clinical trials to be woefully insufficient in that the physician reports were too few and they were based largely on physician testimonials rather than sound scientific study. She rejected the application.

Reports of the side effect peripheral neuritis—painful inflammation of the peripheral nerves—were published in the December 1960 issue of the British Medical Journal. This raised an even bigger red flag for Kelsey: “the peripheral neuritis did not seem the sort of side effect that should come from a simple sleeping pill.” 

She asked for more information from Merrell, who responded with another application merely stating that thalidomide was at least safer than barbiturates. Kelsey then sent a letter directly to Merrell saying that she suspected they knew of the neurological toxicity that led to nerve inflammation but chose not to disclose it in their application. Merrell grew increasingly upset that Kelsey would not pass their drug, which had been used in over 40 other countries at this point.

If neurological toxicity developed in adults who took thalidomide, Kelsey wondered: What was happening to the fetus of a pregnant woman who took the drug? Her concern hit on what would be the most sinister effect of thalidomide in other countries. 

Kelsey had asked these questions before. After getting her Ph.D. in 1938, she stayed on with Geiling. During World War II, Geiling’s lab joined the widespread effort to find a treatment for malaria for soldiers in wartime. Kelsey worked on the metabolism of drugs in rabbits, particularly an enzyme in their livers that allowed them to easily break down quinine. What wasn’t clear was how this enzyme broke down quinine in pregnant rabbits and in rabbit embryos.

Kelsey found that pregnant rabbits could not as easily break down quinine and that the embryos could not break it down at all. Though there was already some work being done on the effects of pharmaceuticals on embryos, it was not yet a well-researched area. 

By November of 1961, physicians in Germany and Australia had independently discovered birth defects in infants whose mothers had taken thalidomide during early pregnancy. In embryos, thalidomide could cause critical damage to organ development—even just one pill could result in infant deformities. And since many doctors prescribed thalidomide for the off-label treatment of morning sickness, 10,000 infants all over the world were affected, and countless others died in utero. 

Merrell eventually withdrew the application on their own in April of 1962. But the drug had already been distributed to “more than 1200 physicians, about 15,000-20,000 patients—of whom over 600 were pregnant,” according to Swan. In the U.S., 17 cases of birth defects were reported, but as Swan says via email, “that could have been thousands had the FDA not insisted on the evidence of safety required under the law (despite ongoing pressure from the drug’s sponsor).”

In 1962, soon after Merrell withdrew their application and the horrors of the drug became internationally known, Congress passed the Kefauver-Harris Amendment. This key amendment required more oversight for clinical studies, including informed consent by patients in the studies and scientific evidence of the drug’s effectiveness, not just its safety. In the wake of its passage, President Kennedy awarded Kelsey the President’s Award for Distinguished Federal Civilian Service, making her the second woman to receive such a high civilian honor. 

In her memoir, Kelsey says that the honor did not belong just to her. “I thought that I was accepting the medal on behalf of a lot of different federal workers,” she writes. “This was really a team effort.” She was quickly promoted to chief of the investigational drug branch in 1963, and four years later, she became director of the Office of Scientific investigation—a position she held for 40 years until she retired at the age of 90. She lived until the age of 101, and passed away in 2015. 

Kelsey spent the majority of her life in public service, and her story continues to stand out as a testament to the essential role of the FDA in maintaining drug safety.

Ricardo Echalar-Outbreak Epidemics in a Connected World

National Museum of Natural History
Volunteer facilitator training Outbreak: Epidemics in a Connected World

Dreading the Worst When it Comes to Epidemics

Smithsonian Magazine

So far the swine flu has frightened far more people than it has infected, but fear of a disease can be just as potent as the sickness itself. Outbreaks of plague in medieval Europe led to the murder or exile of Jews who had nothing to do with its spread. In the 20th century, the specter of contagion was used to turn impoverished immigrants away from Ellis Island, demonize gay men and discourage women from getting jobs and even wearing shorter skirts. “So often epidemics end up as campaigns to capitalize on people’s fears or spread prejudice or encourage one or another kind of injustice,” says Philip Alcabes, a public health professor at Hunter College of the City University of New York and the author of a new book, “Dread: How Fear and Fantasy Have Fueled Epidemics From the Black Death to Avian Flu.”

To understand the history of epidemics as cultural forces, Alcabes, an epidemiologist by training and an AIDS expert, delved into both scientific literature and works of fiction ranging from Albert Camus’s “The Plague” to Michael Crichton’s “The Andromeda Strain.” The story that a society tells itself about a disease, he discovered, is just as important as the disease’s actual mechanism. Often these narratives reveal a cultural unease that looms larger than the sickness – sexual anxiety, for instance, or suspicion of foreigners.

Though in recent years America has largely been spared from killer epidemics, the terminology has spread to cover a variety of non-contagious phenomena. The obesity epidemic. The autism epidemic. The drunk driving epidemic. Alcabes shared his thoughts on the swine flu “epidemic,” and on the history and psychology of that fearsome word:

What is an epidemic? And how is it different from a plain old disease?

If you’re an epidemiologist there’s a very simple answer – an epidemic is more than the expected number of cases of a particular disease in a given place and time. That’s easy. But that doesn’t describe what epidemics mean to people. A little more expansive definition is that an epidemic is a disaster of some kind, or, to get still more expansive, an epidemic is a perceived disaster. I write at the end of the book about autism, and autism as an epidemic. There is much more autism among children today than there was a generation or a couple of generations ago. On the other hand, the preponderance of evidence does not suggest that there’s something happening that’s making more kids be born with autism. The increase in autism seems to happen as a combination of expanding diagnosis and changing diagnostic patterns, plus better awareness of the problem and more awareness of what can be done for autistic kids. So there you could say what’s going on is perceptual.

Is swine flu an epidemic?

Yes, sure. Why? Because people are talking about it as an epidemic. And an epidemiologist would say that, since we have never seen cases of this strain before, as soon as we have seen some cases it’s an epidemic.

Can we learn anything about what’s going on now from the swine flu “epidemic” of 1976?

I believe there is much to be learned from what happened in 1976. Health officials were too quick to assume that we were going to see a repeat of 1918, the so-called Spanish flu epidemic (which killed millions). In 1976, officials pulled the switch too soon and called for mass vaccinations against this particular flu strain. And they did it because they had been convinced by some bad history that there was a great likelihood of a very severe and widespread flu epidemic at that time. As a result of this mass vaccination program, some people died. They died from Guillian-Barre Syndrome (an immune system disorder) and no flu was prevented because there was no outbreak. There was the usual outbreak of garden-variety seasonal influenza but not of the new strain. For me there’s a lesson there. I think responding to flu requires balancing sound public health measures against the need to have some foresight. What happened there was the sound measures were outstripped by the desire to predict in advance of the facts.

People used to see epidemics as the work of God?

In many ancient cultures, it was assumed what we now call epidemics were random acts of God or gods that couldn’t be explained. In fact, a kind of philosophical advance that the ancient Hebrews brought was that disaster happened because God got angry (with people). These were real attempts to explain what happened on the basis of people’s actions. The leading example is the ten plagues in Exodus. God smites the Egyptians with these plagues because they won’t let the Hebrews go. The idea was that when there are natural disasters it’s not a random eruption of the spirit world but a predictable response by an angry deity.

The plague of Florence as described by Giovanni Boccaccio. (Bettmann / Corbis)

You say the Black Death was the archetypal epidemic.

We think of epidemics in the pattern of the Black Death. It comes suddenly, without warning, and causes great harm. And then it goes away. There are certain really terrible disease disasters that we don’t call epidemics. Worldwide there are about 1.8 million deaths per year from tuberculosis but we don’t say there’s a tuberculosis epidemic. We don’t talk about that as an epidemic because TB does the same thing year in and year out. There’s something about the sameness of that, the predictability of it, that makes us not consider it an epidemic.

How did medieval epidemics help strengthen communities?

The era of the plague starts in Europe in the mid-1300s and goes to about the year 1700. One of the things that’s remarkable is that at the same time as there were these florid and violent responses that I write about -- the burning of the Jews and hounding people out of their homes and exiling them from the land -- there were also very cogent and thoughtful communitarian responses, like quarantine. Communities decided to protect themselves by preventing goods from coming in or people from coming in, which in essence were the beginnings of public health intervention.

In the 20th century, how did epidemics impact the status of marginalized ethnic groups like Jews in Europe and Irish immigrants and blacks in America?

One of the themes that threads through the history of thinking about epidemics is this idea of fear or suspicion of foreigners or outsiders, fears about people who don’t seem to fit in. The Black Death example is the Christian townspeople in Western Europe who seized on Jews as the cause. Now they basically knew Jews weren’t the cause of the plague, but in many places nonetheless they either ran the Jews out of town or beat them or burned them to death. It was an expression of some unconscious, or not-so-unconscious, fear that I think was really about the stability of society. Fortunately we don’t see so much burning at the stake anymore when there are epidemics. But there’s still an impulse to fix on foreigners and outsiders as being suspect, as being somehow responsible. With cholera in the mid- 19th century, the suspects were Irish immigrants. There was an outbreak of plague in San Francisco in 1900 that started in Chinatown. The plans for what to do about the plague were tied up with anti-immigrant sentiments, which focused on Chinese-Americans but also included Japanese-Americans.

How did dread of epidemics influence women’s place in society?

There are scholarly papers in peer-reviewed medical journals that attribute tuberculosis (in the 1920s) to the new trend of young women’s independence. Instead of staying home and finding a husband, they were going out, getting jobs, and particularly wearing abbreviated clothing. They go out, catch a chill and one thing leads to another, the thinking went. Was there real science behind this? Yes and no. But it really reflected a set of prejudices about women. You see that set of prejudices more generally in the context of sexually transmitted diseases. There’s a general implication that sexual women are dangerous in the history of disease control in America.

What fears did the AIDS epidemic reveal?

AIDS touched on a really essential tension that had to do with modernity or the nature of modern life toward the last quarter of 20th century. The public health profession was feeling like contagion had been conquered, or could be. In the 1970s small pox was eradicated, polio vaccines had diminished what had been a terrible scourge among children, there was vaccination for measles. It was a hopeful moment. At the same time that there was great faith in the advances of modernity, there was a feeling that maybe bad things were going to happen (because of modernity). That’s a persistent theme in western history, that something we’re doing, something that our parents or our grandparents didn’t do having to do with piety or sex or diet, somehow means we’ll “reap the whirlwind.” Then AIDS comes, and people talk about homosexual men like they're getting their comeuppance. Jerry Falwell even used that term about gay men “reaping the whirlwind.” As if something about the sexual revolution, the post-Stonewall moment, when people were able to come out as gay, had threatened society and society was now being punished. The response to AIDS was fraught with all sorts of ideas about what society was like, and a lot of that was about sex and sexuality, but more generally it was about the sexual revolution, the idea of tolerance of homosexuality, which was still a pretty new thing in those days. And it allowed people to talk about sex.

Can the post-9/11 anthrax “epidemic” be seen as a social coping mechanism?

Living in New York in the fall of 2001, I was really struck by a contrast of (reactions). On the one hand, the World Trade Center had fallen down, 2,700 fellow New Yorkers had just died, but the mood in the city was this kind of “keep on keeping on” circumspection. A month afterward there was the postal anthrax event, and the response to that was such a dramatic contrast. There were five deaths, and that’s sad and terrible for the families of the people who died – but that’s five, not 2,700. Yet in response to anthrax, people would come up to me and say “I’m ironing my mail” or “I’m not opening my mail at all.” Buildings got evacuated whenever somebody saw some white powder. I mean, it was nutty. You would have thought there would have been a nutty response to two iconic towers getting knocked down by planes, which seemed like a science fiction scenario, a horror story scenario. And yet the craziness was in response to anthrax.

Why don’t you think we should bother planning a great deal for the next plague?

We should plan very carefully for the things we know about. For instance, it seems reasonable that if you don’t inspect food supplies for contamination, some food will be contaminated and there will be outbreaks of salmonellosis. That’s the planning I would like to see be done. What concerns me more is the kind of planning that “this might happen” and “it might lead to that” and “it might lead to a third thing” -- scenarios that seem like a stretch. It's kind of like speculation times speculation. We need more real public health planning and less “preparedness.”

Sarah Paige-Outbreak Epidemics in a Connected World

National Museum of Natural History
Volunteer facilitator training for Outbreak:Epidemics in a Connected World

John T Brooks-Outbreak: Epidemics in a Connected World Training

National Museum of Natural History
Volunteer Facilitator training for Outbreak: Epidemics in a Connected World
1-24 of 364 Resources