Skip to Content

Found 388 Resources

The Whole Gory Story: Vampires on Film

Smithsonian Magazine

With Halloween on the horizon, I had to check out the "Vampires on Film" lecture, courtesy of the Smithsonian Resident Associate Program. The speaker was movie maven and scholar Max Alvarez. It was a well-attended, three-hour tour of horror flicks that make for—more often than not—painfully bad cinema. Yet, after kicking off his lecture by decorating his podium with several heads of garlic, Alvarez lent a gravitas to these movies, elevating them from mere midnight movie schlock to a study in cultural currency—meaning that vampire stories change and evolve with new images and metaphors for each generation viewing them.

In Western culture, tales of vampirism begin in the plague-addled Europe of the middle ages where newly buried bodies were exhumed and those considered not sufficiently decomposed were desecrated—by way of beheading or a good ol’ stake through the heart—for fear that the undead would spread disease among the living. (Trick or Treat?)

What’s worse is that some persons were prematurely interred—hence, their “as yet not-dead bodies” were in fabulous condition—and they ultimately met excruciatingly violent ends. Hands-down, this was the scariest part of the lecture.

By the late 1800s vampire stories are seen in print and theatrical incarnations (such as the 1828 opera Der Vampyr and the 1872 novella Carmilla). But it is Bram Stoker’s 1897 novel Dracula that sets the gold standard for the genre and captures the imaginations of people across the globe. Like its folkloric antecedents, Dracula is a sign of the times, dealing with issues of sex (which was strictly repressed in Victorian society), xenophobia and, in lieu of plague, syphilis, the dreaded STI du jour.

It is Stoker’s vision of the vampire that first makes it to the silver screen, the earliest surviving adaptation being F.W. Murnau’s Nosferatu, but the one that set the world on fire was Tod Browning’s 1931 film Dracula—starring Bela Lugosi—and kicks off a craze. Like its literary inspiration, Dracula and its string of cinematic spinoffs dealt with those things that you generally don’t bring up in polite conversation—namely human sexuality—and titillated audiences.

After a hiatus in the 40s and 50s, the genre was rekindled in the 60s. With sex becoming less taboo, vampire movies had to start exploring new frontiers. Of note is the 1973 film Blood for Dracula wherein the Count is exposed to impure blood and becomes gravely ill, as if the film were anticipating the AIDS epidemic that would sweep the world in the 1980s. Indeed, as a character in cinema, the vampire was evolving from a one-dimensional villain into a multifaceted character that could even be seen working for the forces of good (such as in Blade or Underworld).

While the genre has lost much of the subtlety and gothic trappings of the classic horror films, vampires endure as fodder for high octane action flicks, jam-packed with as much violence and gore as an R rating can withstand. However, they can also be seen in more playful fare as well. (Buffy the Vampire Slayer anyone?)

What's your favorite vampire film? What interesting things do you see happening within the genre that keeps it from going six feet under? Do you have high hopes for the upcoming film adaptation of the best-selling novel, Twilight? And why do you think we infrequently see vampire stories frequently told by way of animation?

Image from F.W. Murnau's Nosferatu (1922)

The Whole Gory Story: Vampires on Film

Smithsonian Magazine

With Halloween on the horizon, I had to check out the "Vampires on Film" lecture, courtesy of the Smithsonian Resident Associate Program. The speaker was movie maven and scholar Max Alvarez. It was a well-attended, three-hour tour of horror flicks that make for—more often than not—painfully bad cinema. Yet, after kicking off his lecture by decorating his podium with several heads of garlic, Alvarez lent a gravitas to these movies, elevating them from mere midnight movie schlock to a study in cultural currency—meaning that vampire stories change and evolve with new images and metaphors for each generation viewing them.

In Western culture, tales of vampirism begin in the plague-addled Europe of the middle ages where newly buried bodies were exhumed and those considered not sufficiently decomposed were desecrated—by way of beheading or a good ol’ stake through the heart—for fear that the undead would spread disease among the living. (Trick or Treat?)

What’s worse is that some persons were prematurely interred—hence, their "as yet not-dead bodies" were in fabulous condition—and they ultimately met excruciatingly violent ends. Hands-down, this was the scariest part of the lecture.

By the late 1800s vampire stories are seen in print and theatrical incarnations (such as the 1828 opera Der Vampyr and the 1872 novella Carmilla). But it is Bram Stoker’s 1897 novel Dracula that sets the gold standard for the genre and captures the imaginations of people across the globe. Like its folkloric antecedents, Dracula is a sign of the times, dealing with issues of sex (which was strictly repressed in Victorian society), xenophobia and, in lieu of plague, syphilis, the dreaded STI du jour.

It is Stoker’s vision of the vampire that first makes it to the silver screen, the earliest surviving adaptation being F.W. Murnau’s Nosferatu, but the one that set the world on fire was Tod Browning’s 1931 film Dracula—starring Bela Lugosi—and kicks off a craze. Like its literary inspiration, Dracula and its string of cinematic spinoffs dealt with those things that you generally don’t bring up in polite conversation—namely human sexuality—and titillated audiences.

After a hiatus in the 40s and 50s, the genre was rekindled in the 60s. With sex becoming less taboo, vampire movies had to start exploring new frontiers. Of note is the 1973 film Blood for Dracula wherein the Count is exposed to impure blood and becomes gravely ill, as if the film were anticipating the AIDS epidemic that would sweep the world in the 1980s. Indeed, as a character in cinema, the vampire was evolving from a one-dimensional villain into a multifaceted character that could even be seen working for the forces of good (such as in Blade or Underworld).

While the genre has lost much of the subtlety and gothic trappings of the classic horror films, vampires endure as fodder for high octane action flicks, jam-packed with as much violence and gore as an R rating can withstand. However, they can also be seen in more playful fare as well. (Buffy the Vampire Slayer anyone?)

What's your favorite vampire film? What interesting things do you see happening within the genre that keeps it from going six feet under? Do you have high hopes for the upcoming film adaptation of the best-selling novel, Twilight? And why do you think we infrequently see vampire stories frequently told by way of animation?

The Woman Who Stood Between America and an Epidemic of Birth Defects

Smithsonian Magazine

In 1960, America had a stroke of luck. That was when the application to begin mass-marketing the drug thalidomide in the United States landed on the desk of Frances Oldham Kelsey, a reviewer at the Food and Drug Administration. Today we know that the drug causes severe, devastating birth defects when taken by pregnant women for nausea. But the time, thalidomide’s darker effects were just becoming known. 

Between 1957 and 1962, the sedative would leave thousands of infants in Canada, Great Britain and West Germany permanently and tragically disabled. The U.S., however, never had a crisis of thalidomide-linked birth defects on that magnitude. Why not?

What stood between the drug and the health of the American public was none other than Kelsey and the FDA. As a medical reviewer, Kelsey had the power to prevent a drug from going to market if she found the application to be lacking sufficient evidence for safety. After a thorough review, Kelsey rejected the application for thalidomide on the grounds that it lacked sufficient evidence of safety through rigorous clinical trials.

Today we take it for granted that the FDA wisely spurned an unsafe drug. But in many ways, Kelsey’s education and experience up to that point made her especially well-suited for her position as a medical reviewer—and, in particular, for the thalidomide application.

After completing a master’s degree in pharmacology at McGill University in her home country of Canada, Kelsey was recommended by her graduate advisor to write to a Dr. Eugene Geiling at the University of Chicago to inquire about a research assistant position and to express her interested in obtaining a PhD. Geiling, a medical officer at the FDA known for his studies of the pituitary gland, wrote back offering Kelsey a research assistantship and a scholarship for doctoral study. In 1936, Kelsey joined Geiling at the University of Chicago. 

That consequential step in Kelsey's career may been due to a fortuitous error on the part of Geiling. In her short memoir “Autobiographical Reflections,” Kelsey describes Geiling as “very conservative and old-fashioned,” noting that “he really did not hold too much with women as scientists.” This might explain why Geiling, in his response letter to Kelsey, addressed it to “Mr. Oldham”—believing her to be a man. Kelsey said she continued to wonder “if my name had been Elizabeth or Mary Jane, whether I would have gotten that first big step up.” 

Kelsey was first introduced to the dangers of mass marketed unsafe pharmaceuticals in 1937, when the FDA enlisted Geiling to solve the mystery of Elixir of Sulfanilamide. Sulfanilamide effectively combated infections, but it came in a large and bitter pill that needed to be taken in large dosages. To make the drug more appealing, especially to children, manufacturers added it to a solvent with artificial raspberry flavor.

The problem was that the solvent they chose was diethylene glycol—commonly known as antifreeze. Between September and October, the drug killed 107 people. 

Geiling and his lab of graduate students, including Kelsey, set out to determine what exactly in the elixir was killing people: the solvent, the flavor or the sulfanilamide. Through a series of animal studies—which at the time were not required by federal law for a drug to go to market—Geiling and his lab were able to determine that it was the diethylene glycol that was the cause of death. 

The public outcry to this tragedy prompted Congress to pass the Federal Food, Drug, and Cosmetic Act of 1938, which added a New Drug section requiring manufacturers to present evidence that a drug was safe before going to market. Though this new law “provided for distribution of a new drug for testing purposes,” FDA historian John Swann says “the law did not provide in any explicit or detailed way how oversight of that testing should be conducted.” In other words, clinical trials continued to undergo little to no oversight. 

In 1962, President John F. Kennedy honored Kelsey for her work blocking the marketing of thalidomide. (Food and Drug Administration)

Kelsey graduated from medical school in 1950, and went on to work for the Journal of the American Medical Association before starting work as a medical reviewer at the FDA in 1960. As reviewer of New Drug Applications (NDA), she was one of three people charged with determining a drug’s safety before it could be made available for public consumption. Chemists reviewed the chemical makeup of the drug and how the manufacturer could guarantee its consistency, while pharmacologists reviewed animal trials showing that the drug was safe. 

Though this appears to be a rigorous and thorough process of checks and balances, Kelsey admitted to some weaknesses in her memoir, including the fact that many of the medical reviewers were part-time, underpaid, and sympathetic to the pharmaceutical industry. The most troubling deficiency in the process was the 60 day window for approving or rejecting drugs: If the 60th day passed, the drug would automatically go to market. She recalls that this happened at least once. 

Fortunately, drug manufacturer Richardson-Merrell’s NDA for Kevadon—the U.S. trade name for thalidomide—was only the second NDA Kelsey received, meaning she didn’t yet have a backlog of reviews to get through. For Kelsey and the other reviewers, thalidomide did not pass muster. Not only were there pharmacological problems, but Kelsey found the clinical trials to be woefully insufficient in that the physician reports were too few and they were based largely on physician testimonials rather than sound scientific study. She rejected the application.

Reports of the side effect peripheral neuritis—painful inflammation of the peripheral nerves—were published in the December 1960 issue of the British Medical Journal. This raised an even bigger red flag for Kelsey: “the peripheral neuritis did not seem the sort of side effect that should come from a simple sleeping pill.” 

She asked for more information from Merrell, who responded with another application merely stating that thalidomide was at least safer than barbiturates. Kelsey then sent a letter directly to Merrell saying that she suspected they knew of the neurological toxicity that led to nerve inflammation but chose not to disclose it in their application. Merrell grew increasingly upset that Kelsey would not pass their drug, which had been used in over 40 other countries at this point.

If neurological toxicity developed in adults who took thalidomide, Kelsey wondered: What was happening to the fetus of a pregnant woman who took the drug? Her concern hit on what would be the most sinister effect of thalidomide in other countries. 

Kelsey had asked these questions before. After getting her Ph.D. in 1938, she stayed on with Geiling. During World War II, Geiling’s lab joined the widespread effort to find a treatment for malaria for soldiers in wartime. Kelsey worked on the metabolism of drugs in rabbits, particularly an enzyme in their livers that allowed them to easily break down quinine. What wasn’t clear was how this enzyme broke down quinine in pregnant rabbits and in rabbit embryos.

Kelsey found that pregnant rabbits could not as easily break down quinine and that the embryos could not break it down at all. Though there was already some work being done on the effects of pharmaceuticals on embryos, it was not yet a well-researched area. 

By November of 1961, physicians in Germany and Australia had independently discovered birth defects in infants whose mothers had taken thalidomide during early pregnancy. In embryos, thalidomide could cause critical damage to organ development—even just one pill could result in infant deformities. And since many doctors prescribed thalidomide for the off-label treatment of morning sickness, 10,000 infants all over the world were affected, and countless others died in utero. 

Merrell eventually withdrew the application on their own in April of 1962. But the drug had already been distributed to “more than 1200 physicians, about 15,000-20,000 patients—of whom over 600 were pregnant,” according to Swan. In the U.S., 17 cases of birth defects were reported, but as Swan says via email, “that could have been thousands had the FDA not insisted on the evidence of safety required under the law (despite ongoing pressure from the drug’s sponsor).”

In 1962, soon after Merrell withdrew their application and the horrors of the drug became internationally known, Congress passed the Kefauver-Harris Amendment. This key amendment required more oversight for clinical studies, including informed consent by patients in the studies and scientific evidence of the drug’s effectiveness, not just its safety. In the wake of its passage, President Kennedy awarded Kelsey the President’s Award for Distinguished Federal Civilian Service, making her the second woman to receive such a high civilian honor. 

In her memoir, Kelsey says that the honor did not belong just to her. “I thought that I was accepting the medal on behalf of a lot of different federal workers,” she writes. “This was really a team effort.” She was quickly promoted to chief of the investigational drug branch in 1963, and four years later, she became director of the Office of Scientific investigation—a position she held for 40 years until she retired at the age of 90. She lived until the age of 101, and passed away in 2015. 

Kelsey spent the majority of her life in public service, and her story continues to stand out as a testament to the essential role of the FDA in maintaining drug safety.

The World's Megacities Are Making Dengue Deadlier

Smithsonian Magazine

While the world’s attention is focused on the Zika virus spreading through the Americas, large urban areas in Southeast Asia are fighting off outbreaks of dengue fever. The mosquito-borne illness causes high fever, rash and debilitating joint pain, and it can develop into a more severe and lethal form. An epidemic this past October swept through New Delhi, sickening more than 10,000 people and killing 41, overwhelming the city’s hospital capacity.

The two species of mosquito primarily responsible for transmitting dengue, Aedes aegypti and A. albopictus, live in close proximity to humans. Our homes are their homes. In urban areas, where most dengue transmission happens, recent housing booms have provided more places not only for humans to live, but also these mosquitoes. The influx of people, increased construction and ongoing travel of humans and mosquitoes around the world have led to a 30-fold rise in urban dengue outbreaks between 1960 and 2010, according to the World Health Organization.

Fighting this problem will mean combining some of the world’s most basic public health measures, like plumbing and sanitation, with high-tech vaccines and mosquito control measures. The goal is to provide a better home for humans while kicking mosquitoes to the curb. It will be tough, says infectious disease expert Duane Gubler of the Duke-NUS Graduate Medical School in Singapore. But he believes that this dual focus may finally provide traction against the deadly disease.

“If you can decrease the mosquito population while increasing herd immunity, you can decrease transmission and prevent epidemics,” Gubler says.

An Aedes aegypti mosquito sucks blood out of a human. A. aegypti is the carrier of many diseases, including dengue, and is adapted to live among humans in dense cities. (James Gathany/CDC)

Urban centers have long been magnets for infectious disease. As soon as humanity started living in large cities, epidemics swept through the population, creating death and misery on a scale seldom seen.

Then, as now, epidemics required the confluence of a large pool of uninfected, non-immune people with suitable conditions for the pathogen to spread. For vector-borne diseases, that also meant the presence of the mosquito, tick or flea that helped to move the infection from person to person. Large cities placed all of these factors in close proximity, and the results were catastrophic. Early epidemics of plague and smallpox in ancient Rome, for instance, killed approximately half the population.

More than half of humanity now lives in cities, and that percentage is growing. As more people leave their agrarian pasts for the promise of the big city, many urban centers have boomed into mega-metropolises of more than 20 million people. This rapid influx of people has led to burgeoning slums in the world’s biggest cities, as well as new construction in middle- and upper-class neighborhoods. 

The emergence of the modern megalopolis shows that humans ultimately adapted well to their crowded surroundings, but the same has been true for our microscopic pathogens.

Dengue began as a disease of primates that was transmitted in the forests of Africa by mosquitoes. The virus adapted to humans, as did the A. aegypti mosquito, which passed the virus from host to host in its saliva. As humans moved to small villages, the mosquito and the viruses it carried moved with us, causing small outbreaks of dengue.

The African slave trade transported the mosquito, which laid its eggs in the water casks aboard ship, and diseases like dengue, malaria and yellow fever spread around the world. Many of the world’s first large cities were shipping hubs in warm, humid areas, making them conducive to the spread of tropical diseases.

Still, before World War II, outbreaks of dengue occurred only every 10 to 40 years and rarely caught the attention of physicians or public health officials, Gubler says. Then the effects of dengue and other mosquito-borne diseases on military personnel brought dengue back to the forefront, as did the post-war population boom in Southeast Asia and its accompanying rush of urbanization. This change transformed dengue from a tropical rarity into a major urban pathogen.

Initial investment in mosquito control programs slowed the transmission of dengue, but budget cuts in the 1970s and '80s forced health departments to scale back these programs. At the same time, rapid globalization moved people and pathogens around the world faster than ever before.

The world’s megacities also pose another type of risk. The dengue virus has four different subtypes, and infection with one type doesn’t make you immune to any of the others. It’s one of the factors making a dengue vaccine so hard to produce. Even worse, a second infection with the dengue virus isn’t just an inconvenience, it’s also potentially deadly. Because the immune system has seen a closely related virus, it overreacts when it responds to a second dengue subtype. The result is dengue hemorrhagic fever, when an overreactive immune response causes severe internal bleeding and death.

Massive cities are more likely to have multiple subtypes of dengue circulating at the same time, increasing the chances for developing dengue hemorrhagic fever. The result is the explosive dengue outbreaks that now regularly strike tropical cities like New Delhi, São Paolo and Bangkok. Dengue is an annual problem in New Delhi, with cases climbing after the yearly monsoon season and peaking in early fall.

Exactly how many people are affected by these outbreaks isn’t clear because a large proportion of disease occurs in resource-poor settings where epidemiological surveillance is limited at best, says Narendra Arora, a pediatrician and infectious disease researcher with the INCLEN Trust in India. In addition, the symptoms of dengue closely match those of other tropical diseases like malaria and chikungunya.

The World Health Organization had estimated that 20,474 people in India are sickened by dengue each year, but a 2014 study in the American Journal of Tropical Medicine & Hygiene by Arora and Donald Shepard at Brandeis University showed that the number was likely more around 6 million, 300 times greater than WHO estimates.

“It showed we really don’t know how much dengue there is. We need to know more about how much of a problem it is,” says Carl-Johan Neiderud, a medical microbiologist at Uppsala University in Sweden.

The view from the Jama Masjid Mosque in New Delhi, India. New Delhi and its suburbs rank among the largest megacities in the world, with more than 25 million people living there. (Kiedrowski, R./Corbis)

Few countries have managed to control dengue permanently, but those with some success have focused on mosquito control.

Unfortunately, anti-malaria measures such as insecticide-treated bed nets aren’t effective against dengue because A. aegypti is active during the day, not at night like the malaria-carrying mosquitoes. A. aegypti is also quite content to live its entire life indoors, and it can breed in very small volumes of water. Their eggs can withstand desiccation for several months, making it easy for them to survive temporary dry spells. That means standing water at construction sites and in slums provide mosquitoes with the perfect places to live and reproduce.

In the recent outbreak in New Delhi, news reports linked the construction of one new apartment complex with a large cluster of dengue cases. Scientists hypothesized that mosquitoes breeding in pools of water in the construction site were fueling dengue cases nearby.

Arora says these new construction sites are not a leading cause of the past year’s outbreak, though he acknowledges that they may have contributed. Instead, he says that inadequate and nonexistent sanitation in New Delhi’s many slums is a far larger contributor to outbreaks. Many of the workers on these projects arrive from other parts of India that see fewer dengue cases, so they are very likely to lack immunity to the dengue virus. They also tend to live in the slum areas, further exacerbating the problem. 

To Arora, going back to public health basics like improving plumbing and sanitation are the first steps. He also cited increasing enforcement of an Indian law that prohibits standing water in residential properties and yards. Fines for violators have been encouraging residents to take sanitation issues more seriously and remove potential mosquito breeding grounds from homes. Other measures, such as installing or fixing window screens and repairing doors and siding where mosquitoes can enter, will also help provide a barrier between humans and mosquitoes.

“It is not just the aesthetics of the place. A cleaner India will have a tremendous health impact,” Arora says.

Gubler cites Singapore as an example of effective dengue control. A combination of public education campaigns and larval and insect control measures have helped keep the city dengue-free for nearly 20 years. Although the countries around Singapore were succumbing to regular outbreaks, “Singapore remained a little island in a sea of dengue,” he says. “But you need political will and economic support for these programs to work. It’s a battle between economics and public health, and public health always loses.”

Clinical trials of new dengue vaccines are ongoing, and three candidates are approaching formal approval. Meanwhile, trials in Brazil and Florida are testing the effectiveness of genetically engineered sterile male mosquitoes, providing another new tool in the war on dengue. Gubler is optimistic: “For the first time in my career, we have the tools to control this disease.”

The human obesity epidemic, the mismatch paradigm, and our modern "captive" environment

Smithsonian Libraries
In the distant past obesity in humans was rare and likely caused by metabolic dysregulation due to genetic or disease-related pathology. External factors precluded the ability of most people to overeat or under exert. Socio-cultural obesity came about due to the rareness of obesity and its difficulty to achieve. What is rare becomes valuable and what is difficult to achieve becomes a badge of prestige. The modern human obesity epidemic would appear to represent a third class of obesity: environmental obesity. Much like the captive environments which humans construct for the captive/companion animals in our care, the modern human environment has greatly decreased the challenges of life that would restrict food intake and enforce exertion. And like us, our captive/companion animal populations are also experiencing obesity epidemics. A further concern is that maternal obesity alters maternal signaling to offspring, in utero through the placenta and after birth through breast milk, in ways that perpetuate an enhanced vulnerability to obesity. Molecules such as leptin, produced by adipose tissue and placenta, have significant developmental effects on brain areas associated with feeding behavior. Leptin and other cytokines and growth factors are found in breast milk. These molecules have positive effects on gut maturation; their effects on metabolism and brain development are unclear. Placenta and brain also are hotspots for epigenetic regulation, and epigenetic changes may play significant roles in the later vulnerability to obesity and to the development of a diverse array of diseases, including heart disease, hypertension, and noninsulin-dependent diabetes. Am. J. Hum. Biol., 2012. (C) 2012 Wiley Periodicals, Inc.

The most radical thing about Stonewall wasn’t the uprising

National Museum of American History

The Stonewall uprising began June 28, 1969, in response to a police raid at The Stonewall Inn, a gay bar in New York, and has since been commemorated around the world with pride parades and other events. Curator Katherine Ott reflects on the significance of the uprising.

I’m a Stonewall skeptic. I don’t doubt that it happened, but I question how it has been used over the years. Because this is a big anniversary year, there is a compulsion to heroize the people who were there and elevate the event.

Those sweaty summer nights of rebellion were certainly important and unique and have reverberated for 50 years. However, an event like the Stonewall uprising was inevitable—young people with 1960s political impatience and righteous indignation had a lot of LGBTQ+ history to fuel them. Other protest and resistance had already happened in places such as Philadelphia, Los Angeles, and San Francisco. Much of the staying power of Stonewall’s reputation rests upon the Pride marches that began on the first anniversary of the uprising.

A can that is covered with paper that reads Christopher LiberationDonation can from the Christopher Street Liberation Day March, the first Pride march, New York City, 1970. Gift of Mark Segal.

Stonewall’s outsized fame has a downside—skewing both understanding of LGBTQ+ history and misrepresenting how historical change comes about. There is no universal LGBTQ+ history in which any one event is primary. The only commonality in LGBTQ+ life is the risk people take in being themselves.

Stonewall is often pointed to as the birth of the modern gay rights movement or the biggest news in LGBTQ+ history. But that is not accurate. For many gender-non-conforming people, Stonewall had little effect or held no interest. For many disabled LGBTQ+ people, change has been glacial—many people were institutionalized in the 1960s and still constitute a large percentage of those incarcerated. The largest psychiatric facilities today are prisons. In the 1960s, many people of color were putting their energy into civil rights work, antiwar activism, or the Chicano Movement. People living in small towns and rural areas outside of the metropoles of New York, San Francisco, or Chicago did not hear about what happened in New York City or take it up as a rallying cry.

A button with the wheelchair symbol in rainbow.Rainbow wheelchair button, 2016.

Some 12 years after Stonewall, the AIDS epidemic more broadly modernized the gay rights movement and propelled gay liberation by decimating and restructuring communities, creating solidarity, and necessitating out-of-the-box confrontations.

A black and white comic showing the effects of AIDs"AIDS: Bearing Angry Witness," by Jennifer Camper, printed in The Blade. HIV and AIDS have had a profound effect on communities, science, medicine, social services, and everyday behavior. Courtesy of John-Manuel Andriote Victory Deferred Collection, Archives Center.

We often think of history with testosterone-fueled events such as battles, riots, and assassinations being the source of lasting change. Violent outbreaks are dramatic and the pain that comes in their wake is attention-grabbing. But real change generally does not come about in a moment. It happens over time and is sustained by people who hold on to an idea and push it forward: the World War II soldiers who came out to each other and stayed out, the 1950s and ’60s journalists who mailed their newsletters in plain brown wrappers, the court cases, picketing, cafeteria rebellions, and everyone who showed up to challenge ignorance. Before Stonewall, there were dozens of legal actions around jobs, marriage, housing, and the right to be yourself. Violence may accompany change, but it does not sustain it.

A blue magazine with the words "Mattachine Review, Police Roundup Jails 69:Mattachine Review, May–June, 1955. The cover story discusses one of the many police raids.A poster which reads "Sexual preference is irrelevant to federal employment."Picket signs carried by protestors at the White House and Independence Hall in Philadelphia, 1960s. Frank Kameny Collection.

For me the reason to remember the Stonewall uprising is in recognition of the daily acts of courage the rioters took that got them to the bar that night. It is the multiple, unremarkable moments of inbreathing “Yo Soy, I am” that people on the margins take every day that is the watershed for change.

Katherine Ott is a curator in the Division of Medicine and Science. She has also blogged about objects she collected from the parents of Matthew Shepard and collecting LGBTQ+ objects of the past.

A display titled Illegal to Be You: Gay History Beyond Stonewall is currently on view at the museum.

Posted Date: 
Friday, June 21, 2019 - 10:30
OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=0M42D-0j7JI:qHlC2wQgFdg:V_sGLiPBpWU OSayCanYouSee?i=0M42D-0j7JI:qHlC2wQgFdg:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

There Aren’t Enough Patients for Ebola Drug’s First Clinical Trial

Smithsonian Magazine

A handful of Ebola treatments have been fast-tracked through the many trials needed for new drugs in order to help the epidemic in West Africa. However, the apparent waning of new infections in Liberia has halted one drug’s clinical trial. The other treatments are also running into the same problem—not enough patients, reports Andrew Pollack for the New York Times

The drug developer, Chimerix, announced the study’s halt late last week. The plan had been to test the antiviral drug brincidofovir, and doses for 140 patients had been shipped to Liberia. But fewer than 10 patients had received the treatment since January 2, when the trial started, a company executive told the Times

Brincidofovir was one of a handful of experimental drugs approved by the World Health Organization for testing during the epidemic. Other drugs are being tested: Another antiviral called favipiravir (originally a flu drug) has started a clinical trial in Guinea. A third option, using transfusions of survivor’s blood, has been used before but never properly assessed for effectiveness. Armand Sprecher, with Doctors Without Borders, told the Times that blood plasma trials hadn’t yet started at their clinics. However, another group has been testing the treatment in Liberia and may soon start in Sierra Leone.

It may be the end of testing for brincidofovir. “I think for now our plan is not to pursue clinical trials,” says Chimerix’s chief executive, Michelle Berrey. “We’ll wait and see how the outbreak goes.”

Vaccines are also in the works to prevent infection in the first place. The first major vaccine trials began today, February 2, in Liberia, reports BBC News. According to Al Jazeera, the launch of the trial was accompanied with music:

In a densely populated neighbourhood of Monrovia, guests clapped, danced along and nodded as musicians sang lyrics on Sunday that explained the purpose and intent of the Ebola vaccination trial.

The singing was part of a campaign to overcome Liberians' reluctance to embrace the vaccines amid conspiracy theories.

If the waning numbers of new infections truly indicate an ebb in the epidemic, even the vaccine trials may face trouble continuing. But as Sprecher told the Times, "It’s more important to end the outbreak than to get the trial done."

These Are the World’s Most Dangerous Emerging Pathogens, According to WHO

Smithsonian Magazine

International officials recently gathered to discuss one of the biggest threats facing humanity—and this wasn’t the Paris climate talks. As Science’s Kai Kupferschmidt reports, the setting was Geneva, Switzerland and the task was the selection of a shortlist of the world’s most dangerous emerging pathogens. These diseases are considered by a World Health Organization (WHO) committee of clinicians and scientists to be the pathogens “likely to cause severe outbreaks in the near future, and for which few or no medical countermeasures exist.” Here’s the WHO’s list, and what you should know about these scary diseases:

Crimean Congo hemorrhagic fever

This tick-borne fever got its name from the Crimea, where it first emerged in 1944, and the Congo, where it spread in 1969. Now, it can be found all over the world, though it primarily occurs in Asia. The disease is often misnamed “Asian Ebola virus” for its fast-moving effects, which include enlargement of the liver, fever, aching muscles and vomiting.

Outbreak News Today’s Chris Whitehouse writes that CCHF is currently spreading across India, where agricultural workers are often exposed to diseased, tick-bearing animals. According to the WHO, outbreaks of the disease can have a fatality rate of up to 40 percent. There is no vaccine for CCHF, but at least one has been shown to be effective in animals.

Ebola virus disease

It’s no surprise to see Ebola virus disease on the list—it has been ravaging African countries for decades, with widespread outbreaks throughout West Africa and the recent resurgence in Liberia. Also known as Ebola hemorrhagic fever, the disease has an average fatality rate of 50 percent, but has been as high as 90 percent in some outbreaks.

Though it is still unclear exactly how the virus is transmitted, scientists believe that bats serve as a natural “reservoir” for Ebola, which is then transmitted through some form of contact. There are no current licensed vaccines, but clinical trials for at least two are underway.

Marburg hemorrhagic fever

In 1967, a mysterious disease broke out in Europe, killing laboratory workers who had been exposed to monkeys from Uganda. The cause, Marburg virus, was named after the German city where it was first detected and is a filovirus—a family of viruses that include Ebola.

Marburg virus has broken out only sporadically since the 1960s, but occurs in people who have spent time in caves frequented by Rousettus bats. Marburg causes a rash, malaise and bleeding and is often misdiagnosed. There is no current vaccine or treatment.

Lassa fever 

First diagnosed in Benin, Lassa fever can be difficult for doctors to diagnose and only becomes symptomatic in 20 percent of the people who become infected, according to the WHO. When it does strike, patients can move from mild effects like a slight fever to, in more severe cases, hemorrhaging, encephalitis and shock. But the fever’s most devastating and common complication is deafness. About a third of all Lassa patients lose some or all of their hearing.

Lassa fever is primarily found in West Africa and is contracted when people come into contact with the waste of infected Mastomys rats or the bodily fluids of those with the disease. Though the antiviral drug ribavirin may be effective in Lassa fever cases, there is no current vaccine.

MERS and SARS coronavirus diseases

Middle East Respiratory Syndrome (MERS) and Severe Acute Respiratory Syndrome (SARS) have had their fair share of media coverage. They’re members of the coronavirus family—viruses that usually cause upper respiratory illness. Though transmission seems to come from infected camels, the diseases are both easy to catch from infected peoples’ coughs or sneezes.

Both conditions emerged relatively recently. SARS broke out in Asia in 2003, but the global outbreak was contained and no cases have been reported since 2004. The news isn’t that great concerning MERS: The disease, which started in Saudi Arabia in 2012, broke out again in South Korea this year. The WHO reports that 36 percent of reported patients die. Health officials tell SciDevNet that it’s unlikely a vaccine will be developed anytime soon.

Nipah and Rift Valley fever

The final two entries on the WHO’s list are viruses from animals—Nipah virus infection and Rift Valley fever. Nipah was first identified in 1998 when Malaysian pig farmers fell ill. To stop the outbreak, the Malaysian government ordered euthanasia over a million pigs. Even so, the virus later showed up in Bangladesh and India. Nipah causes brain inflammation, convulsions and even personality changes. 

Rift Valley fever originated with Kenyan sheep farmers in 1931 and has since been identified in outbreaks throughout Africa. The disease is spread by handling diseased animal tissue, drinking infected milk or being bitten by infected mosquitos. However, the WHO has never documented a case of human-to-human transmission. The disease causes symptoms similar to meningitis and can be hard to detect in its early stages. Though most people get a milder version of the disease, others aren’t so lucky. Around eight percent of patients get ocular disease, brain inflammation and could eventually die. Neither Nipah nor Rift Valley fever have currently-approved human vaccines.

Though the diseases on this list were identified as the most likely to cause widespread epidemics, the WHO also designated three other diseases as “serious”: chikungunya, severe fever with thrombocytopenia syndrome, and Zica. Diseases like Malaria and HIV/AIDS were not included because of already established disease control and research into treatment and prevention.

This 1,500-Year-Old Skeleton May Belong to the Man That Brought Leprosy to Britain

Smithsonian Magazine

In the early 1950’s workers digging for gravel uncovered skeletons of people interred in an Anglo-Saxon cemetery a century and a half before. At the time, the team noted that the bones of one man in particular had joint damage and the narrow toe bones typically caused by leprosy. When researchers recently reanalyzed those same bones using modern techniques they realized the man may have had the first case of the disease in Britain. On top of that, other tests show that he was probably from Scandinavia, not Britain.

The researchers were able to gather some bacterial DNA from the bones and sequence it, reports Maev Kennedy for The Guardian. They genetic fingerprint they found was that of a leprosy strain belonging to the lineage 3I, which has been found at other burial sites in Scandinavia and southern Britain but at later dates. The man likely died in the 5th or 6th century. 

“The radiocarbon date confirms this is one of the earliest cases in the UK to have been successfully studied with modern biomolecular methods," says Sonia Zakrzewski, of the University of Southampton in a press release. "This is exciting both for archaeologists and for microbiologists. It helps us understand the spread of disease in the past, and also the evolution of different strains of disease, which might help us fight them in the future.”

The research team also analyzed elements in the man’s teeth. Specifically, they looked at several isotopes — element can different numbers of neutrons, each of variation is a different isotope. They measured the ratio oxygen isotopes, which reflect those found in the water he drank, and strontium isotopes found in his enamel, which reflect the geology of his homeland, explains Maddie Stone for Vice. This analysis told the researchers that the man likely came from Scandinavia. He may have carried the disease to Britain from there. When he died, he was in his 20s, the researchers report. They published their findings in PLOS One

The 3I leprosy strain is one of five strains found around the world. It not only gave rise to the leprosy of the British Isles, but that in the southern U.S. (where it’s often carried by armadillos) and in the U.K. even today. However, the leprosy epidemic didn’t peak in Europe until the 13th century. If the man had seen a physician in his new country, they wouldn’t have recognized the deformations and scaly skin of a leprosy infection. Perhaps he would have escaped the social stigma that later arose around the disease too.

This man isn't the first person in the world to get leprosy, explains Stone. "There are a handful of cases worldwide that predate this young man, including several from second century BC Egypt, first century AD Israel, and 1st through 4th century AD Uzbekistan," she writes. But he is the first known case in Britain. 

The team’s project leader, Sarah Inskip of Leiden University told Stone: “We plan to carry out similar studies on skeletons from different locations to build up a more complete picture of the origins and early spread of this disease.” 

This Color Is Who I Am

Smithsonian Institution

Artist Frank Holliday's social circle in the 1980s was a who's who of New York City cool: Andy Warhol, Cyndi Lauper, RuPaul, Keith Haring, and even Madonna. But Frank's odyssey through the art world also placed him at the center of an epidemic that would shake the entire country. In honor of World AIDS Day, Sidedoor takes a look at America's early HIV/AIDS Crisis through the eyes of an artist whose life and work were changed by it forever. 

This episode features recordings from the "Visual Arts and the AIDS Epidemic" Oral History Project produced by the Smithsonian’s Archives of American Art.

This Exhibition Uses $586 to Tell the Story of American Eviction

Smithsonian Magazine

For his Pulitzer Prize-winning book, Evicted: Poverty and Profit in the American City, the sociologist Matthew Desmond followed eight families living in Milwaukee’s poorest neighborhoods in 2008 and 2009.

One of the Desmond’s subjects, Lamar, who is a black single father, a Navy veteran and a double amputee, made $628 a month (roughly $7,500 a year). With his monthly rent at $550, he had just $2.19 budgeted per day to spend on his family.

When Lamar fell behind on his payments, he became one of the faces of an estimated 3.7 million Americans who have experienced an eviction, according to an analysis by Apartment List last year.

In the new exhibition Evicted, the National Building Museum in Washington, D.C., brings that story of American eviction to the forefront by turning Desmond’s book into an installation.

As Kriston Capps reports for CityLab, the house-like structure, erected with particleboard purchased at Home Depot, cost $586 to build—approximately the amount Lamar makes in one month. As Capps explains, the curatorial interpretation of Evicted “distills the policy analysis of Desmond’s book to three critical points: Incomes are stagnant, rents are rising, and the government is not filling the gap.”

The installation uses infographics from the National Low Income Housing Coalition and the Center on Budget and Policy Priorities, audio interviews, photographs, and excerpts from Evicted to drive those points home.

According to Apartment List’s findings, more than a quarter of renters whose income falls below $30,000 weren't able to afford to fully pay their rent at least once over three consecutive months surveyed. The report also found that evictions disproportionately affect African Americans: About 12 percent of black respondents answered that they had faced an eviction compared to just 5.4 percent of white respondents.

“What I want people to get out of this exhibit is an introduction to the affordable housing crisis and the eviction epidemic,” Desmond says in a promotional video. “For folks that have been evicted, I want them to recognize that they’re not alone, that their story is part of a larger pattern happening all across America; and for those of us that have never thought about eviction, I want them to realize what it’s doing to our families and our children and our communities and how it’s leaving a deep and jagged scar on the next generation.”

According to the National Low Income Housing Coalition “no state has an adequate supply of affordable rental housing for the lowest income renters.” Eviction isn't just directly causing homelessness, either. "Housing instability threatens all aspects of family life: health, jobs, school, and personal relationships," the Building Museum's website explains. And it makes it that much harder to rent in the future, since landlords are weary of past eviction records.

Unsurprisingly, eviction can also lead to mental health issues, like depression and stress, as sociologists at Rice University and Harvard University found in a 2015 study, the first to examine the effects of eviction from nationwide data.

On his end, Desmond is hard at work continuing to study evictions, now with a project called Eviction Lab, which, for the first time, is tracking formal evictions nationwide.

Desmond said the scope of the epidemic in America remains unclear. "[T]he estimates that we have are stunning, but they’re also too low," he says in a recent interview with Fresh Air's Terry Gross. There is no data on informal evictions, for instance, he says, like when a landlord pays a tenant to leave in order to rent out the apartment at a higher price, as has become common practice in places like New York City, or when a landlord threatens deportation.

As Desmond puts it in the video for the new exhibition, evictions are part of a larger American problem: “If you care about high healthcare costs, racial inequality, children's futures, fiscal responsibility, whatever your issue is, the lack of affordable housing sits at the root of that issue."

This Eyelash of a Moment

Smithsonian Center for Folklife and Cultural Heritage
Sonya Renee on the Red Hot stage. Photo by Patricia Wakida
Sonya Renee on the Red Hot stage.
Photo by Patricia Wakida

Standing tall in gold stacked sandals at the mic is Sonya Renee, formerly from the Black AIDS Institute, the nation’s only HIV/AIDS think-tank focused exclusively on the HIV epidemic in Black communities. Even in 100-degree heat, she has the ability to fill a circus-sized open tent with an enormous presence, and to electrify the audience with her self-awareness—she is bald and her beauty is fierce.

Renee is one of five District of Columbia spoken word poets who do triple duty as HIV/AIDS educators, activists, and artists in the community, and they are on heavy rotation at the Creativity and Crisis Red Hot stage. The company she keeps at the Festival is equally inspiring: Mary Bowman, JT Bullock, Regie Cabico, Dwayne Lawson-Brown, artists who stand up for what they believe in and write from the breath.

The stories they channel teach us about grief and gratitude, and the remorse and relief that comes directly from personal experiences. Stories of mothers who crash too early, or a former sexworker client whose redemption after his death comes in the form of a brag: for handing out free condoms and perhaps preventing just one more HIV/AIDS test from reading positive. “Art is a universal medium, and any shared language is a phenomenal tool to helping us live unapologetically,” said Renee. Like many of the poets performing, she reminded the audience of the fragility and value of each waking day, and the power of poetry for people to gather, for spoken word to be heard and passed on, even if only in this “eyelash of a moment.”

Mary Bowman, Dwayne Lawson-Brown, and Regie Cabico perform from the Red Hot stage of the Creativity and Crisis program. Photo by Patricia Wakida
Mary Bowman, Dwayne Lawson-Brown, and Regie Cabico perform from the Red Hot stage of the Creativity and Crisis program.
Photo by Patricia Wakida

Pilipino artist Regie Cabico feels that spoken word is an unprecedented America art form, much like jazz. “Its also political theater, where the poet performs three minute scenes, but without music, without scenery, without props.” Reflecting on the presence of poets and creativity in response to HIV/AIDS, he said “I was twelve during the crisis/epidemic, and as a result growing up, I didn't have any mentors—they had all died. I am just realizing what I've been robbed of twenty-five years later as a forty-two year old.”

According to poet Dwayne Lawson-Brown, “We are the ‘folk’ in folklife. We document life, which becomes a community thing we are all sharing together. Spoken word is storytelling into the new generation.”

Catch the Spoken Word poets everyday in Week 2 on the Red Hot stage, Creativity and Crisis program area.

For those who can’t get enough spoken word, check out the poetry slam at the Good Hope and Naylor Corner stage in the Citified program on July 6, 3:30 to 7:00 p.m.

Patricia Wakida is a writer and historian based in Boyle Heights, a neighborhood of Los Angeles, California.

This Galentine's Day blog post is for you. You poetic, noble land-mermaid.

National Museum of American History

On February 13, women everywhere (we hope!) will be gathering together to celebrate Galentine's Day. First introduced in 2010 by character Leslie Knope on the TV show Parks and Recreation, Galentine's Day is about "ladies celebrating ladies," be they friends, co-workers, family members, or personal heroes. What began as a fictional holiday for women to honor other women has merged into real life as more women learn about and celebrate this happy day. In honor of Galentine's Day we have chosen some of our favorite gal pals in our collections. Below are some of the women and girls who could have had their own Galentine's Day celebration.

The Monterey Gals

Black and white posed photo of women, each with their hair piled on top of their head and high-collared blouses.

Postcard with typed words and handwritten words. Green postage stamp.

Before telephones were common forms of communication, real photo postcards were the rapid messaging tool of the day. The postmark indicates that Elsie sent this card from Monterey, Virginia, at 9:00 a.m. on 23 May, 1907, to a Miss Jay (maybe a nickname?) Yager in Bartow, West Virginia, some 30 miles away. With mail services often delivering twice a day, you could send a quick note in the morning to invite a friend for a late-night horseback ride as Elsie did in May 1907, "am going horse back riding to night [sic], come and go along 'Moon-light' you know." Enticing!

If Elsie or Jay are depicted among the group of young women photographed in an unidentified photographer's studio, we don't know. It sounds like perhaps Elsie was a store clerk at Dunlevie Drug Store in Dunlevie (now Thornwood), West Virginia. "Do you ever go to Dunlevie, anymore(?)/ Would love you to. Come in and see me." Maybe then Jay could have gotten the scoop from her gal pal, "Rec'd card it was a rich one." When this postcard was written, senders were not allowed to write on the back of the postcard; it was to be the address only. Our rebellious sender, Elsie, continued her message there anyway. With their spunk, contemporary hair and clothes, and tight friendship, we would like being friends with these gals!

The Seven Sutherland Sisters

Black and white photo of seven women with exceedingly long hair.

Women of late-19th-century America flaunted hair as their "crowning glory," the ultimate marker of feminine beauty, luxurious vitality, and even moral health. For the seven Sutherland sisters of Cambria, New York, luscious locks were never in short supply. Together, their womanly manes measured 37 feet, a fact that helped them become national celebrities and entrepreneurs.

Glass bottle labeled "hair grower" beside packaging featuring black and white image of seven women with long hair.

Sarah, Victoria, Isabella, Grace, Naomi, Dora, and Mary toured the country first as a musical act and later with the Barnum and Bailey Circus. Eventually, they took their show to the drug store, offering demonstrations and consultations to admirers and selling their father Reverend Fletcher Sutherland's Seven Sutherland Sisters hair and beauty products. Their success allowed them to retire from the road and build a mansion where they lived together back in New York. By the 1920s, however, the fashion for bobbed hair cut their sales short.

The Navy Nurses of Base Hospital No. 5

Black and white group photo of women wearing uniform coats and hats.

In May 1917, just a month after the United States officially entered World War I, four women set sail from New York Harbor amidst a flurry of noisily cheering crowds. Beulah Armor, Faye Fulton, Halberta Grosh, and Bertha Hamer were nurses with the Navy Nurse Corps and, along with hundreds of soldiers and sailors, they were heading to France well before the troops of the American Expeditionary Force could be fully mobilized to follow them. As young nurses in Philadelphia at the time of the war, the women joined a group of fellow medical professionals from Philadelphia to establish Navy Base Hospital No. 5 in Brest, France.

The hospital began operations in December 1917 and was quickly inundated with patients including new soldiers arriving from the United States, wounded soldiers returning from the war front, civilians injured in German submarine attacks, and victims of the 1918 flu epidemic. There's nothing like a war zone to forge lasting relationships, and no doubt the nurses of Base Hospital No. 5 relied on each other to get them through the long, grueling war.

Coat with "USR" on collar and two lines of buttons, vertical. Dark navy blue.

With the end of the war in November 1918, and the closing of the Base Hospital in March 1919, the four women returned to Philadelphia. Fulton, Grosh, and Hamer continued working as professional nurses, while Armor married a fellow member of Base Hospital No. 5: a cook named Elwood Basler, who was also briefly a patient. Clearly the relationships formed among the nurses in Brest remained strong over the years, as the women got together in 1970 to donate objects from their time in the war. These objects serve as a lasting reminder of brave women facing difficult situations with the support of their friends and colleagues.

Takayo Tsubouchi Fischer and Jayce Tsenami

Black and white photo of two smiling children. Standing outside, with a very plain building behind them. They are both smiling.

Screen, stage, and voice actress Takayo Fischer, was one of 120,000 citizens or residents of Japanese ancestry who were forcibly removed from their homes in Western states and incarcerated in camps during World War II. The youngest of five girls, she grew up with a rich tradition of valuing female friends. The communal nature of the camps—where inmates ate common meals in central mess halls and used bathrooms and showers without individual stalls—broke down the traditional Japanese family structure. As families lost the opportunity to create private family meals or carve out family time, friends, like Jayce Tsenami, took on an even greater importance. Both Takayo and Jayce were inmates at the Jerome camp in Arkansas, where the wooded swampland of the Mississippi delta brought with it mosquitoes, poisonous snakes, malaria, and dysentery. The friendship they forged in camp helped to sustain them through the long and difficult incarceration.

Mary Hill and the Ladies of Maltaville

Quilt with white background and simple shapes in a variety of colors. These include flowers, leaves, and shapes.

In 1847, the women of the Presbyterian Church of Maltaville, New York, honored their friend Mary Hill by making her an album quilt. Album quilts, also called friendship quilts, are made up of appliquéd and embroidered blocks which are joined together to form a quilt. The blocks often contain inked inscriptions with special meaning to the maker, including names, dates, places, or poems.

For Mary Hill's quilt, the women of the church made, joined, lined, and quilted 61 blocks. Each block is signed in ink and features motifs such as birds, flowers, hearts, and stars. At the center of the quilt is a large block with a wreath of flowering vines surrounding the inscription, "Presented to Mrs. Mary B. Hill as an expression of esteem by the Ladies of Maltaville." The quilt was clearly treasured by Mary Hill and her family, as it remained with them for almost 100 years until it was donated to the museum by her granddaughter in 1930.

Gertrud Friedemann and Eva Morgenroth Lande

Black and white portrait of a young girl in a plaid coat. Her hair is on top of her head, perhaps in a hat. Her hands are in muff or hand warmer.

In the 1930s Gertrud Bejach Friedemann and her husband, the bacteriologist Ulrich Friedemann, took refuge in Great Britain and then the United States to avoid the terrors of Nazi Germany. They brought with them Gertrud's two children by her first marriage, Eva and Anton Morgenroth. Among their belongings was a small paper puzzle, called Zauberspiel (Magic Game), which Gertrud had played with as a child in Berlin.

Green piece of paper with rectangular windows cut into the top piece of paper. One window reveals the number 73.

Many green sheets of rigid paper with rectangular windows cut into them, some revealing numbers.

Gertrud passed the game on to her children and it remained a favorite of theirs in their new home in America. In 1988 Eva gave her Zauberspiel to the Smithsonian. The game suggests not only the enduring fascination of mathematical recreations and the rich culture of early 20th century Berlin, but also the power of a small object to tie a mother and daughter who took refuge in the United States to the past they left behind.

Care to try Zauberspiel? Visit our collections record to learn more. But be warned: according to Eva, when family friend Albert Einstein tried his hand at the game he spent days puzzling over it to no avail.

Patri O'Gan is a project assistant in the Division of Armed Forces History. Shannon Perich is a curator in the Division of Culture and the Arts. Mallory Warner is a curatorial assistant and Rachel Anderson is a research and project assistant in the Division of Medicine and Science. Lucy Harvey is a program assistant in the Division of Armed Forces History. Madelyn Shaw is a curator in the Division of Home and Community Life. Peggy Kidwell is a curator in the Division of Medicine and Science.

Author(s): 
Patri O’Gan, Shannon Perich, Mallory Warner, Rachel Anderson, Lucy Harvey, Madelyn Shaw, and Peggy Kidwell
Posted Date: 
Monday, February 13, 2017 - 08:00
OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=D1CbjnWLoRw:Xu5C0SFpGgg:V_sGLiPBpWU OSayCanYouSee?i=D1CbjnWLoRw:Xu5C0SFpGgg:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

This Mockumentary Explains the Appeal of Skateboarding to Scared Parents

Smithsonian Magazine

City streets without skateboards seem almost incomprehensible in the 21st century, but in the 1960s they were a relatively new phenomenon that raised eyebrows among some parents who saw them as the dangerous tools of a reckless generation. Though CityLab’s Mark Byrnes writes that things have improved for Montreal skateboarders, the sport wasn’t always familiar to adults who worried about their safety, sounds, and impacts on urban spaces. So, in 1966, a Canadian filmmaker made a mockumentary about what he called The Devil’s Toy, a look at the ultimate weapon in the battle between kids and adults. 

The film was made by Claude Jutra, a director known for his award-winning films for the National Film Board of Canada. A Quebecois nationalist, he was a pioneer of what became known as “direct cinema”—documentary films that captured events in real-time without voiceovers, staging, or directorial meddling.

The Devil’s Toy is a notable exception to his low-key documentary style. Instead of just capturing skateboarding life among the kids of 1960s Montreal, it coopts the scaremongering tone of parents and authorities concerned about the growing fad of skateboarding. “It was like a plague,” says the documentary’s doom-and-gloom narrator, who tracks the spread of the “epidemic from which no one was secure.”

Skating bans were the real epidemic of the late 1960s: The Guardian’s Iain Borden writes that by 1965, numerous U.S. cities had implemented skating bans. A 1965 piece aired on the CBC’s Across Canada says that “the police are directing an organized campaign to stamp out these menaces.” The news piece, which focuses on “skurfing” (Canadian lingo for sidewalk surfing), is pretty tongue-in-cheek, too. Not every adult seemed to be convinced that skateboards were “the devil’s toy.”

This Painting Shows What It Might Look Like When Zika Infects a Cell

Smithsonian Magazine

Zika virus exploded onto the global stage last year when health officials began to suspect it could cause birth defects in babies. Like the Ebola epidemic in 2014, fear burgeoned quickly. The destruction wrought by the disease is profoundly unsettling, in part because the particles of contagion are invisible.

To make something visible is to get a better handle on it, to make it more manageable. In March of this year, Michael Rossmann of Purdue University in Indiana and his colleagues mapped what Meghan Rosen for Science News described as the "bumpy, golf ball-shaped structure" of Zika. With the structure deduced, scientists now have a starting point to learn how the virus works and whether it can be stopped. Researchers will look for points in the structure that might offer up a target for a drug.

In that vein, but with a more artistic twist, another scientist has painted an image of what it might look like when Zika infects a cell.

David S. Goodsell's watercolor depicts an area about 110 nanometers wide, reports Maggie Zackowitz for NPR. That's almost 1,000 times smaller than the width of a typical human hair. In the painting, a pink sphere representing the virus has been sliced in half to reveal tangles of the viral genetic material. Fleshy protuberances on the virus's surface grasp green towers embedded in a light green curve that seems to enclose a jumble of blue. The surface proteins of the virus are binding to receptors on the surface of a cell it will soon infect.

Deadly viruses never looked so beautiful as they do under Goodsell's brush. The molecular biologist with joint appointments at the Scripps Research Institute in La Jolla, California and Rutgers State University in New Jersey paints brightly colored and squishy-looking shapes resembling jellybeans, footballs and spaghetti that crowd and jumble together. As abstract images they are delightful, but Goodsell's work is also firmly footed in science.

The scientist-artist makes some educated guesses for his paintings. "Some of the objects and interactions are very well studied and others are not," he explains. "The science is still a growing field." But his expertise lets him wield the paintbrush with confidence.

Visualizing the microscopic biological world first intrigued Goodsell in graduate school, when he relied on techniques such as x-ray crystallography to deduce the folds, twists and contortions of proteins and nucleic acids.

Structure is key to giving molecules in cells their function, whether they are enzymes that cleave other molecules, RNA strands that instruct protein building or the fibers that support and shape tissues. Pockets in proteins offer up spots where other molecules can bind and catalyze or prevent reactions. When Rosalind Franklin succeeded in capturing the first picture of DNA, using x-ray crystallography, James Watson and Francis Crick were quickly able to deduce how unzipping the double helix could provide a template for replication of genetic material.

"If you are standing outside an automobile and the hood is closed so you can't see the engine, you have no idea how the machine works," says Stephen K. Burley, a researcher who studies proteomics at Rutgers University. Cells themselves are tiny, complex machines, and understanding how they work or what parts and processes go awry under the influence of disease, requires a look under the hood.

That's why Goodsell needed to understand how molecules were shaped as well as how they fit together inside the cell.

Computer graphics were just breaking into the research lab scene in the mid-1980s and giving scientists like Goodsell, now 55, an unprecedented look at the molecules they studied. But even the best programs struggled to show all the intricacies of a single molecule. "Objects the size of a protein were a real challenge," he says. Visualizing multiple proteins and their place relative to cellular structures was beyond the hardware and software capabilities at the time.

"I said to myself: What would it look like if we could blow up a portion of the cell and see the molecules?" Goodsell says. Without the high-powered computer graphic capabilities of today, he turned, quite literally, to the drawing board to piece together all the bits of knowledge about structure he could and create that image of the crowded interior of a cell. His goal was "to get back to looking at the big picture of science," he says.

The images he creates are meant to be scientific illustrations, to inspire researchers and the general public to think about the structures that underlay chemical reactions and cells' functions.

Typically, Goodsell spends a few hours digging through scientific literature to learn everything researchers know about the topic he wants to illustrate. Then, he draws up a big pencil sketch based on what he has learned. Carbon paper helps him transfer that sketch to watercolor paper. The molecules inside cells are often smaller than the wavelength of light, so a true view of a molecular landscape would be colorless, but Goodsell adds color and shading to help people interpret his paintings. The result is detailed views of molecular machinery at work.

In an Ebola painting, for example, the virus looks like a huge worm rearing its head. The virus has stolen the components of a cell membrane from an infected cell, depicted in light purple, Goodsell writes for the online resource, the RCSB's Protein Data Bank (PDB). Turquoise broccoli-heads stuccoing the outside of that membrane are glycoproteins, which can latch on to the surface of a host cell and pull the viral particle close enough that its genetic material (in yellow, protected by the green nucleoprotein) can be shoved inside. Those glycoproteins have been a major target for drugs to combat the virus.

The painting won this year's Wellcome Image Awards, a competition that draws experts in scientific illustration and visualization from the around the world.

The Ebola painting and many other images by Goodsell live at the PDB, under the supervision of Burley, the repository's director. The PDB holds more than 119,000 structures of proteins, RNA, DNA and other molecules. A few statistics demonstrate how important structure is for biologists: There are about 1.5 million downloads of detailed 3D structural information from the data bank every day. In the last four years, people from 191 of the 194 recognized independent states in the world have accessed the resource.

In July, Goodsell will post his 200th "Molecule of the Month," a series featuring his depictions of proteins and other molecules along with a written explanation of the structures' function and importance.

Goodsell's work helps to educate high school students and others about the structures behind disease-causing particles and health conditions in the news. For the so-called PDB-101 series, his molecules help students better understand the mechanisms behind type 2 diabetes or lead poisoning. He has an upcoming large-scale painting that will cover the life cycle of the HIV virus.

Even the experts can learn from Goodsell's illustrations. Early on, he recalls going around the institute to ask his colleagues how crowded they thought a cell was. The estimates he got back were very dilute. Only when he pulled back to look at the big picture did it become obvious that cells are very dense and complex.

"I'm not aware of many other people operating the way [Goodsell] does," says Burley. Goodsell's work unites artistic interpretation and scientific knowledge. "He is able to tell more of the story of the 3D structure by hand than you can with computer graphics. That, I think, is the real beauty of his work."

Goodsell's work can be seen at the RCSB Protein Data Bank's "Molecule of the Month" series and on his website. His website also provides more detail about some of the images in this article.

This Pandemic Isn't the First Time the Hajj Has Been Disrupted for Muslims

Smithsonian Magazine

Saudi Arabia has urged Muslims to delay their plans for the hajj, amid speculation that the obligatory pilgrimage may be canceled this year due to the coronavirus.

Earlier this year, Saudi authorities halted travel to holy sites as part of the umrah, the “lesser pilgrimage” that takes place throughout the year.

Canceling the hajj, however, would mean a massive economic hit for the country and many businesses globally, such as the hajj travel industry. Millions of Muslims visit the Saudi kingdom each year, and the pilgrimage has not been canceled since the founding of the Saudi Kingdom in 1932.

But as a scholar of global Islam, I have encountered many instances in the more than 1,400-year history of the pilgrimage when its planning had to be altered due to armed conflicts, disease or just plain politics. Here are just a few.

Armed conflicts

One of the earliest significant interruptions of the hajj took place in A.D. 930, when a sect of Ismailis, a minority Shiite community, known as the Qarmatians raided Mecca because they believed the hajj to be a pagan ritual.

The Qarmatians were said to have killed scores of pilgrims and absconded with the black stone of the Kaaba—which Muslims believed was sent down from heaven. They took the stone to their stronghold in modern-day Bahrain.

Hajj was suspended until the Abbasids, a dynasty that ruled over a vast empire stretching across North Africa, the Middle East to modern-day India from A.D. 750-1258, paid a ransom for its return over 20 years later.

Political disputes

Political disagreements and conflict have often meant that pilgrims from certain places were kept from performing hajj because of lack of protection along overland routes into the Hijaz, the region in the west of Saudi Arabia where both Mecca and Medina are located.

In A.D. 983, the rulers of Baghdad and Egypt were at war. The Fatimid rulers of Egypt claimed to be the true leaders of Islam and opposed the rule of the Abbasid dynasty in Iraq and Syria.

Their political tug-of-war kept various pilgrims from Mecca and Medina for eight years, until A.D. 991.

Then, during the fall of the Fatimids in A.D. 1168, Egyptians could not enter the Hijaz. It is also said that no one from Baghdad performed hajj for years after the city fell to Mongol invasion in A.D. 1258.

Many years later, Napolean’s military incursions aimed at checking British colonial influence in the region prevented many pilgrims from hajj between A.D. 1798 and 1801.

Diseases and hajj

Much like the present, diseases and other natural calamities have also come in the way of the pilgrimage.

There are reports that the first time an epidemic of any kind caused hajj to be canceled was an outbreak of plague in A.D. 967. And drought and famine caused the Fatimid ruler to cancel overland Hajj routes in A.D. 1048.

Cholera outbreaks in multiple years throughout the 19th century claimed thousands of pilgrims’ lives during the hajj. One cholera outbreak in the holy cities of Mecca and Medina in 1858 forced thousands of Egyptians to flee to Egypt’s Red Sea border, where they were quarantined before being allowed back in.

Indeed, for much of the 19th century and the beginning of the 20th century, cholera remained a “perennial threat” and caused frequent disruption to the annual hajj.

So did the plague. An outbreak of bubonic plague in India in 1831 claimed thousands of pilgrims’ lives on their way to perform hajj.

In fact, with so many outbreaks in such quick succession, the hajj was frequently interrupted throughout the mid-19th century.

Recent years

In more recent years, too, the pilgrimage has been disrupted for many similar reasons.

In 2012 and 2013 Saudi authorities encouraged the ill and the elderly not to undertake the pilgrimage amid concerns over Middle East Respiratory Syndrome, or MERS.

Contemporary geopolitics and human rights issues have also played a role in who was able to perform the pilgrimage.

In 2017, the 1.8 million Muslim citizens of Qatar were not able to perform the hajj following the decision by Saudi Arabia and three other Arab nations to sever diplomatic ties with the country over differences of opinion on various geopolitical issues.

The same year, some Shiite governments such as Iran leveled charges alleging that Shiites were not allowed to perform the pilgrimage by Sunni Saudi authorities.

In other cases, faithful Muslims have called for boycotts, citing Saudi Arabia’s human rights record.

While a decision to cancel the hajj will surely disappoint Muslims looking to perform the pilgrimage, many among them have been sharing online a relevant hadith—a tradition reporting the sayings and practice of the prophet Muhammad—that provides guidance about traveling during a time of an epidemic: “If you hear of an outbreak of plague in a land, do not enter it; but if the plague breaks out in a place while you are in it, do not leave that place.”

This Slo-Mo Sneeze Video Shows Just How Far Spray Clouds Can Spread

Smithsonian Magazine

A lot of things can cause a sneeze—from sickness to sex. But sneezing can be pretty gross. Sneezes eject particles of mucus and saliva, some contaminated with viruses and bacteria, at ten miles per hour, creating a giant cloud of potentially infectious mist. There's still much left to learn about how exactly that disgusting cloud moves. Most advice for avoiding sneeze clouds are largely educated guesses. 

Mathematical physicist Lydia Bourouiba, head of the Fluid Dynamics of Disease Transmission Laboratory at MIT, has spent her academic career sussing out the secrets of the sneeze, reports Rae Ellen Bichell at NPR. Her most recent contribution to schnoz science is a slow motion video of sneezing, which she published at the New England Journal of Medicine.

The high contrast black and white video shows just how large a sneeze cloud can be. Understanding exactly where and how far vaporized mucus travels is important. “Respiratory infectious diseases still remain the leading infectious diseases in the world,” Bourouiba tells Bichell. “It's actually quite amazing that we can produce such a high-speed flow that contains all these ranges of sizes of droplets.”

Bourouiba’s analysis shows that just standing a few feet away from a sick patient doesn't remove them from the firing zone. Tiny droplets can hover in a room for several minutes and zip across an entire room in mere seconds.

In an earlier study, and a different set of sneeze videos, Bourouiba found that the droplets are are not uniform, contradicting previous guesses about sneeze spew. Instead, as the droplets exit the mouth and nose, complicated physics take hold. A combination of the sneeze force and turbulence causes the production of a range of particle sizes, from fine lingering mists to larger spray drops. And even tiny drops, Bourouiba found, can harbor disease causing agents.

Bourouiba says mapping the sneeze cloud could help hospitals and places facing epidemics figure out how to squelch the spread of diseases. Air temperature, humidity, room layout and ventilation could all be tweaked to reduce person-to-person transmissions. For example, when someone sneezes on a plane, the airflow patterns actually facilitate the spread of spray to nearby passengers. But not everyone is just sitting by with a cringe. Raymond Wang won the 2015 Intel Science and Engineering Fair for his innovative airflow foils for the plane interior which actually prevent the spread of germs in the enclosed space.

“This is a major blind spot when designing public health control and prevention policies, particularly when urgent measures are needed during epidemics or pandemics,” Bourouiba says in a press release. “Our long term goal is to change that.”

This Startup Is Harvesting Wild Algae to Make Your Next Pair of Sneakers

Smithsonian Magazine

Rob Falken is an inventor with a mission: to put the planet first in everything he does. He grew up in Southern California and calls surfing “the lifeblood of my youth.” The sport, he says, made him sensitive to the natural world.

Falken began inventing surf-related products when he was 17, making a surfboard wax at his mother’s kitchen table. Since then, the material designer has developed products, including skateboards made from reclaimed wood, a biodegradable surfboard with a foam base derived from sugar cane plants, and a buoyant foam used in lightweight life vests used by tow-in, big-wave surfers.

In 2014, Falken found himself wanting to focus on the kind of flexible foam you typically associate with yoga mats and the soles of running shoes. Only he wanted to do it in a sustainable way, putting an abundant form of refuse to good use.

Toxic blue-green algae, which is also known as cyanobacteria, has reached epidemic levels in recent years, due to rising global temperatures, as well as runoff and waterway contamination from human processes such as large-scale agriculture and sanitation. In the oceans, large-scale algal blooms are often dubbed “red tides” due to the rusty hue of the algae. They impact everything from the health of marine mammals, such as manatees, to the business of commercial fishers and seaside resorts where guests expect pristine beaches and clear water.

With algal blooms, a toxin called domoic acid accumulates in shellfish and marine fish stocks, such as anchovies and sardines. When those fish are consumed by other marine life, domoic acid causes a devastating domino effect across the food web. For months last winter, blooms halted California’s Dungeness and rock crab season, costing crabbers an estimated $48 million. The same season, Washington State’s Department of Fish and Wildlife curtailed razor clamming due to elevated levels of domoic acid present in the bivalves.

After a month of experimenting, Falken found a way to make algae his primary ingredient. He quickly co-founded Bloom, a company that now manufactures the foam product.

The mobile harvester collects the explosive plant life that clogs waterways and saps water of oxygen that aquatic life so desperately needs. (Bloom)

Bloom’s mobile harvester collects algae biomass from waste streams in the United States and Asia, harvesting the explosive plant life that clogs waterways and saps water of oxygen that aquatic life so desperately needs. After converting the harvested algae into a polymer, Bloom can produce all sorts of foam-based products, from sneaker soles and car seat upholstery to surfboard traction pads. The algae foam traction pad is Bloom’s first commercial product, made by surfer Kelly Slater’s design firm.

Falken, now Bloom’s managing director, spoke with Smithsonian.com about his algae-harvesting solution.

How did the idea for Bloom come about?

I got interested in this because I have a background developing materials with an environmental focus. I found out that for algae blooms, the past three years have been the worst three years ever, compounding on one another. I’m not exaggerating when I say this is a crisis. Over the July 4th weekend in Florida, the state lost millions in tourism dollars.

Plus, there are areas where the oxygen levels in the water are so depleted that there are manta ray mass die-offs. Manatees are dying from eating contaminated algae. These algae blooms are also bad for human health, impacting entire water systems that drain to the ocean from inland areas where there is massive cattle farming and sugarcane plant runoff.

So to circle back, in early 2014, I set out on a path to foam algae. Algae has been talked about a lot for biofuels, but to make biofuels, you need to genetically engineer enough materials, or basically grow it all in a lab. I tried to work with a bioplastics company already doing something similar to what I wanted to achieve, but the company, Cereplast, unfortunately went bankrupt before we could get our project off the ground.

I found another partner in the company Algix, which had simple but remarkable mobile algae harvesting systems that were successfully deployed to catfish farms throughout the South. Some of the harvesting systems were either underutilized or mothballed. I told them I had an idea to foam their material, and initially, they said it wouldn't work. But they sent me their materials to my specs, and after tinkering for just 30 days, I had a foam product. A few months later, Algix and my company Effekt joined to form Bloom.

How do you make your foam? How does your algae harvester work?

In general, we work with any type of blue-green algae. Blue-green algae is a polymer, so we basically vacuum it off a lake and dry it using our continuous solar drying process. Solar drying produces a charcoal-like flake, which we pulverize into powder. Once we have a pure powder—ours has no toxins—we make it into what is essentially a pellet, which we injection-mold into a panel and make a fiber out of it. We can dry anything with 40-plus percent protein content because that protein makes the plastic.

Bloom dries the algae, pulverizes it into a powder, and then turns the powder into pellets. (Bloom)

We really focus on the plastic side. Plastic is a chain of amino acids, which is the definition of a protein. We have a perfect solution that requires no arable land, no pesticides to grow, and a never-ending feedstock. We’re for profit, but we’re trying to make better solutions that put the planet first.

Can Bloom harvest from any body of water or just freshwater lakes and ponds?

We focus on polymerizing, and if we’re working with saltwater algae, salt creates a challenge when converting to polymer. That said, our system doesn’t care of its working in salt or fresh water, or algae thick as cake. Algae is the largest carbon dioxide sink on the planet; we can use it all. We can roll our mobile harvesters up to brackish estuaries or pontoon them onto the ocean. As Algix found at those catfish farms, we can get into delicate habitats with no harm to the environment.

How is Bloom different from other solutions trying to combat toxic algae blooms?

There is no other solution—not like ours. One of the only things you can do in an ocean or lake is dump in copper sulfate and kill everything.

Our harvester uses a giant vacuum with a screen, which prevents fish and aquatic life from getting sucked up. Then 99 percent of filtered water goes back, and we’re left with blue-green algae we can dry and use to make foam.

Bloom's first commercial product is an algae foam traction pad for a surfboard, made by surfer Kelly Slater's design firm. (Bloom)

What can you make with your foam?

Surfers can use our foam for traction, and that’s how we ended up with our first major product, a four-piece flat pad for surfers to get better grip on their boards.

Do you have any patents for Bloom?

We have a patent on processing the polymers from algae. We also have 12 more patents in process, including some focusing on anti-microbial uses for our foam.

What’s next for Bloom?

Algix and Bloom’s mantra is to do the least amount of harm. We have really amazing technology and it is infinitely scalable because there is endless algae.

We project our foam will be in over two million pairs of shoes by early 2017 and 100 million pairs of shoes by 2018.

We think the consumer product category is easy because people like something they can buy; we’re making physical products people can connect with and use in their everyday lives. We can’t convert everybody to care about eco-friendly materials or products, but our material works.

This Was the First Major News Article on HIV/AIDS

Smithsonian Magazine

Thirty-six years ago, the words “HIV” and “AIDS” weren’t yet invented. But what would later be known as the HIV virus was already at work in the bodies of men in New York and California, perplexing doctors who had no idea why their patients were dying. Then, in July 1981, the United States was given its first look at the mysterious illness with the first major news story to cover the emerging disease. Decades later, it’s a fascinating glimpse into the early days of the AIDS epidemic.

Entitled “Rare Cancer Seen in 41 Homosexuals,” the article was penned by Lawrence K. Altman and appeared in the New York Times. At the time, gay men were dying of an unusual disease. They presented with purple spots on the skin, and their lymph nodes eventually became swollen before they died. It seemed to be cancer—but the symptoms matched a type usually only seen in very old people. The people who were dying at the time, however, were young and otherwise healthy. Doctors did not understand what was happening or whether the cancer was contagious.

Doctors later learned that this particular type of cancer, Kaposi’s Sarcoma, is an “AIDS-defining condition” that marks the transition of the HIV virus into its late stages. A month before the article was published in The New York Times, the Centers for Disease Control and Prevention had reported another set of strange symptoms— Pneumocystis carinii pneumonia that, like the cancer, was occurring in seemingly healthy gay men. But it was unclear if the conditions were linked or why they were happening.

“In hindsight, of course,” wrote Altman in 2011, “these announcements were the first official harbingers of AIDS…But at the time, we had little idea what we were dealing with.”

This led to confusion and, sometimes, panic as scientists tried to figure out what was going on. As Harold W. Jaffe writes in a commentary paper published in Nature Immunology, it was unclear at first whether the disease was new. Rumors began to spread of a "gay cancer"—despite the occurrence of new cases in people who had received blood transfusions, straight women and infants. There was little reliable information about what was going on within the gay community, Harold Levine, a New Yorker who lived through these early days of the epidemic, tells New York Magazine’s Tim Murphy. Levine says he heard about a case of “gay cancer” from friends. "It was a few months before I heard about a second case, then the floodgates opened and it was all we could talk about," he says.

Even after the existence of the HIV virus was discovered to be the cause of AIDS in 1984, stigma about homosexuality and intravenous drug use colored the public’s perception of the disease. Many gay people hid their health struggles, and it took years for President Ronald Reagan to publicly acknowledge HIV/AIDS. Meanwhile, as Smithsonian.com reports, the false identification of flight attendant Gaétan Dugas as “patient zero” spread the rumor that he was responsible for bringing the disease to the United States. But last year, decades after his death from HIV/AIDS, genetic research cleared him of these claims.

Today, the concept of “gay cancer” has been replaced with extensive knowledge about HIV/AIDS, which is not limited to homosexual men and is no longer a death sentence for many patients. According to the World Health Organization, over 35 million people have died of HIV/AIDS thus far, and as of the end of 2015, there were nearly 37 million people living with HIV.

There’s no cure—yet. And stigma is still considered a major roadblock for getting effective treatment to people at risk and infected with HIV/AIDS. The first glimpse of the infection’s deadly consequences is a poignant document of how confusing the epidemic was during its early days—and a reminder of just how far we’ve come.

Thomas Duncan, Dallas' Ebola Patient, Has Died

Smithsonian Magazine

Thomas Eric Duncan, the first person diagnosed with Ebola in the United States, has died, the Associated Press is reporting. Duncan's death brings a sad end to a medical struggle that, just yesterday, seemed to be starting to turn around.

On September 20th, Duncan flew into the United States from Liberia, a country that has so far seen 931 laboratory-confirmed cases of the virus, and thousands of suspected cases. A week later, Duncas was admitted to the Texas Health Presbyterian Hospital in Dallas where he was diagnosed with Ebola, the first time the virus had ever been diagnosed in the U.S., said the CDC in a press conference last week.

Before his death this morning, things had been looking up for Duncan: his temperature and blood pressure had returned to normal levels, and the diarrhea associated with Ebola had waned, says the New York Times. A few days ago, Duncan had been put on an experimental broad-spectrum antiviral medication, says Time.

Before he was admitted to the hospital but while he was contagious, Duncan had been in contact with 48 people, says the Times. Those people are now under surveillance by the CDC. Duncan's family members are under mandated isolation, though as journalist David Dobbs noted last week, the family does not appear to be receiving the kind of support from health care officials that they might need.

Though Ebola is still overwhelmingly an issue for West African nations, it does appear to be spreading within western nations. In Spain, says the BBC, investigations are ongoing to figure out how a Spanish nurse contracted the disease—the first time the disease has ever been seen to spread outside of Africa.

If Ebola continues to spread—though a widespread outbreak of the disease in the West remains incredibly low—the total impact of the disease could push as high as $32.6 billion dollars by next year, according to a new estimate by the World Bank. CTV:

It is far from certain that the epidemic will be contained by the end of the year, so the report estimated the economic costs of two scenarios as the battle against the disease continues. The report estimated that the economic impact could top $9 billion if the disease is rapidly contained in the three most severely affected countries, but could reach $32.6 billion if it takes a long time to contain Ebola in the three countries and it spreads to neighboring nations.

Through Positive Eyes

Smithsonian Center for Folklife and Cultural Heritage

The photograph depicts a scene that could easily be ignored as mundane or ordinary. Through the image, we are invited to look out of a window onto a set of swings that are gently moving, although there is no indication of anyone who might have set them into motion.

The photographer behind this remarkable artwork is one of twelve participants who live locally in the District of Columbia, and are HIV-positive, united for a remarkable photo documentary project called Through Positive Eyes. Elisabeth Nails, co-producer of the project explained, “The project participant saw beauty in this, captured it, and showed us it was beautiful. It’s an angelic moment…it’s as if the swings are made of lace.”

Through Positive Eyes is based on the belief that challenging stigma against people living with HIV/AIDS is the most effective method for combating the epidemic—and that art is a powerful way to do this. The project is in the process of creating an international album of personal photo essays created by people living with HIV/AIDS, drawn from ten-day workshops in major cities around the globe. The participants are chosen through collaborations with local organizations and, once selected, undergo intensive training designed for those with no previous experience with photography. Armed with new skills and techniques—as well as small, high-resolution digital cameras—they are then set loose to document their lives in any way they choose.

The most recent workshop is taking place right now at the Hirshhorn Museum ArtLab+, with exhibition and program components presented as part of the Creativity and Crisis program. Based on recent statistics from the Department of Health and Human Services, 1.2 million people in the U.S. are living with HIV/AIDS, and 1 in 5 are unaware of their diagnosis. Additionally, the southern states of the U.S. have some of the highest percentages of people who are HIV positive; sometimes as high as 14 to18 percent in certain populations. “Groups that were not historically affected in early years are getting it,” says Elisabeth Nails. “It’s across the board, affecting every group, class, and culture.” In fact, a growing number of Americans being diagnosed as HIV positive today are heterosexual African American and Latina women, in the fifteen-to-twenty-five age bracket.

It also takes a tremendous amount of courage to allow the world to see life through the eyes of those infected with the HIV virus. That courage and vulnerability, emotions that all humans experience, is what tether the participants’ photographs to its audiences. “Artists have this gift of being storytellers,” says Nails. “They provide us with a human level of what it could be like to live with this virus.”

Learn more about Through Positive Eyes this weekend in the Creativity and Crisis program area:

  • An exhibition of photographs from the South African Through Positive Eyes workshops are on display on the program site near the Healing Arts tent.
  • Photographs from the D.C. workshops, which have been taking place simultaneously with the Folklife Festival, will be displayed in the Creativity and Crisis program area on Saturday, July 7, 2012 all day. Each participant has selected two photographs, which best represent themselves, and are asking the public to help them choose their final “signature” image. Come out to the Festival site and vote for your favorites.
  • Self Expressions through Body Maps and Photography at the Giving Voice stage, 4:15 to 5:00 pm on Saturday, July 7, 2012.
  • A final culminating program for the workshop, with all of the Through Positive Eyes—Washington, D.C. staff and participants will be presented in the HirshhornMuseum’s Ring Auditorium, 4 to 5:30 p.m. on Sunday, July 8, 2012. FREE and open to the public.

Patricia Wakida is a writer and historian based in Boyle Heights, a neighborhood of Los Angeles, California.

To Fight Deadly Dengue Fever in Humans, Create Dengue-Resistant Mosquitoes

Smithsonian Magazine

There’s a reason this tropical disease is known as “breakbone fever”: To its victims, that's how it feels. Dengue fever can cause such severe muscle and joint pain that it can be excruciating for an infected person to even move. It can also cause burning fever, delirium, internal bleeding and even death as the body attempts to fight off the disease. There is no effective treatment, and won’t be anytime soon.

Nevertheless, new research has identifies a hope for stemming the epidemic—and it lies in genetic engineering.

Dengue virus, which is passed on by the same Aedes Aegypti mosquito that spreads Zika, has been plaguing humans since at least the late 1700s. But in the past few decades, skyrocketing human population and increased urbanization—particularly in warm, moist regions like South America, Southeast Asia and West Africa—have fueled a growing number of cases. Like the Zika virus, dengue has no symptoms for the majority of those who contract it (roughly three-quarters). But nearly 100 million people annually do develop at least some of its dangerous and excruciating symptoms—and roughly 20,000 of those die each year.

Even if you do survive dengue fever, you aren’t out of the woods yet. In fact, overcoming the disease once actually makes you more likely to die if you contract a different strain later. That’s because the various types of the virus appear so similar on the surface, that the immune system will often respond using the same antibodies it developed to fight the last bout. But these are ineffective against the new strain. Moreover, the immune system’s efforts to fight the virus can attack the body instead—causing hemorrhaging, seizures and even death.

So far, preventing the spread of dengue has mostly taken the form of old-fashioned mosquito warfare: nets, insecticide and draining still water, where mosquitoes like to breed. In 2015, researchers finally developed a partially effective dengue virus vaccine, which was green-lighted in three countries. But the vaccine only reduced chances of getting the virus by 60 percent in clinical trials, and because of the risk of developing antibodies, some experts think it may only be safe for people who have already survived an infection.

Today the vaccine is only being used in limited quantities in the Philippines. "There is really an urgent need for developing new methods for control," says George Dimopoulos, a John Hopkins University entomologist who studies mosquito-borne diseases like malaria and dengue.

Instead of focusing on how people get infected with dengue, Dimopoulos has turned his efforts to how mosquitoes themselves contract the virus. Usually, the virus makes its home in a mosquito after the insect bites an infected human; it rarely passes between mosquitoes. So theoretically, by figuring out how to block that infection from ever occurring, you could effectively eliminate dengue virus, Dimopoulos says.

In a study published today in the journal PLOS Neglected Tropical Diseases, lead author Dimopoulos explained how that would work. Using genetic engineering, he and his team manipulated two genes that help control the immune system of the Aedes aegypti mosquito, which most commonly spreads dengue. The manipulated genes caused the mosquitoes' immune systems to become more active when the bugs fed on blood, which is when they contract dengue virus. This stimulation made the mosquitos significantly more resistant to the different types of dengue virus.

"This impressive body of work is an important step forward in understanding mosquito-[dengue virus] immunology," says University of Melbourne dengue researcher Lauren Carrington, who was not involved in the study.

However, Dimopoulos says this breakthrough is just the first step. While the mosquitoes in his study became roughly 85 percent more resistant to some types of dengue virus, other types were much less affected by the genetic engineering. Furthermore, the manipulation didn't seem to create any significant resistance to the related Zika and Chikungunya viruses that Aedes aegypti also spread.

Dimopoulos hopes to fine-tune the method to make it more effective. While genetic engineering comes laden with controversy, he points out that his technique doesn't introduce any foreign genes into the mosquitoes; it simply manipulates the ones they already have. Eventually, he hopes to create mosquitoes that will be resistant to multiple tropical diseases. He also wants to take advantage of "gene drive" technology, which enhances the chances of a certain gene to be passed to offspring, to allow the genetically modified mosquitoes to quickly become dominant in any environment they're released into.

This isn’t the first time researchers have played with mosquitoes’ genes in an attempt to halt the spread of disease. The British biotechnology company Oxitec has worked to modify the genome of the Aedes aegypti mosquitoes to make males that produce dead offspring after mating. Brazil has already partnered with the company to release billions of these mosquitoes into the country, in hopes of suppressing the population of disease-spreading mosquitoes. The company has also worked to get approval to release its mosquitoes in other places, including India, the Cayman Islands and the Florida Keys, where Zika fears drove voters to approve a trial in a ballot measure last year.

Oxitec's methods are effective in the short term, Dimopoulos says. But eliminating the mosquito population from an area will not make it mosquito-free permanently, because mosquitoes from other areas will eventually fill the empty niche left behind. Authorities will be forced to regularly release more genetically modified mosquitoes to keep their population numbers suppressed, Dimopoulos notes—a costly method that would appeal to biotech companies like Oxitec.

Replacing the wild mosquitoes with live but resistant mosquitoes, however, will act as a lasting barrier to spreading tropical diseases, Dimopoulos says. Before we get there, though, he says he wants to work on upping the resistance of the mosquitoes to dengue, as well as making them resistant to other types of tropical diseases. Then, he’ll need to do trials in greenhouses and on islands to see if the resistance works outside the lab.

He doesn't expect any widespread releases of mosquitoes for another decade, but points out that 10 years is a small wait overall. "It's not going to happen quickly," Dimopoulos says, "but we have to remember that these diseases have been with us for a very long time."

There's no humane way to test in the lab whether or not humans will contract dengue less often from these mosquitoes, Dimopoulos says. As a result, we'll only know for sure how effective the gene manipulation is once the mosquitoes have been released. But even if they don't work as well outside the lab, Dimopoulos has no regrets about blazing new trails to combat tropical illnesses.

"The fight against these diseases is like a war," Dimopoulos says. "You can't win it with one weapon."

Too Much Tech Could Be Causing Nearsightedness…But Not in the Way You Might Think

Smithsonian Magazine

Myopia, the blurry vision we know as nearsightedness, is reaching epidemic proportions—it could overtake a third of the world’s population by decade’s end. But is the condition caused by the rise of computers and mobile devices that strain the world’s eyes? It turns out that tech can cause nearsightedness...but not in the way you might think.

Scientists are increasingly linking myopia with time spent indoors, reports Ellie Dolgin for Nature. She notes that scientists have long been on the hunt for the cause of myopia, which has been linked to higher education levels, genetics and book work over the years. But though researchers have been unable to find a link between specific computing or reading behaviors and myopia, says Dolpin, they did find a connection between eyesight and the amount of time spent indoors.

As we spend more time indoors consuming technology, it appears that our susceptibility to myopia rises. But Dolgin reports that there’s a way to protect your eyes from the condition:

Based on epidemiological studies, Ian Morgan, a myopia researcher at the Australian National University in Canberra, estimates that children need to spend around three hours per day under light levels of at least 10,000 lux to be protected against myopia. This is about the level experienced by someone under a shady tree, wearing sunglasses, on a bright summer day. (An overcast day can provide less than 10,000 lux and a well-lit office or classroom is usually no more than 500 lux.) Three or more hours of daily outdoor time is already the norm for children in Morgan's native Australia, where only around 30% of 17-year-olds are myopic. But in many parts of the world — including the United States, Europe and East Asia — children are often outside for only one or two hours.

That insight could help put a stop to the growing tendency towards myopia. In the United States, it grew 66 percent between 1971 and 2004. But though the National Eye Institute estimates that 33 percent of Americans have myopia, the number is much higher in children—and in countries like China, nearsightedness rates are as high as 86 percent in some cities. And Dolgin notes that it’s even worse in Seoul, where more than 96 percent of 19-year-old men have myopia.

Research on how light affects myopia is still ongoing, and there’s fierce debate about not just how to get kids outside, but how to supervise them once they’re there. And though it’s not clear how long it will take for science to focus the world’s vision, a new pair of glasses might help you focus on your work—these experimental eyeglasses use neurofeedback to get you back on task.

Tracking Down the Origins of Cystic Fibrosis in Ancient Europe

Smithsonian Magazine

Imagine the thrill of discovery when more than 10 years of research on the origin of a common genetic disease, cystic fibrosis (CF), results in tracing it to a group of distinct but mysterious Europeans who lived about 5,000 years ago.

CF is the most common, potentially lethal, inherited disease among Caucasians—about one in 40 carry the so-called F508del mutation. Typically only beneficial mutations, which provide a survival advantage, spread widely through a population.

CF hinders the release of digestive enzymes from the pancreas, which triggers malnutrition, causes lung disease that is eventually fatal and produces high levels of salt in sweat that can be life-threatening.

Depending on the mutation a patient carries, they may experience some or all symptoms of cystic fibrosis. (Blausen.com staff (2014), CC BY-SA)

In recent years, scientists have revealed many aspects of this deadly lung disease which have led to routine early diagnosis in screened babies, better treatments and longer lives. On the other hand, the scientific community hasn’t been able to figure out when, where and why the mutation became so common. Collaborating with an extraordinary team of European scientists such as David Barton in Ireland and Milan Macek in the Czech Republic, in particular a group of brilliant geneticists in Brest, France led by Emmanuelle Génin and Claude Férec, we believe that we now know where and when the original mutation arose and in which ancient tribe of people.

We share these findings in an article in the European Journal of Human Genetics which represents the culmination of 20 years’ work involving nine countries.

What is cystic fibrosis?

My quest to determine how CF arose and why it’s so common began soon after scientists discovered the CFTR gene causing the disease in 1989. The most common mutation of that gene that causes the disease was called F508del. Two copies of the mutation—one inherited from the mother and the other from the father—caused the lethal disease. But, inheriting just a single copy caused no symptoms, and made the person a “carrier.”

I had been employed at the University of Wisconsin since 1977 as a physician-scientist focusing on the early diagnosis of CF through newborn screening. Before the gene discovery, we identified babies at high risk for CF using a blood test that measured levels of protein called immunoreactive trypsinogen (IRT). High levels of IRT suggested the baby had CF. When I learned of the gene discovery, I was convinced that it would be a game-changer for both screening test development and epidemiological research.

That’s because with the gene we could offer parents a more informative test. We could tell them not just whether their child had CF, but also whether they carried two copies of a CFTR mutation, which caused disease, or just one copy which made them a carrier.

Parents carrying one good copy of the CF gene (R) and one bad copy of the mutated CF gene (r) are called carriers. When both parents transmit a bad copy of the CF gene to their offspring, the child will suffer from cystic fibrosis. Children who inherit just one bad copy will be carriers like their parents and can transmit the gene to their children. (Cburnett, CC BY-SA)

One might ask what is the connection between studying CF newborn screening and learning about the disease origin. The answer lies in how our research team in Wisconsin transformed a biochemical screening test using the IRT marker to a two-tiered method called IRT/DNA.

Because about 90 percent of CF patients in the U.S. and Europe have at least one F508del mutation, we began analyzing newborn blood for its presence whenever the IRT level was high. But when this two-step IRT/DNA screening is done, not only are patients with the disease diagnosed but also tenfold more infants who are genetic carriers of the disease are identified.

As preconception-, prenatal- and neonatal screening for CF have proliferated during the past two decades, the many thousands of individuals who discovered they were F508del carriers and their concerned parents often raised questions about the origin and significance of carrying this mutation themselves or in their children. Would they suffer with one copy? Was there a health benefit? It has been frustrating for a pediatrician specializing in CF to have no answer for them.

The challenge of finding origin of the CF mutation

I wanted to zero in on when this genetic mutation first starting appearing. Pinpointing this period would allow us to understand how it could have evolved to provide a benefit—at least initially—to those people in Europe who had it. To expand my research, I decided to take a sabbatical and train in epidemiology while taking courses in 1993 at the London School of Hygiene and Tropical Medicine.

The timing was perfect because the field of ancient DNA research was starting to blossom. New breakthrough techniques like the Polymerase Chain Reaction made it possible to study the DNA of mummies and other human archaeological specimens from prehistoric burials. For example, early studies were performed on the DNA from the 5,000-year-old Tyrolean Iceman, which later became known as Ötzi.

A typical prehistoric burial in a crouched fetal position. (Philip Farrell, CC BY-SA)

I decided that we might be able to discover the origin of CF by analyzing the DNA in the teeth of Iron Age people buried between 700-100 B.C. in cemeteries throughout Europe.

Using this strategy, I teamed up with archaeologists and anthropologists such as Maria Teschler-Nicola at the Natural History Museum in Vienna, who provided access to 32 skeletons buried around 350 B.C. near Vienna. Geneticists in France collected DNA from the ancient molars and analyzed the DNA. To our surprise, we discovered the presence of the F508del mutation in DNA from three of 32 skeletons.

This discovery of F508del in Central European Iron Age burials radiocarbon-dated to 350 B.C. suggested to us that the original CF mutation may have arisen earlier. But obtaining Bronze Age and Neolithic specimens for such direct studies proved difficult because fewer burials are available, skeletons are not as well-preserved and each cemetery merely represents a tribe or village. So rather than depend on ancient DNA, we shifted our strategy to examine the genes of modern humans to figure out when this mutation first arose.

Why would a harmful mutation spread?

To find the origin of CF in modern patients, we knew we needed to learn more about the signature mutation—F508del—in people who are carriers or have the disease.

This tiny mutation causes loss of one amino acid out of the 1,480 amino acid chain and changes the shape of a protein on the surface of the cell that moves chloride in and out of the cell. When this protein is mutated, people carrying two copies of it—one from the mother and one from the father—are plagued with thick sticky mucus in their lungs, pancreas and other organs. The mucus in their lungs allows bacteria to thrive, destroying the tissue and eventually causing the lungs to fail. In the pancreas, the thick secretions prevent the gland from delivering the enzymes the body needs to digest food.

So why would such a harmful mutation continue to be transmitted from generation to generation?

The Natural History Museum in Vienna, Austria, houses a large collection of Iron Age and Bronze Age skeletons which are curated by Dr. Maria Teschler-Nicola. These collections were the source of teeth and bones for investigation of ancient DNA and studies on ‘The Ancient Origin of Cystic Fibrosis.’ (Philip Farrell, CC BY-ND)

A mutation as harmful as F508del would never have survived among people with two copies of the mutated CFTR gene because they likely died soon after birth. On the other hand, those with one mutation may have a survival advantage, as predicted in Darwin’s “survival of the fittest” theory.

Perhaps the best example of a mutation favoring survival under stressful environmental conditions can be found in Africa, where fatal malaria has been endemic for centuries. The parasite that causes malaria infects the red blood cells in which the major constituent is the oxygen-carrying protein hemoglobin. Individuals who carry the normal hemoglobin gene are vulnerable to this mosquito-borne disease. But those who are carriers of the mutated “hemoglobin S” gene, with only one copy, are protected from severe malaria. However two copies of the hemoglobin S gene causes sickle cell disease, which can be fatal.

Here there is a clear advantage to carrying one mutant gene—in fact, about one in 10 Africans carries a single copy. Thus, for many centuries an environmental factor has favored the survival of individuals carrying a single copy of the sickle hemoglobin mutation.

Individuals who carry two copies of the sickle cell gene suffer from sickle cell anemia, in which the blood cells become rigid sickle shapes and get stuck in the blood vessels, causing pain. Normal red blood cells are flexible discs that slide easily through vessels. (Designua/Shutterstock.com)

Similarly we wondered whether there was a health benefit to carrying a single copy of this specific CF mutation during exposures to environmentally stressful conditions. Perhaps, we reasoned, that’s why the F508del mutation was common among Caucasian Europeans and Europe-derived populations.

Clues from modern DNA

To figure out the advantage of transmitting a single mutated F508del gene from generation to generation, we first had to determine when and where the mutation arose so that we could uncover the benefit this mutation conferred.

We obtained DNA samples from 190 CF patients bearing F508del and their parents residing in geographically distinct European populations from Ireland to Greece plus a Germany-derived population in the U.S. We then identified a collection of genetic markers—essentially sequences of DNA—within the CF gene and flanking locations on the chromosome. By identifying when these mutations emerged in the populations we studied, we were able to estimate the age of the most recent common ancestor.

Next, by rigorous computer analyses, we estimated the age of the CF mutation in each population residing in the various countries.

Two copies of the sickle cell gene cause the disease. But carrying one copy reduces the risk of malaria. The gene is widespread among people who live in regions of the world (red) where malaria is endemic. ( ellepigrafica)

We then determined that the age of the oldest common ancestor is between 4,600 and 4,725 years and arose in southwestern Europe, probably in settlements along the Atlantic Ocean and perhaps in the region of France or Portugal. We believe that the mutation spread quickly from there to Britain and Ireland, and then later to central and southeastern European populations such as Greece, where F508del was introduced only about 1,000 years ago.

Who spread the CF mutation throughout Europe?

Thus, our newly published data suggest that the F508del mutation arose in the early Bronze Age and spread from west to southeast Europe during ancient migrations.

Moreover, taking the archaeological record into account, our results allow us to introduce a novel concept by suggesting that a population known as the Bell Beaker folk were the probable migrating population responsible for the early dissemination of F508del in prehistoric Europe. They appeared at the transition from the Late Neolithic period, around 4000 B.C., to the Early Bronze Age during the third millennium B.C. somewhere in Western Europe. They were distinguished by their ceramic beakers, pioneering copper and bronze metallurgy north of the Alps and great mobility. All studies, in fact, show that they were into heavy migration, traveling all over Western Europe.

Distribution of Bell Beaker sites throughout Europe. (DieKraft via Wikimedia Commons)

Over approximately 1,000 years, a network of small families and/or elite tribes spread their culture from west to east into regions that correspond closely to the present-day European Union, where the highest incidence of CF is found. Their migrations are linked to the advent of Western and Central European metallurgy, as they manufactured and traded metal goods, especially weapons, while traveling over long distances. It is also speculated that their travels were motivated by establishing marriage networks. Most relevant to our study is evidence that they migrated in a direction and over a time period that fit well with our results. Recent genomic data suggest that both migration and cultural transmission played a major role in diffusion of the “Beaker Complex” and led to a “profound demographic transformation” of Britain and elsewhere after 2400 B.C.

Determining when F508del was first introduced in Europe and discovering where it arose should provide new insights about the high prevalence of carriers—and whether the mutation confers an evolutionary advantage. For instance, Bronze Age Europeans, while migrating extensively, were apparently spared from exposure to endemic infectious diseases or epidemics; thus, protection from an infectious disease, as in the sickle cell mutation, through this genetic mutation seems unlikely.

As more information on Bronze Age people and their practices during migrations become available through archaeological and genomics research, more clues about environmental factors that favored people who had this gene variant should emerge. Then, we may be able to answer questions from patients and parents about why they have a CFTR mutation in their family and what advantage this endows.

Examples of tools and ceramics created by the Bell Beaker people. (Benutzer:Thomas Ihle via German Wikipedia, CC BY-SA)
313-336 of 388 Resources