Skip to Content

Found 388 Resources

English Mass Grave Sheds New Light on the Horrors of the Black Death

Smithsonian Magazine

The Black Death is among the most traumatic epidemics in recorded history. The disease swept through 14th century Europe, killing hundreds of millions of people. Now, a newly-discovered burial pit at the site of a former abbey in the English countryside could shed new light on how people outside of major cities were devastated by the plague, Haroon Siddique reports for The Guardian.

Historians estimate that nearly half of England’s population was killed by the plague in the mid-14th century, but until now the only cemeteries that researchers knew of with evidence of the Black Death were found inside London. But recently, archaeologists working at the ruins of a medieval abbey in the countryside north of the capital came across a mass grave dating back to 1349, a year after the plague first hit England, writes Siddique.

"The finding of a previously unknown and completely unexpected mass burial dating to this period in a quiet corner of rural Lincolnshire is thus far unique, and sheds light into the real difficulties faced by a small community ill-prepared to face such a devastating threat," Hugh Willmott, a researcher at the University of Sheffield's Department of Archaeology, says in a statement.

The grave contained 48 skeletons, 27 of which were children. After the archaeologists uncovered it, they were able to recover tooth samples from some of the remains, which were then sent off to McMaster University in Canada for DNA analysis. The scans found traces of DNA left from Yersinia pestis, the bacterium responsible for the plague, suggesting that these bodies fell victim to the black death, the BBC reports.

An archaeologist examining the remains of plague victims buried at Thornton Abbey, north of London. (University of Sheffield)

The origins of the pit may be gruesome, but finding a mass burial like this is rare in England. Most people at the time were buried in graves in their local parish, as communities tried to maintain some semblance of normal rites despite the high death rates, Oliver Moody reports for The Times. Even though these bodies were buried all together, they were still found laid out in even rows, suggesting that the mourners still took care with the bodies of the deceased.

“You only get graves like this when the normal system of burial has broken down,” Willmott tells Moody. “Whether the priest [was] dead or the gravediggers had died, we don’t know. This clearly was a community that was hit very hard and had to rely on the monastery for help.”

In addition to the human remains, Willmott and his colleagues uncovered little trinkets and remnants of the things these people may have carried while alive. One intriguing find was a small amulet in the shape of a T, which people at the time may have believed could cure certain diseases, Moody reports.

As the excavation continues, Willmott and his team hope that they can uncover more details about these people’s lives from objects from the mass grave and more genetic clues gathered from the remains. With more insight into how these people lived, historians can finally start to figure out how people in the countryside managed the devastating disease.

A pendant found at the site shaped like a "Tau Cross," which may have been believed to cure sickness. (University of Sheffield)

Measles Outbreaks Are on the Rise in the U.S.

Smithsonian Magazine

Cases of measles fell dramatically in the U.S. after the vaccine’s introduction, though numbers seem to be creeping up in recent years. Photo: 2over0

Prior to the release of the measles vaccine in 1963, hundreds of thousands of people in the U.S. contracted the potentially deadly respiratory illness each year. Since the mid-1990s, cases have declined sharply, with just 37 incidences of the viral disease occurring in 2004. Now, however, the disease seems to be making a very slight rebound. In 2011, 222 people in the U.S. contracted measles. Then, in 2012, cases fell again to 54. But 2013 seems to be another measles-prone year, with 118 cases reported so far, many of them clustered among Brooklyn’s Orthodox Jewish communities. The Wall Street Journal reports:

In March, New York City health authorities saw a sudden rise in measles cases in several densely populated Orthodox Jewish communities.

The disease quickly spread. Among the 58 measles cases reported thus far, a child contracted pneumonia and two pregnant women were hospitalized, according to the New York City Department of Health and Mental Hygiene. One of the women had a miscarriage.

Like many of the cases in the U.S. reported in recent years, the Brooklyn outbreaks seem to have originated from someone contracting the disease abroad and carrying it home—in this case, from London. Now, Orthodox Jewish communities are on the alert, and a push is underway to make sure all children receive their MMR vaccination to protect against the disease.

Developing countries are no stranger to the disease. In Pakistan, around 25,000 people have contracted measles this year, and 154 of those cases claimed their child victims. In such places, vaccines are often unavailable or prohibitively expensive (not, as in some American circles, avoided because of erroneous concerns about the MMR vaccine causing autism). From around 8,500 measles cases in the European Union over the past year, around 80 percent were contracted by people who had not been vaccinated. In the Brooklyn outbreak, all of the cases originated in unvaccinated people.

As the Wall Street Journal points out in another story on the toll of anti-vaccine activism, choosing to skip vaccines jeopardizes the health of the community since diseases such as measles are highly contagious.

More from Smithsonian.com:

Without Vaccines, Hundreds of Children in Pakistan Have Died from a Measles Outbreak 
The Black Death Never Left – And It Might Defeat Our Best Defenses 

How a Farming Project in Brazil Turned Into a Social and Ecological Tragedy

Smithsonian Magazine

It's a tale of displaced workers, disease epidemics and gruesome deaths that will haunt ecologists and sociologists for decades to come. This is what went wrong in the Brazilian state of Rondônia, where farmers and indigenous peoples are still paying the price for a combination of poor government planning and limited knowledge about rainforest ecology.

In this week's episode of Generation Anthropocene, produce Mike Osborne gets the incredible story of Rondônia from Bill Durham, an anthropologist and human ecologist at Stanford. He studies the ways human populations have adapted to their environments, and the reasons those same populations often seem to wreak havoc on the natural world around them.

According to Durham, the story of Rondônia kicks into gear in the late 1970s, when tens of thousands of agricultural workers found themselves out of jobs due to technology advances on farms. To address the issue, the Brazilian government looked to the untapped resources of the Amazon.

"Here you have this area that’s the largest piece of unbroken tropical rainforest left in the Americas and it’s the center of your country. It’s not incorporated in the national economy. It’s not being very productive, and Brazil saw this as a potential solution," Durham says.

With funding support from the World Bank, the government set up a program to settle people in the rainforest, clearing land and building roads in a specific pattern that, in theory, would allow them to farm commercial crops such as coffee while keeping some of the rainforest untouched and preserving the welfare of indigenous people nearby. 

The hitch? No one had tested the soil to see if it could support the crops being grown. When a million people tried to take part in the resettlement program, they quickly found out that their farms were not as productive as hoped. From there, the vast social and ecological experiment turned into a nightmare.

More land was cleared and in some places cattle ranchers moved in, creating conflict between settlers and the tribes of the region, some of whom practice headhunting for survival and social status. The fringes of the cleared areas also created the perfect breeding ground for the mosquito that transmits malaria, which quickly infected up to 40 percent of the migrants. At the same time, indigenous groups were being exposed for the first time to diseases such as measles and chicken pox.

To find out what happened next in Rondônia, listen to the full interview with Durham in the audio clip above.

Watch this video in the original article

Buckingham Palace Remembers Princess Diana With New Exhibit

Smithsonian Magazine

Twenty years after her death, a new exhibit at London's Buckingham Palace remembers the life and legacy of Princess Diana. The Palace display, a recreating Diana's sitting room in Kensington Palace, feels like a time capsule to the past—full of personal memorabilia and the princess' distinctive tastes.

“It was chosen to reflect an aspect of Diana, Princess of Wales’ official duties," curator Sally Goodsir tells the Associated Press. In her years as princess, Diana became beloved for her commitment to helping the less fortunate, including working actively with people suffering from HIV and AIDS at a time when many people were still afraid to even touch them.

Many of the objects on display were selected by Diana's sons, Prince Harry and Prince William, the AP reports. The brothers highlight her love of music—including her diverse collection of cassette tapes, ranging from R&B musicians Lionel Richie and George Michael to Luciano Pavarotti, her favorite opera singer.

Some of the 200 gifts presented by world leaders and notable people to Queen Elizabeth II on display (Royal Collection Trust)

The exhibit is part of the annual opening of Buckingham Palace to the public each summer while Elizabeth II vacations at her estate in Scotland. The openings began in 1993 to raise money for the Windsor Castle following a devastating fire at the estate the year before.

Also on display this summer are more than 200 gifts presented to Elizabeth II during her 65-year reign, ranging from a portrait made of woven banana leaves from Rwanda to a badge worn by British astronaut Tim Peake in space.

Not on display are any of the dozens of live animals that have been presented as gifts to the queen over the years, ranging from an elephant from Cameroon to sloths, toucans and even a giant armadillo from Brazil. Those animals have been returned to their native countries to be cared for, the Guardian notes.

Democratic Republic of Congo Approves Ebola Vaccine

Smithsonian Magazine

Three years ago, 49 people died of Ebola in the Democratic Republic of Congo—and in the unrelated outbreak that devastated West Africa between 2014 and 2016, over 11,000 deaths were recorded. So it’s no wonder that news of Ebola’s return to Congo set off alarm bells for health officials, who are now watching to make sure an outbreak does not become an epidemic.

But now, reports NPR’s Michaeleen Doucleff, there’s a new tool available in the fight against the deadly virus: the Ebola vaccie. And the Democratic Republic of Congo has agreed to use it.

The highly effective vaccine, rVSV-ZEBOV, was recently tested in Guinea. When the trial started in 2015, outbreaks of the virus were still happening in the region. According to the World Health Organization, the 5,837 people who were vaccinated did not contract Ebola, while people in the same area who did not receive it did. Ira Longini, a biostatistician who helped test the vaccine, tells Doucleff that while the efficacy was 100 percent during the trial, the vaccine is likely between 70 and 100 percent effective.

As Smithsonian.com reported earlier this month, Ebola returned to the Democratic Republic of Congo in late April when a group of people in a remote area were stricken with hemorrhagic fever. According to the most recent World Health Organization update, there have been a total of two confirmed, three probable and 12 suspected cases thus far.

Gavi, the Vaccine Alliance, a public-private global health partnership that focuses on immunizations in poor countries, committed to purchasing the vaccine before it was licensed, Nature’s Erika Check Hayden reported last year. Merck, the vaccine’s manufacturer, provided a stockpile of 300,000 doses of the vaccine.

Saving all of those vaccines for a rainy day seems to have worked: Now, the vaccine is available for use where it’s needed. However, the vaccine is still technically experimental and, Reuters reports, will only be used if someone outside the known chain of transmission is identified as having Ebola.

The known cases occurred in an extremely remote, forested area and it’s still unclear if the logistics of organizing a vaccination campaign and transporting the precious immunizations will be possible. Still, the existence of the vaccine and the willingness to deploy it if necessary is a relief—until the vaccine was developed, the only way to fight the disease was to isolate people from those infected with Ebola.

How Armadillos Can Spread Leprosy

Smithsonian Magazine

Last week, offficials in eastern Florida announced the emergence of three new cases of leprosy—the ancient, highly stigmatized disease once handled by isolation—in the last five months. And two of those cases have been linked to contact with the armored, strangely cute critter endemic to the American south: armadillos.

Armadillos are the only other animals besides humans to host the leprosy bacillus. In 2011, the New England Journal of Medicine published an article formally linking the creature to human leprosy cases—people and armadillos tested in the study both shared the same exact strain of the disease. 

So, what’s unique about armadillos that make them good carriers? Likely a combination of body temperature and the fragile nature of the disease. As the New York Times reports, leprosy is a “wimp of a pathogen." It’s so fragile that it dies quickly outside of the body and is notoriously difficult to grow in lab conditions. But with a body temperature of just 90 degrees, one hypothesis suggests, the armadillo presents a kind of Goldilocks condition for the disease—not too hot, not too cold. Bacterial transmission to people can occur when we handle or eat the animal.

But before you start to worry about epidemics or making armadillo eradication plans, find comfort in this: Though Hansen’s disease, as it is clinically known, annually effects 250,000 people worldwide, it only infects about 150 to 250 Americans. Even more reassuring: up to 95 percent of the population is genetically unsusceptible to contracting it. And these days, it is highly treatable and not nearly as contagious as once believed.

And as for armadillos—the risk of transmission to humans is low. Only the nine-banded armadillo is known to carry the disease. And, most people in the U.S. who come down with the chronic bacterial disease get it from other people while traveling outside the country.

And it looks like armadillos are the real victims here. Scientists believe that we actually transmitted leprosy to them about 400 to 500 years ago. Today, up to 20 percent of some armadillo populations are thought to be infected. At least, according to one researcher at the National Hansen’s Disease Program in Baton Rouge, the critters rarely live long enough to be seriously effected by the disease’s symptoms.

Experts say the easiest way to avoid contagion is to simply avoid unnecessary contact with the critters. And, of course, they advise not to go hunting, skinning or eating them (which is a rule the armadillos would probably appreciate, too).

Seriously, Just Stay in Bed: Fever-Reducing Pills May Boost Flu Transmission

Smithsonian Magazine

For those of us in good health, the flu feels like more of an annoyance than a threat—a few days of being stuffed up, some aches and pains, maybe a fever. But for the old, the young and the immune-compromised, the flu can be deadly. Flu-related mortality ebbs and flows from year to year, bouncing around from 1.4 to up to 16.7 deaths per 100,000 people per year in the United States, according to a 2010 CDC report. This lingering threat, taken in context, makes a new study out of McMaster University even more troubling.

In their report, the scientists tried to figure out the effect of fever-fighting medications like aspirin or ibuprofen on flu transmission.

Previous experimental research, says Science magazine, had shown that “lowering your body temperature may make the virus replicate faster and increase the risk that you transmit it to others.” Your body turns up the heat to try to cook the flu virus out. By stymieing the fever with medication, you're also making it easier on the virus. Having more virus flowing through your system, the scientists suggest, makes you more likely to transmit it to those you meet.

The scientists extrapolated from this potential for a higher transmission rate to calculate the effect on the wider population, says the CBC

"We put together a chain — how many people have influenza, how many of them take these anti-fever drugs, how much does that increase the amount of virus they give off, how much does that increase the chance that they’re going to affect somebody else, how much does that increase the overall size of the seasonal flu epidemic," said Ben Bolker, professor of math and biology.

...After crunching numbers, the researchers realized that avoiding medications with ibuprofen, acetaminophen and acetylsalicylic acid and just staying home could save many lives. Their research estimates as many as 1,000 lives across North America each year could be saved.”

The work is still preliminary, says Science, and the researchers tried to keep their calculations conservative: “they didn't even include the effect that after an aspirin, people may be more likely to go out and meet people, increasing chances of spreading disease.”

Taking anti-fever medications may make you feel less sickly, but they may also make you more contagious. The best bet then, it seems, is to pop the pills—or skip them altogether—and just stay in bed.

Tribal Force, the First Comic to Feature a Team of Native American Superheroes, Is Returning

Smithsonian Magazine

Superheroes have always had a diversity problem, which is sort of ironic, really. In a world where people can fly, shoot lasers, sling webs, change shape or wield any other of a seemingly endless string of powers, the representation of actual human diversity—physical, cultural, racial, sexual—is often left out of the mix.

Since the chiseled white jaws of Superman and Captain America first started gracing comics, the industry has made some efforts to address this. Starting in the 1960s, the formation of the X-Men brought issues of racism to the comics, while other series have dealt with issues of sexuality. But there still were (and are) gaps.

In 1996, writer Jon Proudstar and artist Ryan Huna Smith created Tribal Force, a comic centered on the first band of superheroes made up entirely of Native Americans. The comic charted the path of five people who used the powers granted to them by the god Thunderbird to protect native land from a powerful, high-tech government, and was incredibly well received. (Its accolades include a slot in the Smithsonian National Museum of the American Indian's Comic Art Indigène exhibit.) But it had only a short run. Proudstar and Smith split, and the comic went dark.

After nearly two decades Tribal Force is coming back, says High County News. In an interview, Proudstar explores his goals for Tribal Force, including exploring Native American cultural histories and stories, and tackling serious issues faced today by those living on reservations.

HCN: If the members of Tribal Force were here today, what would they be most upset with?

Proudstar: Tribal Force looks at the same issues that rez kids have to deal with. When I was younger, I remember thinking, "We'll always be poor, struggling, seeing relatives being arrested." That was kind of crushing. But I educated myself by reading a lot, and in broadening my horizons, I realized that things will change – and that you can change them.

The first issue I'm dealing with in the book is the epidemic of child molestation on Indian reservations. Seven out of 10 girls – it's a huge cancer. Gabe has fetal alcohol syndrome … and he's into weed and drinking, and struggles with learning what it truly is to be a warrior. A lot of kids misinterpret what a warrior is. It has nothing to do with war. A warrior takes care of his village, makes sure the old ones are taken care of, and that the children are safe.

But for the most part, it's a comic book. There's action and aliens, and weird stuff.

Tribal Force wasn't the first comic to include Native American heroes, but many of those early attempts, from the cringe-inducing Red Warrior to Marvel's Red Wolf suffer from ham-handed racial stereotyping

Africa's Ongoing Ebola Outbreak Is the Worst the World Has Ever Seen

Smithsonian Magazine

It started in February, when the Ebola virus began spreading through the forested regions of southeastern Guinea, in West Africa. By late March, 29 people had died, and 49 were infected. After the death toll climbed for months, by spring the outbreak seemed to be waning. But in June the virus' spread again picked up speed, and now, says the BBC, the ongoing Ebola outbreak that is affecting Guinea, Liberia and Sierra Leone is “the biggest and most deadly Ebola outbreak the world has seen.”

As of yesterday, the World Health Organization says that there have been 544 laboratory-confirmed cases of the disease; 291 of those people have died. In total, there are 759 people suspected to have been been infected with the disease, with 467 deaths.

The lack of a structured health care system in the region is making it harder to fight the disease's spread, says Maggie Fox for NBC News. In one instance, she says, medical “team members arrived in at least one village to find it deserted, and the body of an Ebola victim left unattended in a house.”

“It’s not hard to imagine what happened, but it makes it impossible to track down people who might have been infected and get them to hospitals for what care can be provided, and to prevent them from infecting others.”

The Ebola virus was only identified in 1976, says Fox, but this outbreak's climbing death toll and large geographic spread make it the worst yet seen.

Ebola is an incurable disease, and the death rate is high. The only way to stop the disease's spread, says the BBC, is to quarantine the infected. Yet, says virologist W. Ian Lipkin to National Geographic, stopping its spread is easier than for some other diseases, like airborne viruses.

This is not a highly transmissible disease, where the number of people who can be infected by a single individual is high. You have to come into very close contact with blood, organs, or bodily fluids of infected animals, including people. If you educate people properly and isolate those who are potentially infected, it should be something you can bring under control.

Before the most recent cases, the most deadly outbreak of Ebola was in the Congo, says the Guardian. Some are attributing the higher death toll of this outbreak to better surveillance, though, rather than a truly higher number of cases.

1830 Violet Alexander's "Flowering Tree" Appliqued Quilt

National Museum of American History
A quilted inscription at the base of the flowering tree on this quilt reads “Violet E. L. Alexander / June 10 / 1830.” The central focus of this quilt, a flowering “Tree of Life” motif, is appliquéd on a 40-inch square of white cotton. Other motifs of palm trees, flowers, and long-tailed birds are appliquéd on white cotton triangles to fill out the center section. This is framed by 3-inch and 7-inch borders that are made of roller-printed floral and geometric stripes. The two borders are separated by a 3¾-inch plain white border. The corner motifs and some parts of the central tree are cut from block-printed cotton produced at the Bannister Hall print works near Preston, England.

The quilting pattern, 8 stitches per inch, consists of diagonal lines, ¼-inch apart, over the entire center and on the printed borders. Clamshell quilting is found on the plain white border. The fine quilting and use of costly chintz fabrics printed in England make it a typical example of a medallion quilt, popular in the early nineteenth century, and often found in the American South.

Violet Elizabeth was the daughter of William Bain Alexander and Violet Davidson. Violet was born January 9, 1812. She was one of fourteen children (seven girls and seven boys) who grew up on a prosperous estate in Mecklenburg County, North Carolina. On December, 27, 1831 she married Dr. Isaac Wilson, who both farmed and maintained an extensive medical practice. The couple had six children, five sons and one daughter. Two sons lost their lives in the Civil War, two others farmed in the county, and another practiced medicine. Violet died at age 33 of erysipelas, a bacterial infection, during an epidemic in 1845. This quilt was made just prior to her marriage. According to information from the donor, Dr. John E. S. Davidson, the quilt may have been made by his mother, Jane Henderson (Mrs. Edward Constantine Davidson), a friend or relative of Violet.

Note: The name Violet appears and reappears in the family. She may have gone by the name “Elizabeth,” as some sources cite.

Pat Yourself on the Back, America: The U.S. Is Not Freaking Out About Ebola (For the Most Part)

Smithsonian Magazine

The breathless expositions from some politicians and media outlets might leave the impression that Ebola is a direct threat to the health and safety of Americans. At any moment someone might get off a plane and infect us all—a doomsday epidemic of an incurable disease!!!

But even if some people are reacting with irrational fear (or calculated political maneuvers), a new poll from the Washington Post and ABC News suggests that, when it comes to Ebola, Americans have (mostly) got their heads on straight. Good job!

According to the poll, most Americans, some 63 percent, are either “not too worried” or “not worried at all” that they or someone in their immediate family might catch the virus. It's unlikely that the other 37 percent of Americans are planning to go to West Africa soon, or have a family member living there now—so they probably shouldn't be worried, either. But a 63 percent cool-and-collected citizenship is not too shabby.

The poll largely echoes an earlier one conducted by the Pew Research Center that found that 32 percent of those polled are either “very worried” or “somewhat worried” that they or someone they know will be exposed to the virus.

At odds with widespread scientific perspective, however, 60 percent of those polled also are either “very concerned” or “somewhat concerned” that there could been a widespread Ebola outbreak in the U.S. So, there's some PR work for health professionals to do yet. Overall, though, says the Washington Post, the poll suggests that “fears among Americans about the Ebola virus appear to be waning.”

Maybe if we can all calm down on this side of the Atlantic, focus can get back to where it should be: West Africa, where the outbreak just hit 13,703 official cases.

Probiotics Exist Thanks to a Man Who Drank Cholera

Smithsonian Magazine

Sometimes it seems like everything in the dairy aisle at the supermarket wants to fix your guts. If you put on a blindfold and picked something off the shelf at random, the chances are pretty good that whatever you grabbed would have the word “probiotic” emblazoned somewhere on the packaging. And it’s all thanks to a man who once drank a glass of cholera for science.

Ilya Metchnikoff, a researcher working in TK, was obsessed with figuring out how the immune system works. Back in the late 19th century, the accepted theory was that white blood cells actually aided bodily infections by creating an environment friendly to invading microbes and helping them spread. But by comparing the immune responses of animals like starfish to the human immune system, Metchnikoff proved that white blood cells were fighting on the front lines against infection, writes Lina Zeldovich for Nautilus. His discovery shattered traditional conventions of medical science and won him the 1908 Nobel Prize.

Metchnikoff discovered all sorts of things that now form the foundations of our understanding of the human body, but during his life, much of his work was considered radical. “A lot of the things he did were very prescient,” Siamon Gordon, professor emeritus of cellular pathology at the University of Oxford told Zeldovich. “Right now several of his ‘crazy’ ideas are absolutely mainstream.”

Which brings us to 1892. A cholera epidemic was sweeping in France, and Metchnikoff was struggling to understand why the disease struck some people and not others. To do so, he sucked down a drink full of cholera. He never got sick, so he let a volunteer drink some as well. When that volunteer failed to get sick as well, Metchnikoff offered the drink to a second test subject. That man, however, didn't fare so well. He got cholera and nearly died. 

From there, Metchnikoff went to the lab. Zeldovich writes:

When Metchnikoff took his experiments into the petri dish to find out what caused such a marked difference, he discovered that some microbes hindered the cholera growth while others stimulated it. He then proposed that the bacteria of the human intestinal flora played a part in disease prevention. And, he reasoned, if swallowing a pathogenic bacterial culture sickened you, then swallowing a beneficial one would make you healthier. Therefore, he decided, the proper alteration of the intestinal flora could help battle diseases that had plagued humans for centuries.

Once again, however, Metchnikoff ran up against mainstream science. A popular theory of the time was that the large intestine was a reservoir for noxious bacteria and was itself the source of most stomach problems. At least one surgeon recommended that people suffering from digestive issues have the whole thing removed. But Metchnikoff was convinced by his work that gut problems could be healed by restoring balance to the a person’s microbiome. He began experimenting with different microbial cultures, especially one that was popular for yogurt-making in Eastern Europe, and discovered that some types of microbes did in fact aid people with stomach problems.

Metchnikoff's theories never gained prominence during his lifetime, writes Zeldovich, there was one notable exception: a small company in Barcelona that started marketing yogurt as medicine in 1910. A few years later, the company expanded to the United States, where it was branded “Dannon.”

Metchnikoff died in 1916, long before he could see his fringe ideas become the foundations for a mainstream juggernaut. Research into probiotics is a multi-billion dollar industry and supermarket shelves are packed with cultured milk products like yogurt and kefir. But despite all the hype and branding, yogurts do actually contain some bacteria that is good for our bodies. And it’s thanks to a man brave enough to drink some cholera that we can reap its benefits today.

Four American Cities Voted for Taxes on Soda Last Night

Smithsonian Magazine

The presidential election was at the top of the ticket yesterday, but this wasn't the only measure on the ballot. In an effort to help combat rising obesity and diabetes rates, four cities have voted for sodas and other sugary drinks to be subject to a new tax.

As of last night, three cities in California’s Bay Area (Oakland, San Francisco and Albany) joined Boulder, Colorado in a small but vocal group of cities experimenting with raising taxes on non-alcoholic, sugar-sweetened beverages like sodas, energy drinks, sweetened tea and sports drinks, Rachel Becker reports for The Verge. Soon, drink distributors in the three Californian cities will have to pay a new tax of one cent-per-ounce of these drinks they sell, while those in Boulder will pay a steeper charge of two cents-per-ounce.

That may not seem like much at first blush, but it has the potential of adding up over time. While the average soda-drinker might see the price of their drink go up by a few cents at the corner store, those pennies go far. According to Becker, these cities estimate the new taxes will bring in millions of dollars of annual revenue in the coming years, while potentially discouraging people from reaching for sugary drinks when they are feeling parched.

“This night goes to every single person I spoke to who told me their story about diabetes,” Joyce Ganthavorn, who spent the last year advocating for the tax in San Francisco and Oakland, tells Farida Jhabvala Romero for KQED Radio. “This victory goes out to them.”

That’s not to say these were easy fights: beverage industry giants like the Coca-Cola Company, PepsiCo, Inc. and Dr Pepper Snapple Group, Inc. have spent tens of millions of dollars in recent years fighting these kinds of ballot measures, and this was no exception. While advocates for the new taxes poured more than $20 million into the fight, groups backed by retailers and the beverage industry spent at least $30 million on fighting these taxes at the ballot booth, Mike Esterl reports for the Wall Street Journal.

“I think they see this as very important for their future, and they are trying to make it clear to other cities and other states that might consider similar types of taxes that they are going to fight hard,” Jason McDaniel, a political science professor at San Francisco State University, tells Romero.

These cities aren’t the first to pass a tax on sugary drinks: In 2014, Berkeley, California became the first city in the nation to start taxing sodas more, with Philadelphia, Pennsylvania following suit earlier this year. However, with the beverage tax scheduled to hit the City of Brotherly Love on January 1, 2017, the beverage industry is pushing back and suing to keep it from being implemented, Becker reports.

“We respect the decision of voters in these cities. Our energy remains squarely focused on reducing the sugar consumed from beverages—engaging with prominent public health and community organizations to change behavior,’’ representatives of the American Beverage Association tell Esterl.

A tax on soda is far from a silver bullet in the fight against the obesity epidemic. The jury is still out on how much of an impact raising prices on sugary drinks has on obesity rates in the long term, and the beverage industry is continuing to pour millions into protecting its products. However, with other cities continuing to propose and consider levying taxes on sugary beverages, these are likely not the last soda taxes to come.

Swine Flu: Worst Case Scenario

Smithsonian Magazine

On Monday, the President's Council of Advisors on Science and Technology released a report assessing the U.S. preparations for the H1N1 flu virus (a.k.a. swine flu), which is expected to soon make a resurgence in this country. But despite the conclusion that the nation is on track in this area ("The preparations are the best ever for an influenza pandemic," PCAST co-chair Eric Lander said), media reports are focusing on the worst case scenario outlined in the report:

Infected: 150 million
Symptomatic: 120 million
Needing medical attention: 90 million
Needing hospital care: 1.8 million
Needing intensive care unit facilities: 300,000
Deaths: 90,000

However, this is only one scenario, and the flu season could end up being no worse than usual (the low-end estimate is about 30,000 deaths, which is an average flu season). And H1N1 is not expected to bring anything like the 1918-1919 flu pandemic that killed 50 million to 100 million people worldwide.

The H1N1 virus, though, is unlike the regular flu viruses we have been infected with lately, and few people will have any immunity against it. And this means that there is some reason to worry, especially if the virus spreads quickly in September before vaccination can take place (the vaccination program is not expected to begin until mid-October). "This potential mismatch in timing could significantly diminish the usefulness of vaccination for mitigating the epidemic and could place many at risk of serious disease," PCAST wrote. Thus, one of their main recommendations in the report is to accelerate production of the initial batch of the vaccine and quickly vaccinate 40 million of the most vulnerable Americans (based on age and disease).

Behavior will also matter, the report notes. Individuals should, of course, be certain to wash their hands frequently and stay home when sick. And workplaces could be encouraged to liberalize their rules to make it easier for people to stay home.

I hope that when the swine flu reappears in the coming weeks we will avoid the panic that occurred earlier this year when it first came about. There's no need, for the moment, to run to the store and buy face masks, and certainly no reason to avoid eating pork or to lock up Afghanistan's sole pig, again.

In the meantime, here are a couple trusted flu resources:

Centers for Disease Control and Prevention

World Health Organization

Dogs Are Being Trained to Sniff Out COVID-19

Smithsonian Magazine

Dogs are being enlisted in the fight against the novel coronavirus. Researchers at the University of Pennsylvania are testing a pack of eight Labrador retrievers to find out if their sensitive snouts can detect the pandemic virus by scent, Karin Brulliard reports for the Washington Post.

Humans have trained our canine friends' finely tuned noses to sniff out other deadly diseases, including malaria, diabetes, some cancers and Parkinson’s disease, reported Ian Tucker for the Guardian in 2018. Other research has shown that viruses give off a particular smell, Cynthia Otto, director of the Working Dog Center at UPenn’s School of Veterinary Medicine, tells the Post.

If the dogs’ 300 million scent receptors can be trained to smell the novel coronavirus they could eventually be used in public places such as airports, businesses or hospitals to quickly and easily screen large numbers of people. Because this diagnosis by dog would depend on the smell given off by people infected with COVID-19 it should have no problem picking out asymptomatic carriers.

The yellow, black and chocolate labs will be trained for three weeks using a process called odor imprinting. Miss M., Poncho and six other dogs will be exposed to COVID-19 positive saliva or urine collected from hospitals and then rewarded with food when they pick out the correct samples, according to a statement from UPenn. When the dogs have the scent, they’ll be tested to see if they can pick out COVID-19 positive people.

“We don’t know that this will be the odor of the virus, per se, or the response to the virus, or a combination,” Otto, who is leading the project, tells the Post. “But the dogs don’t care what the odor is. … What they learn is that there’s something different about this sample than there is about that sample.”

Dogs are also being trained for this purpose in the United Kingdom by the charity Medical Detection Dogs in collaboration with Durham University and the London School of Hygiene and Tropical Medicine, reports the BBC.

"This would help prevent the re-emergence of the disease after we have brought the present epidemic under control," Steve Lindsay, public health entomologist at Durham University, tells the BBC.

The U.K. trial expects to start collecting COVID-19 positive samples in the coming weeks and will train its dogs shortly thereafter, per the Post. If the trial is successful the group aims to distribute six dogs to be used for screening in U.K. airports.

“Each individual dog can screen up to 250 people per hour,” James Logan, epidemiologist at Durham University and collaborator on the project, tells the Post. “We are simultaneously working on a model to scale it up so it can be deployed in other countries at ports of entry, including airports.”

Otto tells the Post that the trial could inspire an electronic sensor that could detect COVID-19 which might be able to rapidly test thousands of people. But if the dogs’ olfactory prowess can’t be replicated, then the ability to scale up could be limited by another issue: the U.S.’s shortage of detection dogs.

Howardena Pindell Gets Her First Major Museum Survey

Smithsonian Magazine

Howardena Pindell, the multidisciplinary artist and activist for social and political change, has finally gotten her first major museum survey.

As Jason Foumberg reports for The Art Newspaper, the Museum of Contemporary Art Chicago is highlighting the span of Pindell’s groundbreaking career in the recently opened, “Howardena Pindell: What Remains to be Seen,” which runs through May 20.

Pindell was born in 1943 in Philadelphia and studied painting at Boston University and Yale University. She worked for 12 years at the Museum of Modern Art in New York, and later as a professor at Stony Brook University, all the while showing her own work extensively.

Having grown up at a time when the South was still lawfully segregated, racism was an inescapable part of her existence. Foumberg writes that her efforts, both inside and outside of her art work, reflect that with a focus on homelessness, the AIDS epidemic, racism and apartheid.

Pindell has also worked tirelessly to improve equality in the art work. She helped lead a protest against a 1979 show by white artist Donald Newman that drew fire for its racist framing. She has also advocated for equal gender representation in galleries.

The new exhibition spans Pindell’s decades-long career. Among the work on view is "Free, White and 21," a 12-minute video that Pindell recorded several months after a car accident left her with partial memory loss in 1979. The Museum of Modern Art writes that the work came out of "her need to heal and to vent." In the video, she appears as herself and as a white woman, delivering a deadpan account of the racism she experienced coming of age as a black woman in America. 

In 2014, writing about the show "Howardena Pindell: Paintings, 1974–1980," on view at New York's Garth Greenan Gallery​, the critic John Yau praised the rage that courses through her paintings and drawings for Hyperallergic. Through the layers of acrylic paint and hundreds of pieces of tiny paper dots, made by a hole-punch, applied to a canvas, he writes, "Pindell’s rage became paintings in which dissonance and anarchy were submerged, but not hidden." 

Naomi Beckwith, co-curator of “Howardena Pindell: What Remains to be Seen,” echoes that sentiment in her interview with Foumberg. “Howardena was among the first to take formal experiments and use them as the language of politics," Beckwith says. "I want viewers to walk away with the sense that the history of art is always malleable. Howardena is one of those people who can tell a very different story about what art does in our world.”

South Carolina - History and Heritage

Smithsonian Magazine

Before the Europeans started arriving in the 16th century, some 30 native tribes lived on the land that now comprises South Carolina. Smallpox and other diseases carried by the Europeans decimated the native population. Some tribes were wiped out completely. Today, the Catawba, Pee Dee, Chicora, Edisto, Santee, and Chicora-Waccamaw tribes are all still present in South Carolina, as are many descendents of the Cherokee.

Spaniards explored the South Carolina coast as early as 1514, and Hernando DeSoto met the Queen of Cofitachiqui in 1540 when he crossed the central part of the state in search of gold. In 1566, the Spanish built a fort on Parris Island. A decade later, they abandoned it in favor of St. Augustine, Florida, and South Carolina was left to the native tribes until 1670 when the English established a settlement at Albemarle Point on the Ashley River.

Many of those first permanent settlers had moved to the colony from Barbados, and South Carolina grew to closely resemble the plantation economy of the West Indies, particularly in the importation and dependence on large numbers of African slaves.

By the 1750s, rice and indigo had made the planters and merchants of the South Carolina Lowcountry the wealthiest men in what would become the United States. White Protestant immigrants continued to pour in, settling in the interior and joined by German, Scots-Irish and Welsh settlers who were relocating from colonies farther north.

In the Sea Islands along the coast of South Carolina, Georgia, and Florida, a unique culture, Gullah, was evolving among African slaves brought to work the rice fields and their descendants. The Sea Island slaves were the first to be emancipated following the Civil War, and the language, traditions, and customs of the Gullah culture have survived the centuries

As the tensions leading to the American Revolution rose, South Carolina was a colony divided between those seeking independence and those loyal to the Crown. In 1776, South Carolina became one of the 13 original colonies to declare independence from Britain. Ever since then, the politics of the state have been distinguished by a strong preference for independence and federalism.

In 1860, the state was the first to secede from the Union. And the first shots of the Civil War rang out over Charleston Harbor on April 12, 1861. Although few of the war's major battles were fought in South Carolina, some 20 percent of the state's white males died in the conflict.

The post-war economy, based to a large extent on sharecropping, made little progress for many decades. The textile industry, which had expanded dramatically after the war, suffered a major blow when a boll weevil epidemic devastated cotton fields in the 1920s. Meanwhile, the impoverished state maintained polices of discrimination and segregation that led many African Americans to seek better lives and opportunities in the North.

Since World War II and the Civil Rights movement of the 1960s, South Carolina has bounced back, both politically and economically. Today, agriculture and manufacturing are vital industries, as is an economic engine that draws on the state's history, rich culture, and natural beauty—tourism.

How 13th-Century “Mermaid Bones” Came to Be Displayed in a Japanese Temple

Smithsonian Magazine

In Japan, mermaids are not the conventionally attractive creatures that they have been depicted as in Disney movies. Called ningyo, Wu Mingren at Ancient Origins writes, the fish-like creatures vary in appearence, often said to have pointy teeth, and sometimes, menacing horns. They also are purported to have mystical abilities.

Today, the “bones” of a 13th-century ningyo are on display at Ryuguji temple in Fukuoka, reports Shinjiro Sadamatsu at The Asahi Shimbun.

But how did its bones get there?

According to legend, on April 14, 1222, a mermaid washed ashore in Hakata Bay, on the Japanese island of Kyushu. After a shaman declared the mermaid a good omen for the nation, its bones were then buried at the Ukimido temple, which people took to calling Ryūgū-jō, which in Japanese folklore translates to the undersea palace of the dragon god.

Many believe that what Japanese fishermen and seamen perceived to be mermaids, or ningyo, were actually dugong. Dugong are large sea mammals that live in the warm coastal waters; they are related to (and resemble) the manatee. They generally travel alone or in pairs and can remain underwater for up to six minutes at a time.

It’s possible that the specific Ryuguji temple bones came from a finless porpoise (neophocaena phocaenoides). These creatures have no dorsal fin (hence their name). Finless porpoises swim off the coast of Japan and in the area of Fukuoka Prefecture; if one washed ashore in 1222, it’s not a far stretch to think the locals could have mistook it for a mermaid.

During the Edo period, between 1772 and 1781, the bones of the temple’s mermaid were removed from their resting place, and visitors to the temple were able to partake of water in which the mermaid bones had been soaked. At the time, people claimed that soaking in the bones could protect bathers from epidemics.

Today, six of its bones remain at the temple, which is now officially called the Ryuguji temple. The bones can be seen by appointment, and they appear smooth and glossy, writes Sadamatsu, a look achieved by centuries of handling.

When asked whether the bones are actually from a mermaid, Yoshihito Wakai, the vice director of the Toba Aquarium, demures. He tells Sadamatsu, “I cannot say anything definitively. I think it’s better to keep a legend a legend.”

Ryuguji temple is not the only holy place in Japan to have a mermaid relic. One of the oldest-known mermaid shrines in Japan is at Fujinomiya, near Mount Fuji, reports Atlas Obscura. The temple at Tenshou-Kyousha has a mermaid mummy purported to be over 1,400 years old. The mermaid was once a fisherman, and according to local mythology, he was transformed into a beast because he deigned to fish in protected waters. The punishment made the mermaid see the error of its ways and it asked a prince to display its remains to serve as a lesson—and a warning—to others.

More than 150 Years Later, Canada Exonerates Six Indigenous Chiefs Hanged in 1864

Smithsonian Magazine

In 1864, five chiefs of Canada’s indigenous Tsilhqot’in people were called in to peace talks with the gold commissioner of the Colony of British Columbia. A fierce conflict had been waging between the Tsilhqot’in and white settlers who were building a road through Tsilhqot’in territory—and without Tsilhqot’in permission—to a creek laden with gold. The chiefs believed that the talks were a gesture of reconciliation, but when they arrived at the gold commissioner’s camp, they were promptly arrested, declared guilty of the murder of 14 settlers and hanged. A sixth Tsilhqot’in chief was later executed while trying to offer reparations.

For more than 150 years, this act of deception has been remembered by the Tsilhqot’in people as a deeply painful chapter of their history. Last Monday, John Paul Tasker of the CBC reports, Canadian Prime Minister Justin Trudeau acknowledged the long-standing wounds in an apology for the executions of the chiefs, posthumously absolving them of any wrongdoing.

“Today, we come together in the presence of the Tsilhqot'in chiefs, to fully acknowledge the actions of past governments, committed against the Tsilhqot'in people, and to express the government of Canada's profound regret for those actions,” Trudeau said in the House of Commons, where six modern-day Tsilhqot'in chiefs were invited to witness the apology, according to Tasker.

"As settlers came to the land in the rush for gold, no consideration was given to the needs of the Tsilhqot'in people who were there first,” Trudeau added. “No agreement was made to access their land. No consent was sought.”

The Tsilhqot'in had other reasons to oppose the encroachment of white settlers on their territory. Months before the conflict, remembered as the Chilcotin War of 1864, the Chilcotin Uprising and the Bute Inlet Massacre, the ​Tsilhqot'in numbers had been halved by a devastating smallpox epidemic, according to Tristin Hopper of the National Post. The disease was thought to have been spread by two white travelers, and it killed some 800 Tsilhqot'in.

Believing that they faced an existential threat, Tsilhqot'in chiefs hatched a plan of attack. In the early hours of an April morning, 24 Tsilhqot'in men ambushed and killed 14 road workers as they slept in their tents. The Tsilhqot'in never denied that they had perpetrated the killings, but as the chiefs reportedly said after their arrest, they “meant war, not murder.”

Trudeau repeated this now-famed quotation during his House of Commons speech. “They acted as leaders of a proud and independent nation facing the threat of another nation,” the prime minister added, according to Meagan Flynn of the Washington Post.

Today’s Tsilhqot'in leaders welcomed Trudeau’s apology and exoneration of their ancestors. “We have always been proud of the sacrifices made by our chiefs, who are heroes to our people, and continue to inspire and guide the work of the future,” chief Alphonse told reporters at the House of Commons, according to Tasker of the CBC. “Today, Canada has finally acknowledged that our warriors did no wrong.”

After Trudeau’s speech, the six Tsilhqot'in performed a drum ceremony for the assembled politicians. They wore black vests, which they turned inside out partway through the ceremony to reveal the garments' bright red lining. It was a poignant, hopeful gesture; red, to the Tsilhqot'in, symbolizes rebirth and renewal.

21st Century Cures Act Tackles Postpartum Depression

Smithsonian Magazine

This afternoon the 21st Century Cures Act was signed into law by President Obama. This $6.3 billion dollar package funds a broad range of issues, including the Cancer Moonshot, the opioid epidemic, FDA drug approval as well as mental health treatments. Among these many provisions, however, the bill also addresses a topic that has received little attention over the years: postpartum depression.

This mental health condition is part of a wider problem of maternal depression that occurs both before and after childbirth. Postpartum depression inflicts up to one in seven mothers after their child's birth. But only around half of those women ever get diagnosed. Proposed by representative Katherine M. Clark, the Bringing Postpartum Depression Out of the Shadows Act provides $5 million per year from 2018 to 2022 for states to develop screening and treatment programs for mothers. 

"Women are falling through the cracks and not getting treatment, even when they're crying out for help," Joy Burkhard, founder of the National Coalition for Maternal Mental Health, tells Annamarya Scaccia at Broadly. "It's the fault of our medical system for not catching the problem."

It's not easy to diagnose and can easily be confused with the so-called "baby blues"—a week or so of mild depression, worry and fatigue in the first year after they give birth, which inflicts roughly 80 percent of mothers. But without treatment, postpartum depression can last for months or years, impacting the mother and child's quality of life.

Women with postpartum depression often have difficulty following a breastfeeding schedule. They sometimes don't form an emotional attachment to their child. They could even consider hurting themselves or their baby. 

“As a mom of three boys, I know how rewarding, as well as how overwhelming and exhausting, a new baby can be,” Clark tells Caroline Bologna at The Huffington Post. “Moms comprise fewer than a fifth of Congress, so it’s especially important for us to bring these perspectives into policymaking. I introduced this bill because our moms need to know they matter ― that we, as a nation, value them and will fight for the health and success of their families.”

The grants will go towards programs similar to the Massachusetts Child Psychiatry Access Project (MCPAP) for Moms, a Massachusetts state-funded program launched in 2014 to provide training and tool kits for recognizing PPD. The program also established three call centers across the state available for doctors with queries about psychiatric support services.

“The first thing we do is we go to a practice and we provide training. We teach them about the screening tools, we teach them about how to manage depression,” Dr. Nancy Byatt, a psychiatrist at UMass Medical School who started the program tells Emily Riemer at WCVB5.

The bill also fights against the stigma of mental illness, which forces countless people into isolation. But the hope is that with more improved screening and treatment programs, fewer mother will be left to battle this illness on their own.

Editor's Note, Dec 15, 2016: This article has been corrected to show that postpartum depression only inflicts mothers after their child's birth. Depression during pregnancy is also common and the newly implemented screening is aimed at detecting and treating both postpartum depression and related conditions.

How Vaccines, a Collective Triumph of Modern Medicine, Conquered the World's Diseases

Smithsonian Magazine

Tucked away in a cabinet on the fifth floor of the National Museum of American History are rows of tiny bottles, boxes and needles. Acrid whiffs of evaporating medicine hint at their purpose.

These are the instruments that brought down polio, smallpox and diphtheria—diseases that in the past two centuries have killed thousands annually. By the end of the 20th century, however, mass vaccination programs completely eradicated or brought these diseases under control both in the United States and abroad.

In the late 19th century, when James Flint (1838-1919), the Smithsonian’s first curator of Materia Medica (medical substances), began the collection, vaccines and serums were at the cutting edge of modern medicine. Flint collected some of the first vaccine products manufactured in America.

In the 1920s, Flint’s successor, Charles Whitebread, curated the Smithsonian’s first exhibition on vaccines to showcase the recent medical advances at the time and to help educate Americans about the power of vaccines and serums in arresting epidemics in their communities. And today, the American History Museum continues that effort, helping to explain the role and importance of vaccines in the nation's history.

Whitebread worked closely with pharmaceutical companies to acquire their latest products. Under his direction, the collection grew to about 100 specimens including the influenza and typhus vaccines developed during World War II. Following in his footsteps, curators today collect vaccines, syringes and serums from pharmaceutical companies, druggists, physicians and public health organizations, making the collection one of the largest and most complete in the country.

Some of the oldest objects in the collection include a patent model for a vaccinator that dates to the mid-1860s and a mid-19th-century scab carrier. (Yes, a scab!)

This small gold-plated case—not much bigger than a quarter—was used by a doctor to carry a fresh scab (or two) “picked” from a recent smallpox vaccination. The scab was still virulent and could cause a mild infection when a small piece was inserted under the skin—enough to confer immunity—to another individual. The rudimentary method helped to protect against smallpox. Alongside these crude relics from the early years of vaccination are some of the latest flu vaccines developed during the swine flu pandemic of 2009.

Most of the objects are from the United States, but because diseases do not respect national borders, curators have also collected objects associated with global campaigns to control or eradicate disease. The collection includes, for example, artifacts from the successful 1966 to 1980 campaign to eradicate smallpox. These objects range from posters recommending vaccination to postage stamps and samples of the vaccines and needles used by health care workers in the field. A sampling of the museum's medical collections were recently photographed by Smithsonian magazine's Brendan McCabe.

Thomas Duncan, Dallas' Ebola Patient, Has Died

Smithsonian Magazine

Thomas Eric Duncan, the first person diagnosed with Ebola in the United States, has died, the Associated Press is reporting. Duncan's death brings a sad end to a medical struggle that, just yesterday, seemed to be starting to turn around.

On September 20th, Duncan flew into the United States from Liberia, a country that has so far seen 931 laboratory-confirmed cases of the virus, and thousands of suspected cases. A week later, Duncas was admitted to the Texas Health Presbyterian Hospital in Dallas where he was diagnosed with Ebola, the first time the virus had ever been diagnosed in the U.S., said the CDC in a press conference last week.

Before his death this morning, things had been looking up for Duncan: his temperature and blood pressure had returned to normal levels, and the diarrhea associated with Ebola had waned, says the New York Times. A few days ago, Duncan had been put on an experimental broad-spectrum antiviral medication, says Time.

Before he was admitted to the hospital but while he was contagious, Duncan had been in contact with 48 people, says the Times. Those people are now under surveillance by the CDC. Duncan's family members are under mandated isolation, though as journalist David Dobbs noted last week, the family does not appear to be receiving the kind of support from health care officials that they might need.

Though Ebola is still overwhelmingly an issue for West African nations, it does appear to be spreading within western nations. In Spain, says the BBC, investigations are ongoing to figure out how a Spanish nurse contracted the disease—the first time the disease has ever been seen to spread outside of Africa.

If Ebola continues to spread—though a widespread outbreak of the disease in the West remains incredibly low—the total impact of the disease could push as high as $32.6 billion dollars by next year, according to a new estimate by the World Bank. CTV:

It is far from certain that the epidemic will be contained by the end of the year, so the report estimated the economic costs of two scenarios as the battle against the disease continues. The report estimated that the economic impact could top $9 billion if the disease is rapidly contained in the three most severely affected countries, but could reach $32.6 billion if it takes a long time to contain Ebola in the three countries and it spreads to neighboring nations.

A Quarter of the World’s Saiga Antelope Are Dead

Smithsonian Magazine

Spirited, slightly weird-looking and instantly recognizable, the Saiga antelope find safety in numbers during their spectacular mass migrations. But since the early 2000s, they’ve been considered critically endangered. Now, the fragile antelopes are doing something else en masse: dying. As the BBC’s Victoria Gill reports, a quarter of the world’s saiga population is thought to have died in Mongolia.

It’s devastating news for a species whose existence is already under threat. Scientists tell Gill that ovine rinderpest, a disease also known as sheep plague, Peste des Petits Ruminants or PPR, is to blame. According to the Food and Agriculture Organization of the United Nations, the highly contagious disease can affect up to 90 percent of an animal herd and kills up to 70 percent of the animals who contract it. PPR is viral and has a range of symptoms, including fevers, stomach problems and pneumonia among others. It is spread by close contact between animals—and for free-ranging creatures like antelope, who are not managed by farmers or keepers, it can rage unchecked.

The news is especially devastating for Saiga antelope, whose numbers are already so low the entire species is considered critically endangered by the IUCN. Though a population of at least one million is thought to have existed as late as 1994, their numbers have since dwindled. The animals were poached into oblivion by hunters who sought their horns to sell them to Asian countries for medical use. As The New York Times’ Erica Goode reports, only 50,000 Mongolian saiga are thought to live today.

This isn’t the first time saiga have been wiped out. In 2015, nearly half of the global population—over 120,00 animals—died over the course of just two weeks. Though the cause was initially a mystery, scientists and conservationists now think it was due to a bacterial infection. Altogether, 95 percent of the animals have been lost in just a decade.

How can the potentially disastrous epidemic be halted? As Gill reports, animal carcasses are being burned to prevent PPR from spreading. But the animals that do survive could be weak and susceptible to other diseases and conservationists worry that the species could now be doomed. That’s horrible news not just for the antelope, but for the ecosystem of the grasslands where they live. Other animals could catch PPR, and endangered snow leopards, who rely on saiga for food, could suffer, too. The race is on to eradicate PPR and save these strange-looking antelopes from extinction.

Human Diseases May Have Doomed the Neanderthals

Smithsonian Magazine

In the last decade, researchers have realized that the interactions between ancient humans and Neanderthals were much more complicated than previously believed. Not only did Homo sapiens compete with Neanderthals for resources, we extensively interbred with our hominid cousins, an inter-species hookup that gave some modern humans one to four percent of Neanderthal DNA. A new study shows that humans likely gave Neanderthals something too: tropical diseases.

The study, published in the American Journal of Physical Anthropology suggests that waves of ancient humans traveling out of Africa and into the Neanderthal’s stronghold in Europe probably passed along bugs like tuberculosis, herpes, tapeworms, and stomach ulcers.

“Humans migrating out of Africa would have been a significant reservoir of tropical diseases,” study author Charlotte Houldcroft of Cambridge University’s Division of Biological Anthropology says in a press release. “For the Neanderthal population of Eurasia, adapted to that geographical infectious disease environment, exposure to new pathogens carried out of Africa may have been catastrophic.”

It was assumed many infectious diseases evolved after the development of agriculture, which allowed humans to crowd together in cities and put them in regular contact with domestic animals. But recent studies of infectious disease genomes reveal that they developed tens of thousands or millions of years earlier. Though the researchers found no direct evidence for transmission of disease between humans and Neanderthals, the paper suggests that these new timelines for diseases means its highly likely humans carried them when they migrated into Neanderthal territory.

Melissa Hogenboom at the BBC points out that researchers thought that Heliobacter pylori, the bug that causes stomach ulcers appeared about 8,000 years ago, soon after the beginning of agriculture. But H. pylori’s genome reveals it is at least 88,000 years old. A study of Herpes Simplex 2, the cause of genital herpes, shows it was transmitted to humans from an unknown hominid 1.2 million years ago.

Unlike disease transfers from Europeans to Native Americans, which led to massive epidemics like smallpox that killed millions of people in a short period of time, it’s more likely the disease transfer between humans and Neanderthals was much more localized, Houldcroft says. Because hunter-gathers lived in small bands of about 15 to 30 people, infectious diseases would have affected one isolated band at a time, weakening their overall health.

“Our hypothesis is basically that each band of Neanderthals had its own personal disaster and over time you lose more and more groups,” she tells Hogenboom. “I don't think we'll ever find a [single] theory of what killed the Neanderthals, but there is increasing evidence that lots of things happened over a period of a few thousand years that cumulatively killed [them] off.”

217-240 of 388 Resources