Skip to Content

Found 388 Resources

One of Nature’s Most Extreme Dads, the Darwin’s Frog, Is Going Extinct

Smithsonian Magazine

A Darwin’s frog daddy, of the southernly species. Photo by Claudio Soto-Azat

In 1834, Charles Darwin discovered a strange animal during his exploration of Chile’s southern coast. The creature, a small frog, was shaped like a leaf with a pointed nose, but appeared puffed up as if had been blown full of air, like a balloon. As it turned out, those fat male frogs hadn’t been gorging themselves on too many mosquitoes, but instead were enacting duties that earn them distinction as one of nature’s best dads. They were incubating several of their squirming babies in their vocal sac.

These peculiar animals, known as Darwin’s frogs, are today divided into two species, one that occurs in northern Chile, and another that lives in southern Chile and Argentina. When a female Darwin’s frogs lay her eggs, her mate keep a careful watch until the tadpoles hatch. The eager dad then swallows his young, allowing the babies to safely grow within his vocal sac until they turn into frogs and are ready to strike out on their own. Here, you can see a dutiful papa frog seemingly vomit up his living young:

Northerly Darwin’s frogs, however, have not been spotted in the wild since 1980. Researchers are nearly certain the species is extinct. Meanwhile, their southerly cousins are in steep decline and seem to be heading down extinction’s death row as well. For once, it seems that humans are not entirely to blame for these biodiversity disasters (unlike the western black rhino, which bit the dust a couple years ago after enduring decades of poaching for its valuable but medicinally worthless horn, used as an ingredient in traditional Chinese medicine). Instead, the deadly amphibian chytrid fungus, researchers report today in PLoS One, is likely to blame.

The chytrid fungus has popped up in amphibians in North and South America, Europe and Australia. The fungus infects the animals’ skin, preventing them from absorbing water and other nutrients. The fungus can rapidly decimate amphibian populations it comes into contact with, and has been called (pdf) “the worst infectious disease ever recorded among vertebrates in terms of the number of species impacted, and its propensity to drive them to extinction” by the International Union for Conservation of Nature.

To identify chytrid as the likely culprit behind the Darwin’s frogs disappearance and decline, researchers from Chile, the UK and Germany conducted a bit of historical sleuthing. They dug up hundreds of archived specimens of Darwin’s frogs and closely related species dating from 1835 until 1989, and then tested them all for fungal spores (the problematic form of chytrid fungus was first recorded in the 1930s and reached epidemic-status around 1993, but researchers aren’t certain of when it first emerged). They also took around 800 skin swabs between 2008 and 2012 from 26 populations of still-living southern Darwin’s frogs and other similar frog species that live nearby.

Leaf look-alike. Photo by Claudio Soto-Azat

Six of the old museum specimens, all collected between 1970 and 1978–just before the northern Darwin’s frog’s disappearance–tested positive for the disease. More than 12 percent of the living frogs tested positive for the fungal spores. In places where the Darwin’s frog has gone extinct or is experiencing drastic declines, however, rates of infection jumped to 30 percent in other amphibian species.  Although these events don’t prove that the fungus killed the northern Darwin’s frogs and are now wiping out the southern species, the researchers strongly suspect that is the case.

Despite evidence that the disease has spread throughout the Darwin’s frog’s range, the researchers are not giving up on hope to save one of the world’s greatest dads from extinction. “We may have already lost one species, the Northern Darwin’s frog, but we cannot risk losing the other one,” Claudio Soto-Azat, the study’s lead author, said in a statement. ”There is still time to protect this incredible species.”

Snapshot: Deer Isle

Smithsonian Magazine

Origins: People have lived on Deer Isle and its dozens of rocky surrounding islands since at least 11,000 B.C. Around 8,000 B.C., a culture arose that included sophisticated tools, land and sea trade, and made extensive use of the islands' rich clam and mussel beds. Lore, if not the archaeological record, suggests that Vikings explored the islands in the 11th century A.D. By the 16th century, several Algonquin-speaking groups had settled the area, most of whose members left or fell to disease or battle after the first white settlers arrived in 1762.

The appeal: Lobstering, rather than tourism, remains Deer Isle's primary economic engine. And thanks to the Haystack Mountain School of Crafts, which draws artists from all over the world—dozens of whom have made the island their home—art may be the second-largest industry. Stonington, the island's largest town, reflects that balance with an old-fashioned harbor crowded with lobster boats and a main street dotted with galleries. Perhaps because of this balance, Deer Isle remains a place to enjoy natural beauty rather than a tourist Mecca filled with t-shirt shops and noise. The air, cooled by the Atlantic and filtered by dense woods of white pine and birch, energizes visitors who hike its many trails or explore its coves and islands by kayak or sailboat, as well as those who choose to simply sit and enjoy the quiet. Bald eagles, osprey, a panoply of duck species and other water birds make frequent appearances. Harbour porpoises are also known to summer here. The bracing air (and chilly waters) rouse big appetites for the local bounty. Deer Isle is known around the world for its sweet Maine lobsters and fat clams. Natives and veteran visitors seek out succulent rock and peekytoe crabs. In recent decades, organic farms and dairies have added to the feast. Sheep and goat farming, practiced here since the late-18th century, continue to provide fresh cheese, wool and meat. And of course, wild Maine blueberries are everywhere during the summer.

Interesting historical facts: Deer Isle granite was used in the Manhattan Bridge, the Boston Museum of Fine Arts, and John F. Kennedy's grave at Arlington National Cemetery, among other notable sites. The Defender, which won the first America's Cup in 1895, was crewed entirely by Deer Isle residents.

Famous sons or daughters: Buckminster Fuller, inventor of the geodesic dome, and famed park architect Frederick Law Olmsted spent summers on Deer Isle.

Deer Isle was also home to the woman considered to be Maine's oldest. Born in 1800, Salome Sellers, a direct descendant of the Mayflower settlers and stoic matriarch of an island family, lived through two wars and several epidemics. She died in 1909 at 108. Her farm house is now a museum.

Who goes there? Deer Isle has about 3,000 year-round residents. Perhaps twice that number visit between May and October. In addition to the scions of families that have been summering here since the industrial revolution, Deer Isle draws nature-loving vacationers from all over the world, as well as hundreds of artists and art-lovers who support Haystack, the island's 40 or so galleries, and the Opera House, which produces live performances and serves as the islands' only movie theater. Unlike many holiday destinations, the pace on Deer Isle is resolutely mellow and friendly. At the entrance to the Island Country Club, the sign says, "Public Welcome." Visitors to Deer Isle are happy to leave the cocktail-party circuit to Kennebunkport and the honky-tonk bar scene to other points south.

Then & Now? In 1792, Nathaniel Scott started a ferry service to bring people to and from the mainland. The Scott family ran the ferry until 1939, when the suspension bridge that still connects Deer Isle to the rest of Maine was completed.

Siobhan Roth is a regular Smithsonian.com contributor.

Image by Stacey Cramp. Wild sweet peas take root beyond the waterline along a rocky Deer Isle beach. A walk along any path in summer can double as a harvesting session for sweet peas, purple lupines, and other flowers, as well as rose hips, raspberries, a never-ending abundance of blackberries, and of course, wild blueberries. (original image)

Image by Patricia Roth. The east side of Deer Isle is called Sunshine and is home to beautiful vacation houses, as well as some of the country's largest lobster-holding tanks. Sylvester Cove is in Sunset, on the island's western side, which is also home to the Island Country Club, where the roadside sign proclaims "public welcome." (original image)

Image by Stacey Cramp. The line for coffee at the Harbor View Store on the Stonington waterfront forms at 4 a.m., and by dawn, most of Deer Isle's lobster boats are miles from shore, the lobstermen already hauling the first of the day's traps. In summer, the workday can end by early afternoon. During winter, though, 16-hour days are common. (original image)

Image by Deer Isle Historical Society. Historical photo of Deer Isle pier (original image)

Image by Deer Isle Historical Society. Salome Sellers (original image)

As Temperatures Rise, Malaria Will Invade Higher Elevations

Smithsonian Magazine

Temperatures and environmental conditions are changing, causing the spread of disease to shift. How those changes and shifts will play out, however, is the subject of debate. It’s impossible to build a computer model that perfectly mimics the real world and can thus predict, say, where mid-latitude regions will become warm enough for tropical diseases to thrive or wet enough to enhance the spread of water-borne pathogens. But research does suggest that—similar to shifts in animal and plant distributions as climate changes—some places will see rates of certain diseases drop, while others will see an increase or introduction of those diseases.

Shifting patterns of disease do not apply only by latitude, however. Just as how the distribution of desert cacti is slowly creeping into Arizona's hills or how lowland insects are moving into mountains in Borneo as climate warms, diseases can also broaden their distributions by reaching higher and higher elevations. And according to a new study published by American, British, Ethiopian and Colombian researchers in Science, it’s already happening.

The authors of the study turned their attention specifically to malaria, which infects an estimated 300 million people each year. Malaria might be particularly susceptible to changes in distribution due to warmer temperatures, they explain, because the Anopheles mosquitoes that carry the malaria parasite can only live in warm environments.

The researchers focused on the highlands of western Colombia (50 to 2,700 meters) and central Ethiopia (1,600 to 2,500 meters), which historically have been cool year-round but have experienced a flux of warmer and cooler seasons in recent years. To see how malaria might or might not have been affected by those climate variations, they compared records of malaria incidence from 1990 to  2005 in Colombia, and from 1993 to 2005 in Ethiopia, with temperature data from each of those years. 

(original image)

In warmer years, they found, malaria incidence did indeed occur at significantly higher elevations than in the cooler years. In Ethiopia’s Debre Zeit region, for example, an increase in 1ºC corresponded to an average of more than 2,100 additional cases during the transmission season, from September to December.

"This is indisputable evidence of a climate effect," said Mercedes Pascual, a theoretical ecologist at the University of Michigan and co-author of the study, in a statement.

She and her colleagues predict that these results would also apply to other countries and regions that suffer from malaria, although studies will have to be undertaken in those places to confirm that assumption. "The main implication is that with warmer temperatures, we expect to see a higher number of people exposed to the risk of malaria in tropical highland areas like these," Pascual added.

A permanent 1ºC temperature change in Ethiopia could mean three million more malaria cases per year in people under 15-years old alone, the authors estimate. Around 43 percent of the country's population currently lives in rural areas historically protected from malaria due their elevations of 1,600 to 2,400 meters, but which now fall within the potential danger zone for hosting the disease as climate warms.

"Our latest research suggests that with progressive global warming, malaria will creep up the mountains and spread to new high-altitude areas,” said Menno Bouma, a clinical lecturer at the London School of Hygiene & Tropical Medicine and co-author of the study. “And because these populations lack protective immunity, they will be particularly vulnerable to severe morbidity and mortality."

Malaria’s shifting distribution is certainly a cause for alarm. According to the United Nations, the disease causes around 2 million deaths annually—most of which are children—and acts as a significant burden to countries, keeping poor regions poor by reducing worker productivity and thus economic growth.

The study authors point out that their research is a heads-up about what will likely become an even greater problem in the future. They note that nonprofits, governments, and other groups interested in curbing the spread of malaria will need to establish intervention methods in places where they were previously not needed before, including at higher altitudes. Mapping where malaria may strike under different regimes of climate change "should further contribute to the early warning of epidemics and assist global malaria elimination,” they write. 

The Whole Gory Story: Vampires on Film

Smithsonian Magazine

With Halloween on the horizon, I had to check out the "Vampires on Film" lecture, courtesy of the Smithsonian Resident Associate Program. The speaker was movie maven and scholar Max Alvarez. It was a well-attended, three-hour tour of horror flicks that make for—more often than not—painfully bad cinema. Yet, after kicking off his lecture by decorating his podium with several heads of garlic, Alvarez lent a gravitas to these movies, elevating them from mere midnight movie schlock to a study in cultural currency—meaning that vampire stories change and evolve with new images and metaphors for each generation viewing them.

In Western culture, tales of vampirism begin in the plague-addled Europe of the middle ages where newly buried bodies were exhumed and those considered not sufficiently decomposed were desecrated—by way of beheading or a good ol’ stake through the heart—for fear that the undead would spread disease among the living. (Trick or Treat?)

What’s worse is that some persons were prematurely interred—hence, their "as yet not-dead bodies" were in fabulous condition—and they ultimately met excruciatingly violent ends. Hands-down, this was the scariest part of the lecture.

By the late 1800s vampire stories are seen in print and theatrical incarnations (such as the 1828 opera Der Vampyr and the 1872 novella Carmilla). But it is Bram Stoker’s 1897 novel Dracula that sets the gold standard for the genre and captures the imaginations of people across the globe. Like its folkloric antecedents, Dracula is a sign of the times, dealing with issues of sex (which was strictly repressed in Victorian society), xenophobia and, in lieu of plague, syphilis, the dreaded STI du jour.

It is Stoker’s vision of the vampire that first makes it to the silver screen, the earliest surviving adaptation being F.W. Murnau’s Nosferatu, but the one that set the world on fire was Tod Browning’s 1931 film Dracula—starring Bela Lugosi—and kicks off a craze. Like its literary inspiration, Dracula and its string of cinematic spinoffs dealt with those things that you generally don’t bring up in polite conversation—namely human sexuality—and titillated audiences.

After a hiatus in the 40s and 50s, the genre was rekindled in the 60s. With sex becoming less taboo, vampire movies had to start exploring new frontiers. Of note is the 1973 film Blood for Dracula wherein the Count is exposed to impure blood and becomes gravely ill, as if the film were anticipating the AIDS epidemic that would sweep the world in the 1980s. Indeed, as a character in cinema, the vampire was evolving from a one-dimensional villain into a multifaceted character that could even be seen working for the forces of good (such as in Blade or Underworld).

While the genre has lost much of the subtlety and gothic trappings of the classic horror films, vampires endure as fodder for high octane action flicks, jam-packed with as much violence and gore as an R rating can withstand. However, they can also be seen in more playful fare as well. (Buffy the Vampire Slayer anyone?)

What's your favorite vampire film? What interesting things do you see happening within the genre that keeps it from going six feet under? Do you have high hopes for the upcoming film adaptation of the best-selling novel, Twilight? And why do you think we infrequently see vampire stories frequently told by way of animation?

Vaccine Week: A History of Vaccine Backlash

Smithsonian Magazine

In light of President Obama’s declaration that the outbreak of the H1N1 virus is a national emergency, Surprising Science is setting this week aside to discuss the history and science of vaccines and their importance in battling diseases, including swine flu. See Monday’s post for part 1, A Brief History and How Vaccines Work, and yesterday for part 2, Success Stories.

It’s kind of startling that the idea of vaccines ever caught on. There is an amazing amount of trust needed: A person—often a complete stranger—is injecting you with a foreign substance. You have to trust that the substance is really what you’ve been told it is, that it has been sufficiently tested and is safe, and that it will work as advertised and not hurt you.

Despite this, most people trust the doctors, science and government and do get vaccinated. A small percentage, however, choose not to be vaccinated (or not to have their children vaccinated). And it’s been this way almost since Edward Jenner first began vaccinating people against smallpox (see the illustration).

Decades after Jenner’s discovery, the British government got involved in vaccination by passing a law in 1840 that provided free smallpox vaccinations to the poor. But later efforts didn’t go over so well. A 1853 law required all infants be vaccinated in the first three months of life and threatened parents who did not vaccinate their children with a fine or imprisonment. Riots soon broke out in several towns. In London, an Anti-Vaccination League was founded. In 1867, after the law was extended to children up to age 14, the Anti-Compulsory Vaccination League was founded. Opposition now focused on the law’s threat to personal liberty. (“As parliament, instead of guarding the liberty of the subject, has invaded this liberty by rendering good health a crime…parliament is deserving of public condemnation.”)

In the late 19th century, anti-vaccination movements spread across Europe and into the United States, where they succeeded in repealing compulsory vaccination laws in several western and Midwest states.

But despite the controversy, protests and pamphlets, the doctors, science and governments eradicated smallpox from the United States by 1950 and from the entire world by 1980.

Along the way, though, anti-vaccination sentiments have resulted in serious harm. For example, when the majority of the residents of Stockholm, Sweden refused vaccination for smallpox in the early 1870s, they were left vulnerable to the disease. The city experienced a major epidemic in 1874, after which vaccination was again popular.

Efforts to eradicate polio—a disease now confined to just a few countries—came off track in Nigeria due to a 2004 rumor that the vaccine “contained birth control drugs as part of a secret western plot to reduce population growth in the Muslim world.” Polio is on the rise again in Nigeria, and more than 100 children have been left paralyzed by the disease this year.

And in places like Europe, Australia and the United States, in communities where parents have stopped vaccinating their children for fear that common childhood immunization causes autism (a fear that is completely unfounded), diseases that had become rare—like measles and pertussis—are making a comeback, as Wired magazine notes in their November issue:

“I used to say that the tide would turn when children started to die. Well, children have started to die,” Offit says, frowning as he ticks off recent fatal cases of meningitis in unvaccinated children in Pennsylvania and Minnesota. “So now I’ve changed it to ‘when enough children start to die.’ Because obviously, we’re not there yet.”

The anti-vaccination movement ebbs and flows over time, with fear of disease fighting mistrust of doctors, science and government. Which will win? If history is any guide: neither. But doctors, science and government will all need to work together to find a way to protect public health. And then, perhaps, they will find more vaccine success stories along the way.

Tomorrow—Vaccine Week, Day 4: Swine Flu Edition

China's Art, From Museum Exhibits to Rock Concerts, Moves Online During Coronavirus Outbreak

Smithsonian Magazine

The outbreak of a novel coronavirus has caused weeks of anxiety and quarantine in China. People are staying home to limit the spread of the illness, recently named COVID-19. Venues that normally draw large crowds have shut their doors indefinitely, and events like concerts and an international art fair have been canceled.

But the country’s ban on public gatherings hasn’t completely shuttered China’s cultural landscape. Instead, the action is increasingly moving online. From museum exhibitions to live concerts, the country’s art scene is connecting communities in the digital sphere.

In January, the Chinese government issued a letter directing museums to “enrich the people’s spiritual and cultural life during the epidemic [with] cloud exhibitions” that display previously planned gallery programming, reports Caroline Goldstein for artnet News. At that point, two museum openings in China had been postponed, and Hong Kong had closed all public institutions.

Now, sites including the Chongqing China Three Gorges Museum, the Chongqing Natural History Museum and the National Museum in Beijing have all opted to increase their digital offerings. Some sites, like the Forbidden City’s Palace Museum, are only accessible from mainland China, according to Maggie Hiufu Wong of CNN. But about 100 online exhibits can be accessed from anywhere via China’s National Cultural Heritage Administration website.

An extensive lineup of special exhibitions had been planned for the Forbidden City’s 600th anniversary. One of those, focused on the Spring Festival, is accessible online in Chinese, as is a 3-D tour of the Forbidden City complex. The terracotta warriors of Emperor Qinshihuang’s Mausoleum Site Museum in Xi’an and the Nanjing Massacre Memorial Hall are among the other museums available for virtual visits.

Live concerts similarly shut down by measures to reduce the spread of the virus are also moving online. A legendary punk rock venue called VOX Livehouse came up with the idea of livestreaming a concert, reports Hyperallergic’s Krish Raghav. The concert hall is located in Wuhan, arguably the center of Chinese punk-rock culture—and the city where the new coronavirus was first identified.

VOX’s initial “live-streamed music festival” has sparked a nationwide trend of similar events. As Hyperallergic reports, musicians, record labels, venues and clubs alike are organizing “bedroom music festivals” and livestreamed club nights featuring pop, techno, punk and experimental improvisation.

“It’s like going to a karaoke parlor or being in a mosh pit without leaving your house,” singer He Fan of Beijing band Birdstriking tells Hyperallergic.

Fan’s band performed an acoustic set for a livestream event called “Strawberry Z,” which derives its name from China’s biggest annual outdoor music festival, Strawberry. The event, called “I’m at Home, Too,” in Chinese, is a five-day music festival hosted on the short video app Bilibili. As the video plays, viewers can participate by contributing to the stream of comments floating onscreen. Bilibili has offered 100,000 free memberships to people living in quarantine in the hope of connecting people and alleviating boredom and anxiety caused by the spread of COVID-19.

“Some artists have also been invited to livestream their lives while staying at home during the outbreak such as cooking, exercising, playing games and many other fun ways to kill time,” says a Bilibili spokesman to Variety’s Patrick Frater. “The cooking segments will be streaming during the evening around dinnertime.”

A Strange Case of Dancing Mania Struck Germany Six Centuries Ago Today

Smithsonian Magazine

Six-hundred and forty two years ago today, citizens in the German city of Aachen started to pour out of their houses and into the streets where they began to writhe and whirl uncontrollably. This was the first major outbreak of dancing plague or choreomania and it would spread across Europe in the next several years.

To this day, experts aren't sure what caused the frenzy, which could drive those who danced to exhaustion. The outbreak in Germany was called St. John's dance, but it wasn't the first appearance of the mania or the last, according to The Black Death and The Dancing Mania, originally published in 1888. In the book, Justus Friedrich Karl Hecker imaginatively describes the spectacle of St. John's dance as follows:

They formed circles hand in hand, and appearing to have lost all control over their senses, continued dancing, regardless of the bystanders, for hours together, in wild delirium, until at length they fell to the ground in a state of exhaustion. They then complained of extreme oppression, and groaned as if in the agonies of death, until they were swathed in cloths bound tightly round their waists, upon which they again recovered, and remained free from complaint until the next attack.

The "disease" spread to Liege, Utrecht, Tongres and other towns in the Netherlands and Belgium, up and down the Rhine river. In other times and other forms the mania started to be called St. Vitus' dance. During the Middle Ages, the church held that the dancers had been possessed by the devil or perhaps cursed by a saint. Called Tarantism in Italy, it was believed the dancing was either brought on by the bite of a spider or a way to work out the poisons the arachnid had injected.

More modern interpretations have blamed a toxin produced by fungus that grew on rye. Ergot poisoning, or ergotism, could bring on hallucinations, spasms and delusions thanks to the psychoactive chemicals produced by the fungus Claviceps purpurea, writes Steven Gilbert for the Toxipedia.

But not all of the regions affected by the strange compulsion to dance would been home to people who consumed rye, points out Robert E. Bartholomew in an article for the July/August 2000 issue of Skeptical Inquirer. Furthermore, the outbreaks didn't always happen during the wet season when the fungus would have grown.

St. Vitus' dance later came to mean Sydenham chorea, a disorder that struck children and did cause involuntary tremors in the arms, legs and face. However those twitches were not the kind of dancing described in the outbreaks of dancing mania.

Another notable epidemic broke out in the city of Strasbourg in 1518. It started in July when a woman called Frau Troffea began to dance. Within a month, 400 people joined in the madness. This plague in particular was probably worsened by apparently well-meaning officials who thought that the victims just needed to dance it out and shake it off. They set aside guild halls for the dancers, hired professional pipe and drum players and dancers to keep people inspired, writes John Waller for BBC.com.

Madness is ultimately what some experts think caused such a bizarre phenomenon. Waller explains that in 1518, the people of Strasbourg were struggling to deal with famine, disease and the belief that supernatural forces could force them to dance. In 1374, the region near the Rhine was suffering from the aftermath of another, true plague: the Black Death. Waller argues that the dancers were under extreme psychological distress and were able to enter a trance state—something they would need to dance for such a long period of time. He blames the dancing mania on a kind of mass hysteria.

Bartholomew disagrees. He points out that records from the time claim that the dancers were often from other regions. They were religious pilgrims, he posits. He writes:

The behavior of these dancers was described as strange, because while exhibiting actions that were part of the Christian tradition, and paying homage to Jesus, Mary, and various saints at chapels and shrines, other elements were foreign. Radulphus de Rivo’s chronicle Decani Tongrensis states that “in their songs they uttered the names of devils never before heard of . . . this strange sect.” Petrus de Herenthal writes in Vita Gregorii XI: “There came to Aachen . . . a curious sect.” The Chronicon Belgicum Magnumdescribes the participants as “a sect of dancers.”

Once the first dancers started their strange ritual, other people perhaps joined in, claiming to be overwhelmed by a compulsion. Societal prohibitions against such unrestrained behavior could then be cast aside.

Ultimately, the cause of choreomania seems to be mystery, but it will never cease to be a fascinating part of European history.

What a 6,000-Year-Old Knee Can Teach Us About Arthritis

Smithsonian Magazine

The human joint is a wonderfully flexible and durable evolutionary innovation, but like any good machine eventually it wears down. And in many people, this wearing is thought to cause arthritis.

Pain from arthritis strikes some 54.4 million U.S. adults, and is "one of the most common chronic conditions in the nation," according to the Centers for Disease Control website. The disease causes stiffness, swelling and pain in the joints and has been found in humans for thousands of years. (Scientists even identified evidence of arthritis in Nefertari's mummified knees.) But researchers have long assumed that arthritis rates have spiked in recent years as people live longer and populations grow heavier. Now, as Mitch Leslie reports for Sciencea study of ancient knees has finally provided evidence to support the trend, and suggests that arthritis may not be an inevitable fate of old age.

To tease out the history of arthritis, Harvard University biologist Ian Wallace studied skeletons of middle-aged and elderly people from various time periods of America, including specimens from Native Americans up to 6,000 years old. He thought that perhaps in the early days of humanity—when when walking was the main way to get around and many people spent their lives hunting, farming or fighting—the rates of arthritis would actually be fairly high due to the joint stress from all this activity.

But this wasn't the case.  

Instead, it appears that osteoarthritis of the knees affects far more Americans today than even just a few decades ago, Leslie reports. And after controlling for weight and age, the results suggest that these factors have no effect on how many people develop the disorder. Strikingly, the rate of osteoarthritis has more than doubled among Americans just since 1940. Wallace and his team pubished their results earlier this month in the journal Proceedings of the National Academy of Sciences.

“We were able to show, for the first time, that this pervasive cause of pain is actually twice as common today than even in the recent past," Wallace says in a statement. "But the even bigger surprise is that it’s not just because people are living longer or getting fatter, but for other reasons likely related to our modern environments.”

The study doesn't make any conclusions for why this spike has occurred, but study co-author Daniel Lieberman suggests that the epidemic of sitting in mondern-day America could be affecting how our joints are formed and maintained, leading to more arthritis, Richard Harris reports for NPR. Changing diets and the rising rates of injuries from sports among children and adults could also play a role.

Though cause is still unknown, the study's results suggest that the disease may not be as inevitable as once believed.  “We should think of this as a partly preventable disease," Lieberman says in a statement.

Today, there is no true "cure" for arthritis, only management of pain, such as taking medications, wearing splints and losing weight. In 2003, Americans spent some $80.8 billion on diagnosis and treatment of the disease. But researchers hope to eventually stem the flow of that money. The latest study gives hope that with continued testing of treatments and ways to prevent osteoarthritis, we can eventually beat this ancient ailment.

UN Report Finds Finland Is the Happiest Country in the World

Smithsonian Magazine

Good cheer might abound in Naples, Florida, but as a whole, the United States is lagging behind comparably wealthy nations when it comes to its residents’ happiness. As Maggie Astor reports for the New York Times, the U.S. ranked 18th out of 156 countries surveyed in the World Happiness Report of 2018. The top spot went to Finland.

The World Happiness Report is produced by the United Nations Sustainable Development Solutions Network, and it draws on data from on Gallup International surveys conducted between 2015 to 2017. The surveys asked thousands of people across the globe to place themselves on a ladder with steps numbered from zero to 10, with 10 representing the best possible life—a method known as the Cantril scale.

Finland scored an average of 7.632. Other Nordic nations also ranked high on the list of happiest countries; after Finland, the top nine spots were occupied by Norway, Denmark, Iceland, Switzerland, the Netherlands, Canada, New Zealand, Sweden and Australia.

The report evaluates six variables: GDP (or gross domestic product) per capita, social support, healthy life expectancy, freedom to make life choices, freedom from corruption and generosity. Most of the top 10 countries are social democracies, which “believe that what makes people happy is solid social support systems, good public services, and even paying a significant amount in taxes for that,” Jeffrey D. Sachs, director of the Center for Sustainable Development at Columbia University and an editor of the report, tells Astor. This political philosophy, he adds, is very different from that of the United States.

Though the economy in America is strong, its place in the ranking fell four spots from last year’s report. In an interview with Patrick Collinson of the Guardian, Sachs explained that “America’s subjective wellbeing is being systematically undermined by three interrelated epidemic diseases, notably obesity, substance abuse (especially opioid addiction) and depression.”

Burundi placed last in the ranking, with an average score of 2.905. Second from last was the Central African Republic. Both countries are plagued by political instability and violence. Though most of the bottom ten spots are occupied by African nations, Togo is one of this year’s biggest gainers: the country ranked last in 2015, but rose 18 places in the 2018 report.

One of the major themes of this year’s report was the intersection of migration and happiness, and countries were also ranked based on the happiness of their immigrants. Strikingly, the authors of the report found that immigrant happiness scores were almost identical to the scores of the population at large. Finland, for example, also came first in the ranking of immigrant happiness, followed by Denmark, Norway and Iceland.

“The closeness of the two rankings shows that the happiness of immigrants depends predominantly on the quality of life where they now live, illustrating a general pattern of convergence,” the authors of the report write.

The authors also considered a Gallup index that measured how accepting countries are of migrants. A higher value for migrant acceptance was linked to greater happiness among both immigrants and native residents “by almost equal amounts,” the report says.

“Happiness can change, and does change, according to the quality of the society in which people live,” the authors of the report add. “The countries with the happiest immigrants are not the richest countries, but instead the countries with a more balanced set of social and institutional supports for better lives.”

Revamp Your Christmas Playlist with These Unsung American Carols

Smithsonian Magazine

Elizabeth Mitchell’s new album for Smithsonian Folkways, The Sounding Joy, features new renditions of traditional American Christmas carols. Cover artwork by Brian Selznick

Elizabeth Mitchell’s The Sounding Joy, released by Smithsonian Folkways for this holiday season, features new recordings of traditional American carols rescued from obscurity by the late Ruth Crawford Seeger (Pete Seeger’s stepmother) in her 1953 songbook, American Folk Songs for Christmas. These simple devotionals evoke, as Ruth Seeger put it, the “old-time American Christmas. . .not of Santa Claus and tinseled trees but of homespun worship and festivity.”

“That’s what we did in our house,” says Ruth’s daughter, Peggy Seeger, who is featured on the album, along with Joan Osborne and Natalie Merchant. We spoke with Peggy about her contribution to the recording as well as her memories of her mother and Christmastime.

Which tracks did you record on The Sounding Joy?

I was asked to do “Christmas in the Morning,” and I chose to do “Mother’s Child” because it was one that I sing a lot in concerts and I absolutely love the tune. But I didn’t care for the original words, “a child of god,” so I “I’m a mother’s child,” which any religion can sing.

So it was important to you that these songs appeal to all faiths?

Oh, yes, absolutely, definitely.

How did it feel to return to these songs?

I love them. The collection is very interesting because my mother was the daughter of a Methodist minister, and she was pretty atheistic. My father was a combination of an agnostic and an atheist. And I’m very surprised that so many of the songs mention God and the Lord. These are terms that I kind of tried to avoid. Now that I live in England, which is very multicultural, I avoid them even more than I would in the United States.

My mother had a real ear for picking songs. She got an awful lot of these, most of them off of the Library of Congress recordings. She brought home these 16-inch aluminum records and listened to them with a thorn needle—I’m talking about the mid-1940s, early ’50s, and the only way you could listen to those records was with a thorn needle because a steel needle would ruin the tracks. It was our job, the children’s job, to keep the needle sharp using a sparkler. You’d put the needle into a little clamp and then you whizzed a wheel around it that put sandpaper on it, and that sharpened it again.

We heard these songs in the house as was transcribing them, from a very early age. Grew up with them. I know them all. I always loved accompaniments. They’re not easy to play, actually. To play and sing these songs with her accompaniments needs a lot of concentration. It’s not just ump-chump-chump-ump-chump-chump, and it’s not just chords with the left hand. There’s a lot of contrapuntal countermelody going on there.

Why are these songs still relevant? What can modern audiences gain from this recording?

They have choruses that a lot of people can sing. A lot of repeated words. And for many people now, religious or not religious, Christmas is a time to get together. Having some new songs to sing at Christmas is a very nice idea. . . . Many of songs sprang out of people singing together. That’s why there’s so much repetition. Often you have to repeat it for people to learn it and catch up with it, and for them to be able to feel themselves singing together, feel the edges of the room, as it were.

Do you celebrate Christmas?

Not anymore. . . . I’ve kind of lost interest in Christmas, with the horrifying commercialization. I don’t want to go into the stores anymore at Christmastime. I don’t want to hear all of the Christmas songs which you hear over and over ’til you are sick of them. . . .

The best Christmas I ever had was when I was about 7. It was a sad time for some people because there was an epidemic of polio in Washington, D.C, so we didn’t go into town to get presents. We stayed home and made presents for each other in the house. My brother, who was 9, got a little carpentry set before Christmas so he could make little cradles for our dolls. My mother taught me how to crochet and I crocheted things for my sisters’ dolls. My mother loved Christmas. She adored it.

This 1,500-Year-Old Skeleton May Belong to the Man That Brought Leprosy to Britain

Smithsonian Magazine

In the early 1950’s workers digging for gravel uncovered skeletons of people interred in an Anglo-Saxon cemetery a century and a half before. At the time, the team noted that the bones of one man in particular had joint damage and the narrow toe bones typically caused by leprosy. When researchers recently reanalyzed those same bones using modern techniques they realized the man may have had the first case of the disease in Britain. On top of that, other tests show that he was probably from Scandinavia, not Britain.

The researchers were able to gather some bacterial DNA from the bones and sequence it, reports Maev Kennedy for The Guardian. They genetic fingerprint they found was that of a leprosy strain belonging to the lineage 3I, which has been found at other burial sites in Scandinavia and southern Britain but at later dates. The man likely died in the 5th or 6th century. 

“The radiocarbon date confirms this is one of the earliest cases in the UK to have been successfully studied with modern biomolecular methods," says Sonia Zakrzewski, of the University of Southampton in a press release. "This is exciting both for archaeologists and for microbiologists. It helps us understand the spread of disease in the past, and also the evolution of different strains of disease, which might help us fight them in the future.”

The research team also analyzed elements in the man’s teeth. Specifically, they looked at several isotopes — element can different numbers of neutrons, each of variation is a different isotope. They measured the ratio oxygen isotopes, which reflect those found in the water he drank, and strontium isotopes found in his enamel, which reflect the geology of his homeland, explains Maddie Stone for Vice. This analysis told the researchers that the man likely came from Scandinavia. He may have carried the disease to Britain from there. When he died, he was in his 20s, the researchers report. They published their findings in PLOS One

The 3I leprosy strain is one of five strains found around the world. It not only gave rise to the leprosy of the British Isles, but that in the southern U.S. (where it’s often carried by armadillos) and in the U.K. even today. However, the leprosy epidemic didn’t peak in Europe until the 13th century. If the man had seen a physician in his new country, they wouldn’t have recognized the deformations and scaly skin of a leprosy infection. Perhaps he would have escaped the social stigma that later arose around the disease too.

This man isn't the first person in the world to get leprosy, explains Stone. "There are a handful of cases worldwide that predate this young man, including several from second century BC Egypt, first century AD Israel, and 1st through 4th century AD Uzbekistan," she writes. But he is the first known case in Britain. 

The team’s project leader, Sarah Inskip of Leiden University told Stone: “We plan to carry out similar studies on skeletons from different locations to build up a more complete picture of the origins and early spread of this disease.” 

The Whole Gory Story: Vampires on Film

Smithsonian Magazine

With Halloween on the horizon, I had to check out the "Vampires on Film" lecture, courtesy of the Smithsonian Resident Associate Program. The speaker was movie maven and scholar Max Alvarez. It was a well-attended, three-hour tour of horror flicks that make for—more often than not—painfully bad cinema. Yet, after kicking off his lecture by decorating his podium with several heads of garlic, Alvarez lent a gravitas to these movies, elevating them from mere midnight movie schlock to a study in cultural currency—meaning that vampire stories change and evolve with new images and metaphors for each generation viewing them.

In Western culture, tales of vampirism begin in the plague-addled Europe of the middle ages where newly buried bodies were exhumed and those considered not sufficiently decomposed were desecrated—by way of beheading or a good ol’ stake through the heart—for fear that the undead would spread disease among the living. (Trick or Treat?)

What’s worse is that some persons were prematurely interred—hence, their “as yet not-dead bodies” were in fabulous condition—and they ultimately met excruciatingly violent ends. Hands-down, this was the scariest part of the lecture.

By the late 1800s vampire stories are seen in print and theatrical incarnations (such as the 1828 opera Der Vampyr and the 1872 novella Carmilla). But it is Bram Stoker’s 1897 novel Dracula that sets the gold standard for the genre and captures the imaginations of people across the globe. Like its folkloric antecedents, Dracula is a sign of the times, dealing with issues of sex (which was strictly repressed in Victorian society), xenophobia and, in lieu of plague, syphilis, the dreaded STI du jour.

It is Stoker’s vision of the vampire that first makes it to the silver screen, the earliest surviving adaptation being F.W. Murnau’s Nosferatu, but the one that set the world on fire was Tod Browning’s 1931 film Dracula—starring Bela Lugosi—and kicks off a craze. Like its literary inspiration, Dracula and its string of cinematic spinoffs dealt with those things that you generally don’t bring up in polite conversation—namely human sexuality—and titillated audiences.

After a hiatus in the 40s and 50s, the genre was rekindled in the 60s. With sex becoming less taboo, vampire movies had to start exploring new frontiers. Of note is the 1973 film Blood for Dracula wherein the Count is exposed to impure blood and becomes gravely ill, as if the film were anticipating the AIDS epidemic that would sweep the world in the 1980s. Indeed, as a character in cinema, the vampire was evolving from a one-dimensional villain into a multifaceted character that could even be seen working for the forces of good (such as in Blade or Underworld).

While the genre has lost much of the subtlety and gothic trappings of the classic horror films, vampires endure as fodder for high octane action flicks, jam-packed with as much violence and gore as an R rating can withstand. However, they can also be seen in more playful fare as well. (Buffy the Vampire Slayer anyone?)

What's your favorite vampire film? What interesting things do you see happening within the genre that keeps it from going six feet under? Do you have high hopes for the upcoming film adaptation of the best-selling novel, Twilight? And why do you think we infrequently see vampire stories frequently told by way of animation?

Image from F.W. Murnau's Nosferatu (1922)

Florida Authorities Investigate a Disorder Affecting Panthers' Ability to Walk

Smithsonian Magazine

A mysterious affliction is crippling Florida’s panthers, leaving some members of the endangered species unable to walk without stumbling or toppling over.

As the Florida Fish and Wildlife Conservation Commission (FWC) announced Monday, the disorder—believed to affect the big cats’ ability to coordinate their back legs—has struck at least nine panthers and two bobcats to date. According to a press release, trail camera footage captured in Collier, Lee and Sarasota counties shows eight panthers (mainly juveniles) and one adult bobcat struggling to walk to varying degrees. Another panther photographed in Charlotte County could also be affected.

The FWC further confirmed the presence of neurological damage in one panther and one bobcat examined after dying of unrelated causes. According to the Washington Post’s Morgan Krakow, the bobcat sustained injuries during a fight and was subsequently hit by a car, while the panther was euthanized after she was struck by a vehicle and contracted an infection.

Neither animal tested positive for feline leukemia or commonly seen infectious diseases, but as spokeswoman Michelle Kerr of the FWC’s Fish and Wildlife Research Institute notes, “We wouldn’t say infectious diseases are ruled out completely.”

Krakow writes that potential explanations for the big cats’ condition range from infection to nutritional deficiencies, exposure to heavy metals, and toxins such as rat poison and toxic algae. It’s possible the panthers contracted a disease by preying on infected animals or drinking contaminated water, but it remains too early to know for certain.

“While the number of animals exhibiting these symptoms is relatively few, we are increasing monitoring efforts to determine the full scope of the issue,” Gil McRae, director of the Fish and Wildlife Research Institute, explains in the statement. “Numerous diseases and possible causes have been ruled out; a definitive cause has not yet been determined.”

According to Joshua Sokol of the New York Times, the agency first learned about the disorder when a local submitted video footage of an affected kitten in 2018. A review of photographs from the previous year yielded another instance of the ailment, but reports only started ramping up recently. “It was not until 2019 that additional reports have been received, suggesting that this is a broader issue,” spokeswoman Carli Segelson says to the Times.

According to the U.S. Fish and Wildlife Service (FWS), Florida’s panther population was dangerously low during the 1970s and ‘80s, when just 20 to 30 of the big cats roamed the state. Thanks to heightened conservation efforts, including the introduction of gene pool-diversifying Texas cougars in the 1990s, this number has risen steadily. As Amber Crooks, environmental policy coordinator for the nonprofit Conservancy of South Florida, tells the Miami Herald’s David Goodhue, around 120 to 230 panthers now live across Florida. Still, Crooks notes, “The population is already facing many … threats”—among others, urban development, cars, habitat loss and territorial disputes—“so this [new disorder] is concerning.”

To gain a better understanding of the mysterious crippling condition, the FWC is deploying extra trail cameras, consulting with federal authorities and experts, and appealing to the public. In particular, Sokol reports for the Times, researchers are hoping to confirm whether the disorder is limited to several counties along the state’s Gulf Coast or indicative of a more widespread problem. Locals can submit video footage of potentially affected animals through an online portal or via email at Panther.Sightings@MyFWC.com.

Speaking with the Post’s Krakow, Samantha Wisely, a wildlife ecologist at the University of Florida, says authorities will need to investigate multiple potential explanations for the epidemic.

“When you don’t have a good sense of what it is,” she concludes, “you really want to cast your net widely.”

America's Long-Overdue Opioid Revolution Is Finally Here

Smithsonian Magazine

A bunion, you may have the misfortune to know, is a bony growth that forms at the base of your big toe. When that bump begins to irritate the rest of your foot, it has to go.

Wincing would be the correct reaction here. On the pain scale, a bunionectomy doesn’t compare to having a limb sawn off; nor is it particularly medically risky. But since it “involves shaving off extra bone and cutting the big toe in half and pinning it back together,” says David Soergel, chief medical officer of the pharmaceutical company Trevena Inc, “it’s actually a very painful surgery.” That wince-worthy quality makes it the perfect surgery on which to test cutting-edge new pain relievers—such as Oliceridine, Trevena’s newest and most promising opioid compound.

For more than 200 years, doctors have soothed their patients’ pain with morphine, the drug isolated from the opium poppy and named after Morpheus, Greek god of dreams. And morphine has generally lived up to its reputation as an effective painkiller. But because of how it works on the central nervous system, morphine also has a host of notorious side effects, from nausea to life-threatening respiratory depression to addiction. So in 2014, Soergel and his team were on the hunt for a safer—and more effective—painkiller. The hope was that Oliceridine could provide equal or better pain relief than morphine, while reducing those nasty side effects.

In the trial, 330 bunionectomy patients received either Oliceridine, morphine or a placebo after their surgery. Those that received either drug reported pain relief within minutes (as opposed to hours for the poor souls who had only been given the placebo). But while patients given 4 mg of morphine reported that it took about a half hour for them to feel any relief, those given 4 mg of Oliceridine reported an average of just two minutes. Oliceridine, which was designed to take advantage of researchers’ new understanding of the underlying neuroscience of opioids, ultimately proved to be about three times as potent a painkiller as morphine. Even better, peer-reviewed studies showed that it was far less likely to cause dangerous side effects.

The result “could be a substantial advance in opioid pharmacotherapy,” Soergel and his colleagues reported in the journal PAIN in June of that year. They expanded on the drug's potential in the abstract of another study, presented in October 2016 at the annual meeting of American Society of Anesthesiologists. "This novel mechanism of action could lead to ... rapid, effective analgesia with improved safety and tolerability,” the team wrote.

Today, Oliceridine is the only opioid compound of its kind to be tested on humans. It’s now in Phase III clinical trials, with results due in early 2017; if all goes well, it could be brought to market within the next few years, according to Trevena co-founder Jonathan Violin. The drug’s potential is great. Oliceridine—and other compounds like it—could be just the first of a host of medicines with all the powerful pain relief of morphine, but far fewer devastating side effects. And they’ve all been made possible by our new understanding of the neuroscience behind these compounds. “This could be the first in what you might think of as a new class of opioids,” says Violin.

And the truth is, it’s about time. 

A tablet of Vicodin, one of the many prescription opioids on the market today. (Norma Jean Gargasz / Alamy)

A Revolution on Hold

There are few elements of medicine that haven’t progressed since the 19th century. Today, physicians work in antiseptic operating rooms and wield antibiotics to fight infection, rather than bone saws for the removal of gangrenous limbs. Modern anesthesia is a sophisticated medical concoction, compared to chloroform on a rag or a shot of whiskey. But when it comes to treating severe pain, we still rely on the same go-to substance we’ve been using since at least 3400 B.C.E: opium.

There’s a reason we’ve been so loyal to this flower: It works. Since antiquity, humans have utilized the power of the opium poppy to ease pain, treat disease and generate euphoria. The Sumerian civilization knew the poppy as hul gil, or “plant of joy” more than 5,000 years ago; there are visual hints of the poppy in Greek artifacts stretching back to 1500 B.C.E.  Roman physicians in the first and second centuries C.E. recommended opium mixed with wine prior to the amputation of limbs. In 1784, British surgeon James Moore recorded the first known use of opium to ease pain after surgery. 

In 1805, German pharmacist Friedrich Serturner changed the game by isolating morphine from opium. Other developments in that century would expand on that success, improving the delivery and distillation of this potent compound. In the 1850s, the development of the hypodermic syringe allowed exact dosages of morphine to be delivered directly into a patient’s bloodstream, which would be key for field hospital amputations during the American Civil War. In the 1890s, morphine was expanded into an array of morphine-like medications known collectively as the opioids.

Taken together, this suite of medications—which are today available as pills, injections, lollipops and patches—has revolutionized the treatment of pain. But the relief they bring is not without cost. Morphine would also prove to have a dark side. Even in the 19th century, addiction among soldiers was reportedly prevalent enough to earn the moniker “the soldier’s disease.”

Today, morphine addiction is America’s disease. In the U.S., the overprescription and abuse of opioid medications has led to a growing addiction crisis. Since 1999, the number of fatal opioid overdoses in the U.S. have quadrupled. So have the number of prescriptions written for opioid pain medications. According to Debra Houry, director of the National Center for Injury Prevention and Control at the CDC, 249 million opioid prescriptions were written in 2013—enough for every American adult to have their own bottle. And many who become addicted to these medications move on to a cheaper and more dangerous black market drug: heroin.

In the 1890s, Bayer pharmaceuticals began marketing heroin—which is made by applying the chemical process of acetylation to morphine—as a supposedly more potent and less addictive alternative to morphine. Heroin would prove to be about two to four times more potent than morphine, but claims that it was less addictive would prove unfounded in dramatic fashion. In 2015, the American Society of Addiction Medicine estimated almost 600,000 Americans were addicted to heroin. According to the Surgeon General’s report on addiction released in November, more than 28,000 Americans died from the use of prescription opioids or heroin in 2014.

What can be done about this epidemic of highly addictive, often fatal pain-killers? The obvious answer, you might think, would be to ditch opioids. The problem is, there is only so much pain a patient can be expected to bear, and so far, only opioids have been available to relieve it.

But that may be about to change. New research into the underlying molecular mechanisms of opioids has made possible the discoveries of new compounds that might just allow for the relief of pain without some of the worst side effects of traditional opioids. (Science writer Bethany Brookshire recently wrote about some of these new compounds for Science News.) If this research bears fruit, morphine may soon go the way of the 19th century bone saw—making way for a revolution of new drugs that don't cause physical dependence, and on which it is impossible to overdose. Drugs for which the risk of addiction will be negligible, or even disappear entirely.

If they pan out.

Laura Bohn in her laboratory at the Scripps Research Institute. (Jeremy Pyle / TSRI Outreach)

The Double Door

Traditional opioids—including morphine, the potent synthetic fentanyl and the Vicodin you get from your dentist—all work by binding to opioid receptors in the nervous system. These receptors come in three flavors: mu, delta and kappa. It’s at the mu-opioid receptor that opioids work their magic, activating a cascade of cellular signaling that triggers their pain-relieving effects. In the language of neuroscience, opioids are mu-receptor “agonists,” as opposed to “antagonists,” which are compounds that bind to a receptor and block it, preventing cellular signaling. When an opioid binds with the mu-opioid receptor, it ultimately turns down the volume on the nerves communicating pain. This, of course, is the desired effect.

Unfortunately, that’s not all it does. Opioids also release the neurotransmitter dopamine, which causes euphoria and can lead to addiction. These compounds also inhibit nerve cells from firing more generally, including in parts of the brain that regulate breathing—which can be dangerous. Take too much of an opioid and you stop breathing and die; that’s what it is to overdose. The CDC estimates that 91 Americans die every day from an opioid overdose. The side effects go on, from constipation to nausea to the rapid development of tolerance so that ever higher doses are needed for the same effect.

For a long time, it was thought that this was just the package deal. That to achieve relief from pain, you had to live with the side effects, since they were the result of mu-opioid receptor signaling. Then came Laura Bohn, who set the stage for a new science of pain relief.

In 1999, Bohn was a post-doctoral researcher in the Marc Caron lab at Duke University studying how the opioid receptor functioned in mice. This was basic research at the time—that is, it wasn’t undertaken as part of a plan to develop new pain drugs. Rather, she says, it was the kind of science for science’s sake that gets eyed for budget cuts. “You remember from the 1980s, all the politicians would say, ‘Putting a mouse on a hot plate, how can this help?’” says Bohn, who is now a pharmacologist at the Scripps Research Institute in Jupiter, Florida. “Well, this is how that helps.”

At the time, researchers knew that there were two proteins involved in opioid receptor signaling: the G-protein, and another called beta-arrestin. To explore the function of beta-arrestin, Bohn took a group of beta-arrestin “knockout” mice—animals who had been genetically manipulated so their bodies contained no beta-arrestin—and gave them morphine alongside a control group of regular mice. It was well known how mice reacted to morphine, so any different response in the knockout mice would provide clues to the role of beta-arrestin.

When you give them morphine, normal mice tend to run around in apparent glee. The knockout mice did not. “When we started treating the animals with morphine, it was just really obvious the difference between the wild types and the ones that lacked the beta-arrestin,” Bohn says. “Obvious to the point where a six-year-old child walked into the lab and said, ‘those mice are different from the other mice.’” Later research showed even more promising signs: The knockout mice showed less constipation and respiratory depression when given morphine, and the morphine proved more potent at relieving pain.

Suddenly, it appeared that the double-edged-sword hypothesis wasn’t necessarily true. The effects of opioids, it seemed, didn’t have to be a package deal—you could spin off some desired effects, and leave others. As Trevena’s Violin puts it: “In the absence of beta-arrestin, morphine was a better drug.”

The key discovery was that opioid “receptors are not on/off switches,” explains Bohn. “It’s not the ‘lock and key,’ where the key goes in and turns the lock and it just opens.” Instead, the receptor is like a double garden gate that can open onto two pathways, the G-protein and beta-arrestin paths. Use morphine to unlock the gate, and it swings open as one unit onto both paths. Change the gate itself so the beta-arrestin side remained locked—as in Bohn’s knockout mice—and you could open to just the G-protein path and reap the crucial benefits of morphine with fewer side effects.

It may not always be the case, Bohn says, that side effects and desired effects will be split neatly into beta-arrestin and G-protein signaling at every receptor. But “these are things we have to learn," she says. "It kind of calls us back to basic research and really understanding the physiology.”

The problem is, you can’t change the mu-receptor gate itself in humans; that would require genetic manipulation before birth. What was needed, therefore, was a different set of keys: New drugs, G-protein “biased-agonists,” which would open only the G-protein side of the gate, and leave the Pandora’s box of harmful side effects safely locked. In 2004, Bohn began looking for those keys; she would be joined in 2008 by the folks at Trevena. “They took this toward a drug development path and I took it toward an academic path,” Bohn says. “I think we are all kind of coming around and seeing that yes, there is some promise to this.”

A misleading advertisement for Vicodin, published in 1992. (North Carolina Medical Journal, Vol. 53 )

The New Morphine(s)

In terms of getting onto the market and into patients' prescriptions, Oliceridine is currently leaps and bounds ahead of its competitors. But it isn’t the only drug showing promise. Another compound, known as PZM21, appears to depress respiration—meaning to slow or impede breathing—to a lesser degree than even Oliceridine in rodents, according to work published in the journal Nature in September. There are also indications that it could be less rewarding, i.e., less addictive than traditional opioids.

Like Oliceridine, PZM21 is a biased-agonist opioid compound, but it has a different chemical structure. Scientists are still unclear what about that difference in structure accounts for the different effects of the two compounds, according to pharmacologist Brian Shoichet of the University of California at San Francisco, one of the authors of the Nature study. “Quite apart from clinical use, PZM21, [Oliceridine] and others, are tool molecules that can help us understand the biology of addiction,” he says. “Coupled to the right pharmacology, it could really expand our opportunities for discovering very new molecules conferring very new biological effects.”

Other lines of inquiry go beyond the Bohn's biased-agonist approach. At the University of Maryland School of Pharmacy, researcher Andrew Coop has spent more than a decade working on a synthetic opioid called UMB425, taking precisely the opposite approach from researchers working on biased-agonists like Oliceridine and PZM21. Rather than designing a drug that is more and more selective in order to hit a specific pathway, he asks, “how about how about going the other way and hit a second target that modulates it?” This approach—using one drug to hit multiple receptors—is known as polypharmacology. The result is a drug that, in rodents at least, relieves pain better than morphine with less development of tolerance.

And that's just the tip of the painkiller revolution. Another example of the polypharmacology approach is the work of Stephen Husbands, a medicinal chemist at Bath University. His compound, BU08028, is structurally similar to buprenorphine, a drug used to treat opioid use disorders. It acts at both the mu-opioid receptor and the nociceptin receptor, which is related to the opioid receptors. In monkeys, Husbands showed BU08028 relieves pain without causing dependence, addiction or depression of breathing.  

New pain drugs could be only the beginning. Many receptors in the brain—including the dopamine, serotonin and cannabinoid receptors—can also be targeted using the biased-agonist approach, perhaps yielding better antidepressants or other drugs. Trevena is already studying a compound that acts as a biased-agonist at the delta-opioid receptor as a possible migraine headache medication, according to Violin. Previous drugs that targeted the delta-receptor caused seizures, but Trevena’s compound does not (the theory is that the seizures were being caused by beta-arrestin signaling). 

Coop, who hopes to test UMB425 in primates and one day in humans, says all that competition is a good thing. “It’s good to have all these different mechanisms moving forward,” he says. “It enhances our chances that one of these will actually be able to make it.”

A Dose of Caution

The potential for these next-generation opioids is great. But in drug development, nothing is guaranteed. Oliceridine could hit some unforeseen problem in clinical trials; UMB425 could prove too addicting or too toxic in humans. A black market chemist could synthesize one of these new compounds and cause a regulatory backlash. (That’s no abstract concern: Last year,  the DEA temporarily announced its intention to place the active components of the Kratom plant into the restrictive Schedule I, following reports of people using the plant to treat pain or opioid addiction. That could impede research on mitragynine pseudoindoxyl, another promising new opioid based on compounds found in Kratom.)

Given some dubious industry promises about addiction and pain drugs in the past, Bohn is especially wary of claiming too much, too soon. “I am very conservative on this because I think we have to be very careful not to repeat the problems of the past and oversell an opiate and say it won’t be addictive—as certain companies did,” she says. Her philosophy going forward is to assume all these drugs will have some risk for addiction, and to treat them with caution. At the same time, even if addiction remains a hazard, drugs that eliminate other side effects will still represent a huge step forward.

Yet Bohn’s approach raises a crucial question: Can addiction ever be fully mitigated—or will painkillers always come with the risk of dark consequences? Decoupling the two certainly seems scientifically possible, says Coop, given the current models of biased-agonism and polypharmacology. But addiction is a many-faceted beast, and there could always be new components that are not yet understood. There may be no magic bullet, Coop concedes. “There have been several false dawns with respect to separating the desired from the unwanted effects of opioids," he says, "and the current approaches may again not translate to treating people in the clinic.”

A little excitement is warranted, in other words, but don’t consign morphine to the hall of medical curiosities just yet. “I think we should proceed carefully, but also realize the tremendous opportunity,” Bohn says. “This is a real opportunity in pharmaceutical development.” 

Editor's note, January 16, 2017: Due to an editing error photo caption initially stated that the Vicodin advertisement featured in the North Carolina Medical Journal was published in 1940. Actually, the journal began publication in that year.

Stop Calling it Bird Flu: World Health Organization Calls for More Care in Naming Diseases

Smithsonian Magazine

It’s hard to give something a name, whether that's a baby, a pet or a disease you've discovered. And now the World Health Organization is urging both doctors and members of the media to think hard about how they label diseases, and to avoid some common names like “swine flu” and “Middle East respiratory syndrome” that could unintentionally harm some communities through association.

Diseases are often given common names when they are first reported to the media, and early ones tend to stick. While this might seem obvious, what a disease is called matters most to the people (or animals) who are directly affected, says Dr. Keiji Fukuda, Assistant Director-General for Health Security at the WHO in a press release:

We’ve seen certain disease names provoke a backlash against members of particular religious or ethnic communities, create unjustified barriers to travel, commerce and trade, and trigger needless slaughtering of food animals. This can have serious consequences for peoples’ lives and livelihoods.

The WHO now says in naming new diseases doctors should stick to the symptoms. The new guidelines specify avoiding names that include geographic locations, people’s names, animal species and cultural or occupational references, to name a few. The WHO also recommends avoiding “terms that incite undue fear” such as “unknown,” “epidemic” and “fatal.”

It's not that disease names are unchangeable once they stick — the word “hemmorhagic” disappeared from early coverage of the Ebola virus after doctors found that it wasn’t a common symptom, writes Carina Storrs for CNN. But naming a disease accurately without sensationalizing it could be tricky. While Dr. Robert Bristow, medical director of emergency management at New York Presbyterian Hospital, thinks the new guidelines are a good step forward, he says doctors need to strike a balance between accurately naming a new disease and not overloading the public with technical terms. Storrs writes:

The WHO suggests that dubbing a disease should involve mention of the symptoms associated with that disease, and, if appropriate, the pathogen (such as the virus or bacteria) responsible for the disease and the season associated with it.

One example? Swine flu could go by A(H1N1)pdm09, a name the WHO put forth in 2009.

Bristow thinks this might be too much information. Just "new flu" could be enough, along with something such as "highly transmissible" to indicate that it is not your ordinary seasonal flu.

While the new naming conventions won’t replace the International Classification of Diseases system, the WHO says they will provide guidance for naming future diseases.

h/t International Business Times

Why Google Flu Trends Can't Track the Flu (Yet)

Smithsonian Magazine

In 2008, Google announced an intriguing new service called Google Flu Trends. Engineers at the company had observed that certain search queries (such as those including the words "fever" or cough") seemed to spike every flu season. Their idea was to use the frequency of these searches to calculate nationwide flu rates faster than could be done with conventional data (which generally takes a few weeks to collect and analyze), letting people know when to take extra precautions to avoid getting the virus.

Media outlets (this reporter included) rushed to congratulate Google on such an insightful, innovative and disruptive use of big data. The only problem? Google Flu Trends hasn't performed very well.

The service has consistently overestimated flu rates, when compared to conventional data collected afterward by the CDC, estimating the incidence of flu to be higher than it actually was for 100 out of 108 weeks between August 2011 and September 2013. In January 2013, when national flu rates peaked but Google Flu Trends estimates were twice as high as the real data, its inaccuracy finally started garnering press coverage.

The most common explanation for the discrepancy has been that Google hasn't taken into account the uptick in flu-related queries that occur as a result of the media-driven flu hysteria that occurs every winter. But this week in Science, a group of social scientists led by David Lazer propose an alternate explanation: that Google's own tweaks to its search algorithm are to blame.

It's admittedly hard for outsiders to analyze Google Flu Trends, because the company doesn't make public the specific search terms it uses as raw data, or the particular algorithm it uses to convert the frequency of these terms into flu assessments. But the researchers did their best to infer the terms by using Google Correlate, a service that allows you to look at the rates of particular search terms over time.

When the researchers did this for a variety of flu-related queries over the past few years, they found that a couple key searches (those for flu treatments, and those asking how to differentiate the flu from the cold) tracked more closely with Google Flu Trends' estimates than with actual flu rates, especially when Google overestimated the prevalence of the ailment. These particular searches, it seems, could be a huge part of the inaccuracy problem.

There's another good reason to suspect this might be the case. In 2011, as part of one of its regular search algorithm tweaks, Google began recommending related search terms for many queries (including listing a search for flu treatments after someone Googled many flu-related terms) and in 2012, the company began providing potential diagnoses in response to symptoms in searches (including listing both "flu" and "cold" after a search that included the phrase "sore throat," for instance, perhaps prompting a user to search for how to distinguish between the two). These tweaks, the researchers argue, likely artificially drove up the rates of the searches they identified as responsible for Google's overestimates.

Of course, if this hypothesis were true, it wouldn't mean Google Flu Trends is inevitably doomed to inaccuracy, just that it needs to be updated to take into account the search engine's constant changes. But Lazer and the other reserachers argue that tracking the flu from big data is a particularly difficult problem.

A huge proportion of the search terms that correlate with CDC data on flu rates, it turns out, are caused not by people getting the flu, but by a third factor that affects both searching patterns and flu transmission: winter. In fact, the developers of Google Flu Trends reported coming across particular terms—those related to high school basketball, for instance—that were correlated with flu rates over time but clearly had nothing to do with the virus.

Over time, Google engineers manually removed many terms that correlate with flu searches but have nothing to do with flu, but their model was clearly still too dependent on non-flu seasonal search trends—part of the reason why Google Flu Trends failed to reflect the 2009 epidemic of H1N1, which happened during summer. Especially in its earlier versions, Google Flu Trends was "part flu detector, part winter detector," the authors of the Science paper write.

But all of this can be a lesson for the use of big data in projects like Google Flu Trends, rather than a blanket indictment of it, the researchers say. If properly updated to take into account tweaks to Google's own algorithm, and rigorously analyzed to remove purely seasonal factors, it could be useful in documenting nationwide flu rates—especially when combined with conventional data.

As a test, the researchers created a model that combined Google Flu Trends data (which is essentially real-time, but potentially inaccurate) with two-week old CDC data (which is dated, because it takes time to collect, but could still be somewhat indicative of current flu rates). Their hybrid matched the actual and current flu data much more closely than Google Flu Trends alone, and presented a way of getting this information much faster than waiting two weeks for the conventional data. 

"Our analysis of Google Flu demonstrates that the best results come from combining information and techniques from both sources," Ryan Kennedy, a University of Houston political science professor and co-author, said in a press statement. "Instead of talking about a 'big data revolution,' we should be discussing an 'all data revolution.'"

How to Make Science Fiction Become Fact, in Three Steps

Smithsonian Magazine

While speakers at the first day of Smithsonian magazine’s fourth annual “Future is Here” festival shared their thoughts on subjects as diverse as computer programming, the Zika virus, human space exploration, the future of the internet and the state of global fisheries, they all shared a common thread: there’s hope. Never give up—even if you have to wait a long time.

“Who will be the next President of the United States?” Smithsonian’s editor-in-chief Michael Caruso asked a Magic 8 Ball as he opened the day of TED-style talks on Saturday. “The future is notoriously difficult to predict. But never before has the distance between imagination and reality been so close, and the predictions scientists are making aren’t wild fantasies.”

Smithsonian magazine's editor-in-chief Michael Caruso kicks off the day. (Richard Greenhouse Photography)

Caruso welcomed a roster of visionaries including Nicholas Negroponte, co-founder of the MIT Media Lab; Martine Rothblatt, founder of Sirius Radio and United Therapeutics; Vint Cerf, Google’s “chief internet evangelist” and co-developer of modern internet connection protocols; and former NASA astronaut Tony Antonelli, who helps Lockheed Martin shape its human spaceflight initiatives. Two of Jacques-Yves Cousteau’s granddaughters, Céline and Alexandra Cousteau, also took the stage to talk about their respective work in the Amazon and with the world’s oceans.

Sisyphean perseverance emerged as the theme of the day, encouraging those despairing visionaries out there, eager for the day when technology (hopefully) makes their ideas possible.

Rothblatt, obsessed with all things space for most of her life, said her whole focus shifted after her daughter Jenesis was diagnosed in 1994 with life-threatening and incurable pulmonary arterial hypertension (PAH). She founded United Therapeutics in 1996 after doing a deep-dive into potential treatments and convincing Burroughs Wellcome (and later GlaxoSmithKline) to allow her to license a compound, treprostinil, they’d shelved in favor of an easier-to-manufacture drug. 

Rothblatt founded United Therapeutics in 1996 after her daughter Jenesis was diagnosed with life-threatening pulmonary arterial hypertension. (Richard Greenhouse Photography)

With no background in biotech, Rothblatt pursued a PhD in medical ethics even as she worked, at great personal cost and expense, with pharmaceutical scientists to develop treprostinil into a drug. The Food and Drug Administration (FDA) ultimately approved the drug, Remodulin, in 2002.

“I gave one doctor the money he said he needed to make it, and he finally produced half a gram,” Rothblatt told the audience. “But we needed dozens of grams for animal studies, hundreds of grams for animal studies, and, ultimately, hundreds of kilos to help people across the country. So we put the pedal to the metal.”

Today, Rothblatt’s company, United Therapeutics, annually produces enough drugs for tens of thousands of patients, including her daughter, who can now live out their lives beyond the three-year life expectancy once given at diagnosis.

“We’ve never turned away a patient who can’t pay,” she said. “We will give that medicine to them for free. It hasn’t stopped us from being a successful pharmaceutical company—we’ve found that doing the right thing helps you do the best thing.”

Actor William Shatner appeared as a surprise guest. (Richard Greenhouse Photography)

In a special appearance, actor William Shatner said that though science fiction can lay the groundwork for the future, progress is not always made with computer wizardry and bubbling test tubes. He spoke about recently witnessing an unusual and unexpected experiment in progress.

“We write and we think about all these highfalutin futuristic things that are going to take place, but buried in the basement of a small building in Philadelphia there are dogs sniffing for cancer in vials of blood,” he said. “It has nothing to do with the future as imagined by a show called 'Star Trek.'”

Vint Cerf, Google's "chief internet evangelist," made some predictions about the "internet of things." (Richard Greenhouse Photography)

Google’s Vint Cerf described how the genesis of the internet was, at heart, a bottom-up enterprise. Built to satisfy a military defense agency that needed a cost-effective communications network compatible with a range of computer brands, Cerf said that four decades of evolution shed some light on what is yet to come.

“The thing you carry in your pocket once took an entire van to do,” Cerf said, holding up a cell phone. “Now we’re faced with a new invasion, devices you wouldn’t expect to be part of the internet environment. I used to tell jokes that every lightbulb will have its own IP address. Well, now I can’t joke about that.”

In the current day, between 3 and 3.5 billion people use three to five devices every day, Cerf said, for a global total of 10 to 15 billion devices. Looking into a future where an “internet of things” connects humans and a host of objects, it’s completely reasonable, Cerf said, to predict that by 2036, the planet will have 8 to 10 billion users, and the average person will use or interact with around 100 devices per day, from phones to tablets to embedded sensors. That adds up to one trillion devices.

“We need to get smarter about how we use our resources,” Cerf said. “How we gather our data can really make a difference.”

To that end, he described Google’s ongoing projects using innovative sensing, from contact lenses that can measure a diabetic’s glucose level, to ingestible nanobots to diagnose disease from inside the body. Like the trucks used to test out network connectivity in the 1970s, Cerf suggested today’s cutting-edge technology only has room to shrink.

“3D printers today are large and clunky, but over time those printers could make smaller and smaller stuff,” Cerf said. “Maybe one day the 3D printers can print even smaller printers, eventually printing at the molecular level.”

And, of course, Google is working on making sure internet works in space, too.

Alexandra Cousteau, an environmental advocate and granddaughter of Jacques-Yves Cousteau, spoke about the world's oceans. (Richard Greenhouse Photography)

In the year of the 40th anniversary of the Viking mission to Mars, Lockheed Martin’s Antonelli said today’s space missions are paving the way for the next steps, including an asteroid retrieval program and the Orion spacecraft, which will eventually take humans to Mars. (People took selfies all day with a quarter-scale replica of the Orion at the festival.)

In addition to the current missions surveying Mars, including the Mars Reconnaissance Orbiter, which takes its own surveys of the Martian surface as well as relays messages between Earth and the Martian rovers, there’s also Maven, a Martian atmospheric observatory, and Juno, which will arrive at Jupiter this summer to map the planet’s atmosphere and magnetic and gravitational fields.

Osiris-Rex (Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer) will launch this fall destined for the asteroid Bennu, Antonelli said. Close enough to reach, large enough to land upon, and old enough that it reflects the early composition of the solar system, Bennu is thought to hold the molecular ancestors of life on Earth, but also whizzes scarily close to our planet on a regular basis. The samples from the Osiris-Rex mission will help scientists plan for a possible impact intervention mission, and also help aspiring asteroid miners know what resources they might find.

Despite the fact that new space missions are popping up one after another, it’s today’s students who will one day be making the next big steps into space.

“Keep in mind, that the first person to go to Mars is in school today,” Antonelli said. “Well, maybe not today, since it’s a Saturday,” he added. 

A New Report Identifies 30 Technologies That Will Save Lives in the Next 15 Years

Smithsonian Magazine

President Obama wasn't the only head of state visiting Ethiopia this summer. In early July, the United Nations brought global leaders to Addis Ababa, for the third annual International Conference on Financing for Development. The goal of the meeting was to outline what the UN calls Sustainanble Development Goals—a series of financial, social and technological targets that they want countries in the developing world to hit by 2030.

At the conference, the United States Agency for International Development (USAID), the Government of Norway, the Bill and Melinda Gates Foundation and global health nonprofit PATH released "Reimagining Global Health," a report outlining 30 innovations that will save lives in the next 15 years. The team spent a year analyzing current and future technology, by reaching out to all the partners they work with in the world of international health. They received 500 nominations from entrepreneurs, scientists and other experts in nearly 50 countries, which a panel of 60 health experts reviewed and whittled down to a short list of easy-to-use technologies that they felt could reduce child mortality, improve maternal health and reproductive rights, and combat both infectious and noncommunicable diseases.

By 2030, USAID, the Gates Foundation and PATH want to reduce the global maternal mortality rate to less than 70 per 100,000 live births; end preventable deaths of newborns and children under five years old; reduce premature mortality from noncommunicable diseases by a third; ensure universal access to sexual and reproductive health care services; end the epidemics of AIDS, TB, malaria and neglected tropical diseases; and combat other infectious diseases.

The groups want to consolidate investments from philanthropic organizations like the Gates Foundation and from government groups to go to the most high value projects, so that their products and services are cheap and accessible. “Strengthening the capacity of low- and middle-income countries to identify, develop, adapt, produce, regulate, assess, and share innovations is critical for a robust innovation pipeline,” says Amie Batson, Chief Strategy Officer at PATH said in an email.

Making communities healthier also makes them more financially resilient. Former U.S. Treasury Secretary Lawrence Summers, who also contributed to the report, says that by investing in health technology now, globally we can save significant money and lives down the road. “With the right investments, we could reach grand convergence in just one generation, averting 10 million deaths every year by 2035. But today’s health tools alone won’t get us there,” says Summers in the report.

Here are eight of the 30 new drugs, diagnostics, devices and services poised to help the developing world:

Easily-transportable vaccines can help treat communicable diseases. (Gabe Bienczycki)

Chlorhexidine for Umbilical Cord Care

In the developed world, medical professionals clean babies' umbilical cords shortly after birth. But in the developing world, hundreds of thousands of newborns die each year from infections related to lack of antiseptic at delivery. If $81 million was spent on introducing chlorhexidine in home settings in the developing world in the next 15 years, the authors of the report estimate that more than 1 million neonatal lives could be saved, resulting in a 9 percent reduction in deaths due to sepsis.

Uterine Balloon Tamponades

One of the biggest causes of maternal death is postpartum hemorrhage, which can be stopped or slowed by inserting an inflatable tamponade into the uterus. Because of cost and lack of training, the devices haven't been used in the developing world. The report highlights one easy-to-use, low-cost option, called Every Second Matters for Mothers and Babies. Basically, a condom is attached to a catheter that's inflated with water through a syringe and a one-way valve. By investing $27 million in these devices, the group estimates that 169,000 mothers' lives could be saved in the next decade and a half.

Neonatal Resuscitators

Low-cost neonatal resuscitators could help the one in 10 babies who have trouble breathing at birth. They've been hard to bring into the developing world, because of cost, so these groups are working to identify and develop cheap, reuseable and easy-to-use options, including ones that health care workers can operate by hand.

Antiretrovirals for HIV That Can Be Injected Every Two Months

HIV is virulent and widespread in sub-Sarahan Africa, so, to try to slow the spread, these groups are looking at long-lasting drugs that could be injected into HIV patients every two months to treat symptoms and slow the virus' progression to AIDS. These options could prove more effective than easily forgotten daily pills.

Single-dose Antimalarial Drug

Malaria treatment is tricky for a lot of reasons, but one of the big ones is that the Plasmodium parasites that cause it, transmitted to humans by mosquitoes, are developing resistance to existing drugs. A single-dose antimalarial drug, OZ439, knocks out the resistant strains without giving them time to develop against the infectious disease. This solution will be critical to the 200 million people fighting malaria each year.

Portable Eye Scanners

For the 300 million people who experience it worlwide, visual impairment can have a huge impact on quality of life. But most eye troubles can be treated. The authors of the report acknowledge that, with little training, people in remote villages where eye doctors are few and far between could use portable, user-friendly eye scanners, like the 3nethraClassic, to diagnose cataracts, glaucoma and other conditions.

mHealth Innovations

Many non-communicable diseases, such as diabeties and heart disease, can be managed with diet and exercise. This often involves patients changing their habits and routines, but these life changes can be hard to track and stick to, especially if frequent checkups with doctors aren't an option. In the next 15 years, the report speculates that low-cost mobile phones will be leveraged to track behavioral changes. Doctors could send texts and patients could report to networks that hold them accountable.

Injectable Contraceptives

Last year, PATH developed Sayana Press, an injectable contraceptive that lasts for three months. Unlike other contraceptives of this type, this one is designed for home use. The single-use shots can be distributed to individuals, who can discretely administer them on their own.

America’s First Great Global Warming Debate

Smithsonian Magazine

As the tumultuous century was drawing to a close, the conservative Yale grad challenged the sitting vice president’s ideas about global warming. The vice president, a cerebral Southerner, was planning his own run for the presidency, and the fiery Connecticut native was eager to denounce the opposition party.

The date was 1799, not 1999—and the opposing voices in America’s first great debate about the link between human activity and rising temperature readings were not Al Gore and George W. Bush, but Thomas Jefferson and Noah Webster.

As a gentleman farmer in Virginia, Jefferson had long been obsessed with the weather; in fact, on July 1, 1776, just as he was finishing his work on the Declaration of Independence, he began keeping a temperature diary. Jefferson would take two readings a day for the next 50 years. He would also crunch the numbers every which way, calculating various averages such as the mean temperature each month and each year.

In his 1787 book, Notes on the State of Virginia, Jefferson launched into a discussion of the climate of both his home state and America as a whole. Near the end of a brief chapter addressing wind currents, rain and temperature, he presented a series of tentative conclusions: “A change in our climate…is taking place very sensibly. Both heats and colds are become much more moderate within the memory of the middle-aged. Snows are less frequent and less deep….The elderly inform me the earth used to be covered with snow about three months in every year. The rivers, which then seldom failed to freeze over in the course of the winter, scarcely ever do so now.” Concerned about the destructive effects of this warming trend, Jefferson noted how “an unfortunate fluctuation between heat and cold” in the spring has been “very fatal to fruits.”

Jefferson was affirming the long-standing conventional wisdom of the day. For more than two millennia, people had lamented that deforestation had resulted in rising temperatures. A slew of prominent writers, from the great ancient naturalists Theophrastus and Pliny the Elder to such Enlightenment heavyweights as the Comte de Buffon and David Hume, had alluded to Europe’s warming trend.

A contemporary authority, Samuel Williams, the author of a 1794 magnum opus, The Natural and Civil History of Vermont, had studied temperature readings at several points in the 18th century from his home state and half a dozen other locales throughout North America, including South Carolina, Maryland and Quebec. Citing this empirical data, Williams claimed that the leveling of trees and the clearing of lands had caused the earth to become warmer and drier. “[Climate] change…instead of being so slow and gradual, as to be a matter of doubt,” he argued, “is so rapid and constant, that it is the subject of common observation and experience. It has been observed in every part of the United States; but is most of all sensible and apparent in a new country, which is suddenly changing from a state of vast uncultivated wilderness, to that of numerous settlements.”

Image by The Granger Collection, New York. In his 1787 book, Notes on the State of Virginia, Thomas Jefferson launched into a discussion of the climate of both his home atate and America as a whole. (original image)

Image by Bettmann / Corbis. Concerned about the destructive effects of a warming trend outlined in his book, Jefferson noted how "an unfortunate fluctuation between heat and cold" in the spring has been "very fatal to fruits." (original image)

Image by The Granger Collection, New York. Noah Webster disputed the "popular opinion that the temperature of the winter season, in northern latitudes, has suffered a material change" in a speech. Webster focused on the numbers—and his opponents' lack of hard data on the subject of global warming. (original image)

This opinion had been uttered for so long that it was widely accepted as a given—until Webster. Today Webster is best known as the author of the American Dictionary of the English Language (1828), but his “great book” was actually his retirement project. He was a pioneering journalist who edited American Minerva, New York City’s first daily newspaper in the 1790s, and he weighed in on the major public policy issues of the day, cranking out essays on behalf of the Constitution, a 700-page treatise on epidemics and a condemnation of slavery. He would also serve in the state legislature of both Connecticut and Massachusetts. Webster disputed the “popular opinion that the temperature of the winter season, in northern latitudes, has suffered a material change” in a speech before the newly established Connecticut Academy of Arts and Sciences in 1799. Several years later, Webster delivered a second address on the topic. The two speeches were published together in 1810 under the title “On the Supposed Change of in the Temperature of Winter.”

With the thermometer still a relatively recent invention—the Polish inventor Daniel Fahrenheit didn’t develop his eponymous scale until 1724—conclusions about weather patterns before the mid-18th century were based largely on anecdotes. In the first two-thirds of his 1799 speech, Webster attacked Williams, a pastor who helped found the University of Vermont, for his faulty interpretations of literary texts such as the Bible and Virgil’s Georgics. Challenging Williams’ assumption—derived from his close examination of the Book of Job—that winters in Palestine were no longer as cold as they used to be, Webster declared, “I am really surprised to observe on what a slight foundation, a divine and philosopher has erected this theory.” But Webster, while acknowledging that the Bible may well not have been “a series of facts,” tried to spin the weather imagery in ancient texts his own way. Citing passages from Horace and Pliny, Webster asserted that “we then have the data to ascertain the ancient climate of Italy with great precision.”

To settle the scientific debate, Webster offered more than just literary exegesis. In examining “the cold of American winters,” Webster focused on the numbers—and his opponents’ lack of hard data (Jeffersons recorded his own temperature readings in a private diary). “Mr. Jefferson,” Webster stated, “seems to have no authority for his opinions but the observations of elderly and middle-aged people.” Webster saved most of his ammunition for Williams, who had written the more extensive brief, replete with an array of temperature readings. Williams’ central contention, that America’s temperature had risen by 10 or 12 degrees in the prior century and a half, Webster asserted, just doesn’t make any sense. “The mean temperature of Vermont,” he writes, “is now 43 degrees…If we suppose the winter only to have changed, and deduct one half the supposed abatement, still the result forbids us to believe the hypothesis. If we suppose the heat of summer to have lessened in the same proportion…the summers formerly must have been intolerable; no animal could have subsisted under ten degrees of heat beyond our present summer temperature. On whichever side we turn our eyes, we meet with insurmountable difficulties.”

Watch this video in the original article

Webster concluded by rejecting the crude warming theory of Jefferson and Williams in favor of a more subtle rendering of the data. The conversion of forests to fields, he acknowledged, has led to some microclimatic changes—namely, more windiness and more variation in winter conditions. But while snow doesn’t stay on the ground as long, that doesn’t necessarily mean the country as a whole gets less snowfall each winter: “We have, in the cultivated districts, deep snow today, and none tomorrow; but the same quantity of snow falling in the woods, lies there till spring….This will explain all the appearances of the seasons without resorting to the unphilosophical hypothesis of a general increase in heat.”

Webster’s words essentially ended the controversy. While Jefferson continued to compile and crunch temperature data after his retirement from the presidency, he never again made the case for global warming. Neither did Williams, who died a few years after the publication of Webster’s article. Webster’s position was considered unimpeachable. In 1850, the acclaimed German naturalist Alexander von Humboldt declared that “statements frequently advanced, although unsupported by measurements, that…the destruction of many forests on both sides of the Alleghenys has rendered the climate more equable…are now generally discredited.”

And there the matter rested until the second half of the 20th century, when scientists began to understand the impact of greenhouse gases on the environment. The second great global warming debate poses a different set of scientific questions from those raised in the late 18th century, and this time the science clearly supports the idea that human activity (including clearing and burning forests) can increase temperatures. But it is Webster’s papers, with their careful analysis of the data, that have stood the test of time. Kenneth Thompson, a modern environmental scientist from the University of California at Davis, praises “the force and erudition” of Webster’s arguments and labels his contribution to climatology “a tour de force.”

Joshua Kendall is the author of The Forgotten Founding Father: Noah Webster’s Obsession and the Creation of an American Culture (Putnam, 2011).

In Hawaii, Old Buses Are Being Turned Into Homeless Shelters

Smithsonian Magazine

When we think of Hawaii, most of us probably picture surfers, shaved ice and sleek beach resorts. But the 50th state has one of the highest rates of homelessness in America. Due in large part to high rent, displacement from development and income inequality, Hawaii has some 7,000 people without a roof over their heads.

Now, architects at the Honolulu-based firm Group 70 International have come up with a creative response to the homelessness problem: turn a fleet of retired city buses into temporary mobile shelters.

“Homelessness is a growing epidemic,” says Ma Ry Kim, the architect at the helm of the project. “We’re in a desperate situation.”

Kim and her friend Jun Yang, executive director of Honolulu’s Office of Housing, came up with the idea after attending a disheartening meeting of Hawaii’s legislature. Homelessness was discussed but few solutions were offered.

“[Jun] just said, ‘I have this dream, there’s all these buses sitting at the depot, do you think there’s anything we can do with them?’” Kim recalls. “I just said ‘sure.’”

The buses, while still functional, have too high mileage for the city of Honolulu to use. The architects envision converting them into a variety of spaces to serve the needs of the homeless population. Some buses will be sleeping quarters, with origami-inspired beds that fold away when not in use. Others will be outfitted with showers to serve the homeless populations’ hygiene needs. The buses will be able to go to locations on the island of Oahu where they are most needed, either separately or as a fleet. The entire project is being done with donated materials, including the buses themselves, and volunteer manpower. Members of the U.S. Navy have pitched in, as have local builders and volunteers for Habitat for Humanity. The first two buses are scheduled to be finished by the end of the summer.

The blueprint for the shower-equipped hygiene bus comes from the San Francisco program Lava Mae, which put its first shower bus on the streets of the Mission District in July 2014. Kim hopes to “pay it forward” by sharing her group’s foldable sleeping bus designs with other cities.

“The next city can adopt it and add their piece or two,” Kim says. “There are retired buses everywhere. The missing part is the instruction manual on how to do this.”

The project comes on the heels of recent controversy about new laws preventing the homeless from sleeping in public. Proponents say the laws, which make it illegal to sit or sleep on Waikiki sidewalks, are a compassionate way of getting the homeless off the streets and into shelters. Critics say the laws are merely criminalizing homelessness and making life more difficult for Hawaii’s most disadvantaged population in order to make tourists feel more comfortable.

The needs of the homeless are varied. While a small percentage of the homeless are chronically on the streets, most are people experiencing difficult transitions—a loss of a house due to foreclosure, fleeing domestic violence, displacement by natural disaster. Increasingly, designers and architects are looking to fill these needs with creative design-based solutions.

In Hong Kong, the architecture and design group Affect-T created temporary bamboo dwellings for refugees and disaster victims. The dwellings are meant to sit inside warehouses or other sheltered spaces. Light and easy to transport and construct, the dwellings could be a model for temporary shelters anywhere in the world.

The Italian firm ZO-loft Architecture and Design built a prototype for a rolling shelter called the Wheely. The temporary abode looks like a large can lid, and opens on either side to unveil two polyester resin tents. The internal frame provides space for hanging belongings, and the tents, which stretch out like Slinky toys, can be closed at the end for privacy and protection from weather. Inventor Paul Elkin came up with a similar solution—a tiny shelter on wheels that unfolds to reveal a larger sleeping space.  

But temporary shelters don’t solve the problem of chronic homelessness. It’s increasingly understood that simply giving homeless people homes—a philosophy called Housing First—is more effective than trying to deal with the underlying causes of their homelessness while they’re still living in shelters. Housing First is also cost effective, since people with homes end up needing fewer social supports and are less likely to end up in prisons or emergency rooms. 

A number of cities are tapping into the mania for tiny houses as a more permanent partial solution. In Portland, Dignity Village is a permanent community of some 60 people living in 10-by-12-foot houses near the airport. The houses were built mostly with donated or salvaged materials, and residents share communal kitchens and bathrooms. The village was originally an illegal tent encampment, but the city granted the community land, which ensures houses are built to city code. Residents say the village grants them not just shelter and safety, but also privacy and autonomy. Unlike in homeless shelters, residents have a permanent spot and are allowed to live with partners and pets. Similar villages exist across the Pacific Northwest and California, with more springing up in other parts of the country. 

With homelessness on the rise in America—a recent U.S. Conference of Mayors survey of 25 cities showed homelessness had increased in nearly half over the past year—we'll certainly be in need of more design-inspired solutions, tiny, rolling, and otherwise. 

The most radical thing about Stonewall wasn’t the uprising

National Museum of American History

The Stonewall uprising began June 28, 1969, in response to a police raid at The Stonewall Inn, a gay bar in New York, and has since been commemorated around the world with pride parades and other events. Curator Katherine Ott reflects on the significance of the uprising.

I’m a Stonewall skeptic. I don’t doubt that it happened, but I question how it has been used over the years. Because this is a big anniversary year, there is a compulsion to heroize the people who were there and elevate the event.

Those sweaty summer nights of rebellion were certainly important and unique and have reverberated for 50 years. However, an event like the Stonewall uprising was inevitable—young people with 1960s political impatience and righteous indignation had a lot of LGBTQ+ history to fuel them. Other protest and resistance had already happened in places such as Philadelphia, Los Angeles, and San Francisco. Much of the staying power of Stonewall’s reputation rests upon the Pride marches that began on the first anniversary of the uprising.

A can that is covered with paper that reads Christopher LiberationDonation can from the Christopher Street Liberation Day March, the first Pride march, New York City, 1970. Gift of Mark Segal.

Stonewall’s outsized fame has a downside—skewing both understanding of LGBTQ+ history and misrepresenting how historical change comes about. There is no universal LGBTQ+ history in which any one event is primary. The only commonality in LGBTQ+ life is the risk people take in being themselves.

Stonewall is often pointed to as the birth of the modern gay rights movement or the biggest news in LGBTQ+ history. But that is not accurate. For many gender-non-conforming people, Stonewall had little effect or held no interest. For many disabled LGBTQ+ people, change has been glacial—many people were institutionalized in the 1960s and still constitute a large percentage of those incarcerated. The largest psychiatric facilities today are prisons. In the 1960s, many people of color were putting their energy into civil rights work, antiwar activism, or the Chicano Movement. People living in small towns and rural areas outside of the metropoles of New York, San Francisco, or Chicago did not hear about what happened in New York City or take it up as a rallying cry.

A button with the wheelchair symbol in rainbow.Rainbow wheelchair button, 2016.

Some 12 years after Stonewall, the AIDS epidemic more broadly modernized the gay rights movement and propelled gay liberation by decimating and restructuring communities, creating solidarity, and necessitating out-of-the-box confrontations.

A black and white comic showing the effects of AIDs"AIDS: Bearing Angry Witness," by Jennifer Camper, printed in The Blade. HIV and AIDS have had a profound effect on communities, science, medicine, social services, and everyday behavior. Courtesy of John-Manuel Andriote Victory Deferred Collection, Archives Center.

We often think of history with testosterone-fueled events such as battles, riots, and assassinations being the source of lasting change. Violent outbreaks are dramatic and the pain that comes in their wake is attention-grabbing. But real change generally does not come about in a moment. It happens over time and is sustained by people who hold on to an idea and push it forward: the World War II soldiers who came out to each other and stayed out, the 1950s and ’60s journalists who mailed their newsletters in plain brown wrappers, the court cases, picketing, cafeteria rebellions, and everyone who showed up to challenge ignorance. Before Stonewall, there were dozens of legal actions around jobs, marriage, housing, and the right to be yourself. Violence may accompany change, but it does not sustain it.

A blue magazine with the words "Mattachine Review, Police Roundup Jails 69:Mattachine Review, May–June, 1955. The cover story discusses one of the many police raids.A poster which reads "Sexual preference is irrelevant to federal employment."Picket signs carried by protestors at the White House and Independence Hall in Philadelphia, 1960s. Frank Kameny Collection.

For me the reason to remember the Stonewall uprising is in recognition of the daily acts of courage the rioters took that got them to the bar that night. It is the multiple, unremarkable moments of inbreathing “Yo Soy, I am” that people on the margins take every day that is the watershed for change.

Katherine Ott is a curator in the Division of Medicine and Science. She has also blogged about objects she collected from the parents of Matthew Shepard and collecting LGBTQ+ objects of the past.

A display titled Illegal to Be You: Gay History Beyond Stonewall is currently on view at the museum.

Posted Date: 
Friday, June 21, 2019 - 10:30
OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=0M42D-0j7JI:qHlC2wQgFdg:V_sGLiPBpWU OSayCanYouSee?i=0M42D-0j7JI:qHlC2wQgFdg:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

Portrait Gallery's Hide/Seek Uncovers an Intricate Visual History of Gay Relationships

Smithsonian Magazine

It's hard to consider a large pile of Jolly Rancher-type candies as a form of portraiture. And yet, in the corner of the National Portrait Gallery's new show "Hide/Seek: Difference and Desire in American Portraiture" is a tidy spill of sweets in Technicolor cellophane. You can't miss it, nor should you—it's one of the few opportunities you'll ever have to not only touch the art, but to eat it. (Minding the nearby choking hazard warning signs, of course.) But the sheer whimsy of it all is quickly undercut upon realizing that the piece is a memorial to Ross Laycock, partner and lover of the artist Felix Gonzalez-Torres. Laycock died of AIDS in 1991.

But what does a pile of candy really communicate about a human being? Minimalist art isn't always easy to read, so careful consideration has to be paid to what visual elements are there before you. To display the work of art, the museum had a number of guidelines they were required to follow. "There has to be an assortment of colors," says Hide/Seek curator David Ward, "and it has to weigh 175 pounds—Ross’s weight when healthy—at the start of the installation." As viewers pass by and eat the candy, they enjoy the sweetness of the relationship Gonzalez-Torres and Laycock shared.

The piece was created at a point in time when much of America—including the nation's leaders—was ignoring the AIDS epidemic, and the dwindling pile of candy is also a symbol for the dissolution of gay communities in the wake of this disease. Furthermore, the piece can be arranged in one of two ways: a mound in a corner or in a rectangle on the floor. "The mound in the corner is simply a way of collecting or organizing it so its not just a lump that gets spread out on the floor in a misshapen mass," Ward explains. "But organizing it flat suggests two things: either it’s a bed or it’s a grave. This makes it more powerful in a way but we didn’t have the space to install it like that."

But artwork that speaks to how AIDS impacted gay communities is only a facet of Hide/Seek. As a whole, the show reveals how American artists have explored human sexuality. Those who approach the show thinking that gay culture is a recent development may be surprised to find that it has been hidden in plain view for decades. It's all a matter of knowing how to crack the visual codes that artists hid in their work. "This is a show about oblique glances," says Ward. "It's a show about subversion."

For a preview of the show, be sure to check out the gallery below as well as Blake Gopnik's Washington Post reviewHide/Seek: Difference and Desire in American Portraiture will be on view at the National Portrait Gallery until February 13, 2011.

Image by Walt Whitman by Thomas Cowperthwaite Eakins. National Portrait Gallery, Smithsonian Institution. “Walt Whitman is the founding spirit of this show,” says Ward. During the Civil War, Whitman, whose poetry collection Leaves of Grass contains themes of free love, worked as a nurse in the Patent Office Building, which is now the National Portrait Gallery. Thomas Eakins took this photograph a year before the poet’s death in 1891. (original image)

Image by Salutat by Thomas Cowperthwaite Eakins. National Portrait Gallery, Smithsonian Institution. In the late 19th century, sporting events that glorified masculinity rose in popularity. College football, rowing and boxing celebrated the fit and healthy physique of the athlete. Here, Eakins plays with social norms by portraying a scantily clad boxer instead of a nude female as the object of an all-male crowd’s gaze. The boxer is the 22-year-old featherweight Billy Smith, who was a close, devoted friend to the artist. (original image)

Image by Painting No. 47, Berlin by Marsden Hartley. National Portrait Gallery, Smithsonian Institution. In this 1917 canvas, Marsden Hartley memorializes a man he fell in love with, a German soldier named Karl von Freyburg, who was killed during World War I. “Gays and lesbians were particularly attuned to abstraction because of the care with which they had to present themselves in society,” says Ward. “Their lives had to be coded to hide themselves from repressive or hostile forces, yet they also had to leave keys both to assert their identity and to link up with other members of the community.” Von Freyburg’s initials, his age at death his position in the cavalry unit are all cautiously hidden in this abstraction, Painting No. 47, Berlin. (original image)

Image by Self Portrait by Romaine Brooks. National Portrait Gallery, Smithsonian Institution. Romaine Brooks was both an artist and patron of the arts. In this 1923 self-portrait, she depicts herself in hyper-masculine clothing. “I think the element of cross-dressing has had an appeal in the lesbian community,” Ward says. “Brooks abandons a stereotypically female look for a combination of items that would signal how she was crossing gender and sexual lines.” (original image)

Image by Janet Flanner by Berenice Abbott. National Portrait Gallery, Smithsonian Institution. Janet Flanner was an American living in Paris with her lover Solita Solano and together they traveled in the most fashionable gay social circles. Flanner wrote a regular column for the New Yorker that gave readers a coded glimpse of the Parisian “in crowd.” This 1923 portrait, Flanner’s masks are a symbol of the multiple disguises that she wears, one for private life, and one for public life. (original image)

Image by Marsden Hartley by Geoge Platt Lynes. National Portrait Gallery, Smithsonian Institution. This 1942 portrait captures artist Marsden Hartley mourning the death of another man that Hartley admired. A shadowy man haunts the background of this portrait, taken by photographer George Platt Lynes in 1942, alluding to the loves of Hartley’s life that were lost and unspoken. (original image)

Image by Robert Mapplethorpe Self-Portrait by Robert Maplethorpe. National Portrait Gallery, Smithsonian Institution. Stricken with AIDS, Robert Maplethorpe casts himself in this 1988 self-portrait as the figure of death. “What he is doing,” Ward says, “is refusing to accept our pity. He is refusing to be defined by us: poor gay man, poor dying gay man. He is also dying with dignity, turning himself into the King of Death. He is owning his status. And what he is telling us is that we are all going to die. We are all mortal and this is the fate that awaits us all. And I also think he is making a statement that he is going to survive after death because of his work as an artist. He is transcending death through art.” (original image)

Image by Unfinished Painting by Keith Haring. National Portrait Gallery, Smithsonian Institution. As AIDS raged through gay communities across the United States beginning in the 1980s, Haring’s 1989 devastating canvas, entitled Unfinished Painting, mourns the loss of so many. Haring himself died from AIDS on February 16, 1990, a year that saw the incredible toll—18,447 deaths—of the disease. (original image)

Image by Camouflage Self-Portrait by Andy Warhol. National Portrait Gallery, Smithsonian Institution. In this 1986 canvas, Andy Warhol plays with the concept of camouflage and the idea that portraiture is a means of masking oneself. Here he is hidden, yet in plain sight. (original image)

Image by Ellen DeGeneres, Kauai, Hawaii by Annie Leibovitz. National Portrait Gallery, Smithsonian Institution. When Ellen DeGeneres publicly acknowledged her lesbianism in 1997, it was a landmark event. Besides defying Hollywood’s convention of rarely publicly acknowledging her homosexuality, coming out gave her a degree of control over her life. "For me,” DeGeneres said in a 1997 interview with Diane Sawyer, “this has been the most freeing experience, because people can’t hurt me anymore.” (original image)

To Fight Deadly Dengue Fever in Humans, Create Dengue-Resistant Mosquitoes

Smithsonian Magazine

There’s a reason this tropical disease is known as “breakbone fever”: To its victims, that's how it feels. Dengue fever can cause such severe muscle and joint pain that it can be excruciating for an infected person to even move. It can also cause burning fever, delirium, internal bleeding and even death as the body attempts to fight off the disease. There is no effective treatment, and won’t be anytime soon.

Nevertheless, new research has identifies a hope for stemming the epidemic—and it lies in genetic engineering.

Dengue virus, which is passed on by the same Aedes Aegypti mosquito that spreads Zika, has been plaguing humans since at least the late 1700s. But in the past few decades, skyrocketing human population and increased urbanization—particularly in warm, moist regions like South America, Southeast Asia and West Africa—have fueled a growing number of cases. Like the Zika virus, dengue has no symptoms for the majority of those who contract it (roughly three-quarters). But nearly 100 million people annually do develop at least some of its dangerous and excruciating symptoms—and roughly 20,000 of those die each year.

Even if you do survive dengue fever, you aren’t out of the woods yet. In fact, overcoming the disease once actually makes you more likely to die if you contract a different strain later. That’s because the various types of the virus appear so similar on the surface, that the immune system will often respond using the same antibodies it developed to fight the last bout. But these are ineffective against the new strain. Moreover, the immune system’s efforts to fight the virus can attack the body instead—causing hemorrhaging, seizures and even death.

So far, preventing the spread of dengue has mostly taken the form of old-fashioned mosquito warfare: nets, insecticide and draining still water, where mosquitoes like to breed. In 2015, researchers finally developed a partially effective dengue virus vaccine, which was green-lighted in three countries. But the vaccine only reduced chances of getting the virus by 60 percent in clinical trials, and because of the risk of developing antibodies, some experts think it may only be safe for people who have already survived an infection.

Today the vaccine is only being used in limited quantities in the Philippines. "There is really an urgent need for developing new methods for control," says George Dimopoulos, a John Hopkins University entomologist who studies mosquito-borne diseases like malaria and dengue.

Instead of focusing on how people get infected with dengue, Dimopoulos has turned his efforts to how mosquitoes themselves contract the virus. Usually, the virus makes its home in a mosquito after the insect bites an infected human; it rarely passes between mosquitoes. So theoretically, by figuring out how to block that infection from ever occurring, you could effectively eliminate dengue virus, Dimopoulos says.

In a study published today in the journal PLOS Neglected Tropical Diseases, lead author Dimopoulos explained how that would work. Using genetic engineering, he and his team manipulated two genes that help control the immune system of the Aedes aegypti mosquito, which most commonly spreads dengue. The manipulated genes caused the mosquitoes' immune systems to become more active when the bugs fed on blood, which is when they contract dengue virus. This stimulation made the mosquitos significantly more resistant to the different types of dengue virus.

"This impressive body of work is an important step forward in understanding mosquito-[dengue virus] immunology," says University of Melbourne dengue researcher Lauren Carrington, who was not involved in the study.

However, Dimopoulos says this breakthrough is just the first step. While the mosquitoes in his study became roughly 85 percent more resistant to some types of dengue virus, other types were much less affected by the genetic engineering. Furthermore, the manipulation didn't seem to create any significant resistance to the related Zika and Chikungunya viruses that Aedes aegypti also spread.

Dimopoulos hopes to fine-tune the method to make it more effective. While genetic engineering comes laden with controversy, he points out that his technique doesn't introduce any foreign genes into the mosquitoes; it simply manipulates the ones they already have. Eventually, he hopes to create mosquitoes that will be resistant to multiple tropical diseases. He also wants to take advantage of "gene drive" technology, which enhances the chances of a certain gene to be passed to offspring, to allow the genetically modified mosquitoes to quickly become dominant in any environment they're released into.

This isn’t the first time researchers have played with mosquitoes’ genes in an attempt to halt the spread of disease. The British biotechnology company Oxitec has worked to modify the genome of the Aedes aegypti mosquitoes to make males that produce dead offspring after mating. Brazil has already partnered with the company to release billions of these mosquitoes into the country, in hopes of suppressing the population of disease-spreading mosquitoes. The company has also worked to get approval to release its mosquitoes in other places, including India, the Cayman Islands and the Florida Keys, where Zika fears drove voters to approve a trial in a ballot measure last year.

Oxitec's methods are effective in the short term, Dimopoulos says. But eliminating the mosquito population from an area will not make it mosquito-free permanently, because mosquitoes from other areas will eventually fill the empty niche left behind. Authorities will be forced to regularly release more genetically modified mosquitoes to keep their population numbers suppressed, Dimopoulos notes—a costly method that would appeal to biotech companies like Oxitec.

Replacing the wild mosquitoes with live but resistant mosquitoes, however, will act as a lasting barrier to spreading tropical diseases, Dimopoulos says. Before we get there, though, he says he wants to work on upping the resistance of the mosquitoes to dengue, as well as making them resistant to other types of tropical diseases. Then, he’ll need to do trials in greenhouses and on islands to see if the resistance works outside the lab.

He doesn't expect any widespread releases of mosquitoes for another decade, but points out that 10 years is a small wait overall. "It's not going to happen quickly," Dimopoulos says, "but we have to remember that these diseases have been with us for a very long time."

There's no humane way to test in the lab whether or not humans will contract dengue less often from these mosquitoes, Dimopoulos says. As a result, we'll only know for sure how effective the gene manipulation is once the mosquitoes have been released. But even if they don't work as well outside the lab, Dimopoulos has no regrets about blazing new trails to combat tropical illnesses.

"The fight against these diseases is like a war," Dimopoulos says. "You can't win it with one weapon."

Have Scientists Found a Way to Pop the Filter Bubble?

Smithsonian Magazine

We like to believe that every visit to Google is a search for knowledge, or, at least, useful information. Sure, but it's also an act of narcissism.

Each time we retrieve search results, we pull out a virtual mirror that reflects who we are in the Web world. It's what Eli Pariser aptly described as the "filter bubble" in his 2011 book, The Filter Bubble: What the Internet Is Hiding From You.

Pariser laid out the thinking behind algorithmic personalization. By meticulously tracking our every click, Google--and now Facebook and more and more other websites--can, based on past behavior, make pretty good guesses about what we want to know. This means that two people doing exactly the same search can end up with very different results.

We're fed what we seem to want, and since we're more likely to click on stuff within our comfort zone--including ads--Google, and others, are motivated to keep sharpening their targeting. As a result, the bubbles we live in are shrinking.

There's a price for all this precision, as Pariser pointed out in an interview with Brain Pickings' Maria Popova:

"Personalization is sort of privacy turned inside out: it’s not the problem of controlling what the world knows about you, it’s the problem of what you get to see of the world."

The bigger picture

So we're trapped in a maze of our own making, right?

Not necessarily, thanks to a team of scientists who say they may have come up with a way to escape the constraints of algorithms. As the MIT Technology Review reported recently, Eduardo Graells-Garrido at the Universitat Pompeu Fabra in Barcelona and Mounia Lalmas and Daniel Quercia at Yahoo Labs have developed what they call a "recommendation engine," designed to expose people to opposing views.

One key, say the researchers, is that those views come from people with whom we share other interests. That seems to make us more receptive to opinions we'd otherwise likely dismiss as folly. The other is to present opposing views in a visual way that makes them feel less foreign.

To that end, the scientists used the model of a word cloud, which allowed study participants both to see what subjects they tended to tweet about most often, and also to have access to--in a visually engaging way--content from others whose own word clouds mentioned many of the same topics.

But what if some of that content reflected a very different political view? Would people instinctively reject it?

To put their theory to a proper test, the researchers connected people on opposite sides of an issue that evokes deeply personal feelings--abortion. They focused on thousands of active Twitter users in Chile who had included hashtags such as #prolife and #prochoice in their tweets, creating word clouds for them based on terms they used most frequently.

Then, they provided study participants with tweets from people who had many of the same terms in their word clouds, but who also held the opposite view on abortion. The researchers found that because people seemed to feel a connection to those who had similar word clouds, they were more interested in their comments. And that tended to expose them to a much wider range of opinions and ideas than they would have otherwise experienced.

In short, the researchers used what people had in common to make them more open to discussing ways in which they differed. They had, their paper concluded, found "an indirect way to connect dissimilar people."

So, there's hope yet.

Madness to the method

Here are other recent developments in the sometimes bizarre world of algorithms.

  • Nothing like automated "Warm personal regards": This was probably inevitable. Google has just received a patent for software that would keep such close track of your social media behavior that it will be able to provide you with a choice of possible reactions to whatever comments or queries come your way on Facebook or Twitter. If, for instance, a friend gets a new job, the software would suggest a response, presumably something such as "Congratulations." That's right, you wouldn't have to waste any of your brain power. The algorithm will do it for you.
  • Phone it in: Researchers at the University of Helsinki have developed algorithms for determining how people get around--walking, driving or taking the bus or subway--by tracking the accelerometer signals of their cell phones. That allows them to analyze the frequency of their stops and starts. The researchers say it could be a powerful tool in helping planners understand how people move around in their cities.
  • All the news that fits: Facebook has tweaked its "news feed" algorithms so that more actual news will start showing up there. The idea is to give greater exposure to links to articles from news organizations on Facebook feeds--which will help make the social media giant more relevant to what's going on in the world besides friends' birthdays. The speculation is that this is an effort by Facebook to challenge Twitter's dominance in generating buzz around current events.
  • What does she have to say about the Chicago Cubs?: An Israeli computer scientist has created an algorithm that can analyze huge volumes of electronic data about past events from sources as diverse as the New York Times' archive to Twitter feeds and predict what might happen in the future. Most notably, the scientist, named Kira Radinsky, has used her system to predict the first cholera epidemic in Cuba in many decades and the protests leading up to the Arab Spring.

Video bonus: Here's the TED talk that made Eli Pariser and his concept of the filter bubble famous.

Video bonus bonus: There are algorithms for everything these days and, to believe Sheldon, of "The "Big Bang Theory," that includes making friends.

More from Smithsonian.com

How Big Data Has Changed Dating

Think You're Doing a Good Job? Not If the Algorithms Say You're Not

289-312 of 388 Resources