Skip to Content

Found 380 Resources

12 kids who helped a doubting public accept the smallpox vaccine

National Museum of American History

Each year in August, National Immunization Awareness Month provides an opportunity to highlight the value of immunization across the lifespan. Activities focus on encouraging all people to protect their health by being vaccinated against infectious diseases. — Centers for Disease Control

Thirty-five years have passed since the 33rd World Health Assembly declared the world free of smallpox, an infectious disease that had plagued humankind for most of written history. This momentous achievement was the result of a massive global eradication campaign begun in the late 1960s, but its real beginnings can be traced back much further—to a medical discovery made in the English countryside, which spread across the Atlantic and to the small towns of the new republic. The following is a small piece of evidence of this long and rich history.

"He is slaid. Milton 25th October 1809. The twelve children whose names are written on the back of this card were vaccinated by Doctor Amos Holbrook..."

This unassuming 3 x 5 inch card in the collections at the National Museum of American History attests to a remarkable event that took place over two hundred years ago in a small town outside Boston. On October 25, 1809, in Milton, Massachusetts, twelve children were released from quarantine after fifteen days of close observation for any sign of smallpox infection. This may not sound unusual for a time when smallpox epidemics were a part of life, but these children had been purposefully inoculated with virulent smallpox matter in order to make a public test of a new medical discovery—vaccination.

The discovery had been made over a decade earlier by Edward Jenner, a country doctor in Gloucester, England. In 1798 he published a pamphlet entitled An Inquiry into the Causes and Effects of the Variolae vaccinae, a disease discovered in some of the western counties of England, particularly Gloucestershire and Known by the Name of Cow Pox. The booklet described his successful experiments using inoculation with cowpox to provide protection from the more serious disease smallpox. Jenner's method was named "vaccination," referring to the medical term for cowpox, Variolae vaccinae, and the Latin vacca, meaning "cow." Vaccination provided a potentially much safer alternative to the older practice of variolation, in which immunity was conferred by deliberately infecting a person with a small dose of smallpox.

As word of the vaccine's effectiveness spread, Jenner supplied cowpox vaccine matter to doctors throughout England. In 1800 vaccine material reached the United States through Benjamin Waterhouse, a professor at Harvard Medical School. Acceptance of vaccination did not come easily, and many members of the medical profession and the church opposed a method that introduced an animal disease into humans. In 1802 Waterhouse felt obliged to extol the virtues of the cow in an attempt to persuade the Boston Board of Health to set aside its objections to the "contemptible origin" of the vaccine. "The earth maintains not a more clean, placid, healthy, and useful animal than the Cow," he appealed. "She is peculiarly the poor man's riches and support. From her is drawn, night and morning, the food for his ruddy children; […] every part of her has its particular uses in commerce and medicine. On these accounts she is an [sic] useful, though invisible wheel in the great machine of state."

Satirical cartoon

Whatever their attitudes toward cows may have been, in 1809 the citizens of the town of Milton, Massachusetts, became part of the first municipal effort in the United States to offer free vaccination to all inhabitants. Over three hundred persons were inoculated during a three-day campaign in July. Following this program, the town leaders took an unusual step—they decided to hold a public demonstration to prove without a doubt that cowpox vaccine offered protection from smallpox. On October 9, 1809, twelve children, selected from those vaccinated in July, were inoculated with fresh, virulent smallpox matter by Dr. Amos Holbrook and witnessed by eighteen town members. The children were confined to a single home for fifteen days and on October 25 were discharged with no sign of smallpox infection.

Each child received a personalized certificate pronouncing them a living testament to the "never failing power of the mild preventative the Cow Pox," "a blessing great as it is singular in its kind." Several other small certificates were produced to commemorate this remarkable demonstration, including the one now in the museum's collection. The names of the twelve children subjected to the vaccine test are inscribed on the back of the card:

"Joshua Briggs, Samuel Alden, Thomas Street Briggs, Benjamin Church Briggs, Martin Briggs, George Briggs, Charles Briggs, John Smith, Catharine Bent, Suzanna Bent, Ruth Porter Horton, Mary Ann Belcher"

Milton's councilmen published a detailed account of the vaccination experiment and sent a copy to the officers of every town in the state, as well as to Governor Christopher Gore, a proponent of vaccination. In 1810 the State of Massachusetts passed the Cow Pox Act directing every town, district, or plantation, within the Commonwealth, to provide for the vaccination of their inhabitants.

 

The world is now free of small pox—a remarkable global achievement that owes a small debt to the citizens in a little town in New England in the early years of our republic.

Diane Wendt is a curator in the Division of Medicine and Science at the National Museum of American History. She has previously blogged about what it was like to survive rabies 100 years ago.

Posted Date: 
Friday, August 28, 2015 - 08:00
OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=3Auuw29QVeI:aRYhli1vY3g:V_sGLiPBpWU OSayCanYouSee?i=3Auuw29QVeI:aRYhli1vY3g:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

Your Tweets Can Predict When You’ll Get the Flu

Smithsonian Magazine

Simply by looking at geo-tagged tweets, an algorithm can track the spread of flu and predict which users are going to get sick. Image via Adam Sadilek, University of Rochester

In 1854, in response to a devastating cholera epidemic that was sweeping through London, British doctor John Snow introduced an idea that would revolutionize the field of public health: the epidemiological map. By recording instances of cholera in different neighborhoods of the city and plotting them on a map based on patients’ residences, he discovered that a single contaminated water pump was responsible for a great deal of the infections.

The map persuaded him—and, eventually, the public authorities—that the miasma theory of disease (which claimed that diseases spread via noxious gases) was false, and that the germ theory (which correctly claimed that microorganisms were to blame) was true. They put a lock on the handle of the pump responsible for the outbreak, signaling a paradigm shift that permanently changed how we deal with infectious diseases and thus sanitation.

The mapping technology is quite different, as is the disease, but there’s a certain similarity between Snow’s map and a new project conducted by a group of researchers led by Henry Kautz of the University of Rochester. By creating algorithms that can spot flu trends and make predictions based on keywords in publicly available geotagged tweets, they’re taking a new approach to studying the transmission of disease—one that could change the way we study and track the movement of diseases in society.

“We can think of people as sensors that are looking at the world around them and then reporting what they are seeing and experiencing on social media,” Kautz explains. “This allows us to do detailed measurements on a population scale, and doesn’t require active user participation.”

In other words, when we tweet that we’ve just been laid low by a painful cough and a fever, we’re unwittingly providing rich data for an enormous public health experiment, information that researchers can use to track the movement of diseases like flu in high resolution and real time.

Kautz’ project, called SocialHealth, has made use of tweets and other sorts of social media to track a range of public health issues—recently, they began using tweets to monitor instances of food poisoning at New York City restaurants by logging everyone who had posted geotagged tweets from a restaurant, then following their tweets for the next 72 hours, checking for mentions of vomiting, diarrhea, abdominal pain, fever or chills. In doing so, they detected 480 likely instances of food poisoning.

But as the season changes, it’s their work tracking the influenza virus that’s most eye-opening. Google Flu Trends has similarly sought to use Google searchers to track the movement of flu, but the model greatly overestimated last year’s outbreak, perhaps because media coverage of flu prompted people to start making flu-related queries. Twitter analysis represents a new dataset with a few qualities—a higher geographic resolution and the ability to capture the movement of a user over time—that could yield better predictions.

To start their flu-tracking project , the SocialHealth researchers looked specifically at New York, collecting around 16 million geotagged public tweets per month from 600,000 users for three months’ time. Below is a time-lapse of one New York Twitter day, with different colors representing different frequencies of tweets at that location (blue and green mean fewer tweets, orange and red mean more):

To make use of all this data, his team developed an algorithm that determines if each tweet represents a report of flu-like symptoms. Previously, other researchers had simply done this by searching for keywords in tweets (“sick,” for example), but his team found that the approach leads to false positives: Many more users tweet that they’re sick of homework than they’re feeling sick.

To account for this, his team’s algorithm looks for three words in a row (instead of one), and considers how often the particular sequence is indicative of an illness, based on a set of tweets they’d manually labelled. The phrase “sick of flu,” for instance, is strongly correlated with illness, whereas “sick and tired” is less so. Some particular words—headache, fever, coughing—are strongly linked with illness no matter what three-word sequence they’re part of.

Once these millions of tweets were coded, the researchers could do a few intriguing things with them. For starters, they looked at changes in flu-related tweets over time, and compared them with levels of flu as reported by the CDC, confirming that the tweets accurately captured the overall trend in flu rates. However, unlike CDC data, it’s available in nearly real-time, rather than a week or two after the fact.

But they also went deeper, looking at the interactions between different users—as represented by two users tweeting from the same location (the GPS resolution is about half a city block) within the same hour—to model how likely it is that a healthy person would become sick after coming into contact with someone with the flu. Obviously, two people tweeting from the same block 40 minutes apart didn’t necessarily meet in person, but the odds of them having met are slightly higher than two random users.

As a result, when you look at a large enough dataset of interactions, a picture of transmission emerges. They found that if a healthy user encounters 40 other users who report themselves as sick with flu symptoms, his or her odds of getting flu symptoms the next day increases from less than one percent to 20 percent. With 60 interactions, that number rises to 50 percent.

The team also looked at interactions on Twitter itself, isolating pairs of users who follow each other and calling them “friendships.” Even though many Twitter relationships exist only on the Web, some correspond to real-life interactions, and they found that a user who has ten friends who report themselves as sick are 28 percent more likely to become sick the next day. In total, using both of these types of interactions, their algorithm was able to predict whether a healthy person would get sick (and tweet about it) with 90 percent accuracy.

We’re still in the early stages of this research, and there are plenty of limitations: Most people still don’t use Twitter (yes, really) and even if they do, they might not tweet about getting sick.

But if this sort of system could be developed further, it’s easy to imagine all sorts of applications. Your smartphone could automatically warn you, for instance, if you’d spent too much time in the places occupied by people with the flu, prompting you to go home to stop putting yourself in the path of infection. An entire city’s residents could even be warned if it were on the verge of an outbreak.

Despite the 150 years we’re removed from John Snow’s disease-mapping breakthrough, it’s clear that there are still aspects of disease information we don’t fully understand. Now, as then, mapping the data could help yield the answers.

What Did Independence Day Mean to Southerners About to Secede?

Smithsonian Magazine

In the cooling evening air, Charleston, South Carolina's notable citizens filed into Hibernian Hall on Meeting Street for the traditional banquet to close their Fourth of July festivities. The year was 1860, and the host, as always, was the ’76 Association, a society formed by elite Charlestonians in 1810 to pay homage to the Declaration of Independence.

The guest of honor was one of the city’s most beloved figures, William Porcher Miles, Charleston’s representative in the U.S. Congress in Washington. A former professor of mathematics at the College of Charleston, Miles had won his city’s heart with his heroic efforts as a volunteer nurse to combat an epidemic of yellow fever on the coast of Virginia. He was not a planter, and not even a slaveholder, but he believed in the Constitution and in the slave master’s rights sealed by that compact—and he had come to believe that America was best split into two.

Miles wasn't happy when, amid the clinking of glasses, a poem approved by the ’76 Association was read out loud in the hall:

The day, when dissevered from Union we be,
In darkness will break, o’er the land and the sea;

The Genius of Liberty, mantled with gloom,
Will despairingly weep o’er America’s doom…

It was just a poem, mere words, sounded with a muted note of elegy. But there was no such thing as “mere words” in the blistering heat of this Charleston summer, with war about to erupt. Words, in 1860, were weapons. And these particular words struck a blow at an equation that secessionists like Miles had labored to forge between their cause and the broader American cause of freedom. This verse presented a quite different idea—the notion, heretical to the secessionist, that the sacred principle of liberty was bound up with Union, with the bonds linking together all of the states, and all of the people of the nation, from Maine to Texas.

So it went for Charleston in this year, beset with a complicated, even excruciating welter of emotions on the question of secession. As determined as so many in Charleston were to defend their way of life, based on slavery, under sharp challenge from the North, still there was room for nostalgic feeling for the Union and for the ideals set forth in the Declaration.     

Independence Day in Charleston had begun as customary, with a blast of cannon fire from the Citadel Green at three o’clock in the morning. Roused from their slumber, Charlestonians made ready for a day of parades by militia units in colorful uniform. In the 102-degree heat, the men of the German Artillery, sweltering in their brass-mounted helmets, could only be pitied.

Surely, the town’s secessionists thought, it would be a fine occasion to trumpet their ripening movement. They would celebrate Independence indeed—the coming liberation of the South from the clutches of the nefarious Union. As odd, even bizarre, as this might seem today, Charleston’s secessionists sincerely felt they were acting in a hallowed American tradition. They saw themselves as rebels against tyranny, just like their forefathers who had defeated the British to win America’s freedom some 80 years before. In this instance, the oppressor was the Yankee Abolitionist in league with the devious Washington politician, together plotting to snatch from the South the constitutional right of an American, any American, to hold property in slaves.

By the summer of 1860, these self-styled revolutionaries seemed to be winning their improbable campaign. Back in the spring, at the Democratic National Convention, held in Charleston that year, Charlestonians packed the galleries and cheered wildly when radical Southern Democrats walked out of Institute Hall in protest over the refusal of Northern Democrats to agree to a party plank giving the slaveholder an unimpeded right to operate in western territories like Kansas and Nebraska. The rebel delegates proceeded to establish their own separate “Seceding Convention,” as The Charleston Mercury called this rump group. In its comment hailing the uprising, The Mercury, a daily bugle call for secession, declared that, “The events of yesterday will probably be the most important which have taken place since the Revolution of 1776. The last party, pretending to be a National party, has broken up; and the antagonism of the two sections of the Union has nothing to arrest its fierce collisions.” A Northern reporter strolling the moonlit streets wrote of the occasion that “there was a Fourth of July feeling in Charleston last night—a jubilee …. In all her history, Charleston had never enjoyed herself so hugely.”

In this electric atmosphere, public expressions in favor of the Union could scarcely, and maybe not safely, be heard. An abolitionist in Charleston risked being tarred and feathered. Horace Greeley’s New York Tribune, America’s largest paper by circulation and a standard-bearer for abolition, was banned in the city.

It was all the more remarkable, then, that the poem confessing to despair over the Union’s impending collapse was read for all to hear at the banquet at Hibernian Hall on July 4. Rep. Miles could hardly let a handwringing cry for Union stand unchallenged. He held his tongue at the banquet, but five nights later, at a political meeting of town folk held at the Charleston Theatre, up the street from Hibernian Hall, he gave his constituents a tongue lashing. “I am sick at heart of the endless talk and bluster of the South. If we are in earnest, let us act,” he declared. “The question is with you. It is for you to decide—you, the descendants of the men of ’76.”

His words, and many more like them, would win the summer of 1860 for his camp. Charleston’s passion was for rebellion—and the banquet poem turned out to be a last spasm of sentiment for the Union. Repulsed by such feelings, the Charleston merchant Robert Newman Gourdin, a close friend of Miles, organized rich Charlestonians into a Society of Earnest Men for the purpose of promoting and financing the secession cause. When an Atlanta newspaper mocked Charleston’s insurgents as all talk, no action, a member of the group responded in The Mercury that the Earnest Men would “spot the traitors to the South, who may require some hemp ere long.”

True to their identification of their undertaking with the American Revolution, the secessionists also formed a new crop of militia units known as Minute Men, after the bands that gathered renown in colonial Massachusetts for taking on the British redcoats. Recruits swore an oath, adapted from the last line of Jefferson’s Declaration of Independence, to “solemnly pledge, OUR LIVES, OUR FORTUNES, and our sacred HONOR, to sustain Southern Constitutional equality in the Union, or failing that, to establish our independence out of it.”

In November, with the election to the presidency of Abraham Lincoln, the candidate of the antislavery Republican Party, Charleston went all in for secession. Federal officeholders in the city, including the federal district court judge, resigned their positions, spurring The Mercury to proclaim that “the tea has been thrown overboard—the revolution of 1860 has been initiated.”

Charleston’s “patriotic” uprising ended in ruin—ruin for the dream of secession; ruin for the owner of human chattel, with the Constitution amended to abolish slavery; ruin for the city itself, large parts of which were destroyed by federal shells during the Civil War. The triumph, won by blood, was for the idea expressed ever so faintly by the men of ‘76 at Charleston’s July Fourth celebration of 1860, and made definitive by the war—the idea that liberty, and American-ness, too, were inextricably and forever tied to union.  

Paul Starobin is the author of Madness Rules the Hour: Charleston, 1860 and the Mania for War (PublicAffairs, 2017). He lives in Orleans, Massachusetts.

Creating the Cadet Nurse Corps for World War II

National Museum of American History

“Wartime nursing is different,” The American Journal of Nursing soberly noted in 1943. As nurses well knew, wars always created a shortage of qualified nurses both on the home front and in the military. Recognizing that resolving and addressing these shortages would require “all the imagination and administrative skill” of their profession, American nurses began to discuss and debate how to best address the growing shortage of nurses even before the United States entered World War II.

Light blue and white striped uniform top with collar and four buttons, red details on shoulders.

As American nurses embarked upon this discussion, the federal government was initiating steps to not only “step up recruitment of student nurses” but also to “educate…and better prepare graduate nurses.” By 1943, the United States Public Health Service had already funneled $5.7 million into nursing education in an attempt to address what they believed would be a pending shortage of trained nurses. But this $5 million was, as Public Health Service officials knew, insufficient to address the problem.

Poster with image of young woman in military uniform

In an attempt to solve this problem once and for all, Frances Payne Bolton, a United States Representative from Ohio, called for an innovative program to resolve the nation’s shortage of nurses. Backed by over $150 million in federal funds, the Cadet Nurse Corps program was signed into law in 1943. Under this program, federal funds were used both to provide scholarships and stipends directly to students and to improve facilities at nursing schools, many of which had been deemed sub-standard. In a surprising twist in a nation that was still ruled by Jim Crow, dispersal of these funds was to be uniform, with funds being provided to all nursing students, regardless of their race or ethnicity, and to all nursing schools, including those that served primarily or even solely minority students.  

Following passage of the Bolton Act, a massive recruitment campaign was launched. Targeting women who were high-school graduates between the ages of 17-35, the campaign used ads, films, radio programs, billboards, and recruiting posters to encourage women to join the Cadet Nurse Corps. Recruitment materials underscored the benefits of the program: free tuition, coverage of book fees and uniform costs, and even a stipend to cover any ancillary costs. In exchange for this financial assistance, nursing students were required to complete their education in 30 months and to then work as civilian or military nurses throughout the duration of the war. The recruitment campaign was an unconditional success, with the program enrolling its target number of recruits each year it was in operation.

Two photos of details of the above uniform. Stripes. Badge with red circle and white cross with text "Cadet Nurse." Button with anchor and medical symbol.

Across the country, nursing schools underwent a radical transformation as federal funds helped schools update and modernize their equipment and facilities. Because nursing schools that served minority populations were more likely to have large numbers of students in need of financial assistance, and because these nursing schools were less likely to have a strong endowment that they could use to improve their facilities, the Cadet Nurse Corps program had an especially dramatic impact on minority access to nursing education. At some nursing schools, such as the Sage Memorial Nursing School which served predominantly Navajo students, a significant number of students joined the Cadet Nurse Corps. Looking back at their experiences, the women in the Cadet Nurse Corps who studied at Sage remembered that the stipends they received from the government to study nursing “made them relatively rich in an area that was desperately poor.” Twenty-one African American nursing schools also benefited substantially from this program, as did 38 nursing programs that accepted both African American and white students.

Between 1943 and 1948, when the program was terminated, just over 124,000 women enrolled in the Cadet Nurse Corps program. For many of these women, the program helped propel them into a profession and into the American middle class. Nursing schools were also transformed as federal funds were used to build modern facilities and ensure that laboratory equipment was state of the art.

More broadly, the Cadet Nurse Corps program ensured that Americans, whether they were enrolled in the military or serving on the home front, had access to the nursing care that they needed throughout and after the war years.

Gray cap with no brim

Alexandra M. Lord, Ph.D., is chair of the History of Medicine and Science Division. She has also blogged about the history of measles. For National Nurses Week (May 6-12), you may want to read about a Civil War nurse in Washington, D.C., midwives on horseback, or stories from the frontline of a measles epidemic

Posted Date: 
Thursday, May 5, 2016 - 08:00
OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=TWv5gYj0iFg:C9bCrkx2t2w:V_sGLiPBpWU OSayCanYouSee?i=TWv5gYj0iFg:C9bCrkx2t2w:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

Billy Graham, the Evangelical Pastor Who Preached to Millions, Has Died at 99

Smithsonian Magazine

Billy Graham, the charismatic Christian evangelist who preached to millions of people and was known affectionately as “America’s pastor,” has died at the age of 99.

Jeremy Blume, a spokesman for the Billy Graham Evangelistic Association, confirmed Graham’s death to Laurie Goodstein of the New York Times on Wednesday. Graham had battled a number of illnesses in his later life, including prostate cancer, Parkinson’s disease and hydrocephalus, an accumulation of fluid in the brain.

The man who reportedly preached to some 215 million people in more than 185 countries and territories was born outside of Charlotte, North Carolina, in 1918. His parents were Reformed Presbyterians, but Graham was initially an “unenthusiastic Christian,” as Goodstein puts it, preferring history books and baseball to the Bible. That changed in 1934, when Graham encountered the itinerant preacher Reverend Mordecai Ham and decided to make a personal commitment to God.

Graham gravitated toward the Southern Baptist denomination. He wanted to become “a Bible-waving preacher like the ones who came through Charlotte in pursuit of lost souls,” Tom Gjelten writes for NPR. According to Graham biographer William Martin, the budding preacher took to locking himself in a tool shed or canoe out to isolated spots on a river, practicing his sermons to oil cans and alligators.

While attending the evangelical Wheaton College in Illinois in the early 1940s, Graham met his wife, Ruth McCue Bell. He subsequently led a Baptist congregation in Chicago, and in the mid 1940s became the chief preacher for the Youth for Christ rallies. But his career truly began to burgeon in 1949, after he held what he referred to as a “crusade” in a 6,000-seat tent in Los Angeles.

More than 350,000 thousand people are said to have come to see the handsome young preacher deliver his sermons over an eight week period. His success was no doubt spurred by the newspaper publisher William Randolph Hearst, who, impressed by Graham’s anti-communist rhetoric, told his employees to “puff Graham,” according to Gjelten.

Graham wasn’t the first popular evangelist in America, but he distinguished himself with a unique ambition and a prescient embrace of technology. The Billy Graham Evangelistic Association, which he founded in 1950, produced radio and television programs, allowing Graham to reach millions of followers around the globe. His religious rallies, which featured musicians and choirs, could fill stadiums; more than 2 million people came to see Graham at New York City’s Madison Square Garden in 1957.

Though Graham initially permitted segregated seating at his “crusades,” he soon demanded that all of his followers be treated equally. He was not an active civil rights campaigner, but he invited Martin Luther King Jr. to offer a prayer at a his Madison Square Garden crusade and spoke out against the 1963 bombing of the predominantly black church in Birmingham, Alabama.

“We should have been leading the way to racial justice but we failed,” he said, according to Daniel Burke of CNN. “Let's confess it, let's admit it, and let's do something about it.”

Graham’s support of the Civil Rights Movement drew the ire of the Klu Klux Klan and Southern segregationists. He also angered fundamentalist leaders because he embraced Christians of other denominations, inviting Catholic and liberal Protestant ministers to share his stage.

Billy Graham, Jr.James Pease Blair1958 (printed later)Gelatin silver print (National Portrait Gallery, Smithsonian Institution; gift of James P. Blair ©1958, James P. Blair)

Graham’s fame brought him into close association with several presidents, including Lyndon Johnson, George W. Bush and Bill Clinton. He endorsed the 1968 presidential campaign of Richard Nixon, with whom he became friends. Their relationship was strained, however, in the wake of the Watergate scandal.

“[Graham] recognized then that he had probably been used, that he had misunderstood something of the president’s character,” biographer William Martin told NPR’s Gjelten. “That was a terrible blow to him and caused him to withdraw from the political arena.”

Graham guarded his image as a man of the church carefully. Throughout his life, he was also known for adhering to the so-called "Billy Graham rule" for refusing to "travel, meet, or eat alone" with women other than his wife. First thought up in 1948, he continued the practice throughout his life (with the exception of a lunch with Hillary Clinton), even, according to his grandson Will, employing "two nurses, for accountability purposes" for care in his final years. 

When it came to gender roles, Graham's daughter, Anne Graham Lotz told NPR in 2011 that her father's views evolved over time. While she says her father was initially critical of her own decision to practice ministry, once he attended one of her classes, he gave her his full support. He reportedly used to say that Anne was the best preacher in the family.

In the later years of Graham's career, he made efforts to steer clear of incendiary topics—like homosexuality and abortion—that fueled other evangelical preachers. But he did court controversy in 1993 when he suggested that the AIDS epidemic was a "judgemental of God"—a statement for which he later apologized. 

His association with Nixon also drew Graham into a scandal in 2002, when the National Archives released tapes from Nixon's White House. One tape from 1972 captured the preacher telling the president that American Jews had a “stranglehold” on the media, and that Nixon “might be able to do something” about it if he was elected to a second term. In the wake of the tape’s release, Graham apologized to Jewish leaders and said that he had long “sought to build bridges between Jews and Christians.”

During a candid interview with Laurie Goodstein of the New York Times in 2005, Graham reflected on his regrets, including that tape. When asked about politics, the then 86 year old affirmed he no longer wanted to be vocal in that sphere. "I feel I have only a short time to go, and I have to leave that to the younger people," he said. "If I get on these other subjects, it divides the audience on an issue that is not the issue I'm promoting. I'm just promoting the gospel."

Horrific Tales of Potatoes That Caused Mass Sickness and Even Death

Smithsonian Magazine

It was the second day of autumn term at a small boys’ school in South London in 1979. Without warning, 78 schoolboys and a handful of monitors simultaneously fell ill. Symptoms included vomiting, diarrhea, abdominal pain and, in severe cases, depression of the central nervous system. Several patients were comatose with episodes of convulsive twitching and violent fits of fever. In many patients, there were signs of peripheral circulatory collapse. Within five days of the initial outbreak, all patients recovered in full, though some hallucinated for several days, Mary McMillan and J.C. Thompson report in the Quarterly Journal of Medicine. But what could cause such a sudden and mysterious illness?

Turns out, a bag of potatoes left in storage from the previous summer term.

After careful analysis of the sequence of events, the onset of symptoms was pinpointed to about four to 14 hours after the boys had eaten boiled potatoes that had a high concentration of the toxin, solanine, a glycoalkaloid that was first isolated in 1820 in the berries of a European black nightshade. Nightshade is the term used to describe over 2,800 species of plants in the scientific family, Solanaceae. Eggplants, tomatoes, and some berries are common members of the nightshade family—many of them contain highly toxic alkaloids. 

That said, the potato is the most common cause of solanine poisoning in humans. But how do you know when solanine is present in a potato? The tuber is turning green.

Though the green color that forms on the skin of a potato is actually chlorophyll, which isn’t toxic at all (it’s the plant’s response to light exposure), the presence of chlorophyll indicates concentrations of solanine. The nerve toxin is produced in the green part of the potato (the leaves, the stem, and any green spots on the skin). The reason it exists? It’s a part of the plant’s defense against insects, disease and other predators.

If you eat enough of the green stuff, it can cause vomiting, diarrhea, headaches, paralysis of the central nervous system (as evidenced by the incident above) but in some rare cases the poisoning can cause coma—even death. Studies have recorded illnesses caused by a range of 30 to 50 mg of solanine per 100 grams of potato, but symptoms vary depending on the ratio of body weight of the toxin and the individual’s tolerance of the alkaloid. The following cases recorded in various medical journals include examples of some of the most severe cases of solanine poisoning (many of which resulted in death):

1899: After eating cooked potatoes containing 0.24 mg of solanine per gram of potato, 56 German soldiers experienced solanine poisoning. Though all recovered, in a few cases, jaundice and partial paralysis were observed. 

1918:  In Glasgow, Scotland, 61 people from 18 separate households were affected at once by a bad batch of potatoes. The following day, a five-year-old boy died of strangulation of the bowel following extreme retching and vomiting. According to “An Investigation of Solanine Poisoning” by S. G. Willimott, PhD, B.Sc. published in 1933, the case was investigated by scientists, R. W. Harris and T. Cockburn, who concluded in their article, “Alleged Poisoning By Potatoes” (1918),  that the poisoning was the result of eating potatoes which contained five or six times the amount of solanine found in normal potatoes. Willimott cites this particular occurrence as an example of the toxin’s prevalence: “A review of the literature reveals the fact that authentic cases of solanine poisoning are not so rare as authorities appear to believe.”

1922: In autumn of this year, serious epidemic broke out in Germany which was traced to the abnormal content of solanine in the potato crop. 

1925: Seven members of a family were poisoned by greened potatoes. Two of them died. According to reports, symptoms included vomiting, extreme exhaustion, but no convulsions like that of the schoolboys in London. Breathing was rapid and labored until consciousness was lost a few hours before death. 

1948: A case of solanine poisoning involving the potato’s nightshade relative, the berry, was recorded in the article “A Fatal Case of Solanine Poisoning published in  the British Medical JournalOn August 13 of that year, a 9-year-old girl with a bad habit of snacking on the berries that grew along the railroad tracks by her house was admitted to the hospital with symptoms of vomiting, abdominal pain, and distressed breathing. She died two days later. An autopsy found hemorrhages in the mucosa of stomach and middle section of her small intestine. The stomach contained about one pint of dark brown fluid. 

1952: According to the British Medical Journal, solanine poisoning is most common during times of food shortage. In the face of starvation, there have been accounts of large groups eating older potatoes with a higher concentration of the toxin. In North Korea during the war years of 1952-1953, entire communities were forced to eat rotting potatoes. In one area alone, 382 people were affected, of whom 52 were hospitalized and 22 died. The most severe cases died of heart failure within 24 hours of potato consumption. Some of the less severe symptoms included irregular pulses, enlargement of the heart, and blueing lips and ears. Those who displayed these ailments died within 5 or 10 days. Authors John Emsley and Peter Fell explain their book Was It Something You Ate?: Food Intolerance: What Causes It and How to Avoid It: ”In the final stages there were sometimes a state of high excitability with shaking attacks and death was due to respiratory failure.” 

1983: Sixty-one of 109 school children and staff in Alberta, Canada, fell ill within five minutes of eating baked potato. Forty-four percent of those affected noted a green tinge and a bitter taste in the potatoes.

Not to worry though, fatal cases of solanine poisoning are very rare these days. Most commercial varieties of potatoes are screened for solanine, but any potato will build up the toxin to dangerous levels if exposed to light or stored improperly. Often, the highest concentrations of solanine are in the peel, just below the surface and in the sprouted “eyes”—things that are typically removed in cooking preparation—though Warren would argue even boiling water in potato prep dissolves only a little of the alkaloid. Emsley and Fell continue:

Most people can easily cope with the solanine in the average portion of potato and show no symptoms of poisoning because the body can break it down and rapidly and excrete the products in the urine. But if the level of solanine is as high as 40 mg per 100 g of potato, symptoms include diarrhea…even coma.

The best way to prevent solanine poisoning is to store tubers in a cool, dark place and remove the skin before consumption. A general rule for avoiding illnesses like the ones described above? Green and sprouted? Throw it out.

Why Are Native Groups Protesting Catholicism's Newest Saint?

Smithsonian Magazine

Sometimes a saint may be all too human.

Junipero Serra, the missionary who brought Catholicism to California, is set to be canonized this week on the occasion of Pope Francis’ visit to the United States in a Spanish-language ceremony expected to draw tens of thousands of worshippers. But some Native American groups think the event is cause for outcry, not celebration.

Serra’s story is the West Coast incarnation of some of the founding myths of the United States. Just as the stories of Columbus, Pocahontas and the Puritans are incomplete without including the fate of displaced and mistreated native populations, so too is that of the settlement of the Pacific Coast.

The mission system lasted over 60 years and was integral to Spain’s colonization of the recently conquered California land mass. Serra’s canonization, meanwhile, is stirring up controversy about whether the system he founded was holy or horrible. Between 1769 and 1784, Serra formed nine Spanish missions. Many were massive in size; Mission San Luis Rey had 60,000 head of cattle at one point. Each mission was a closed Catholic community that offered native nations, like the Kumeyaay, Chumash and Cahuilla, Spanish citizenship and education in exchange for their conversion, labor and permanent residency.

The mission system lasted long after Serra’s death—21 missions were formed before newly independent Mexico abandoned the project in 1833. The missionaries’ promises to entrust mission lands to the native people who built and turned them into self-sustaining communities were broken: Most of the land was “secularized” and distributed to non-native owners. Though many mission lands were eventually regained by the Catholic church, they were never returned to the people who built them.

To Serra’s supporters, the missions were forces for good, spreading Catholicism, settling the state and building beautiful sanctuaries. But for many others, Serra’s legacy is much darker than the whitewashed plaster of California’s iconic missions.

Jeffrey M. Burns, a Serra scholar who directs the University of San Diego’s Frances G. Harpst Center for Catholic Thought and Culture, says that Serra and his fellow missionaries measured success in terms of souls saved. “Serra offered the native people membership in the missions in exchange for eternal life,” says Burns. “He would have seen everything at the mission as the native people’s property, something he was holding in trust for them. It may not have worked out that way, but that’s how he understood it.”

Though native peoples could hypothetically decide whether to enter the missions, some were lured in when the missions needed more manual labor. Others felt they had no choice—as animals imported from Europe changed the ecosystem and diseases threatened the native populations, missions became a more attractive (but by no means ideal) option.

Mission life exacted a high cost from native peoples, Serra biographer and University of Riverside history professor Steven W. Hackel says. As they farmed, labored and went to church, “Indians were expected to give up most of the important aspects of their culture in return for what the missionaries promised them was salvation,” says Hackel. Confined inside the missions among a diverse group of mission-bound Native Americans, says Hackel, indigenous people were encouraged to abandon both their cultural practices and traditional farming techniques.

“Indians who challenged the mission’s authority were flogged,” says Hackel. The Indians’ “spiritual fathers,” he continues, “punished them as children even when they were adults.” Those who tried to escape were hunted down by Spanish soldiers and forced to return. Crowded missions were also hot spots for diseases like pneumonia and diphtheria. One missionary wrote that an epidemic of measles “has cleaned out the missions and filled the cemeteries.” According to the Huntington Library’s Early California Population Project, 71,000 burials were performed in California’s missions between 1769 and 1850. And the University of California’s Calisphere notes that though there were an estimated 300,000 native people living in the area before Spanish colonization, only 30,000 remained by 1860.

“There were no easy answers” for Native Californians, says Burns, who notes that converts had to weigh their survival against a mission system that “didn’t have cultural sensitivity.” And according to some tribal leaders, modern-day conversations about Serra are no better.

Though Pope Francis has asked forgiveness for mission-era crimes against native peoples in July, some see the canonization of the system’s figurehead as a slap in the face. The Pope is “evidently unaware of the deadly toll and devastating effect that the Catholic Mission system had on our nations and peoples here in California,” wrote Robert Smith, chairman of Pala Band of Mission Indians and the Southern California Tribal Chairmen’s Association, in a letter of protest.

“Neither the missions nor Serra’s methods are worthy of secular or state pride,” wrote Valentin Lopez, chairman of the Amah Mutsun Tribal Band in an open letter urging California governor Jerry Brown to protest the canonization. Nonetheless, missions still have plenty of visibility in California: Serra’s name can be found on everything from high schools to highways. A fourth-grade project on the missions has been part of the state curriculum for decades, and the mission system’s legacy is present in California’s architecture, statuary and even sports teams—San Diego Padres, anyone?

“The missions were an unmitigated disaster for the Indians of California,” says Andrew Galvan, museum curator at Mission Dolores in San Francisco. “There’s no denying that.” But Galvan, whose Ohlone ancestors were forcibly baptized and brought to live in the missions, also sees a silver lining in Serra’s canonization. “This negativity is an opportunity for transformation,” he says.

Galvan, who served Serra’s canonization cause, doesn’t see a contradiction between admiring the man who brought Catholicism to California and condemning the system that he helped found. Instead, he is alarmed at church and museum officials’ furthering of what he calls “the mission myth”—a romanticized version of mission life that erases the struggles and contributions of Native Californians.

“There’s an opportunity to tell the true story now,” says Galvan—the story of a man “on fire” with missionary zeal and at the helm of a system that had fatal consequences for Native Californians. He calls on the Catholic Church to go beyond canonizing Serra and begin to reweave native contributions and stories into the story of the missions. “They’re Indian missions,” he insists. “They’re our places. Indian people are still here.”

Why Your Next Favorite Fictional Protagonist Might Be on the Autism Spectrum

Smithsonian Magazine

Is autism cool?

It is in literature, as novels featuring characters on the autism spectrum become so frequent that they’ve spawned a new genre: “autism lit,” or “aut lit.”

Many of the works put a positive spin on autism. These autistic characters have abilities as well as disabilities; they exist not only as mirrors or catalysts to help others solve their problems, but as active agents with inner lives.

The Curious Incident of the Dog in the Night-Time, first published in 2003, did more than any other book to give life to this genre. Christopher Boone, the narrator, is a 15-year-old autistic savant; that is, he can perform computer-like math in his head. He also has trouble with language and social interactions, the two primary symptoms of autism. Still, he’s shown to have an inner life that includes many opinions, as well as hopes for the future. Perhaps of greatest importance is that he has the ability to pursue his goal of solving the mystery of who killed his neighbor’s dog.

A successful book that breaks new ground will breed many imitations. Back in the late 1970s, Robin Cook’s Coma introduced the medical thriller to the world. And so Curious Incident has been followed by a wide range of novels, including the pseudo-science fiction novel, The Speed of Dark (2005); fiction-that-reads-as-memoir, such as Daniel Isn’t Talking (2006) and Tilt: Every Family Spins On Its Own Axis (2003); young adult novels such as Mindblind (2010); and the light-hearted The Rosie Project (2013) and its sequel, The Rosie Effect (2014)

Of particular interest is M is for Autism (2015), the moving result of a collaboration of young students at Limpsfield Grange, a school for autistic girls.  Boys are diagnosed with autism four times more often than girls, and the face of autism is almost always that of a young boy.  This novella looks at some of the special issues that young women face, and by doing so it’s an exception in the genre.

Back to our young men, though: Somewhere on the journey from Curious to Rosie a transformation occurred.  The smart, but anti-social and clueless Christopher Boone morphed into the smart and somewhat clueless but also charming husband and father Don Tillman.  Don is a professor of genetics in The Rosie Project and an equally successful professor in New York in the sequel.

On this same literary journey, the perceived limitations of these autistic characters have been turned either into strengths, or into obstacles that, once overcome, become strengths. For example, many of these fictional beings “miss social cues” (a stereotype, but like all stereotypes based in some reality), and therefore don’t dissemble or manipulate the way that neurotypical people do.

Lou Arrendale, the hero of The Speed of Dark, is a thoughtful young man with a superior moral sense. He lives in a not-too-distant future when autism can be cured in infancy. Lou was born just a little too late for that, but now science has invented a way to “fix” autism in adulthood, and Lou has to decide whether he wants to give up the advantages of his condition for the sake of fitting into society’s mold.  It’s difficult to imagine a character debating this question 20 years ago, let alone 50.

Mindblind is a contemporary young adult novel; no scientific advances here.  But Nathaniel Clark, the hero and narrator, not only drives the action, he ends up being a rock star, at least in his own social circle.

Perhaps the most powerful statement, though, is uttered by the heroine’s therapist in M is for Autism: “You are a wonderful teenage girl.  And you have autism.  The truth is, you will need some support and guidance with life’s inevitable ups and downs but you can have a glorious, fulfilled life, M, and this is the truth, too.”  In other words, even without medical intervention or a touch of wishful thinking, there’s no reason for people on the spectrum to give up on their future.

This positive prediction wouldn’t be made about to Boo Radley, the recluse from To Kill a Mockingbird.  Rumors surround Boo: he eats raw squirrels; he drools most of the time.  Though these are indeed rumors, from what we do learn about Boo, he may well be autistic.  Regarded as a shadowy, sinister figure, Boo ends up saving Scout’s and Jem’s lives, but his “reward” is to have his brave act go unrecognized.  We last see him as Scout leads him by the hand back to his lonely existence.

Autism lit is not without controversy:  Many readers object to the prevalence of the autistic savant.  And in fact, most of these protagonists are gifted: Christopher Boone, for example, is about to sit for his A levels in math, a heady accomplishment in England, where the book takes place. Nathaniel Clark is graduating college (with a double major, he reminds us more than once) at the age of 14.

In reality, savant skills are as rare in the autism spectrum community as they are in the neurotypical one.  Those who dislike the novels for this reason cite the 1988 film Rain Man in which Dustin Hoffman plays Raymond Babbit, who can memorize a thick phone book in one night.  As novelist and cultural observer Greg Olear wrote, “Thirty years later, the belief persists that autistics can reliably count a pile of toothpicks at a glance. This is a powerful negative stereotype that autistic children (and their parents) must overcome.”

But there doesn’t seem to be any stopping “autism lit,” exploitative or not. In fact, the fascination with the autism spectrum and fiction has launched yet another literary trend: the “retroactive diagnosis.” Some readers now believe that Mr. Darcy of Pride and Prejudice is on the spectrum; that’s the explanation for his reserve. Some readers suspect that the narrator of Hermann Hesse’s Steppenwolf falls into this category as well. The word “autism” didn’t exist, the theory goes, before World War II, and that’s the explanation for why Austen and Hesse didn’t label their characters themselves.

I’m not ready to jump onto this bandwagon. Calling Mr. Darcy autistic is a way of granting stature to people truly on the spectrum who don’t need your literary charity, thank you very much. But there are worse alternatives.  (The retroactive diagnoses might apply to Boo Radley.)

In the world outside victimhood, we remain in the midst of an unexplained epidemic of autism spectrum disorders; some sources say that as many as 1 in 68 children are being diagnosed with the condition. And even with the onslaught of fictional characters on the spectrum, much of the story of autism remains untold.

There’s a saying that has been variously attributed to Temple Grandin, the autistic professor of animal science and advocate for the humane treatment of livestock, as well as to the autism advocate and author Stephen Shore, which has become one of those aphorisms that belong to the world: “If you’ve met one person with autism, you’ve met one person with autism.”

Since each story is different, we can expect the category of autism lit to swell, ideally with more portrayals of people on the spectrum who have jobs, partners, and most of all, purpose.

Donna Levin’s latest novel, There’s More Than One Way Home, was published in May of this year by Chickadee Prince Books. Her papers are part of the Howard Gotlieb Archival Research Center at Boston University and her novels are part of the “California novels” collection in the California State Library. 

How to Make Science Fiction Become Fact, in Three Steps

Smithsonian Magazine

While speakers at the first day of Smithsonian magazine’s fourth annual “Future is Here” festival shared their thoughts on subjects as diverse as computer programming, the Zika virus, human space exploration, the future of the internet and the state of global fisheries, they all shared a common thread: there’s hope. Never give up—even if you have to wait a long time.

“Who will be the next President of the United States?” Smithsonian’s editor-in-chief Michael Caruso asked a Magic 8 Ball as he opened the day of TED-style talks on Saturday. “The future is notoriously difficult to predict. But never before has the distance between imagination and reality been so close, and the predictions scientists are making aren’t wild fantasies.”

Smithsonian magazine's editor-in-chief Michael Caruso kicks off the day. (Richard Greenhouse Photography)

Caruso welcomed a roster of visionaries including Nicholas Negroponte, co-founder of the MIT Media Lab; Martine Rothblatt, founder of Sirius Radio and United Therapeutics; Vint Cerf, Google’s “chief internet evangelist” and co-developer of modern internet connection protocols; and former NASA astronaut Tony Antonelli, who helps Lockheed Martin shape its human spaceflight initiatives. Two of Jacques-Yves Cousteau’s granddaughters, Céline and Alexandra Cousteau, also took the stage to talk about their respective work in the Amazon and with the world’s oceans.

Sisyphean perseverance emerged as the theme of the day, encouraging those despairing visionaries out there, eager for the day when technology (hopefully) makes their ideas possible.

Rothblatt, obsessed with all things space for most of her life, said her whole focus shifted after her daughter Jenesis was diagnosed in 1994 with life-threatening and incurable pulmonary arterial hypertension (PAH). She founded United Therapeutics in 1996 after doing a deep-dive into potential treatments and convincing Burroughs Wellcome (and later GlaxoSmithKline) to allow her to license a compound, treprostinil, they’d shelved in favor of an easier-to-manufacture drug. 

Rothblatt founded United Therapeutics in 1996 after her daughter Jenesis was diagnosed with life-threatening pulmonary arterial hypertension. (Richard Greenhouse Photography)

With no background in biotech, Rothblatt pursued a PhD in medical ethics even as she worked, at great personal cost and expense, with pharmaceutical scientists to develop treprostinil into a drug. The Food and Drug Administration (FDA) ultimately approved the drug, Remodulin, in 2002.

“I gave one doctor the money he said he needed to make it, and he finally produced half a gram,” Rothblatt told the audience. “But we needed dozens of grams for animal studies, hundreds of grams for animal studies, and, ultimately, hundreds of kilos to help people across the country. So we put the pedal to the metal.”

Today, Rothblatt’s company, United Therapeutics, annually produces enough drugs for tens of thousands of patients, including her daughter, who can now live out their lives beyond the three-year life expectancy once given at diagnosis.

“We’ve never turned away a patient who can’t pay,” she said. “We will give that medicine to them for free. It hasn’t stopped us from being a successful pharmaceutical company—we’ve found that doing the right thing helps you do the best thing.”

Actor William Shatner appeared as a surprise guest. (Richard Greenhouse Photography)

In a special appearance, actor William Shatner said that though science fiction can lay the groundwork for the future, progress is not always made with computer wizardry and bubbling test tubes. He spoke about recently witnessing an unusual and unexpected experiment in progress.

“We write and we think about all these highfalutin futuristic things that are going to take place, but buried in the basement of a small building in Philadelphia there are dogs sniffing for cancer in vials of blood,” he said. “It has nothing to do with the future as imagined by a show called 'Star Trek.'”

Vint Cerf, Google's "chief internet evangelist," made some predictions about the "internet of things." (Richard Greenhouse Photography)

Google’s Vint Cerf described how the genesis of the internet was, at heart, a bottom-up enterprise. Built to satisfy a military defense agency that needed a cost-effective communications network compatible with a range of computer brands, Cerf said that four decades of evolution shed some light on what is yet to come.

“The thing you carry in your pocket once took an entire van to do,” Cerf said, holding up a cell phone. “Now we’re faced with a new invasion, devices you wouldn’t expect to be part of the internet environment. I used to tell jokes that every lightbulb will have its own IP address. Well, now I can’t joke about that.”

In the current day, between 3 and 3.5 billion people use three to five devices every day, Cerf said, for a global total of 10 to 15 billion devices. Looking into a future where an “internet of things” connects humans and a host of objects, it’s completely reasonable, Cerf said, to predict that by 2036, the planet will have 8 to 10 billion users, and the average person will use or interact with around 100 devices per day, from phones to tablets to embedded sensors. That adds up to one trillion devices.

“We need to get smarter about how we use our resources,” Cerf said. “How we gather our data can really make a difference.”

To that end, he described Google’s ongoing projects using innovative sensing, from contact lenses that can measure a diabetic’s glucose level, to ingestible nanobots to diagnose disease from inside the body. Like the trucks used to test out network connectivity in the 1970s, Cerf suggested today’s cutting-edge technology only has room to shrink.

“3D printers today are large and clunky, but over time those printers could make smaller and smaller stuff,” Cerf said. “Maybe one day the 3D printers can print even smaller printers, eventually printing at the molecular level.”

And, of course, Google is working on making sure internet works in space, too.

Alexandra Cousteau, an environmental advocate and granddaughter of Jacques-Yves Cousteau, spoke about the world's oceans. (Richard Greenhouse Photography)

In the year of the 40th anniversary of the Viking mission to Mars, Lockheed Martin’s Antonelli said today’s space missions are paving the way for the next steps, including an asteroid retrieval program and the Orion spacecraft, which will eventually take humans to Mars. (People took selfies all day with a quarter-scale replica of the Orion at the festival.)

In addition to the current missions surveying Mars, including the Mars Reconnaissance Orbiter, which takes its own surveys of the Martian surface as well as relays messages between Earth and the Martian rovers, there’s also Maven, a Martian atmospheric observatory, and Juno, which will arrive at Jupiter this summer to map the planet’s atmosphere and magnetic and gravitational fields.

Osiris-Rex (Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer) will launch this fall destined for the asteroid Bennu, Antonelli said. Close enough to reach, large enough to land upon, and old enough that it reflects the early composition of the solar system, Bennu is thought to hold the molecular ancestors of life on Earth, but also whizzes scarily close to our planet on a regular basis. The samples from the Osiris-Rex mission will help scientists plan for a possible impact intervention mission, and also help aspiring asteroid miners know what resources they might find.

Despite the fact that new space missions are popping up one after another, it’s today’s students who will one day be making the next big steps into space.

“Keep in mind, that the first person to go to Mars is in school today,” Antonelli said. “Well, maybe not today, since it’s a Saturday,” he added. 

In Hawaii, Old Buses Are Being Turned Into Homeless Shelters

Smithsonian Magazine

When we think of Hawaii, most of us probably picture surfers, shaved ice and sleek beach resorts. But the 50th state has one of the highest rates of homelessness in America. Due in large part to high rent, displacement from development and income inequality, Hawaii has some 7,000 people without a roof over their heads.

Now, architects at the Honolulu-based firm Group 70 International have come up with a creative response to the homelessness problem: turn a fleet of retired city buses into temporary mobile shelters.

“Homelessness is a growing epidemic,” says Ma Ry Kim, the architect at the helm of the project. “We’re in a desperate situation.”

Kim and her friend Jun Yang, executive director of Honolulu’s Office of Housing, came up with the idea after attending a disheartening meeting of Hawaii’s legislature. Homelessness was discussed but few solutions were offered.

“[Jun] just said, ‘I have this dream, there’s all these buses sitting at the depot, do you think there’s anything we can do with them?’” Kim recalls. “I just said ‘sure.’”

The buses, while still functional, have too high mileage for the city of Honolulu to use. The architects envision converting them into a variety of spaces to serve the needs of the homeless population. Some buses will be sleeping quarters, with origami-inspired beds that fold away when not in use. Others will be outfitted with showers to serve the homeless populations’ hygiene needs. The buses will be able to go to locations on the island of Oahu where they are most needed, either separately or as a fleet. The entire project is being done with donated materials, including the buses themselves, and volunteer manpower. Members of the U.S. Navy have pitched in, as have local builders and volunteers for Habitat for Humanity. The first two buses are scheduled to be finished by the end of the summer.

The blueprint for the shower-equipped hygiene bus comes from the San Francisco program Lava Mae, which put its first shower bus on the streets of the Mission District in July 2014. Kim hopes to “pay it forward” by sharing her group’s foldable sleeping bus designs with other cities.

“The next city can adopt it and add their piece or two,” Kim says. “There are retired buses everywhere. The missing part is the instruction manual on how to do this.”

The project comes on the heels of recent controversy about new laws preventing the homeless from sleeping in public. Proponents say the laws, which make it illegal to sit or sleep on Waikiki sidewalks, are a compassionate way of getting the homeless off the streets and into shelters. Critics say the laws are merely criminalizing homelessness and making life more difficult for Hawaii’s most disadvantaged population in order to make tourists feel more comfortable.

The needs of the homeless are varied. While a small percentage of the homeless are chronically on the streets, most are people experiencing difficult transitions—a loss of a house due to foreclosure, fleeing domestic violence, displacement by natural disaster. Increasingly, designers and architects are looking to fill these needs with creative design-based solutions.

In Hong Kong, the architecture and design group Affect-T created temporary bamboo dwellings for refugees and disaster victims. The dwellings are meant to sit inside warehouses or other sheltered spaces. Light and easy to transport and construct, the dwellings could be a model for temporary shelters anywhere in the world.

The Italian firm ZO-loft Architecture and Design built a prototype for a rolling shelter called the Wheely. The temporary abode looks like a large can lid, and opens on either side to unveil two polyester resin tents. The internal frame provides space for hanging belongings, and the tents, which stretch out like Slinky toys, can be closed at the end for privacy and protection from weather. Inventor Paul Elkin came up with a similar solution—a tiny shelter on wheels that unfolds to reveal a larger sleeping space.  

But temporary shelters don’t solve the problem of chronic homelessness. It’s increasingly understood that simply giving homeless people homes—a philosophy called Housing First—is more effective than trying to deal with the underlying causes of their homelessness while they’re still living in shelters. Housing First is also cost effective, since people with homes end up needing fewer social supports and are less likely to end up in prisons or emergency rooms. 

A number of cities are tapping into the mania for tiny houses as a more permanent partial solution. In Portland, Dignity Village is a permanent community of some 60 people living in 10-by-12-foot houses near the airport. The houses were built mostly with donated or salvaged materials, and residents share communal kitchens and bathrooms. The village was originally an illegal tent encampment, but the city granted the community land, which ensures houses are built to city code. Residents say the village grants them not just shelter and safety, but also privacy and autonomy. Unlike in homeless shelters, residents have a permanent spot and are allowed to live with partners and pets. Similar villages exist across the Pacific Northwest and California, with more springing up in other parts of the country. 

With homelessness on the rise in America—a recent U.S. Conference of Mayors survey of 25 cities showed homelessness had increased in nearly half over the past year—we'll certainly be in need of more design-inspired solutions, tiny, rolling, and otherwise. 

Vinegar-Like Acid Rain May Have Fallen During Earth’s Worst Extinction

Smithsonian Magazine

Roughly a quarter of a billion years ago, an apocalypse struck the Earth. Known as the Great Dying, it claimed more lives than any other mass extinction known to science, including the one that did in the non-avian dinosaurs 65 million years ago. Over 90 percent of all species on the planet were wiped out, from armor-clad trilobites in the oceans to giant reptiles on land. The host of strange creatures vanished, giving way to the ancestors of modern flora and fauna.

What caused the cataclysm has long been a subject of debate—theories range from an asteroid impact to methane-belching microbes. The most popular scenario starts with volcanoes in present-day Siberia, which erupted at about the right time to have kicked off a cascade of problems, including climate change. Now a team of researchers has found physical evidence that extremely caustic acid rain created by these massive eruptions could have played a part in the loss of life.

“For the first time, we can say that soils from this time had an acidity similar to that of vinegar,” says Mark Sephton, a geologist at Imperial College London whose team will publish the finding in February in the journal Geology.

Sephton and his colleagues examined traces of ancient soils in rock strata that date back to the extinction, which occurred at the end of the Permian period around 250 million years ago. At this time, all of the world’s landmasses were fused into the supercontinent Pangaea. The rocks, unearthed in what is now Northern Italy, contained a particularly intriguing substance: vanillin, the same molecule that gives vanilla its flavor and aroma.

Mark Sephton and study co-author Cindy Looy investigate the Permian-Triassic boundary in Italy's Butterloch Canyon. (Courtesy of Mark Sephton)

Vanillin is naturally produced by plants and is found in wood. But it shouldn’t survive long on its own in the ground, where bacteria release enzymes that break it down. Finding significant amounts preserved for hundreds of millions of years was even more surprising.

“It’s certainly unusual,” says Tim Bugg, a biological chemist at the University of Warwick who was not involved in the study. “To see vanillin accumulate probably suggests a lack of bacterial degradation activity.”

To explain the lethargy of the bacteria, the researchers turned to the dairy industry for inspiration. Milk producers often flavor their beverages by adding a dash of vanilla. Experiments have shown that acidifying milk protects the additive and prolongs the flavor, because the low pH deactivates the enzymes that would otherwise target vanillin.

Soil bacteria activity out in the wild could be similarly sensitive to acid, which would also explain why the Italian rocks contained relatively low amounts of a chemical called vanillic acid that tends to be produced by vanillin-munching bacteria. “Our data fits the idea that acid rain caused the microbes to cease functioning,” says Henk Visscher, a paleoecologist at Utrecht University in the Netherlands and a member of Sephton’s team.

Studies of acid rain produced in the 20th century, primarily by fossil-fuel burning power plants, have shown that it can disrupt ecosystems. The poisonous precipitation strips nutrients out of the soil and damages plants. A loss of vegetation could have lead to widespread erosion, Septhon speculates, and a shortage of food that made life difficult for creatures higher on the food chain.

A light micrograph image shows the tissue damage done to a spruce leaf by acid rain. (Science Photo Library/Corbis)

The finding is welcome news for Benjamin Black, now a geologist at the University of California, Berkeley. While at MIT he helped create a computer simulation that estimated the amount and severity of acid rain that could have been produced by the Siberian eruptions. “My hope when I was making that prediction was that we would find ways to test it,” says Black.

Published in 2013, the model suggested that carbon dioxide belched out by the eruptions could have lowered the pH of rain to about 4, the acidity of tomato juice. Add in sulfur dioxide, another common volcanic emission, and the acidity could have worsened a hundred-fold—the Northern Hemisphere could have been scoured by bursts of rain as acidic as undiluted lemon juice.

“It can’t be a coincidence that vanillin turns up exactly at this time,” says Greg Retallack, a paleobotanist at the University of Oregon who was not involved in the research. But he cautions that this new and unfamiliar approach to studying ancient soils must be carefully scrutinized. Retallack also questions whether sulfur dioxide emissions from the Siberian volcanoes could have had such a global impact. The pollutant typically forms heavy aerosol particles that rain out of the sky, limiting how far it can travel.

The severe acid rain proposed by Sephton’s team could instead have been the work of a smaller eruption close to the site studied, suggests Retallack. Another possibility is that, in certain conditions, microbes can produce sulfuric acid and acidify their environments all by themselves. In either case, the plunge in soil pH would have been limited to the region.

Bolstering the case for a worldwide acid rain epidemic may require looking farther afield. Traces of ancient soils dating back to the Great Dying have turned up not only in Italy but also in places such as China and Greenland. Future studies could test whether these rocks also contain a hint of vanilla.

Can This Toilet Save Millions of Lives?

Smithsonian Magazine

Globally, you might say that there is one household item that separates the haves from the have-nots. Of the more than 7 billion people populating the Earth, 2.5 billion don't have access to a toilet. In these regions, where clean water is scarce, easily preventable diseases, such as typhoid and cholera, are full-blown epidemics. Each year, as many as 1.5 million children die because of poor sanitation.

For these impoverished communities, concentrated mostly in parts of South Asia and Africa, sewage plants simply aren't an option. Families are often forced to use contaminant-ridden alternatives like latrine pits (essentially a dug-out hole in the gound) or simpy resort to defecating out in the open. So for toilets to be practical, they need to be not only self-powered and waterless, but also affordable for families that make as little as a dollar a day. To that end, the Bill and Melinda Gates Foundation launched the "Reinvent the Toilet Challenge," a competition which, in 2012, awarded a team of researchers $100,000 to develop a prototype capable of solving one of the most dire health crises in the developing world.

The winners, a group of engineers working out of the California Institute of Technology, has now embarked on a crucial trial run of their design. In December, a couple of test toilets were shipped to India and installed at public restroom facilities at Mahatma Gandhi University in Kerala and in Ahmedabad. In March, the prototype in Kerala will be moved to Delhi, where it will be demonstrated at a toilet fair.

Interestingly enough, the concept the Caltech team ultimately came up with isn't waterless. In fact, it operates just like a conventional toilet. "We went with a conventional flush toilet because, after testing different designs, we found people generally prefer those," says Clement Cid, a PhD student who worked on the project. "This is true even in the developing societies."

In practice, the toilet system—a self-contained combination of a toilet and a sewage system—works similarly to what's found in the small-scale septic tank sewage systems popular in rural areas of the United States. A simple flush and the feces is sent to a holding chamber where it's put through a hi-tech sanitation process that eradicates infectious, disease-causing bacteria. 

The challenge now is to figure out how to bring down the $1,200 price tag of what's essentially a portable, self-contained sewage treatment system. (Caltech)

The most noticeable difference between this new design and standard toilets is the addition of a roof-mounted photovoltaic panel. The panel powers the whole sanitation process by supplying energy to a biochemical reactor located beneath the floor that's engineered to purify the waste through the use of electrodes. As feces and urine pass through this chamber, an electrochemical reaction between the anode and cathode (think batteries) breaks down the matter into separate components, such as hydrogen, fertilizer and clean water. Another mechanism filters the waste, diverting the hydrogen toward a compartment that stores it as energy in fuel cells. The fertilizer is collected for farming purposes, while the remaining water is pumped back into a reservoir so it can reused.

“It's an entirely closed-loop system,” Cid explains. “And whereas septic toilets treat the waste only partially, the water that we recycle is totally safe, without any contamination.”

The challenge now is to figure out how to sufficiently bring down the $1,200 price tag of what's essentially a portable, self-contained sewage treatment system. The figure doesn't include other expenses, such as maintenance and repairs should the toilet break down. When connected to the grid, operating the toilet runs about 11 cents a day, more than twice the foundation's stated goal of delivering a technology that costs 5 cents a day. Though it still doesn't sound like a lot, imagine spending 10 percent of your income just to use a toilet. Hooking the system up to a rechargeable battery would raise that number even more since the energy storage units would need to be replaced every so often.

The team's goal, for now, is to devise a method for manufacturing electrodes that function at the same efficiency, but at half the price. Much of this, Cid says, would involve cutting deals to attain the source materials locally. The team is also exploring a redesign that would make the toilet system more compact, requiring less materials. Another possibility is figuring out a way to tap into the fuel cells, a potential source of  energy.

"We've built a top of the line BMW and the goal is to provide a very low end Tata Nano car," says Michael Hoffman, an engineering professor who leads the project. "We are currently exploring manufacturing options. Next week, I will be visiting potential manufacturing partners in China."

One approach, which the engineers have discussed with the Bill and Melinda Gates Foundation, lies on the business end. The proposed strategy involves initially marketing the commodes to middle and upper-middle class families in Asian countries, who tend to be receptive to the idea and also able to afford installation. The expectation is that as mass production gradually ramps up, manufacturing costs are driven down.

But the solar-powered toilet isn't without its critics. In an editorial published by the New York Times, Jason Kass, an environmental engineer and founder of an organization called Toilets For People, points out some of the flaws inherent to efforts that seek to apply highly-sophisticated technologies to the problems of people with scarce resources.

He writes:

Just imagine the fate of a high-tech toilet in one of these communities. What happens if the unique membrane systems get clogged? Or if the supercritical water vessel or the hydrothermal carbonization tank leaks, or worse, explodes? Or what if one of the impoverished residents realizes the device is worth more than a year's earnings and decides to steal it? If the many failed development projects of the past 60 years have taught us anything, it's that complicated, imported solutions do not work.

Treehugger's managing editor Llyod Alter slams the latest version of the toilet as an entirely misguided effort that, above all, disseminates some of the West's most mistaken ideas on sanitation. These include sitting on a toilet rather than squatting, a choice that can cause more strain in the bowels, and placing toilets inside of washrooms. He contrasts this with toilets in Japan, which are more sanitary since they're located in a separate room. And as Kass mentions, servicing such a complicated system would require trained specialists that these poor communities cannot afford.

Nonetheless, Hoffman believes that with the way new technologies tend to progress, these kinds of toilets will be practical in the long run. He uses Apple as an example. When Apple first introduced the touchscreen smartphone just seven years ago, it cost at least $600. Early this week, software developers from Mozilla unveiled a version for developing markets that debuts at just $25. "The costs were once prohibitive for the poor, but now are attainable," he adds.

The practicality of solar-powered toilets in impoverished communities that severely lack resources should become more clear within the next few years. In 2015, the Caltech team plans to test newer prototypes in small communities in five countries, most likely India, China, Thailand, Cambodia and Peru. Mass scale production of at least 1,000 toilets is slated for as early as 2016.

Our Top Ten Stories of 2016

Smithsonian Magazine

It’s not the first time Americans have taken to social media to rejoice the end of a uniquely horrible year—though by some accounts 2016 does seem to have been especially difficult. Yet the top stories on Smithsonian.com prove there’s reason to hope. We’ve provided continuous coverage of the Institution’s newest museum, the National Museum of African American History and Culture, and brought a historical perspective to the 2016 election (such as with this story about Susan B. Anthony’s grave). Whether you’re revisiting the site’s best work on history and science, or just want to brush up for end-of-the-year trivia, here are the 10 most-read stories from 2016.

1. The True Story of the Free State of Jones

Newton Knight probably isn’t a household name outside of Mississippi, but the 2016 film Free State of Jones brought his story to a wider audience. Knight was one of a group of white Southerners who waged a guerilla war against Confederate troops, founding a free state in Jones County. Eventually Knight went on to marry his grandfather’s former slave, Rachel, and have children with her. But Knight’s legacy in Mississippi is far from universally acclaimed, showing the complicated history of race relations in the South. To get the story, author Richard Grant braved spiders, snakes, and the complicated feelings of residents of Jones County.

2. Deep in the Swamps, Archaeologists Are Finding How Fugitive Slaves Kept Their Freedom

The Great Dismal Swamp once spread across 2,000 square miles of Virginia and North Carolina, and it was a place of hope despite its name. Archaeologists tromping across the sodden wildlife refuge have found traces of cabins, tools, clay pipes and weapons—all evidence of the runaway slaves and Native Americans who once lived there in free communities. The story revealed a new side of slavery, one in which African-Americans were featured as their own redeemers, and it was shared widely, including by the Southern Poverty Law Center.

3. A Secret Tunnel Found in Mexico May Finally Solve the Mysteries of Teotihuacán

When archaeologist Sergio Gómez happened upon a lengthy tunnel beneath the Temple of the Plumed Serpent in Teotihuacán (a Mesoamerican city at the edge of the Mexican Plateau), he hoped it might illuminate the history of the mysterious ruins. His discovery has produced dozens of relics and even an underground room whose ceiling is studded with glowing rocks that look like stars. To capture the experience of being inside the tunnels, writer Matthew Shaer, a former staff writer for the magazine, ventured into the dark, narrow tunnels being held up with scaffolding; there had been two partial collapses already.

4. How Tuberculosis Shaped Victorian Fashion

Tuberculosis was an epidemic in 19th-century Europe, with profound and sometimes surprising impacts on society—including for fashion. With victims becoming pale and wasting away before dying, the disease actually enhanced aspects already thought of as beautiful in women: sparkling eyes and rosy cheeks from fever, delicate skin and thinness.

5. What’s the Difference Between England, Britain and the U.K.?

Following the decision of U.K. voters to leave the European Union, a fair number of questions arose concerning what, exactly, counted as the United Kingdom. This story dives into the complicated history of the sovereign state of the United Kingdom, versus the British Isles, versus the larger Commonwealth Realm. Reporter Erin Blakemore and editor Jackie Mansky found graphics that broke the divisions down to cover the story without having to speculate how the split would play out—a lucky decision since there’s still no consensus on what the future will hold.

6. How 43 Giant, Crumbling Presidential Heads Ended up in a Virginia Field

What began as an American-themed sculpture park, filled with busts of 43 presidents, quickly turned into something out of a horror film. The tourist attraction known as “Virginia’s Presidents Park” went bust in 2010 after years of lackluster attendance. Today the sculptures are stashed on a private farm.

7. Newly Discovered Letters Bring New Insight Into the Life of a Civil War Soldier

In 2015, a postal worker in Michigan received a mysterious collection of letters, which turned out to have survived since the Civil War. The letters detail a young Union soldier’s experience in the Civil War, providing new insight into the lives of young men who enlisted for the war. The story continued to unravel when we uncovered the identity of the person who sent the letters to Michigan, a story you can read about here.

8. The White House Was, in Fact, Built by Slaves

Remember when First Lady Michelle Obama, in her speech at the DNC, exhorted her fellow Americans to celebrate the country’s progress from slave labor to an African-American family living in the White House? Her claim that the presidential mansion as built by slaves was true; Congress even put together a research task force in 2005 to explore the subject. This article further explores the dark history behind the White House.

9. Understanding the Controversy Behind the Dakota Access Pipeline

Throughout the fall and winter of 2016, protests against the Dakota Access oil pipeline have spurred numerous discussions over U.S. energy policy and Native American rights. This story is a primer on the pipeline and the surrounding political issues, though it might be helpful to get an update on the most recent developments.

10. Inside America’s Auschwitz

Louisiana’s Whitney Plantation is, amazingly, the country’s first slavery museum. Following a 15-year restoration effort, the museum now includes the plantation home, an overseer’s home, a blacksmith’s shop and replica slave cabins. Unlike other rosy narratives like Gone With The Wind, this museum is meant to emphasize the brutality and horror of life for slaves and leave visitors with the conclusion that racial injustices didn’t disappear at the end of the Civil War.

To Fight Deadly Dengue Fever in Humans, Create Dengue-Resistant Mosquitoes

Smithsonian Magazine

There’s a reason this tropical disease is known as “breakbone fever”: To its victims, that's how it feels. Dengue fever can cause such severe muscle and joint pain that it can be excruciating for an infected person to even move. It can also cause burning fever, delirium, internal bleeding and even death as the body attempts to fight off the disease. There is no effective treatment, and won’t be anytime soon.

Nevertheless, new research has identifies a hope for stemming the epidemic—and it lies in genetic engineering.

Dengue virus, which is passed on by the same Aedes Aegypti mosquito that spreads Zika, has been plaguing humans since at least the late 1700s. But in the past few decades, skyrocketing human population and increased urbanization—particularly in warm, moist regions like South America, Southeast Asia and West Africa—have fueled a growing number of cases. Like the Zika virus, dengue has no symptoms for the majority of those who contract it (roughly three-quarters). But nearly 100 million people annually do develop at least some of its dangerous and excruciating symptoms—and roughly 20,000 of those die each year.

Even if you do survive dengue fever, you aren’t out of the woods yet. In fact, overcoming the disease once actually makes you more likely to die if you contract a different strain later. That’s because the various types of the virus appear so similar on the surface, that the immune system will often respond using the same antibodies it developed to fight the last bout. But these are ineffective against the new strain. Moreover, the immune system’s efforts to fight the virus can attack the body instead—causing hemorrhaging, seizures and even death.

So far, preventing the spread of dengue has mostly taken the form of old-fashioned mosquito warfare: nets, insecticide and draining still water, where mosquitoes like to breed. In 2015, researchers finally developed a partially effective dengue virus vaccine, which was green-lighted in three countries. But the vaccine only reduced chances of getting the virus by 60 percent in clinical trials, and because of the risk of developing antibodies, some experts think it may only be safe for people who have already survived an infection.

Today the vaccine is only being used in limited quantities in the Philippines. "There is really an urgent need for developing new methods for control," says George Dimopoulos, a John Hopkins University entomologist who studies mosquito-borne diseases like malaria and dengue.

Instead of focusing on how people get infected with dengue, Dimopoulos has turned his efforts to how mosquitoes themselves contract the virus. Usually, the virus makes its home in a mosquito after the insect bites an infected human; it rarely passes between mosquitoes. So theoretically, by figuring out how to block that infection from ever occurring, you could effectively eliminate dengue virus, Dimopoulos says.

In a study published today in the journal PLOS Neglected Tropical Diseases, lead author Dimopoulos explained how that would work. Using genetic engineering, he and his team manipulated two genes that help control the immune system of the Aedes aegypti mosquito, which most commonly spreads dengue. The manipulated genes caused the mosquitoes' immune systems to become more active when the bugs fed on blood, which is when they contract dengue virus. This stimulation made the mosquitos significantly more resistant to the different types of dengue virus.

"This impressive body of work is an important step forward in understanding mosquito-[dengue virus] immunology," says University of Melbourne dengue researcher Lauren Carrington, who was not involved in the study.

However, Dimopoulos says this breakthrough is just the first step. While the mosquitoes in his study became roughly 85 percent more resistant to some types of dengue virus, other types were much less affected by the genetic engineering. Furthermore, the manipulation didn't seem to create any significant resistance to the related Zika and Chikungunya viruses that Aedes aegypti also spread.

Dimopoulos hopes to fine-tune the method to make it more effective. While genetic engineering comes laden with controversy, he points out that his technique doesn't introduce any foreign genes into the mosquitoes; it simply manipulates the ones they already have. Eventually, he hopes to create mosquitoes that will be resistant to multiple tropical diseases. He also wants to take advantage of "gene drive" technology, which enhances the chances of a certain gene to be passed to offspring, to allow the genetically modified mosquitoes to quickly become dominant in any environment they're released into.

This isn’t the first time researchers have played with mosquitoes’ genes in an attempt to halt the spread of disease. The British biotechnology company Oxitec has worked to modify the genome of the Aedes aegypti mosquitoes to make males that produce dead offspring after mating. Brazil has already partnered with the company to release billions of these mosquitoes into the country, in hopes of suppressing the population of disease-spreading mosquitoes. The company has also worked to get approval to release its mosquitoes in other places, including India, the Cayman Islands and the Florida Keys, where Zika fears drove voters to approve a trial in a ballot measure last year.

Oxitec's methods are effective in the short term, Dimopoulos says. But eliminating the mosquito population from an area will not make it mosquito-free permanently, because mosquitoes from other areas will eventually fill the empty niche left behind. Authorities will be forced to regularly release more genetically modified mosquitoes to keep their population numbers suppressed, Dimopoulos notes—a costly method that would appeal to biotech companies like Oxitec.

Replacing the wild mosquitoes with live but resistant mosquitoes, however, will act as a lasting barrier to spreading tropical diseases, Dimopoulos says. Before we get there, though, he says he wants to work on upping the resistance of the mosquitoes to dengue, as well as making them resistant to other types of tropical diseases. Then, he’ll need to do trials in greenhouses and on islands to see if the resistance works outside the lab.

He doesn't expect any widespread releases of mosquitoes for another decade, but points out that 10 years is a small wait overall. "It's not going to happen quickly," Dimopoulos says, "but we have to remember that these diseases have been with us for a very long time."

There's no humane way to test in the lab whether or not humans will contract dengue less often from these mosquitoes, Dimopoulos says. As a result, we'll only know for sure how effective the gene manipulation is once the mosquitoes have been released. But even if they don't work as well outside the lab, Dimopoulos has no regrets about blazing new trails to combat tropical illnesses.

"The fight against these diseases is like a war," Dimopoulos says. "You can't win it with one weapon."

The Inventor of Air

Smithsonian Magazine

Joseph Priestly is best known for discovering oxygen, but Steven Johnson, author of a new biography of Priestly titled The Invention of Air, points out that his contributions were much larger: he was the first ecosystems thinker, almost 200 years ahead of his time. Priestly was best friends with Benjamin Franklin, he wrote about major scientific discoveries in popular literature, and was highly revered by George Washington, Thomas Jefferson and John Adams.

Johnson’s previous books have covered everything from the impact of popular culture on neuroscience and the19th-century cholera epidemic in London. Smithsonian’s Bruce Hathaway spoke with Johnson about his discoveries in The Invention of Air.

People who recognize the name Joseph Priestly think of him as the discoverer of oxygen. But you say that emphasis completely misses his most important contributions, one of which was the discovery of how plants sustain other life on earth.

The work with oxygen is the one thing I knew about him. And it’s the first line in his biographies everywhere you look. But it’s not entirely true. He wasn’t really the first. Carl Sheele was probably the first. And Priestly was messed up in his understanding of oxygen. Ultimately it was Antoine Lavoisier, in part building on Priestly’s thinking, who got it right about oxygen. It’s possible that if Priestly had not been so much of a polymath that he would have fully understood oxygen. But Priestly was not a systematic thinker. He was a great experimentalist and was incredibly clever at devising these experiments and coming up novel data for people to wrestle with. But he was never particularly gifted at taking the crazy things that he would discover and turning them into a systematic theory of the world. He was interested in finding these weird puzzles and letting other people solve them.

I think one of the things we have to recognize is that there are two kinds of minds in revolutionary science, science that changes the world. There are people who are really good at exploding the existing paradigm and then there are people who once the old paradigms are exploded are good at sorting. Priestly was the former, not the latter. Science needs both kinds of minds.

And you say that Priestly’s great discovery [of oxygen?] was quite a coincidence?

There were a bunch of interesting accidents in Priestly scientific life. One of the big ones was that he once moved randomly next to a brewery. He was ever inquisitive, so he went over to check out what they were doing. He noticed that there were interesting gases coming up from the big vats of beer they were brewing so he asked these guys if he could do some science experiments. What an incredible image. A weird scientist wanting to do experiments over beer.

And due to that fiddling, Priestly invented soda water?

Yes. Just by pouring water back and forth over these vats he noticed that it had a delightful fizzy flavor. So he got interested in gas in part because of this. Priestly’s brother said that when Joseph was like an 11 year old he trapped spiders and mice in little jars and waited to see how long it would take for them to die. So Priestly had long known that if you take a closed, sealed vessel and put an animal in there after a certain amount of time they’re going to use up all the air and they’ll die. But it wasn’t understood why that was happening and what was happening. Were they adding something to the air that was poisoning it? Were they taking something from the air? No one knew just what was happening.

Steven Johnson, author of a new biography of Joseph Priestly titled The Invention of Air. (Nina Subin)

Joseph Priestly suffocating mice and spiders just sounds sadistic. How did any scientific good come out of that?

Priestly had another idea which as far as we know no one had really looked into. What happens with a plant in that jar? How long would it take for the a plant to die? The assumption was that the plant would die; a plant’s another kind of organism. So he took this little sprig of mint out of his garden—and basically all of his nature experiments were done with things that were just around the house, a laundry sink he’d borrowed from his wife and glasses he’d get out of the kitchen. So he puts this mint plant in and isolates and sits around and it doesn’t die. It just keeps growing and growing, and he thinks, hmm, this is interesting.

How did Franklin get involved?

Once [Priestly] decided he’s got something, one of the first people he writes to is Franklin. We don’t have the letter that he writes to Franklin, but we have the letter that Franklin writes back. It’s one of these wonderful things because you have really direct evidence of this conversation that changed the way we think about the world. What Franklin does is he takes the experiment from this very local problem to the global level, in a brilliant way.

It seems from the historical record that Franklin really contributes this to Priestly’s little experiment. What Franklin says is that this sounds like it is a rational system and it’s probably one that exists all across the planet. There must be some way for the Earth to continue to heal itself, to purify the atmosphere. He says it is probably something that is happening everywhere and plants are probably cleaning up the air for us so that we can breathe clean air.

You write that Priestly’s thinking about religion had a major influence on Jefferson. How so?

Priestly did not believe in the divinity of Jesus. He didn’t believe Jesus was God’ son, and didn’t believe in the holy ghost and all that. That’s the founding precept of Unitarianism, that there is one god and there’s a wonderful voice of God’s vision on earth, but that that person is not the son of god. Priestly felt that rather than worshiping shrines and saints and resurrections that the clearest evidence of God’s work on earth was this tremendous advance—of the enlightenment.

What did Priestly see as the most important part of Christianity and how did his views have such a major effect on Jefferson’s?

He thought the essence of Christ’s message was progressive in the sense of doing unto others as you would have them do unto you. He was an early opponent of slavery and things like that. He was a great believer in tolerance. These views had a huge impact on Jefferson. Jefferson famously created the Jefferson Bible where he went through the Bible and eliminated all the parts that were basically supernatural and not Christ’s moral system. And he did that based almost entirely on Priestly’s book, An History of the Corruptions of Christianity.

Because of Priestly’s religious writings, and his political views, for example supporting the French and American revolutions, mobs destroyed Priestly’s house and would have killed him if they’d had the chance. So he emigrated to America. How was he received here?

He was greeted as a hero. He had tea with Washington a couple times, and Adams and Jefferson referred to Priestly 52 times (to Franklin only five times and to Washington only three) in the famous exchange of letters at the ends of their lives. The founding fathers’ intellectual makeup was such that it was impossible for them to imagine separating the insights and understandings of science and technology—they were also very very interested in technology—from their views of society and their politics. They understood that all those things were connected in all kinds of immensely interesting ways.

You write that Priestly’s views were important to the founding fathers. How so?

In some ways their vision of progress and their belief in the possibility for change, for improvement of human society, had come out of the progress that they had seen and that Priestly had celebrated so much in his writings on scientific and technological advancements over the preceding century and a half. So the idea was that if we can understand so much about the world and about electricity and about air and all these different new fields, why can’t we apply that same kind of rational, empirical method to the organization of human society? One of the messages of this book is that this kind of thinking wasn’t just a dalliance that the founding fathers had on the side but rather that their worldviews were thoroughly infused with the march of science and that partially their political views came out of that tradition.

Are Megacities Friend or Foe in the Fight Against Climate Change?

Smithsonian Magazine

It’s the age of the city. Today more than half the world’s population can be found in cities, and megacities—those with populations of 10 million or more—are on the rise. The world’s largest megacity, Tokyo-Yokohama, joins two cities and multiple Japanese prefectures to cover 5,200 square miles, and it houses a population of some 37.5 million individuals.

For many people, cities offer economic, educational or social opportunities not available in smaller, rural environs. Urban lifestyles can also have environmental advantages when public transportation replaces long trips in gas-guzzlers and residents are tucked into smaller, more efficient housing. But are modern megacities a blessing or a burden when it comes to climate change? While greenhouse gas emissions on a per capita basis can be lower in dense urban areas, cities are still responsible for 70 percent of emissions worldwide, according to the United Nations Human Settlements Programme.

"[They] are the source of the problem and the source of the solution," says Patricia Romero Lankao, who leads the Urban Futures initiative at the National Center for Atmospheric Research in Colorado. Cities are poised to tackle the problems of climate change because they have economies of scale that promote efficiency, as well as research centers, grassroots movements and opportunities for invention and innovation, Romero Lankao notes. “It’s like every human being—we have good and bad, and cities are the same.”

Combating climate change is ultimately in a city's best interest—sheer size can’t protect megacities from direct impacts such as rising temperatures and extreme events like hurricanes and droughts. Already warmer than the countryside because of the urban heat island effect, cities in tropical and sub-tropical areas—such as Mumbai, Bangkok and Lagos—could soon become too hot to handle, especially for those who cannot afford air conditioning. “Cities may become more uncomfortable spaces to live in the future,” says Alex de Sherbinin, a geographer at Columbia University’s Center for International Earth Science Information Network.

Coastal megacities face additional threats from sea level rise, which is caused by ocean water expanding as it heats up, plus the additional volume from melting glaciers and ice sheets. Though the rise has been slow, already coastal flooding has increased, particularly during hurricanes and storms. “That’s why you see a city like New York was at risk from [Hurricane] Sandy,” notes Romero Lankao. By 2100, between 0.2 and 4.6 percent of the global population—potentially hundreds of millions of people—could experience annual floods, according to a study published earlier this year in the Proceedings of the National Academy of Sciences.

Image by John Van Hasselt/Corbis. Flooding is not uncommon in Jakarta, Indonesia (pop: 29,959,000), the second-largest megacity in the world. But it could get even worse. The country’s National Council on Climate Change warned in 2013 that, unless action is taken soon, half the city could be underwater by 2030 due to climate change. (original image)

Image by FRANCIS R. MALASIG/epa/Corbis. The megacity of Manila in the Philippines (pop: 22,710,000) regularly floods during monsoon rains and extreme weather events, such as 2009's Typhoon Ketsana, which dumped a month's worth of rain in less than a day. But Manila faces other challenges that may be exacerbated by climate change, including declining aquifers and high levels of air pollution. (original image)

Image by Markus Hanke/www.MarkusHanke.de/Corbis. Most of the lakes and rivers around Shanghai, China (pop: 22,650,000) are already heavily polluted, and the most recent Intergovernmental Panel on Climate Change report warns that the city is likely to face decreased water availability for its growing populace. (original image)

Image by Kevin Downs/Corbis. Hurricane Sandy gave the New York metro area (pop: 20,661,000) a wake-up call regarding the dangers of extreme weather events. The storm flooded large swaths of coastal communities and drowned tunnels and subway stations. Two years later, the region is still recovering, but Sandy has spurred the development of one of the world’s most ambitious plans for adapting to climate change. (original image)

Image by Kenneth Garrett/National Geographic Society/Corbis. Climate change has brought more heat, drought and flooding to Mexico City (pop: 20,300,000), a megacity already stressed by rapid growth, pollution and the overexploitation of its water resources. But the city has begun to tackle its challenges by modernizing its water treatment system and working to improve air quality and energy efficiency. (original image)

Image by Frederic Soltan/Corbis. More than half the residents of Mumbai, India (pop: 17,672,000) live in slums, often located in low-lying areas prone to flooding. A devastating flood in 2005 killed some 5,000 people, and since then the government has worked to improve river flow and flood defense systems. However, there are gaps around the slums that leave them still vulnerable. (original image)

Image by GEORGE ESIRI/X00996/Reuters/Corbis. Lagos, Nigeria (pop: 12,549,000) is one of the world’s fastest-growing megacities. It has already experienced flooding in its low-lying slums. An artificial island, Eko Atlantic, is now being constructed to provide sanctuary from rising seas—but only for those wealthy enough to afford it. (original image)

Image by Antonio Lacerda/epa/Corbis. Rio de Janeiro, Brazil (pop: 11,723,000) is experiencing higher temperatures and extreme rain events that have spawned flooding and landslides. The city has also been dealing with epidemics of dengue fever, a mosquito-borne disease that could spread as climate change increases temperatures and precipitation in some places. (original image)

Then there are the indirect effects. Weather-related disasters, such as drought, floods and hurricanes, temporarily displaced some 20.6 million people in 2013. When such events strike rural regions, they can send thousands of people streaming into cities, where supply systems are usually more reliable, de Sherbinin notes. But that can add pressure on a megacity if water, food, electricity and other resources are already limited. Some worry that such shortages could even spark violence and rebellion in cities on the edge.

Some projects meant to alleviate the consequences of climate change have already had unintended effects, driving even more people into the world’s megacities. More than 300,000 people have been moved to make way for China’s South-North Water Transfer Project, for example, which is meant to lessen water shortages in the arid north of the country. And that’s on top of more than a million people who had to make way for the Three Gorges Dam.

“There’s recognition that action needs to be taken, but it’s not always effective,” says de Sherbinin. Particularly in poorer nations, “they’re not going to be all that concerned about the slum dwellers who are going to be affected by major events.”

How a megacity manages the challenges presented by growth and climate change may matter more than size or even wealth, says Romero Lankao. Money certainly helps—Tokyo has far more resources to prepare for climate change adaptation than somewhere like Lagos. But other factors, such as social networks and government response, can also help a megacity prepare for climate threats.

“The best adaptation is mitigation,” Romero Lankao says. But megacities can take action in other ways to reduce their vulnerabilities, especially among the poorest. Bangladesh, for instance, has worked over the last few decades to improve the country’s disaster preparedness, reducing tropical cyclone-related deaths from hundreds of thousands in a single storm to less than 200 after a 2013 typhoon.

Climate change carries plenty of uncertainty, even for megacities. “There will be surprises,” Romero Lankao says. But inaction could come at too high a price, she warns. “If we don’t act now, we will lament our lack of action later.”

Have Scientists Found a Way to Pop the Filter Bubble?

Smithsonian Magazine

We like to believe that every visit to Google is a search for knowledge, or, at least, useful information. Sure, but it's also an act of narcissism.

Each time we retrieve search results, we pull out a virtual mirror that reflects who we are in the Web world. It's what Eli Pariser aptly described as the "filter bubble" in his 2011 book, The Filter Bubble: What the Internet Is Hiding From You.

Pariser laid out the thinking behind algorithmic personalization. By meticulously tracking our every click, Google--and now Facebook and more and more other websites--can, based on past behavior, make pretty good guesses about what we want to know. This means that two people doing exactly the same search can end up with very different results.

We're fed what we seem to want, and since we're more likely to click on stuff within our comfort zone--including ads--Google, and others, are motivated to keep sharpening their targeting. As a result, the bubbles we live in are shrinking.

There's a price for all this precision, as Pariser pointed out in an interview with Brain Pickings' Maria Popova:

"Personalization is sort of privacy turned inside out: it’s not the problem of controlling what the world knows about you, it’s the problem of what you get to see of the world."

The bigger picture

So we're trapped in a maze of our own making, right?

Not necessarily, thanks to a team of scientists who say they may have come up with a way to escape the constraints of algorithms. As the MIT Technology Review reported recently, Eduardo Graells-Garrido at the Universitat Pompeu Fabra in Barcelona and Mounia Lalmas and Daniel Quercia at Yahoo Labs have developed what they call a "recommendation engine," designed to expose people to opposing views.

One key, say the researchers, is that those views come from people with whom we share other interests. That seems to make us more receptive to opinions we'd otherwise likely dismiss as folly. The other is to present opposing views in a visual way that makes them feel less foreign.

To that end, the scientists used the model of a word cloud, which allowed study participants both to see what subjects they tended to tweet about most often, and also to have access to--in a visually engaging way--content from others whose own word clouds mentioned many of the same topics.

But what if some of that content reflected a very different political view? Would people instinctively reject it?

To put their theory to a proper test, the researchers connected people on opposite sides of an issue that evokes deeply personal feelings--abortion. They focused on thousands of active Twitter users in Chile who had included hashtags such as #prolife and #prochoice in their tweets, creating word clouds for them based on terms they used most frequently.

Then, they provided study participants with tweets from people who had many of the same terms in their word clouds, but who also held the opposite view on abortion. The researchers found that because people seemed to feel a connection to those who had similar word clouds, they were more interested in their comments. And that tended to expose them to a much wider range of opinions and ideas than they would have otherwise experienced.

In short, the researchers used what people had in common to make them more open to discussing ways in which they differed. They had, their paper concluded, found "an indirect way to connect dissimilar people."

So, there's hope yet.

Madness to the method

Here are other recent developments in the sometimes bizarre world of algorithms.

  • Nothing like automated "Warm personal regards": This was probably inevitable. Google has just received a patent for software that would keep such close track of your social media behavior that it will be able to provide you with a choice of possible reactions to whatever comments or queries come your way on Facebook or Twitter. If, for instance, a friend gets a new job, the software would suggest a response, presumably something such as "Congratulations." That's right, you wouldn't have to waste any of your brain power. The algorithm will do it for you.
  • Phone it in: Researchers at the University of Helsinki have developed algorithms for determining how people get around--walking, driving or taking the bus or subway--by tracking the accelerometer signals of their cell phones. That allows them to analyze the frequency of their stops and starts. The researchers say it could be a powerful tool in helping planners understand how people move around in their cities.
  • All the news that fits: Facebook has tweaked its "news feed" algorithms so that more actual news will start showing up there. The idea is to give greater exposure to links to articles from news organizations on Facebook feeds--which will help make the social media giant more relevant to what's going on in the world besides friends' birthdays. The speculation is that this is an effort by Facebook to challenge Twitter's dominance in generating buzz around current events.
  • What does she have to say about the Chicago Cubs?: An Israeli computer scientist has created an algorithm that can analyze huge volumes of electronic data about past events from sources as diverse as the New York Times' archive to Twitter feeds and predict what might happen in the future. Most notably, the scientist, named Kira Radinsky, has used her system to predict the first cholera epidemic in Cuba in many decades and the protests leading up to the Arab Spring.

Video bonus: Here's the TED talk that made Eli Pariser and his concept of the filter bubble famous.

Video bonus bonus: There are algorithms for everything these days and, to believe Sheldon, of "The "Big Bang Theory," that includes making friends.

More from Smithsonian.com

How Big Data Has Changed Dating

Think You're Doing a Good Job? Not If the Algorithms Say You're Not

As Glaciers Retreat, They Give up the Bodies and Artifacts They Swallowed

Smithsonian Magazine

The 5,300-year-old body of Ötzi, the Stone Age human dubbed "The Iceman," is perhaps one of the most famous mummies to emerge from ice. But with glaciers around the world melting, many more bodies — some relatively new, others ancient — are now emerging. Global warming is giving back many once thought lost forever.

Take the soldiers who died during "The White War," a years-long campaign in the Italian front of World War I, later fictionalized by Ernest Hemingway in A Farewell to Arms. This month marks a century since Italy joined the war, and bodies and artifacts from that time are now surfacing. For Vice, Leander Roet writes:

The battle was fought at high altitude, with special weapons and infrastructure like ice-trenches and cable transports. Often the sides would use mortar fire to try and incur avalanches—‘the white death’—on each other’s camps, claiming thousands of lives.

Now, thanks largely to decades of global warming, the Presena glacier running through the battleground is slowly melting away. And with that melting the remains of the White War are slowly emerging. Remarkably well-kept artifacts have been streaming down with the melting water of the glacier since the early 90s: A love letter dated from 1918, to a certain Maria that was never sent. An ode to an old friend, scribbled down in a diary. A love note picturing a sleeping woman, signed, in Czech, “Your Abandoned Wife.”

The meltwater exposes bodies mummified by the cold as well, still wearing their uniforms. In September 2013, the local community of Peio found two young Austrian men.

“The first thing I thought of were their mothers,” Franco Nicolis from the local Archeological Heritage Office told Laura Spinney at the Telegraph. “They feel contemporary. They come out of the ice just as they went in. In all likelihood the soldiers’ mothers never discovered their sons’ fate.”

On the other side of the world, glaciers in the Argentinian Andes have relinquished their grip on a different set of bodies: Incan children sacrificed five hundred years ago, and a young pilot who crashed just a few decades ago.

"It took me a very long time to acknowledge he might be dead," the pilot's mother said, reported Stephen Messenger for Treehugger in 2011. "Now we have a body. I can visit my son at his burial site and grieve like any mother has a right to do."

A different plane carrying 52 passengers crashed into an Alaskan glacier in 1952. An Alaska National Guard helicopter crew found the wreckage in 2012.

But many finds are too ancient to offer comfort to relatives. Instead, those ancient finds are becoming valuable resources for researchers. 

“The ice is a time machine,” Lars Pilö, an archaeologist told Andrew Curry in a 2013 article for Archaeology. “When you’re really lucky, the artifacts are exposed for the first time since they were lost.” Global warming has created a kind of boom for this kind of archeology, Curry writes. Melting glaciers have released centuries-old moss, Roman coins, an iron age horse and even ancient forests.

Curry reports:

On one hand, it exposes artifacts and sites that have been preserved in ice for millennia, offering archaeologists a chance to study them. On the other hand, from the moment the ice at such sites melts, the pressure to find, document, and conserve the exposed artifacts is tremendous. “The next 50 years will be decisive,” says Albert Hafner, an archaeologist at the University of Bern who has excavated melting sites in the Alps. “If you don’t do it now they will be lost.”

However, the retreat of ice and the slow thaw of these bodies does present a worrisome, if slim, danger. Researchers found that a 30,000-year-old virus trapped in permafrost was viable enough to infect amoeba. Some fear that other pathogens able to infect humans may be lurking in the bits of the world still locked by ice and frost. The worse case would be something like smallpox, for which people have no natural immunity. Fortunately, Michael Lane of the CDC, who worked on smallpox eradication programs, feels this possibility isn’t a strong one, reports Geoff Manaugh for Gizmodo.

"No one feels there's a serious chance that global warming will melt the permafrost and unleash an epidemic," he told Manaugh. But melting glaciers certainly will unleash more bodies and artifacts.

A Ban on Salamanders Is Just Part of the Fight Against This Deadly Fungus

Smithsonian Magazine

Species of all types are disappearing around the globe, but no group may be more threatened than amphibians. One recent analysis found that 43 percent of amphibian species are on the decline and nearly a third are officially threatened. Scientists have also counted 168 species that have gone extinct in the wild, and more than half of those extinctions have occurred in the last few decades.

One big factor has been Batrachochytrium dendrobatidis, a fungal disease also known as chytrid that was virtually unknown two decades ago. Since its discovery, scientists have witnessed mass die-offs of amphibians, especially frogs, around the world, sometimes happening overnight.

Now, a related fungal disease is spreading among salamanders, B. salamandrivorans, or Bsal, and scientists are racing to apply what they have learned about chytrid to prevent this new threat from devastating amphibians in North America. 

Amphibians are an integral part of the ecosystem, providing a link between the aquatic and terrestrial worlds, Karen Lips, who studies the animals at the University of Maryland College Park, said this week at the 2016 meeting of the American Association for the Advancement of Science (AAAS) in Washington, D.C.

Amphibians are key predators of insects—many of which can transmit diseases such as Zika and dengue to humans—and they serve as meals for other creatures. When frogs disappear, “there are big impacts on pretty much all aspects of the ecosystem,” from water quality to snake abundance, says Lips, who has seen the effects of chytrid on amphibians in Panama.

The animals have also become key in research on limb regeneration. That makes amphibian declines, which may be even worse than reported, especially worrisome, Lips says. So researchers around the world are jumping in to find out as much as they can about the attacking fungi.

“The discovery of these two diseases has changed the way we think about pathogens,” says Ana Longo, of the University of Maryland College Park and the Smithsonian Conservation Biology Institute. When chytrid first appeared, scientists were reluctant to believe that a single pathogen could be so dangerous to more than a single species.

While studies have since shown that it's possible, scientists they have also discovered that there are several kinds of Batrachochytrium. Some appear to be endemic in certain regions, such as Brazil, Switzerland and Korea, and amphibians there are able to tolerate the fungus.

But two other versions have spread widely, largely due to the pet trade. These invasive fungi are mostly responsible for the mass die-offs of frogs and other amphibians in the wild. 

Scientists sample chytrid fungus on a dart frog in French Guiana. (Quentin Martinez/Biosphoto/Corbis)

Scientists have also recognized that the chytrid epidemic began decades earlier than they thought. By studying amphibians in natural history collections, they have been able to see that declines in some species, such as the Yosemite toad, occurred around the same time as the arrival of chytrid in a particular region.

“Museums are giving us a view of the past that may help us interpret the status of present-day populations,” says Vance Vredenburg, an amphibian ecologist at San Francisco State University.

One big takeaway so far is that the fungus may not actually doom all frogs, as scientists once feared. Many factors can interact to determine whether a population­—or an entire species—survives. For instance, while chytrid thrives in cooler climates, the local climate and ecology can influence the spread of the disease and amphibian susceptibility.

Interactions with the other microbes living on an animal’s skin may also play a role, along with the response of its immune system. Some researchers are now working on probiotics that might help a frog fight off a chytrid infection. And zoos, including the Smithsonian National Zoo, are raising animals that have gone extinct in the wild, such as the Panamanian golden frog, with plans to eventually reestablish lost populations once they figure out how to control the fungus. 

The Panamanian golden frog. (courtesy Brian Gratwicke)

Such efforts are giving scientists a head start for tackling Bsal, a disease that was first officially described in 2013. Thought to be native to Asia, this fungus arrived in the Netherlands via the pet trade and spread through Europe from there. The disease has not yet been found in North America, but it could be a huge problem if it makes the leap across the Atlantic.

“The threat of the new salamander-eating chytrid fungus is something we should all be very concerned about, because the Appalachian region is the world’s major biodiversity hot spot for salamanders,” says Brian Gratwicke, a conservation biologist at the National Zoo. “We have a responsibility do everything we can to preserve them as an important feature of the continent’s biodiversity.”

The U.S. Geological Survey has developed a rapid-response plan for handling suspicious salamander deaths, and herpetologists would love to see any dead salamanders people find. The National Zoo has also teamed up with a citizen-science project, the Amphibian Survival Alliance, to test pet salamanders for the fungus. In the meantime, researchers are hoping to apply the lessons they are learning about chytrid biology to Bsal.

But for now, the best way to keep U.S. salamanders safe is to keep Bsal out of the country. To that end, the U.S. Fish and Wildlife Service implemented a ban earlier this year on the import and interstate trade of 201 salamander species that could transmit Bsal.

“We know that there’s no treatment,” Lips said, “so it’s pretty obvious that the only thing that is going to give us any amount of time to come up with a solution or treatment … is to keep it out as long as possible.”

Will a New Mosquito Emoji Create Some Buzz About Insect-borne Diseases?

Smithsonian Magazine

Mosquitoes are coming. The Unicode Consortium has just announced that alongside your smiling face – or perhaps crying face – emoji you’ll soon be able to add a mosquito.

The mosquito emoji will join the rabble of emoji wildlife including butterflies, bees, whales and rabbits.

We see a strong case that the addition of the much maligned mosquito to your emoji toolbox could help health authorities battle the health risks associated with these bloodsucking pests.

It may be small but it could make all the difference in battling mosquito-borne disease outbreaks. (https://emojipedia.org/mosquito/)

Given it is the most dangerous animal on the planet, the mosquito is more than deserving of an emoji. But will it make a difference to the way the science behind mosquito research is communicated? Could it influence how the community engages with public health messages of local authorities? Will more people wear insect repellent because of the mosquito emoji?

We won’t know for sure until the mosquito is released.

Where did the mosquito emoji idea come from?

A staggering sixty million emoji are shared on Facebook each day!

We’ve needed a mosquito emoji for a while now (although the blood filled syringe has been a useful substitute). While heavily promoted last year by the Johns Hopkins Center for Communication Programs and the Bill & Melinda Gates Foundation, it was one of us, an Australian virologist, who played a critical role in the emoji’s development by submitting the original proposal in June 2016.

The idea arose during the Zika virus epidemic in South America, when the mosquito-borne infection was triggering many questions and few answers. While the emoji doesn’t represent a specific mosquito species, it captures the distinctive shape of a mosquito.

Time spent outdoors may be the perfect opportunity to employ the mozzie emoji. (jiulliano)

How might a mosquito emoji make a difference?

The mosquito emoji will give health professionals and academics a more relatable way to communicate health risks and new research using social media.

Surveillance programs across the world routinely monitor mosquitoes. Local health authorities could simply tweet a string of mosquito emoji to indicate the relative mosquito risk or identify that there is a risk. Adding in the new microbe emoji (currently in the form of a generic green microscopic shape) could even indicate the presence of mosquito-borne viruses such as dengue virusWest Nile virus or Ross River virus.

Emoji could remind us to tip out, drain or cover backyard water-holding containers that may be a source of mosquitoes following rain. Weather monitoring services or health authorities could simply add the mosquito emoji in alerts featuring a string of storm clouds and water droplets.

More than likely, it’ll be used by the public to punctuate those summer tweets complaining of bites and bumps following backyard BBQs.

Social media is changing public health

Social media will continue to play a role in public health campaigns. Whether promoting better nutritionencouraging exercise or addressing concerns over vaccine coverage, Twitter, Facebook, Instagram, and whatever platform comes next will remain important for the communication tool kits of local health authorities.

Smartphones have already been identified as tools for surveillance of mosquito-borne disease outbreaks.

The addition of a mosquito emoji, together with concise public health messaging, may increase the chances a message hits home, maybe even changing behaviour and reducing the risk of bites.

Simple communication works

The usefulness of emoji as a communication tool has been shown in several fields of research. Emoji can accurately express emotional associations with commercial products, reflect state of mind in cancer patients and aid communication with sick young patients.

Applying these examples to the mosquito emoji, we predict it may aid citizen science – for example, if the community can signal how bad nuisance-biting mosquitoes are in their area. Perhaps this mobile surveillance network could help pick up the introduction of exotic mosquitoes such as the Asian Tiger Mosquito, a species often first detected because of reports by the community. Measuring a rise in mosquito emoji use may identify regions under attack by mosquitoes.

Big corporations have already identified the usefulness of emoji, and fork out serious cash for hashtag-customized emoji. If branded emoji work for commercial enterprises, why not for public health and why not a mosquito? A simple image may provide a critical reminder to put on insect repellent, sleep under a bed net or get appropriately vaccinated for mosquito-borne diseases such as Japanese encephalitis or Yellow Fever.

It is increasingly difficult to escape our social media streams, and emoji use shows no sign of waning. Health authorities should embrace these tiny visual prompts to better engage the community with key health messages.

The mosquito emoji may pave the way for more medically important arthropods: perhaps the tick, flea, lice and bed bug emoji will be on their way soon. Perhaps even viruses and bacteria.

From the middle of 2018, we look forward to watching the creative ways researchers, health workers and the general public incorporate the mosquito emoji into their communications.

A New Report Identifies 30 Technologies That Will Save Lives in the Next 15 Years

Smithsonian Magazine

President Obama wasn't the only head of state visiting Ethiopia this summer. In early July, the United Nations brought global leaders to Addis Ababa, for the third annual International Conference on Financing for Development. The goal of the meeting was to outline what the UN calls Sustainanble Development Goals—a series of financial, social and technological targets that they want countries in the developing world to hit by 2030.

At the conference, the United States Agency for International Development (USAID), the Government of Norway, the Bill and Melinda Gates Foundation and global health nonprofit PATH released "Reimagining Global Health," a report outlining 30 innovations that will save lives in the next 15 years. The team spent a year analyzing current and future technology, by reaching out to all the partners they work with in the world of international health. They received 500 nominations from entrepreneurs, scientists and other experts in nearly 50 countries, which a panel of 60 health experts reviewed and whittled down to a short list of easy-to-use technologies that they felt could reduce child mortality, improve maternal health and reproductive rights, and combat both infectious and noncommunicable diseases.

By 2030, USAID, the Gates Foundation and PATH want to reduce the global maternal mortality rate to less than 70 per 100,000 live births; end preventable deaths of newborns and children under five years old; reduce premature mortality from noncommunicable diseases by a third; ensure universal access to sexual and reproductive health care services; end the epidemics of AIDS, TB, malaria and neglected tropical diseases; and combat other infectious diseases.

The groups want to consolidate investments from philanthropic organizations like the Gates Foundation and from government groups to go to the most high value projects, so that their products and services are cheap and accessible. “Strengthening the capacity of low- and middle-income countries to identify, develop, adapt, produce, regulate, assess, and share innovations is critical for a robust innovation pipeline,” says Amie Batson, Chief Strategy Officer at PATH said in an email.

Making communities healthier also makes them more financially resilient. Former U.S. Treasury Secretary Lawrence Summers, who also contributed to the report, says that by investing in health technology now, globally we can save significant money and lives down the road. “With the right investments, we could reach grand convergence in just one generation, averting 10 million deaths every year by 2035. But today’s health tools alone won’t get us there,” says Summers in the report.

Here are eight of the 30 new drugs, diagnostics, devices and services poised to help the developing world:

Easily-transportable vaccines can help treat communicable diseases. (Gabe Bienczycki)

Chlorhexidine for Umbilical Cord Care

In the developed world, medical professionals clean babies' umbilical cords shortly after birth. But in the developing world, hundreds of thousands of newborns die each year from infections related to lack of antiseptic at delivery. If $81 million was spent on introducing chlorhexidine in home settings in the developing world in the next 15 years, the authors of the report estimate that more than 1 million neonatal lives could be saved, resulting in a 9 percent reduction in deaths due to sepsis.

Uterine Balloon Tamponades

One of the biggest causes of maternal death is postpartum hemorrhage, which can be stopped or slowed by inserting an inflatable tamponade into the uterus. Because of cost and lack of training, the devices haven't been used in the developing world. The report highlights one easy-to-use, low-cost option, called Every Second Matters for Mothers and Babies. Basically, a condom is attached to a catheter that's inflated with water through a syringe and a one-way valve. By investing $27 million in these devices, the group estimates that 169,000 mothers' lives could be saved in the next decade and a half.

Neonatal Resuscitators

Low-cost neonatal resuscitators could help the one in 10 babies who have trouble breathing at birth. They've been hard to bring into the developing world, because of cost, so these groups are working to identify and develop cheap, reuseable and easy-to-use options, including ones that health care workers can operate by hand.

Antiretrovirals for HIV That Can Be Injected Every Two Months

HIV is virulent and widespread in sub-Sarahan Africa, so, to try to slow the spread, these groups are looking at long-lasting drugs that could be injected into HIV patients every two months to treat symptoms and slow the virus' progression to AIDS. These options could prove more effective than easily forgotten daily pills.

Single-dose Antimalarial Drug

Malaria treatment is tricky for a lot of reasons, but one of the big ones is that the Plasmodium parasites that cause it, transmitted to humans by mosquitoes, are developing resistance to existing drugs. A single-dose antimalarial drug, OZ439, knocks out the resistant strains without giving them time to develop against the infectious disease. This solution will be critical to the 200 million people fighting malaria each year.

Portable Eye Scanners

For the 300 million people who experience it worlwide, visual impairment can have a huge impact on quality of life. But most eye troubles can be treated. The authors of the report acknowledge that, with little training, people in remote villages where eye doctors are few and far between could use portable, user-friendly eye scanners, like the 3nethraClassic, to diagnose cataracts, glaucoma and other conditions.

mHealth Innovations

Many non-communicable diseases, such as diabeties and heart disease, can be managed with diet and exercise. This often involves patients changing their habits and routines, but these life changes can be hard to track and stick to, especially if frequent checkups with doctors aren't an option. In the next 15 years, the report speculates that low-cost mobile phones will be leveraged to track behavioral changes. Doctors could send texts and patients could report to networks that hold them accountable.

Injectable Contraceptives

Last year, PATH developed Sayana Press, an injectable contraceptive that lasts for three months. Unlike other contraceptives of this type, this one is designed for home use. The single-use shots can be distributed to individuals, who can discretely administer them on their own.

Why Google Flu Trends Can't Track the Flu (Yet)

Smithsonian Magazine

In 2008, Google announced an intriguing new service called Google Flu Trends. Engineers at the company had observed that certain search queries (such as those including the words "fever" or cough") seemed to spike every flu season. Their idea was to use the frequency of these searches to calculate nationwide flu rates faster than could be done with conventional data (which generally takes a few weeks to collect and analyze), letting people know when to take extra precautions to avoid getting the virus.

Media outlets (this reporter included) rushed to congratulate Google on such an insightful, innovative and disruptive use of big data. The only problem? Google Flu Trends hasn't performed very well.

The service has consistently overestimated flu rates, when compared to conventional data collected afterward by the CDC, estimating the incidence of flu to be higher than it actually was for 100 out of 108 weeks between August 2011 and September 2013. In January 2013, when national flu rates peaked but Google Flu Trends estimates were twice as high as the real data, its inaccuracy finally started garnering press coverage.

The most common explanation for the discrepancy has been that Google hasn't taken into account the uptick in flu-related queries that occur as a result of the media-driven flu hysteria that occurs every winter. But this week in Science, a group of social scientists led by David Lazer propose an alternate explanation: that Google's own tweaks to its search algorithm are to blame.

It's admittedly hard for outsiders to analyze Google Flu Trends, because the company doesn't make public the specific search terms it uses as raw data, or the particular algorithm it uses to convert the frequency of these terms into flu assessments. But the researchers did their best to infer the terms by using Google Correlate, a service that allows you to look at the rates of particular search terms over time.

When the researchers did this for a variety of flu-related queries over the past few years, they found that a couple key searches (those for flu treatments, and those asking how to differentiate the flu from the cold) tracked more closely with Google Flu Trends' estimates than with actual flu rates, especially when Google overestimated the prevalence of the ailment. These particular searches, it seems, could be a huge part of the inaccuracy problem.

There's another good reason to suspect this might be the case. In 2011, as part of one of its regular search algorithm tweaks, Google began recommending related search terms for many queries (including listing a search for flu treatments after someone Googled many flu-related terms) and in 2012, the company began providing potential diagnoses in response to symptoms in searches (including listing both "flu" and "cold" after a search that included the phrase "sore throat," for instance, perhaps prompting a user to search for how to distinguish between the two). These tweaks, the researchers argue, likely artificially drove up the rates of the searches they identified as responsible for Google's overestimates.

Of course, if this hypothesis were true, it wouldn't mean Google Flu Trends is inevitably doomed to inaccuracy, just that it needs to be updated to take into account the search engine's constant changes. But Lazer and the other reserachers argue that tracking the flu from big data is a particularly difficult problem.

A huge proportion of the search terms that correlate with CDC data on flu rates, it turns out, are caused not by people getting the flu, but by a third factor that affects both searching patterns and flu transmission: winter. In fact, the developers of Google Flu Trends reported coming across particular terms—those related to high school basketball, for instance—that were correlated with flu rates over time but clearly had nothing to do with the virus.

Over time, Google engineers manually removed many terms that correlate with flu searches but have nothing to do with flu, but their model was clearly still too dependent on non-flu seasonal search trends—part of the reason why Google Flu Trends failed to reflect the 2009 epidemic of H1N1, which happened during summer. Especially in its earlier versions, Google Flu Trends was "part flu detector, part winter detector," the authors of the Science paper write.

But all of this can be a lesson for the use of big data in projects like Google Flu Trends, rather than a blanket indictment of it, the researchers say. If properly updated to take into account tweaks to Google's own algorithm, and rigorously analyzed to remove purely seasonal factors, it could be useful in documenting nationwide flu rates—especially when combined with conventional data.

As a test, the researchers created a model that combined Google Flu Trends data (which is essentially real-time, but potentially inaccurate) with two-week old CDC data (which is dated, because it takes time to collect, but could still be somewhat indicative of current flu rates). Their hybrid matched the actual and current flu data much more closely than Google Flu Trends alone, and presented a way of getting this information much faster than waiting two weeks for the conventional data. 

"Our analysis of Google Flu demonstrates that the best results come from combining information and techniques from both sources," Ryan Kennedy, a University of Houston political science professor and co-author, said in a press statement. "Instead of talking about a 'big data revolution,' we should be discussing an 'all data revolution.'"

Art to Zoo: The Survival Game after Columbus: Pigs, Weeds, and Other Players (1991)

SI Center for Learning and Digital Access
This issue examines the colonization of the Americas, focusing on diseases from Europe and the population growth of European domesticated animals. In the lesson, students consider the fight for survival when two worlds meet.

Card Commemorating a Test of the Effectiveness of Vaccination onTwelve Children in Milton, Massachusetts

National Museum of American History
Thirty-five years have passed since the 33rd World Health Assembly declared the world free of smallpox, an infectious disease that had plagued humankind for most of written history. This momentous achievement was the result of a massive global eradication campaign begun in the late 1960s, but its real beginnings can be traced back much further—to a medical discovery made in the English countryside, which spread across the Atlantic and to the small towns of the new republic. This commemorative vaccination card is a small piece of evidence of this long and rich history.

This unassuming 3 x 5 inch card in the collections at the National Museum of American History attests to a remarkable event that took place over two hundred years ago in a small town outside Boston. On October 25, 1809, in Milton, Massachusetts, twelve children were released from quarantine after fifteen days of close observation for any sign of smallpox infection. This may not sound unusual for a time when smallpox epidemics were a part of life, but these children had been purposefully inoculated with virulent smallpox matter in order to make a public test of a new medical discovery—vaccination.

The discovery had been made over a decade earlier by Edward Jenner, a country doctor in Gloucester, England. In 1798 he published a pamphlet entitled An Inquiry into the Causes and Effects of the Variolae vaccinae, a disease discovered in some of the western counties of England, particularly Gloucestershire and Known by the Name of Cow Pox. The booklet described his successful experiments using inoculation with cowpox to provide protection from the more serious disease smallpox. Jenner's method was named "vaccination," referring to the medical term for cowpox, Variolae vaccinae, and the Latin vacca, meaning "cow." Vaccination provided a potentially much safer alternative to the older practice of variolation, in which immunity was conferred by deliberately infecting a person with a small dose of smallpox.

As word of the vaccine's effectiveness spread, Jenner supplied cowpox vaccine matter to doctors throughout England. In 1800 vaccine material reached the United States through Benjamin Waterhouse, a professor at Harvard Medical School. Acceptance of vaccination did not come easily, and many members of the medical profession and the church opposed a method that introduced an animal disease into humans. In 1802 Waterhouse felt obliged to extol the virtues of the cow in an attempt to persuade the Boston Board of Health to set aside its objections to the "contemptible origin" of the vaccine. "The earth maintains not a more clean, placid, healthy, and useful animal than the Cow," he appealed. "She is peculiarly the poor man's riches and support. From her is drawn, night and morning, the food for his ruddy children; […] every part of her has its particular uses in commerce and medicine. On these accounts she is an [sic] useful, though invisible wheel in the great machine of state."

Whatever their attitudes toward cows may have been, in 1809 the citizens of the town of Milton, Massachusetts, became part of the first municipal effort in the United States to offer free vaccination to all inhabitants. Over three hundred persons were inoculated during a three-day campaign in July. Following this program, the town leaders took an unusual step—they decided to hold a public demonstration to prove without a doubt that cowpox vaccine offered protection from smallpox. On October 9, 1809, twelve children, selected from those vaccinated in July, were inoculated with fresh, virulent smallpox matter by Dr. Amos Holbrook and witnessed by eighteen town members. The children were confined to a single home for fifteen days and on October 25 were discharged with no sign of smallpox infection.

Each child received a personalized certificate pronouncing them a living testament to the "never failing power of the mild preventative the Cow Pox," "a blessing great as it is singular in its kind." Several other small certificates were produced to commemorate this remarkable demonstration, including the one now in the museum's collection. The names of the twelve children subjected to the vaccine test are inscribed on the back of the card:

"Joshua Briggs, Samuel Alden, Thomas Street Briggs, Benjamin Church Briggs, Martin Briggs, George Briggs, Charles Briggs, John Smith, Catharine Bent, Suzanna Bent, Ruth Porter Horton, Mary Ann Belcher"

Milton's councilmen published a detailed account of the vaccination experiment and sent a copy to the officers of every town in the state, as well as to Governor Christopher Gore, a proponent of vaccination. In 1810 the State of Massachusetts passed the Cow Pox Act directing every town, district, or plantation, within the Commonwealth, to provide for the vaccination of their inhabitants.

The world is now free of small pox—a remarkable global achievement that owes a small debt to the citizens in a little town in New England in the early years of our republic.

Front of card: He is slain. Milton 25th October 1809. The twelve children whose names are written on the back of this card were vaccinated by Doctr. Amos Holbrook at the town innoculation in July last. They were tested by smallpox inoculation on the 10th Inst. and discharged this day from the Hospital after offering to the world in the presence of most respectable witnesses who honored Milton with their attendance on that occassion, an additional proof of the never failing power of that mild preventative the Cowpock, against Smallpox infection. A blessing as great as it is singular in its kind, whereby the hearts of man ought to be eleveated in praise to the Allmighty Giver. (Signed) Oliver Houghton, Chairman of the Committee for Vaccination.

Back of card: Joshua Briggs, Samuel Alden, Thomas Street Briggs, Benjamin Church Briggs, Martin Briggs, George Briggs, Charles Briggs, John Smith, Catharine Bent, Susanna Bent, Ruth Porter Horton, Mary Ann Belcher
313-336 of 380 Resources