Skip to Content

Found 7,850 Resources

1,800 Studies Later, Scientists Conclude Homeopathy Doesn’t Work

Smithsonian Magazine

Perhaps you remember when scientists debunked homeopathy in 2002. Or 2010. Or 2014. But now a major Australian study analyzing over 1,800 papers has shown that homeopathy, the alternative treatment that relies on super-diluted substances and the principle of “like cures like” is completely ineffective.

After assessing more than 1,800 studies on homeopathy, Australia’s National Health and Medical Research Council was only able to find 225 that were rigorous enough to analyze. And a systematic review of these studies revealed “no good quality evidence to support the claim that homeopathy is effective in treating health conditions.”

The Australian study, which is the first position statement relying on such an extensive review of medical literature, strikes the latest blow at a 200-year-old alternative treatment developed by a German physician with “no interest in detailed pathology, and none in conventional diagnosis and treatment.” The Washington Post reports that the study’s authors are concerned that people who continue to choose homeopathic remedies over proven medicine face real health risks—including the nearly 4 million Americans who use homeopathic “medicines.”

The head of the National Health and Medical Research Council told the Guardian that he hopes the findings will lead to changes in Australia’s health insurance and pharmacy systems. But he also said that “there will be a tail of people who won’t respond to this report, and who will say it’s all a conspiracy of the establishment.”

News of the Australian study comes on the heels of newly released National Health Interview Survey data showing a “small but significant” increase in the use of homeopathy during 2012. And recently, a Canadian homeopathic college came under fire for taking an anti-vaccination stance and promoting homeopathic “nosodes” as an alternative to vaccines.

But will the not-so-new news that homeopathy is ineffective keep consumers from wasting their money on the complementary therapy? If the growing homeopathic industry is any indication, the answer is probably no.

Study of Pendentive Figure for Electricity, Dome of the Manufactures and Liberal Arts Building, World's Columbian Exposition, Chicago, IL

Cooper Hewitt, Smithsonian Design Museum
Outline of a pendentive with woman in green shirt and red skirt. Left arm raised, left hand holding telephone receiver to ear; right arm at side. Behind, to right, a ticker, from which come two strands of tape that are wrapped about figure and then end in a waste-basket to left. Gold background.

The Morse Code, Study for "Electricity as Applied to Commerce," Manufactures and Liberal Arts Building, World's Columbian Exposition, Chicago, IL

Cooper Hewitt, Smithsonian Design Museum
Draped figure of a woman seated on a stool, seen from below level of ground line, turned half-way toward left. Across knees a folio, to left corner of which hand is held, while right hand rests on a telegraph key. Center line drawn through figure.

The Saddest Movie in the World

Smithsonian Magazine

In 1979, director Franco Zeffirelli remade a 1931 Oscar-winning film called The Champ, about a washed-up boxer trying to mount a comeback in the ring. Zeffirelli’s version got tepid reviews. The Rotten Tomatoes website gives it only a 38 percent approval rating. But The Champ did succeed in launching the acting career of 9-year-old Ricky Schroder, who was cast as the son of the boxer. At the movie’s climax, the boxer, played by Jon Voight, dies in front of his young son. “Champ, wake up!” sobs an inconsolable T.J., played by Schroder. The performance would win him a Golden Globe Award.

It would also make a lasting contribution to science. The final scene of The Champ has become a must-see in psychology laboratories around the world when scientists want to make people sad.

The Champ has been used in experiments to see if depressed people are more likely to cry than non-depressed people (they aren’t). It has helped determine whether people are more likely to spend money when they are sad (they are) and whether older people are more sensitive to grief than younger people (older people did report more sadness when they watched the scene). Dutch scientists used the scene when they studied the effect of sadness on people with binge eating disorders (sadness didn’t increase eating).

The story of how a mediocre movie became a good tool for scientists dates back to 1988, when Robert Levenson, a psychology professor at the University of California, Berkeley, and his graduate student, James Gross, started soliciting movie recommendations from colleagues, film critics, video store employees and movie buffs. They were trying to identify short film clips that could reliably elicit a strong emotional response in laboratory settings.

It was a harder job than the researchers expected. Instead of months, the project ended up taking years. “Everybody thinks it’s easy,” Levenson says.

Levenson and Gross, now a professor at Stanford, ended up evaluating more than 250 films and film clips. They edited the best ones into segments a few minutes long and selected 78 contenders. They screened selections of clips before groups of undergraduates, eventually surveying nearly 500 viewers on their emotional responses to what they saw on-screen.

Some film scenes were rejected because they elicited a mixture of emotions, maybe anger and sadness from a scene depicting an act of injustice, or disgust and amusement from a bathroom comedy gag. The psychologists wanted to be able to produce one predominant, intense emotion at a time. They knew that if they could do it, creating a list of films proven to generate discrete emotions in a laboratory setting would be enormously useful.

Scientists testing emotions in research subjects have resorted to a variety of techniques, including playing emotional music, exposing volunteers to hydrogen sulfide (“fart spray”) to generate disgust or asking subjects to read a series of depressing statements like “I have too many bad things in my life” or “I want to go to sleep and never wake up.” They’ve rewarded test subjects with money or cookies to study happiness or made them perform tedious and frustrating tasks to study anger.

“In the old days, we used to be able to induce fear by giving people electric shocks,” Levenson says.

Ethical concerns now put more constraints on how scientists can elicit negative emotions. Sadness is especially difficult. How do you induce a feeling of loss or failure in the laboratory without resorting to deception or making a test subject feel miserable?

“You can’t tell them something horrible has happened to their family, or tell them they have some terrible disease,” says William Frey II, a University of Minnesota neuroscientist who has studied the composition of tears.

But as Gross says, “films have this really unusual status.” People willingly pay money to see tearjerkers—and walk out of the theater with no apparent ill effect. As a result, “there’s an ethical exemption” to making someone emotional with a film, Gross says.

Image by Mary Evans / Ronald Grant / Everett Collection. The Champ is about a washed-up boxer, played by Jon Voight shown here in the center of the photo, trying to mount a comeback in the ring. (original image)

Image by MGM / The Kobal Collection. The Champ has been used in experiments to see if depressed people are more likely to cry than non-depressed people. (original image)

Image by United Artists / Courtesy Everett Collection. In 1988, Robert Levenson, a psychology profesor at the University of California, Berkeley, and his graduate student, James Gross, solicited movie recommendations to find the saddest movie scene. They found The Champ produced more sadness than the death of Bambi's mom. (original image)

Image by MGM / Courtesy Everett Collection. The list of films Levenson and Gross developed is widely used by emotion researchers. Of the 16 movie clips they identified, The Champ may be the one that has been used the most by researchers. (original image)

In 1995, Gross and Levenson published the results of their test screenings. They came up with a list of 16 short film clips able to elicit a single emotion, such as anger, fear or surprise. Their recommendation for inducing disgust was a short film showing an amputation. Their top-rated film clip for amusement was the fake orgasm scene from When Harry Met Sally. And then there’s the two-minute, 51-second clip of Schroder weeping over his father’s dead body in The Champ, which Levenson and Gross found produced more sadness in laboratory subjects than the death of Bambi’s mom.

“I still feel sad when I see that boy crying his heart out,” Gross says.

“It’s wonderful for our purposes,” Levenson says. “The theme of irrevocable loss, it’s all compressed into that two or three minutes.”

Researchers are using the tool to study not just what sadness is, but how it makes us behave. Do we cry more, do we eat more, do we smoke more, do we spend more when we’re sad? Since Gross and Levenson gave The Champ two thumbs-up as the saddest movie scene they could find, their research has been cited in more than 300 scientific articles. The movie has been used to test the ability of computers to recognize emotions by analyzing people’s heart rate, temperature and other physiological measures. It has helped show that depressed smokers take more puffs when they are sad.

In a recent study, neuroscientist Noam Sobel at the Weizmann Institute of Science in Israel showed the film clip to women to collect tears for a study to test the sexual arousal of men exposed to weepy women. They found that when men sniffed tear-filled vials or tear-soaked cotton pads, their testosterone levels fell, they were less likely to rate pictures of women’s faces as attractive, and the part of their brains that normally light up in MRI scans during sexual arousal were less active.

Other researchers kept test subjects up all night and then showed them clips from The Champ and When Harry Met Sally. Sleep deprivation made people look about as expressive, the team found, as a zombie.

“I found it very sad. I find most people do,” says Jared Minkel of Duke University, who ran the sleep-deprivation study. “The Champ seems to be very effective in eliciting fairly pure feeling states of sadness and associated cognitive and behavioral changes.”

Other films have been used to produce sadness in the lab. When he needed to collect tears from test subjects in the early 1980s, Frey says he relied on a film called All Mine to Give, about a pioneer family in which the father and mother die and the children are divided up and sent to the homes of strangers.

“Just the sound of the music and I would start crying,” Frey says.

But Levenson says he believes the list of films he developed with Gross is the most widely used by emotion researchers. And of the 16 movies clips they identified, The Champ may be the one that has been used the most by researchers.

“I think sadness is a particularly attractive emotion for people to try to understand,” Gross says.

Richard Chin is a journalist from St. Paul, Minnesota.

The 16 Short Film Clips and the Emotions They Evoked:

Amusement: When Harry Met Sally and Robin Williams Live

Anger: My Bodyguard and Cry Freedom

Contentment: Footage of waves and a beach scene

Disgust: Pink Flamingos and an amputation scene

Fear: The Shining and Silence of the Lambs

Neutral: Abstract shapes and color bars

Sadness: The Champ and Bambi

Surprise: Capricorn One and Sea of Love

Source: Emotion Elicitation Using Films [PDF], by James J. Gross and Robert W. Levenson in Congition and Emotion (1995)

Jaw muscles of old world squirrels

Smithsonian Libraries
The jaw, suprahyoid, and extrinsic tongue muscles were studied in 11 genera, belonging to five tribes, of Old World squirrels. Significant variation in most of the adductor muscles is evident. The most primitive state of sciuromorphy is seen in the African tree squirrels Paraxerus and Funisciurus, especially as reflected in the anterior deep masseter. A derived state of sciuromorphy is found in five genera of Old World squirrels and perhaps evolved independently in each. Reduction of the temporalis muscle was observed in three genera, distantly related to one another. A unique arrangement of the superficial masseter is reported in the Asian giant tree squirrels, Ratufa. The arrangement of the masseter in the African pygmy squirrel, Myosciurus, is very similar to that of the South American pygmy squirrel, Sciurillus. We present hypotheses about the functional significance of these differences. In the derived state of sciuromorphy, which is found in three cases in squirrels that feed extensively on hard fruits, the anterior deep masseter is well positioned to increase the strength of the power stroke of the incisor bite. Among the pygmy squirrels, the position of the anterior deep masseter suggests that it plays a more significant role in molar chewing. (C) 1996 Wiley-Liss, Inc.

World's Oldest Fish Hooks Discovered in Okinawa

Smithsonian Magazine

Japan has long been on the cutting edge of technology, and that rang true even tens of thousands of years ago. Researchers on the island of Okinawa have unearthed a pair of 23,000-year-old fish hooks, the oldest ever discovered. The find, detailed in the Proceedings of the National Academy of Sciences, comes from Sakitari Cave on the southern coast of the island.

According to Michael Price at Science, the hooks are made from snail shell and were used by fisherman who seasonally occupied the limestone cavern to in order to exploit the migration of crabs and freshwater snails. One of the hooks is finished and the other is incomplete. Radiocarbon dating of charcoal discovered in the same layer as the hooks places them between 22,380 and 22,770 years old.

The hooks are older than previously discovered hooks including a 16,000-year old barb discovered on Timor and an 18,000-year-old hook discovered in Papua New Guinea, reports Emiko Jozuka at CNN.

But the hooks have more significance than just their age. Previously, researchers believed Okinawa was too resource-poor for Paleolithic people to live on. But the hooks mean that ancient modern humans had the technology to survive on Okinawa and other remote islands in the northern Pacific and that advanced maritime technology was not just confined to the islands around Australia.

Kate Lyons at The Guardian reports that researchers have been excavating three areas of the cave since 2009 and have found beads, tools and the charred remains of birds, mammals, frogs and eels indicating that early people found enough to eat on the island. In fact, people thrived there, and remains of freshwater crabs show that the human inhabitants waited until crab migration in the autumn when they are, as the scientists note, “the most delicious” before consuming them, meaning they were not struggling to find food.

The research also indicates humans may have inhabited Okinawa much longer than previously thought, and bones show people were able to catch fish from almost the beginning. “We found fish and human bones that dated back some 30,000 to 35,000 years,” Masaki Fujita, study co-author and curator at Okinawa Prefectural and Art Museum tells Jozuka. “We don’t know what kind of tools were used to catch these fish, but we’re hoping to find some even older fishing tools.”

Where to See the World’s Biggest Spiders

Smithsonian Magazine

Currently, more than 46,000 spider species stretch their eight legs in habitats across the world, in every country and continent except Antarctica. And those are only the ones scientists have been able to find and name so far—many more are likely still out there, lurking under leaves and rocks and, for Halloween’s sake, perhaps under a bed or two.

Although some people find these creatures terrifying—a spooky symbol of haunted houses and Halloween frights—we owe a lot to our arachnid friends. Not only have they been around for about 350 million years (trumping our puny 200,000-year modern human existence), spiders make it possible for us to eat and live a more comfortable life.

“If spiders disappeared, we would face famine,” Norman Platnick, a spider expert at New York’s American Museum of Natural History, told the Washington Post in 2014. “Spiders are primary controllers of insects. Without spiders, all of our crops would be consumed by those pests.”

For that matter, so would we. Because spiders munch on insects, they save us from bites.

“Without spiders’ existence and abundance on the plant, life on earth would probably be a less hospitable place for people because the biting flies and mosquitoes of the world would be so populous,” Cat Urban, manager of the invertebrate live animal programs at the Natural History Museum of Los Angeles County, told Smithsonian.com.

The Los Angeles Natural History Museum has a new take on the butterfly pavilion, instead operating a spider pavilion, where guests can walk through an open-air space to get nice and cozy with, and hopefully challenge, their arachnophobia. Right now, the spider pavilion has some of the largest orb weavers in the world, several native species to the region, tarantulas, jumping spiders and wolf spiders. “The purpose of the pavilion is to bring people closer to understanding just how interesting spiders are and their importance in the landscape,” Urban said. She also noted that spiders have made great advances in silk production, something scientists are studying and learning to mimic to create better, stronger and lighter products for human use.

If you find the orb weavers impressive (or spine-tingling), you can find even bigger arachnids around the world. Here are a few places to see the biggest:

Tiny capsules, national service: The draft during World War I

National Museum of American History

After maintaining neutrality for three years, the United States entered World War I on April 6, 1917. Expecting around a million enlistees but receiving only 73,000 volunteers for military service, Congress and President Woodrow Wilson realized other methods were required to call up a large military force. By July 20, Wilson would enact a military draft lottery. Secretary of War Newton D. Baker was in charge of administering this new conscription act, which could have resulted in a riotous backlash as it had in the New York Draft Riots during the Civil War. It didn't, and Baker's implementation of the process may help explain why.

Black and white portrait of a man. He looks at the camera and wears glasses.

President Wilson's Selective Service Act of 1917 differed from the Civil War's conscription act of 1863 in that those who were drafted could neither purchase an exemption nor hire a substitute to take their places. Exemptions and substitutions during the Civil War were unpopular with many, as only the wealthy could afford to evade military service. With the option of substitution off the table, the Selective Service Act was more acceptable to many during the Great War.

Black and white photo of one man visiting another in hospital. One man lays in bed, looking up at visitor. The other looks down, chatting, holding hat.

While Baker's job made him central to the war effort, he had often identified as a pacifist. In a 1961 biography by C. H. Cramer, Baker is quoted in remarks to the Reserve Officers Association in 1916 as saying, "I am a pacifist. I am a pacifist in my hope; I am a pacifist in my prayers; I am a pacifist in my belief that God made man for better things than that a civilization should always be under the blight of this increasingly deadly destruction which war leaves us."

In addition to avoiding the option for substitution, Baker used another strategy to establish a feeling of fairness around the implementation of the Selective Service Act: local draft boards. According to the U.S. National Archives, which holds a collection of World War I draft registration cards, "The local boards were charged with the registration, determination of order and serial numbers, classification, call and entrainment of draftees." The serial numbers were printed on small pieces of paper and inserted onto capsules.

 

This is when the draft process may have begun to resemble a state lottery. The capsules were placed into a large glass bowl and mixed thoroughly using a ladle. The gelatin capsules helped reduce the probability of disorder during selection.

On left, an American penny. On right, a small, red capsule. Pill-shaped. Appears glossy or maybe sticky. Very small in comparison to penny.

Small pill-shaped object and even smaller object next to it.

Small slip of paper with a number on it. In a gloved hand. The glove is black.

Black and white photo. In front of what appear to be tall storage cabinets with small, numbered drawers, a blindfolded man picks something out of a glass bowl. Other men watch.

On July 20, 1917, Baker was selected to draw the first capsule for the draft.He drew it at 9:30 a.m. It held the number 258. The drawing would last into the early hours of the following morning. In this first drawing, 10,500 numbers were drawn.

Black and white photo of a glass bowl on a wooden table. It has high walls with no curve and a wide opening on tap. Inside, a few small pill-shaped capsules.

 

Document written on typewriter. It describes each scene in a film. It begins with President Wilson speaking.

Typewritten document listing scenes in a move titled "Made in America."

The Selective Service System is still with us today, as every male over the age of 18 can attest. In recent years, as women have officially been allowed in combat roles, Congress has debated including women in Selective Service, but no official change has been made to date.

Three capsules. Two are large, one red, one blue. One is small. It is dark red.

Annika Lundeberg completed a summer 2017 internship in the Division of Armed Forces History. She is a junior History and Nordic Studies major at St. Olaf College.

Learn more about the drawings in this digitized report.

Author(s): 
intern Annika Lundeberg
Posted Date: 
Wednesday, November 8, 2017 - 07:00

Categories:

OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=kcZIMzMw7IU:opJQy1AZgvU:V_sGLiPBpWU OSayCanYouSee?i=kcZIMzMw7IU:opJQy1AZgvU:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

The World Won’t End in 2012

Smithsonian Magazine

Someone is always predicting the end of the world, it seems. The latest popular theory says that the world will end on December 21, 2012, when the Mayan calendar will reach the end of its 5,126 year cycle. That alone is fairly nuts, as USAToday wrote two years ago:

"For the ancient Maya, it was a huge celebration to make it to the end of a whole cycle," says Sandra Noble, executive director of the Foundation for the Advancement of Mesoamerican Studies in Crystal River, Fla. To render Dec. 21, 2012, as a doomsday or moment of cosmic shifting, she says, is "a complete fabrication and a chance for a lot of people to cash in."

But the theory has gotten even crazier since then, as astronomer Neil DeGrasse Tyson notes in the video clip above. There are tales of an alignment between the Earth, Sun and the galaxy that will end in great catastrophe. There is Nibiru, a.k.a. Planet X, which will supposedly come close enough to Earth to knock the planet off its axis, with resulting calamity. (NASA has a great page debunking Nibiru.) And there's even more.

I had thought that end-of-the-world predictions and cults were a 20th-century invention until I read recently about some dating to the early 1800s. It doesn't matter that prediction of the future is impossible in an Einsteinian universe (that would be the one we live in). There will be people crazy enough to make this stuff up and others gullible enough to believe it. Don't be one of them.

So, when 12/21/12 comes along, don't despair. Instead, let's celebrate the end of the Mayan calendar cycle. Who's bringing the beer?

NASA Announces World's New Lightning Hotspot

Smithsonian Magazine

In 1997, NASA launched the Tropical Rainfall Measurement Mission Observatory, expecting the little satellite to last for three years. But the mission didn’t close up shop until 2015, providing researchers years of climactic data, including rainfall and more. Scientists have crunched the numbers from one little gadget aboard the satellite, the Lightning Imaging Sensor, and recently announced that the Earth has a new top location for lightning: Lake Maracaibo in the Andes Mountains of northwest Venezuela.

Maracaibo unseats the Congo Basin as the planet’s flash center. According to a press release from NASA, Lake Maracaibo has been on their radar (literally) for years, but until now, no one had crunched the 16 years' worth of data.

According to the study, which will be published in the Bulletin of the American Meteorological Society, in a single year, each square kilometer of Maracaibo experiences an average of 232.52 lightning flashes. The thunderstorms over the lake are so frequent that sailors in the Caribbean used the flashes as a lighthouse in colonial times. According to a Spanish poem, the lightning once thwarted an attack by English pirates. Named for the river entering at the southwest edge of the Lake, the storms are known locally as Catatumbo lightning, The Never-Ending Storm of Catatumbo, or the Lighthouse of Catatumbo and are so regular and spectacular boats take tourists out to see it.

Why so much lightning? As cool breezes from the nearby mountains flow down the slopes of the Andes, they converge with the warm, moist lake air. This mingling sets off 297 nightly thunderstorms per year, with a pyrotechnics peak in the month of September.

The location of the lighning, however, is unexpected. “One of the most interesting aspects was to discover that the place with the most lightning on Earth is over water, and during the night," lead author Dr. Rachel I. Albrecht of the University of São Paulo in Brazil tells The American Meteorological Association. This opposes the global trend of lightning strikes that occurr most commonly over land in the afternoon.

The study also reveals that out of the 500 top lightning hotspots the most are located in Africa, which hosts 283 sites. Asia claims second place with 87 sites, then follows South America with 67, North America with 53 and Oceania with 10. Six of the top ten spots are in Africa near Lake Victoria and other bodies of water in the East African Rift Valley, where climate patterns similar to Lake Maracaimbo produce fantastic storms.

It’s unlikely Maracaibo will be unseated anytime soon, but researchers will continue to count flashes with the new Geostationary Lightning Mapper, which will be aboard the GOES-R mission, a 20-year climate observation satellite that launches in October, 2016.

Here are the world's top ten lightning hotspots, each listed with the average lightning flashes per square kilometer per year:

1 Lake Maracaibo, Venezuela, 232.52

2 Kabare, Dem. Rep. of Congo, 205.31

3 Kampene, Dem. Rep. of Congo, 176.71

4 Caceres, Colombia, 172.29

5 Sake, Dem. Rep. of Congo, 143.21

6 Dagar, Pakistan, 143.11

7 El Tarra, Colombia, 138.61

8 Nguti, Cameroon, 129.58

9 Butembo, Dem. Rep. of Congo, 129.50

10 Boende, Dem. Rep. of Congo, 127.52

Hybridization in large-bodied New World primates

Smithsonian Libraries
Well-documented cases of natural hybridization among primates are not common. In New World primates, natural hybridization has been reported only for small-bodied species, but no genotypic data have ever been gathered that confirm these reports. Here we present genetic evidence of hybridization of two large-bodied species of neotropical primates that diverged 3 MYA. We used species-diagnostic mitochondrial and microsatellite loci and the Y chromosome Sry gene to determine the hybrid status of 36 individuals collected from an area of sympatry in Tabasco, Mexico. Thirteen individuals were hybrids. We show that hybridization and subsequent backcrosses are directionally biased and that the only likely cross between parental species produces fertile hybrid females, but fails to produce viable or fertile males. This system can be used as a model to study gene interchange between primate species that have not achieved complete reproductive isolation.

Creating the Cadet Nurse Corps for World War II

National Museum of American History

“Wartime nursing is different,” The American Journal of Nursing soberly noted in 1943. As nurses well knew, wars always created a shortage of qualified nurses both on the home front and in the military. Recognizing that resolving and addressing these shortages would require “all the imagination and administrative skill” of their profession, American nurses began to discuss and debate how to best address the growing shortage of nurses even before the United States entered World War II.

Light blue and white striped uniform top with collar and four buttons, red details on shoulders.

As American nurses embarked upon this discussion, the federal government was initiating steps to not only “step up recruitment of student nurses” but also to “educate…and better prepare graduate nurses.” By 1943, the United States Public Health Service had already funneled $5.7 million into nursing education in an attempt to address what they believed would be a pending shortage of trained nurses. But this $5 million was, as Public Health Service officials knew, insufficient to address the problem.

Poster with image of young woman in military uniform

In an attempt to solve this problem once and for all, Frances Payne Bolton, a United States Representative from Ohio, called for an innovative program to resolve the nation’s shortage of nurses. Backed by over $150 million in federal funds, the Cadet Nurse Corps program was signed into law in 1943. Under this program, federal funds were used both to provide scholarships and stipends directly to students and to improve facilities at nursing schools, many of which had been deemed sub-standard. In a surprising twist in a nation that was still ruled by Jim Crow, dispersal of these funds was to be uniform, with funds being provided to all nursing students, regardless of their race or ethnicity, and to all nursing schools, including those that served primarily or even solely minority students.  

Following passage of the Bolton Act, a massive recruitment campaign was launched. Targeting women who were high-school graduates between the ages of 17-35, the campaign used ads, films, radio programs, billboards, and recruiting posters to encourage women to join the Cadet Nurse Corps. Recruitment materials underscored the benefits of the program: free tuition, coverage of book fees and uniform costs, and even a stipend to cover any ancillary costs. In exchange for this financial assistance, nursing students were required to complete their education in 30 months and to then work as civilian or military nurses throughout the duration of the war. The recruitment campaign was an unconditional success, with the program enrolling its target number of recruits each year it was in operation.

Two photos of details of the above uniform. Stripes. Badge with red circle and white cross with text "Cadet Nurse." Button with anchor and medical symbol.

Across the country, nursing schools underwent a radical transformation as federal funds helped schools update and modernize their equipment and facilities. Because nursing schools that served minority populations were more likely to have large numbers of students in need of financial assistance, and because these nursing schools were less likely to have a strong endowment that they could use to improve their facilities, the Cadet Nurse Corps program had an especially dramatic impact on minority access to nursing education. At some nursing schools, such as the Sage Memorial Nursing School which served predominantly Navajo students, a significant number of students joined the Cadet Nurse Corps. Looking back at their experiences, the women in the Cadet Nurse Corps who studied at Sage remembered that the stipends they received from the government to study nursing “made them relatively rich in an area that was desperately poor.” Twenty-one African American nursing schools also benefited substantially from this program, as did 38 nursing programs that accepted both African American and white students.

Between 1943 and 1948, when the program was terminated, just over 124,000 women enrolled in the Cadet Nurse Corps program. For many of these women, the program helped propel them into a profession and into the American middle class. Nursing schools were also transformed as federal funds were used to build modern facilities and ensure that laboratory equipment was state of the art.

More broadly, the Cadet Nurse Corps program ensured that Americans, whether they were enrolled in the military or serving on the home front, had access to the nursing care that they needed throughout and after the war years.

Gray cap with no brim

Alexandra M. Lord, Ph.D., is chair of the History of Medicine and Science Division. She has also blogged about the history of measles. For National Nurses Week (May 6-12), you may want to read about a Civil War nurse in Washington, D.C., midwives on horseback, or stories from the frontline of a measles epidemic

Posted Date: 
Thursday, May 5, 2016 - 08:00
OSayCanYouSee?d=qj6IDK7rITs OSayCanYouSee?d=7Q72WNTAKBA OSayCanYouSee?i=TWv5gYj0iFg:C9bCrkx2t2w:V_sGLiPBpWU OSayCanYouSee?i=TWv5gYj0iFg:C9bCrkx2t2w:gIN9vFwOqvQ OSayCanYouSee?d=yIl2AUoC8zA

Who Will Save the World’s Chocolate?

Smithsonian Magazine

Climate change is threatening the world’s supply of chocolate, reports Grist’s Tove Danovich, and some of the largest food companies are banding together to protect its future.

During recent years, in the midst of a global chocolate shortage, rivals Mars Inc. and Hershey raced to decipher the cocoa plant's genetic code. Now, Mars is releasing its findings to the public—as well as its competitors. Genomic data may reveal how particular strains of cocoa resist disease and, possibly, survive a changing climate.

This kind of cooperation, or "pre-competitive research," is growing more common in the food business, Danovich writes. In 2011, Nestlé, Kellogg and several other large food companies worked with TI Food and Nutrition, a Dutch company, to study the microbes that foster gut health. Mars also partnered with dozens of groups to create a food safety center in Huairou, China, which opened in September. 

“It is our belief that it is in everyone’s best interest for the market to be safe,” Harold Schmitz, Mars’ chief science officer, tells Food Business News. So, what’s drawn these competing businesses together? As Danovich explains, problems with food safety can dissuade people from buying and weaken consumer trust in a particular type of food. When the whole system is safer, everyone stands to gain something—especially the manufacturers. 

Ultimately, the bottom line is what matters to these companies. Their collaborations, however, may mean that the world gets a long-lasting supply of chocolate. That’s good too: for the corporations that sell it, for the millions of farmers who grow it, and for anybody with a sweet tooth.

Drapery Study for Figure of "The World" in "Temptation of St. Anthony"

Cooper Hewitt, Smithsonian Design Museum
Standing female figure, facing right, with a bare right shoulder. Drapery shaded; figure lightly outlined.

This Is the World's Tallest Tropical Tree

Smithsonian Magazine

A yellow meranti in the Malaysian State of Sabah on the island of Borneo is now the world’s tallest tropical tree. Earlier this year, local climber Unding Jami of the Southeast Asia Rainforest Research Partnership made it to the top and dropped down a tape measure to confirm it stretched nearly 330 feet from its canopy.

“It was a scary climb, so windy, because the nearest trees are very distant. But honestly the view from the top was incredible. I don't know what to say other than it was very, very, very amazing,” Jami says in a press release.

The tree, named Menara, the Malay word for tower, weighs in around 180,000 pounds, the equivalent to a fully loaded Boeing 737-800. Just 5 percent of that mass is contained in its crown. The other 95 percent is found in its thick, straight trunk.

Researchers conducting Lidar surveys of the forests in the region had identified the tree in scans. In August 2018, researchers trekked to collect 3D image and drone footage of the behemoth.

The scientists say that analysis of the structure of the tree indicates it could grow even bigger. But wind may be a constraint, so they doubt it or other trees would go too much taller. Still, John C. Cannon at Mongabay reports that the Menara's location is perfect for tall trees since the state of Sabah is outside of the typhoon belt to the north of it. And its island location means it doesn’t get the massive, violent storms that form over larger landmasses.

It’s probable that if taller tropical trees are out there, they would be discovered in the same area, the Danum Valley, a conservation area where logging is prohibited and where the trees have some measure of protection.

Menara isn’t the first tree to hold the world’s tallest tropical title to come from Sabah. In 2016, the previous record holder, a 293.6-foot yellow meranti was measured in Sabah’s Maliau Basin Conservation Area. Prior to that, the record came from a yellow meranti in Sabah’s Tawau Hills National Park.

The record may be surpassed sooner than you think. Cannon at Mongabay reports that ecologist Greg Asner of Arizona State University, who found one of the previous tallest trees, has tweeted that he believes he’s discovered a monster meranti, though he has yet to confirm its height.

Which tree is the biggest is not what excites researchers the most. “It’s the science telling us these trees do exist, they are reaching heights we have perhaps never anticipated and there will be other tall trees out there that haven’t been discovered yet,” Doreen Boyd from the University of Nottingham, who led the Lidar study, says in an interview with the BBC. “It tells us that we do need to protect these trees.”

While yellow meranti trees do face pressure from loggers onthe island of Borneo, the Forestry Department has extended protections in the Danum Valley. The state of Sabah, meanwhile, has pledged to protect 30 percent of its land area by 2025, most of which is covered by tropical forests.

In case you were wondering, the world’s tallest tree, Hyperion, was discovered in Redwood National Park in California in 2006 and is 379.7 feet tall.

How Humans Helped Ants Invade the World

Smithsonian Magazine

If you’ve never been stung by a fire ant, consider yourself lucky. Known for their fearlessness and painful, venom-laden butt pinches, these wee warriors can easily take down a chicken, kitten and occasionally even a human (usually by anaphylactic shock). It’s no wonder that the appearance of floating rafts teeming with these horrors was considered a “terrifying threat” to the Gulf Coast in the wake of Hurricane Cindy.

In addition to inciting fear, fire ants have also been particularly successful at spreading around the world. Since tropical fire ants rode Spanish trade ships to new continents in the 16th century, the tenacious critters have taken hold across the Southern United States and reached as far as Taiwan and Australia. And once they invade, they can significantly reshape their new environments—sometimes in catastrophic ways.

What qualities have made them so successful? That was the question that drove Cléo Bertelsmeier, an ecologist at the University of Lausanne in Switzerland, to chart the global spread of ants for a study published last week in the journal Nature Ecology and Evolution. Her study documents how the history of ant migration has largely been driven by waves of human globalization—and asks how we might be able to predict the next great ant invasion.

Ants are far more than just a nuisance for picnics and pantries, Bertelsmeier points out. "Invasive ants are really a huge problem for biodiversity," she says. Besides displacing native species, invasive ants can also cause harm by eating valuable agricultural crops, attacking people and even shorting out electrical systems.

"I think ants globally really are one of the bigger and more problematic invasive taxa," says Andrew Suarez, a University of Illinois at Urbana-Champaign entomologist who has long studied invasive ants. He points to aggressive and durable fire ants as a prime example of a harmful invasive ant genus. Their aggression in colonizing new areas and attacking rival insects helps them push out native insects and even nesting birds and reptiles.

While prior research has traced the paths of some invasive ant species, Bertelsmeier wanted to find out whether there was a pattern to when, and how extensively, certain ant species spread over time. She took to scouring various public databases covering the more than 13,000 known ant species for information on the 241 ant species that have been identified as "aliens," or introduced to environments they aren't native to.

Among those 241 species, Bertelsmeier categorized ants into four different groups based on how well they seemed to take to invading foreign environments. Some alien ant species had barely spread beyond their native ranges, while others had spread throughout a continent. A few ant  managed to make footholds around the world in relatively low numbers. The final, most effective group—which includes fire ants—has been able to spread globally with verve.

Bertelsmeier was able to identify a handful of traits that were associated most strongly with ants that were exceptional invaders. Those included body size, number of queens, how their colonies are organized and other traits.

It turns out that the best invaders tend to be smaller ant species, with multiple queens who bring worker ants along with them to found new colonies instead of going it alone. Other helpful factors include the ability to settle in ecologically disturbed habitats—often those that have been shaped by humans—and the ability to build new nests in many different kinds of environments. Cooperation, hardiness and versatility: These are the traits helps that make groups like fire ants and Argentine ants ruthless invaders.

For the 36 species that she managed to find enough historical data on, Bertelsmeier was also able to track when exactly these alien species typically spread. Unsurprisingly to her, the ant invasions of the last 200 years correlated with the two peaks of human globalization, from the Industrial Revolution and age of European colonization to the Great Depression, and then the global post-war boom starting in the mid-20th century until today. Wherever people went, it seemed, ants followed.

"Human activities have left a fingerprint on the distribution of these alien species," Bertelsmeier says.

"I this is a pretty amazing study," says Suarez, who was not involved in the research. He is particularly impressed, he says, by the amount of data Bertelsmeier was able to collect for the study by scouring public databases and collecting data from many separate studies done over time, and sees it as a useful resource for future research on invasive ants worldwide. "That's something that people have been trying to do for a long time."

Next, Bertelsmeier plans to focus on different countries that have harbored invasive ants and those that haven't, to see what factors make one place more appealing than another. Meanwhile, Suarez says he hopes to see more research expanding on this study that could help scientists predict which ant species are most at risk of causing harm as invaders, and how likely they are to spread in the first place.

In the meantime, if you see a floating raft of fire ants, run far, far away.

A Minecraft World Built for Exploring Chemicals

Smithsonian Magazine

Toss out those ball and stick models of chemical compounds and grab your game controller: Chemistry has finally entered the realm of gaming. Thanks to chemistry students at the United Kingdom’s University of Hull and the Royal Society of Chemistry, now there’s an entire Minecraft world devoted to teaching the basics of biochemistry.

“I got tinkering and started to think of ways of incorporating chemical structures into Minecraft,” MolCraft project leader Mark Lorch tells Emiko Jozuka for Motherboard.  

When you load up MolCraft (Molecules in Minecraft), you appear in a central hall adorned with teleporters that take you to rooms filled with enormous 3D models of chemicals like myoglobin (a protein found in muscle tissue) and asparagine (one of the most common amino acids on Earth). From there, you can fly around the structure and look at it from any angle, whether it’s taking in the entire chemical or zooming up close for a look at the bonds that hold its atoms together.

But there is more to the game than just zipping around different molecules, MolCraft is centered around a scavenger hunt. Treasure chests are scattered throughout the game, holding everything from short quizzes to iron swords to mark a student's progress through the game—like discovery of the atom of iron hiding in the heart of the twisting ribbons that make up myoglobin.

The scavenger hunt might one day be a way to measure student's progress in the game, allwoing teachers to judge whether their students explored enough of the game’s nooks and crannies, Alexandra Ossola reports for Popular Science

As Lorch and Joel Mills, another of MolCraft’s project leaders, write for The Conversation:

You don’t have to be interested in biochemistry and its implications to appreciate that proteins are beautiful wonders of nature, just as you can appreciate the elegant design of a car without knowing how it works. The difference is that you can see wonderfully designed cars all the time. But where could you marvel at the structure of a protein?

MolCraft isn’t the first Minecraft world built with the classroom in mind: others have recreated medieval villages and designed geological maps of the U.K. Players have even built fully operational hard drives in the video game world (one can even store up to one kilobyte of data).

Right now, Lorch and Mills are trying to get schools in the U.K. to integrate MolCraft into their studies. But in the meantime, anyone interested in taking a spin around MolCraft can download it for free, play online at the University of Hull's server, or even make their own MolCraft-inspired Minecraft worlds using downloaded models from the game. 

The Most Loved and Hated Novel About World War I

Smithsonian Magazine

On December 5, 1930, just over 12 years after the end of World War I, German moviegoers flocked to Berlin’s Mozart Hall to see one of Hollywood’s latest films. But during the movie, a cadre of 150 Nazi Brownshirts, nearly all too young to have fought in World War I, were led into the theater by propagandist Joseph Goebbels. Spewing anti-Semitic invective at the screen, they repeatedly shouted “Judenfilm!” as they tossed stink bombs from the balcony, threw sneezing powder in the air, and released white mice into the theater. A somewhat shocking turn of events, considering the movie was the highly anticipated adaptation of countryman Erich Maria Remarque’s novel All Quiet on the Western Front, the blockbuster novel that had transfixed the nation months earlier.

First serialized in 1928 in the German newspaper Vossische Zeitunghe, the book was published on January 31, 1929, and instantly became a literary juggernaut. In Germany, the initial print run sold out on release day, and some 20,000 copies moved off the shelves in the first few weeks on its way to more than a million books sold by year’s end. Abroad, All Quiet on the Western Front was a big hit as well, selling 600,000 copies in both Britain and France, and 200,000 in America. The film rights were snatched up by Universal Pictures for a record $40,000 and the motion picture went into production immediately.

All Quiet on the Western Front is, as most American high school students know, the story of a company of volunteer German soldiers stationed behind the front lines in the last weeks of World War I. Based on Remarque’s time as an infantryman, it’s the first-person account of Paul Baumer, who joins the cause with a group of his classmates.

It’s a gritty pull-no-punches look at the horrors of war. Limbs are lost, horses are destroyed, starving soldiers root through garbage for food, the troops are ravaged by poison gas and artillery bombs, and few make it out alive. Baumer himself dies on a tranquil day shortly before the Armistice is signed. Apolitical in terms of policy and strategy, Remarque’s anti-war masterpiece tapped into the global sorrow following a conflict that led to more than 37 million casualties between 1914-18. The humanity of All Quiet on the Western Front was captured in The New York Times review as, “a document of men who—however else there lives were disrupted—could endure war simply as war.”

Joseph Goebbels was the Minister of Propaganda in Nazi Germany from 1933 to 1945. (Wikimedia Commons)

Ironically it was this very humanity, and relentless political agnosticism, that made Goebbels see the All Quiet on the Western Front film as a threat to the Nazi ideology. A few weeks prior to the December screening, the National Socialist German Workers’ Party surprised the nation on election day, garnering 6.4 million votes, 18 percent of the total. It was a stunning victory for Adolf Hitler that gave his party 107 seats in the Reichstag and made the Nazis the second-largest political party in Germany. His leading campaign message, to unite Germany and make it strong again, resonated with voters in the midst of the Great Depression. Hitler, believing that treasonous Jewish-Marxist revolutionaries at home were to blame for Germany’s defeat in the Great War, proposed tearing up the Treaty of Versailles and ending war reparations to the Allies. This “stabbed in the back” theory was historical nonsense, but allowed workaday Germans to place blame elsewhere for the conflict that took an estimated 3 million lives, military and civilian, an easy sell that undermined the Weimar Republic.

All Quiet on the Western Front may have been the first runaway international bestseller, but its utter lack of pro-German propaganda and honest, downbeat look at war made the book a Nazi target. As Hitler’s power grew, Remarque’s critically acclaimed novel (which would be nominated for the Nobel Peace Prize in 1931) became a proxy for Nazi rage over its portrayal of German infantrymen as dispirited and disillusioned. Hitler refused to believe Teutonic soldiers could be anything but a magnificent fighting force, a nationalistic historical rewrite that took hold amongst the battered German citizenry. 

“One of the great legacies of World War I is that as soon as the Armistice is signed, the enemy is war itself, not the Germans, Russians, or French. The book captures it and becomes the definitive anti-war statement of the Great War,” says Dr. Thomas Doherty, professor of American Studies at Brandeis and the author of Hollywood and Hitler, 1933-39. “The movie has the same depressing tone, the hero doesn’t achieve battlefield glory. He dies in the famous scene reaching for the butterfly. It’s an extraordinary film, the first must-see of the early sound era not starring Al Jolson. Unfortunately, the premiere was an animating moment in the history of Nazism, reclaiming the World War I memory not as meaningless slaughter, as Remarque says, but as a glorious noble German enterprise.”

Image by © John Springer Collection/Corbis. Sick and injured soldiers are cared for in a church in a scene from the 1930 film All Quiet on the Western Front. (original image)

Image by © John Springer Collection/Corbis. Soldiers take refuge in trenches in a movie scene. (original image)

Image by © John Springer Collection/Corbis. Paul Baumer (played by Lew Ayres) is assisted by fellow soldiers after being wounded. (original image)

The $1.25-million film had actually quietly debuted in Germany on December 4 under heavy police presence. According to a Variety reporter, when then lights came up, the audience was too rattled or moved to disapprove or applaud. However, Goebbels correctly guessed that the theater would let its guard down during the December 5 showing. His surprise mob attack went far beyond the realm of boyhood fraternity pranks like mice and sneezing powder. The projectors were shut down and in the chaos, savage beatings were handed down to moviegoers believed to be Jewish. (Also in attendance: Future Nazi filmmaker—and occasional drinking buddy/confidant of Remarque—Leni Riefenstahl.)

Goebbels, a tiny man with a clubfoot, had been unfit to fight in World War I and his physical rejection consumed him. His hatred of All Quiet on the Western Front was both a personal vendetta and one of the first major public displays of Nazi thuggery. The main goal was simply to create chaos, to terrorize moviegoers, to rally support against the film. “Within ten minutes, the cinema was a madhouse,” Goebbels gloated in his diary that night. “The police are powerless. The embittered masses are violently against the Jews.”

Goebbels would lead torch-wielding hooligans for the next few days as other riots broke out. In Vienna, 1,500 police surrounded the Apollo Theater and withstood a mob of several thousand Nazis trying to disrupt the movie, but vandalism and violence still erupted in the streets. Other disturbances, like one on December 9 in Berlin’s West End district were more sanguine. The New York Times described it as “fairly polite rioting, the sort one could take one’s best girl to see.”  Only scary in that it proved others were heeding the Nazi call.

Carl Laemmle, president of Universal Studios, and Erich Maria Remarque, at a Berlin Hotel in 1930. (© Hulton-Deutsch Collection/Corbis)

By week’s end, the Supreme Board of Censors in Germany had reversed its original decision and banned All Quiet on the Western Front, even though Universal Pictures had already revised the film, sanitizing the trench warfare scenes and removing dialogue blaming the Kaiser for the war. Universal founder Carl Laemmle, a Jewish emigre from Germany, was shocked at the movie’s controversial reception. He sent a cable to Berlin newspapers, which ran as an ad, basically saying that the film was not anti-German and that it portrayed a universal war experience. (His point was made in Poland, where All Quiet on the Western Front was banned for being pro-German.) Laemmle’s efforts were fruitless, the Nazi intimidation tactics worked. Perhaps the most insidious part of the damage done was emboldening the Brownshirts to go after people where they live. As Doherty eloquently puts it in his book:

“Whether in the cathedral-like expanse of a grand motion picture palace or a cozy seat at the neighborhood Bijou, the movie theater was a privileged zone of safety and fantasy—a place to escape, to dream, to float free from the worries of the world beyond the Art Deco lobby, a world that, in the first cold winter of the Great Depression, was harder and harder to keep at bay. All the more reason to view the Nazi-instigated violence as the desecration of a sacred space.” 

Throughout, Remarque stayed relatively quiet, a habit he would later come to regret. He’d been recruited by Laemmle to write the screenplay, and as the legend goes, to play Baumer, but neither came to fruition. In his biography The Last Romantic, author Hilton Tims says Remarque was visited by a Nazi emissary prior to the premiere, who asked him to confirm that the publishers had sold the film rights without his consent. The idea was he’d been swindled by Jews, which Goebbels could use as propaganda, in exchange for protection from the Nazis. Remarque declined.

Nazis salute their leader in Berlin's Opera Plaza during a book burning on May 10, 1933, in which some 25,000 volumes were reduced to ashes. (National Archives and Records Administration)

On the night of May 10, 1933, four months after the Nazis had come to power in Germany, Nazis raided bookstores and libraries, stampeding by torchlight to ritually hurl the books of more than 150 authors on to flaming pyres of gas-soaked logs. Students screamed into the night, condemning each writer as some 25,000 books were incinerated. Goebbels would call it “the cleansing of the German spirit.

Remarque, neither Communist nor Jew, had been in Berlin on January 31, 1933, the day Hitler was appointed chancellor. He was tipped off that the Nazis were gunning for him and drove through the darkness to escape. On that May evening, Remarque was ensconced in his palatial Swiss home. By year's end, the Nazis would made it a crime to own All Quiet on the Western Front or its sequel-of-a-sort, The Road Back. All private copies had to be turned over to the Gestapo.

Remarque would finish his trilogy with Three Comrades, the tale of three German soldiers who open an auto body shop and all fall for the same dying woman. Like The Road Back, it sold well and was adapted into a milquetoast film, albeit it the only movie with F. Scott Fitzgerald credited as a screenwriter. Concerned about his safety in Switzerland, Remarque sailed to America in 1939, where he would be reunited with one of his many paramours, an actress he’d met in the South of France, Marlene Dietrich. Although married, for the second time, to the dancer and actress Jutta Ilse Zambona, Remarque would have countless affairs. From barmaids and prostitutes to Hollywood royalty like Greta Garbo, Hedy Lamarr, Luise Rainer and Maureen O’Sullivan (long rumored to have aborted his only child), Remarque had an insatiable sexual appetite.

As World War II raged on, Remarque lived the high life unbeknownst of his family’s tragic suffering. His brother-in-law became a prisoner-of-war; his father’s second wife committed suicide, but it was what befell his youngest sister that haunted Remarque for the rest of his life. In September 1943, Elfriede, a fashionista dressmaker living in Dresden, was turned in by her landlady and arrested by the Gestapo for “defeatist talk” and “subversion of military strength.” She was sentenced to death in a sham trial ‘as a dishonorable subversive propagandist for our enemies’. On December 12, Elfriede was beheaded by guillotine.

Records of the judge’s summation at trial were destroyed in an air raid during Elfriede’s incarceration. According to Tims, in pronouncing the decision the judge allegedly stated: ‘We have sentenced you to death because we cannot apprehend your brother. You must suffer for your brother.’ Remarque would dedicate his 1952 novel Spark of Life to Elfriede, but in a final twist of the knife, it was omitted in the German version, a snub chalked up to those who still saw him as a traitor.

As for the book and film that started his career and ended his relationship with his native country, they went on to be stunning successes. An estimated 30 to 40-million copies of All Quiet on the Western Front have been sold since it was first published in 1929, and the film would win that year’s Academy Awards for Best Director and Best Production. It is still regarded as one of the best war movies ever made. 

The World Hit "Peak Chicken" in 2006

Smithsonian Magazine

The world may not be as close to peak oil as once believed, but peak food, it seems, has already passed.

Energy experts warned in the late 20th century that the world would soon use up its supply of oil, and that production rates were about to plateau. That gloomy prophecy fell flat when oil production accelerated in the last decade, buying us a sort of contract extension on our energy use habits. However, according to research recently published in Ecology and Society, production of the world’s most important food sources has maxed out and could begin dropping—even as the Earth’s human population continues to grow.  

Ralf Seppelt, a scientist with the Helmholtz Centre for Environmental Research in Germany, and several colleagues looked at production rates for 27 renewable and nonrenewable resources. They used data collected from several international organizations, including the Food and Agriculture Organization and the International Union for Conservation of Nature, and analyzed yield rates and totals over a period of time—from 1961 to about 2010 in most cases. For renewable resources like crops and livestock, the team identified peak production as the point when acceleration in gains maxed out and was followed by a clear deceleration.

While annual production is still increasing in all the food resources analyzed—except for wild-caught fish—the rate of acceleration for most of them has been slowing for at least several years. The research team concluded that peak production of the world’s most important crops and livestock products came and went between 5 and 30 years ago. For instance, peak corn came in 1985, peak rice in 1988, peak poultry eggs in 1993, and peak milk and peak wheat both in 2004. The world saw peak cassava and peak chicken in 2006 and peak soy in 2009. This trajectory is troubling, because it means production will eventually plateau and, in some cases, even start to decline.

“Just nine or ten plant species feed the world,” says Seppelt. “But we found there’s a peak for all these resources. Even renewable resources won’t last forever.” While fertilizing soils can help maintain high yields, peak nitrogen—an important fertilizer—occurred in 1983, the study says.

Converting forest, prairie and marsh into farmland may be partially offsetting the per-acre productivity decline in many crops—though this process cannot go on forever. Seppelt and his colleagues found that acceleration of farmland conversion peaked in 1950. What's more, trees support biodiversity and serve as a sponge for atmospheric carbon, so losing more of the world’s forests to agriculture would be a global disaster.

The world reached peak wheat in 2004—just seven years before the global population hit 7 billion. (TODD KOROL/Reuters/Corbis)

All this might not be a problem if the human population was also stabilizing. Though we recently passed peak population, growth is not decelerating especially fast, and by 2050 there will probably be 9 billion of us and counting. Compounding the increased numbers is the fact that Asian population giants China and India are adopting diets heavier in meat—like the one that the western world has enjoyed for decades.

“It’s a bizarre and uncomfortable place to be in as an American, saying, 'If everyone acted like us, we’d all be screwed,'” says Jonathan Foley, director of the California Academy of Sciences. The trouble is that for every pound of beef produced, a cow may have eaten many pounds of nutritious grain and legumes. Other livestock species are more efficient at converting energy into flesh, but raising animals for meat or dairy is generally far more resource-intensive than growing crops for direct human use.

“[U]sing highly productive cropland to produce animal feed, no matter how efficiently, represents a net drain on the world’s potential food supply,” Foley wrote in a paper published in Nature in 2011. Almost four years later, he still believes that future food security will depend largely on a reduction of global meat consumption. Foley has calculated that the Earth would need to produce two times the food it does now to support projected future consumption rates—something that may be impossible, given the results of the study by Seppelt and his colleagues.

“That trajectory [of needing to double our food production] is not a given but more of a warning,” he says. In a way, Foley says, this is good news: “It means we will have to change how we eat and use food.” One of the biggest—and perhaps easiest—gaps to close in food production is in the waste stream. Foley notes that 30 to 40 percent of food grown globally for direct human consumption winds up uneaten. In developing nations, he says, this waste tends to occur before food reaches the retail market and could be addressed with improvements to local harvest and transport systems. In developed nations, waste tends to occur after food reaches consumers. Addressing this is largely a matter of individual awareness, says Foley. He points out that a great deal of packaged food is discarded because it has passed the sell-by date, which is not a reliable indicator of spoilage.

While Seppelt recognizes that the peak oil crisis never panned out, he agrees that deferring peak food production may not be possible: “For food production there are less options for increasing efficiency,” he says. “We don’t believe peak production can be shifted into the future.” Instead the best chance of increasing yields is looking for regions and crops that have not yet been pushed to their limits. 

Prawn farms have been carved out of coastal mangrove forests in Borneo. (Frans Lanting/Corbis)

One food source that has not yet peaked is aquaculture, or the farming of fish and shellfish. Yield gains are still accelerating, though the environmental costs of the global aquaculture industry could be huge if major farms continue to operate as they do today. Tropical shrimp production has been implicated in severe watershed pollution and coastal wetland destruction. In colder waters, salmon farms—mostly in Chile, northern Europe and Canada—also cause waste problems and have dented the numbers of local wild fish. Fish farms also rely on intensive harvest of feed fish, like sardines and anchovies, to grow captive species like salmon, yellowtail and tuna. Not only is this use of one edible resource to produce another considered wasteful, but some fear it could cause a collapse of feed fish populations. This, in turn, might mean the end of many aquaculture operations.   

Casson Trenor, a sustainable fisheries proponent and author in San Francisco, argues that the world’s wealthier people must eat less fish and literally share the ocean’s protein sources with the poor. He says 1.5 billion impoverished people who depend on seafood don't have any alternatives.

“These people are going to get hit first [when wild seafood supplies run short], and it’s not like they can just go to the store and buy beef instead,” Trenor says. He expects world protein shortages could spur desperation and violence. “It’s hard to maintain a peaceful society when there isn’t enough food to go around,” he says.

Foley foresees similar unrest. “But we probably won’t feel the impacts in the U.S.,” he says. “We tend to be pretty immune to instability [in the food economy].” He expects that food shortages and riots in poorer nations will be a part of the transformation process as the globe shifts to a more sustainable diet. 

How Mastiffs Became the World’s Top Dogs

Smithsonian Magazine

With its shaggy ruff and enormous stature, the mastiff is the most adorable giant to thrive in the thin air of the Tibetan Plateau, where the average elevation is around 15,000 feet. But just how did the dogs get so good at mountain living? It appears they got help from their cousins.

Usually it takes a long while for an animal to evolve the capacity to live in a hostile new environment. But mastiffs in the lowlands of China made a sudden transition to the plateau, says geneticist Zhen Wang at the Shanghai Institutes for Biological Sciences. Unlike yaks and snow leopards, which gradually made their home at high elevation over tens of thousands of years, the mastiffs made huge adaptive strides all at once. Wang suspected the dogs had found an evolutionary shortcut by breeding with another, better-suited canine species, a phenomenon called adaptive introgression.

To test his theory, Wang analyzed Tibetan mastiff genes, searching for ones that are associated with high-altitude success but are normally absent in mastiffs living closer to sea level. He and his colleagues also checked the genomes of 49 canid species known to live near the plateau, including wolves, dogs and jackals. The scientists found special versions of two genes that could confer a high-altitude edge and were shared exclusively by Tibetan mastiffs and grey wolves.

Both of the gene varieties work in tandem to cope with low oxygen levels. Typically, when an animal travels to high altitude, its body almost immediately begins to produce extra hemoglobin—the protein in red blood cells that carries oxygen. But that change thickens the blood, increasing the risk of clots and stroke in the long run. One of the special traits pinpointed by the researchers is a novel version of a gene called HBB that boosts the ability of hemoglobin to carry oxygen, making it more efficient. The other special trait is a variation of a gene called EPAS1 that promotes blood vessels to grow even as it puts a brake on the overall hemoglobin concentration, preventing the body from cranking out dangerous amounts of it in response to low oxygen.

As recent as 24,000 years ago the mastiffs of the Tibetan highlands bred with grey wolves, animals that were already well adapted to that demanding environment. The implications of the study, Wang says, might surprise Darwin, because it shows that survival of the fittest sometimes means borrowing a gene or two from another species.

Galaxy of Knowledge

SI Center for Learning and Digital Access
Gateway to Smithsonian Institution Libraries' online collection. Collections are divided thematically: American Discovery, Art and Design, Industry and Technology, and Mosaic of Science. Includes lectures, podcasts, digital versions of rare books, bibliographies, online collections, and online exhibits on a wide range of topics.

Voyages

SI Center for Learning and Digital Access
Online exhibit recording journeys to new places, mental explorations, and new creative thoughts. Background information and items from the Smithsonian collection document physical journeys as seen by early explorers, journeys of the mind where scientists expand our conception of the universe, and journeys of imagination led by artists, writers, and artisans.

Biomedical Science Studies Are Shockingly Hard to Reproduce

Smithsonian Magazine

It's hard to argue against the power of science. From studies that evaluate the latest dietary trend to experiments that illuminate predictors of happiness, people have come to increasingly look at scientific results as concrete, reliable facts that can govern how we think and act.

But over the past several years, a growing contingent of scientists has begun to question the accepted veracity of published research—even after it's cleared the hurdles of peer review and appears in widely respected journals. The problem is a pervasive inability to replicate a large proportion of the results across numerous disciplines.

In 2005, for instance, John Ioannidis, a professor of medicine at Stanford University, used several simulations to show that scientific claims are more likely to be false than true. And this past summer Brian Nosek, a professor of psychology at the University of Virginia, attempted to replicate the findings of 100 psychology studies and found that only 39 percent of the results held up under rigorous re-testing.

“There are multiple lines of evidence, both theoretical and empirical, that have begun to bring the reproducibility of a substantial segment of scientific literature into question,” says Ioannidis. “We are getting millions of papers that go nowhere.”

These preliminary findings have spawned the creation of an entirely new field called meta-research—the scientific study of science.

This week, the biology arm of the Public Library of Science (PLOS), a nonprofit publisher and advocacy organization, launched a new section solely dedicated to meta-research. The section will explore issues such as transparency in research, methodological standards, sources of bias, data sharing, funding and incentive structures.

To kick things off, Ioannidis and his colleagues evaluated a random sample of 441 biomedical articles published between 2000 and 2014. They checked whether these papers provided public access to raw data and experimental protocols, were replicated in subsequent studies, had their results integrated into systematic reviews of a subject area and included documentation of funding sources and other potential conflicts of interest.

Their results were troubling to say the least. For instance, only one study provided full experimental protocols, and zero studies provided directly available raw data.

“These are two basic pillars of reproducibility,” says Ioannidis. “Unless data and the full protocol are available, one cannot really reproduce anything.” After all, without that key information, how can another team know exactly what to do and how their results differ from those in the original experiment?

The team also found that the claims of just eight of the surveyed articles were later confirmed by subsequent studies. And even though many of the studies claimed to have novel findings, the results of only 16 articles were included in later review articles, which serve as a litmus test for the true impact of a study on a particular subject.

“The numbers that we get are pretty scary," says Ioannidis. “But you can see that as a baseline of where we are now, and there is plenty of room for improvement.”

However, not all the results were discouraging. The percentage of articles without a conflict of interest statement decreased from 94.4 percent in 2000 to 34.6 percent in 2014—likely a result of a growing awareness of the pernicious effects of bias on research outcomes.

In a second meta-research study, a German team analyzed how the loss of animal subjects during pre-clinical trials might contribute to the widespread inability to translate laboratory findings into useful clinical drugs.

Research animals might vanish from a study randomly—for instance, because the animal died—or through subtly biased actions, like being removed from the trial to eliminate data that undermines the expected results. The team demonstrated that the biased removal of animal subjects can skew results and significantly increase the likelihood of a false positive—when a new drug is thought to work but actually does not.

In a separate analysis of pre-clinical studies on stroke and cancer, the same researchers found that most papers did not adequately report the loss of animal subjects, and that the positive effects of many drugs being tested may be greatly overestimated.

So why is this crisis in transparency and reproducibility happening in the first place?

While some issues may lie in conscious or unconscious research biases, it's likely that most studies that reach publication are one of a kind due to the current incentive structure in science.  

In the cutthroat world of academia, the primary measure of success is the number of studies a researcher gets in prestigious journals. As a result, scientists are under pressure to spend the majority of their time obtaining the kinds of breakthrough results that are most likely to get published.

“While we value reproducibility in concept, we don't really value it in practice,” says Nosek, who is also co-director of the Center for Open Science, a nonprofit technology startup that works to foster transparency and reproducibility in scientific research.

“The real incentives driving my behavior as a scientist are to innovate, make new discoveries and break new ground—not to repeat what others have done. That's the boring part of science.”

Scientists also see few incentives to provide the information necessary for others to replicate their work, which is one of the primary reasons why the claims of so many studies remain unverified.

“I am not rewarded for making my data available or spelling out my methodology in any more depth than what is required to get into a publication,” says Nosek.

Many journals do ask scientists to provide a detailed explanation of their methods and to share data, but these policies are rarely enforced and there are no universal publication standards.

“If I knew there were never going to be any cops on the roads, would I always stick to the speed limit? No—it's human nature,” says Ivan Oransky, co-founder of Retraction Watch, an organization that promotes accountability and transparency by tracking retractions in scientific literature. “If you know nobody is going to sanction you, then you are not going to share data.”

Those scientists who want to conduct replication work and are able to obtain experimental details are then unlikely to find funding from public agencies like the NIH, who primarily judge grant applications based on novelty and innovation.

“The odds are clearly against replication,” says Ioannidis.

That's where the emerging field of meta-research can step in. Organizations like the Center for Open Science and the Meta-Research Innovation Center at Stanford (METRICS) are working to help realign the reward system and set stringent universal standards that will encourage more widespread transparency and reproducibility practices.

“If the funding levels or promotion depended on what happened to your prior research—if it was replicable, if people could make sense of it, if people could translate it to something useful rather than just how many papers did you publish—that would be a very strong incentive toward changing research to become more reproducible,” says Ioannidis, who is co-director of METRICS.

“I am hopeful that these indicators will improve,” he adds. “And for some of them, there is no other possibility but to go up, because we start from zero.”

121-144 of 7,850 Resources