Skip to Content

Found 6,696 Resources

Will We Ever Know Why Nazi Leader Rudolf Hess Flew to Scotland in the Middle of World War II?

Smithsonian Magazine

On the night of May 10, 1941, a Scottish farmer named David McLean found a German Messerschmitt airplane ablaze in his field and a parachutist who identified himself as Captain Alfred Horn. McLean's mum was soon serving him a cup of tea by the cottage fireside, but their surprise guest was no ordinary Luftwaffe pilot. Incredibly, he was Rudolf Hess, a longtime Hitler loyalist, to say the least. Hess joined the Nazi party in 1920, stood with his friend Adolf Hitler at the Beer Hall Putsch, and served in Landsberg prison -- where he took dictation for much of Mein Kampf. As deputy Fuhrer, Hess was positioned behind only Hermann Goering in the succession hierarchy of the Nazi regime that had Europe firmly under the heel of its jackboot.

Hess's appearance on Scottish soil, a self-described mission of peace just weeks before Hitler would launch his ill-fated invasion of the Soviet Union, was one of the war's strangest incidents. The search for explanations began on the morning after and has roiled on now for 75 years, spawning theories both intriguing (World War II might have ended differently) and bizarre (the man wasn't Hess at all but a body double.) The truth is likely as interesting as any of the fantasies—but it's still not entirely certain what happened 75 years ago.

Image by Wikimedia Commons. The fuselage from Hess' plane, now on view at the Imperial War Museum (original image)

Image by Wikimedia Commons. A photo taken of Hess plane where it crashed in Scotland (original image)

The Hess flight was remarkable in itself. He left an airfield near Munich in a small Messerschmitt fighter-bomber a little before 6 p.m., flying up the Rhine and across the North Sea. Hess displayed considerable skill by navigating such a course alone, using only charts and maps, on a foggy dark night over largely unfamiliar terrain—all while  avoiding being shot down by British air defenses. By 10:30, Hess was over Scotland, out of fuel, and forced to bail out just 12 miles from his destination.

That unlikely site was Dungavel House, home of the Duke of Hamilton. Hess hoped to make contact with one of the highly placed British figures who, unlike Churchill, were willing to make peace with the Nazis on Hitler's terms. Hess believed that Hamilton headed a faction of such people and immediately asked his captors to be taken to him. But Hess was misinformed. Hamilton, who wasn't home that night but on duty commanding an RAF air base, was committed to his country and to its fight against Germany.    

The unlikely envoy's mission quickly took a turn for the worse. When granted a meeting with Hamilton the next day Hess's pleas fell on deaf ears. Worse for Hess, he denied from the start that Hitler knew anything of his mission, which meant that the British afforded him none of the diplomatic respect to which he thought he'd be entitled. Instead he was imprisoned, and by the night of June 16, the obvious failure of his mission left Hess so mentally shattered that he attempted suicide by hurling himself down a flight of stairs.

Hess spent the war in British hands, confined in various locales including (briefly) the Tower of London and a military hospital at which he was even allowed guarded drives in the country. He was visited frequently by intelligence officers eager for secrets and by psychiatrists eager to plumb the Nazi mind—which in Hess's case increasingly showed serious signs of mental illness. The psychiatric examinations were rooted less in concern for Hess's mental health than in the hope that this fanatically devoted Nazi could provide them valuable insights about how the criminals ruling Germany, including Hitler himself, thought.

Hess was transferred back to Nuremberg for the post-war trials in October, 1945, where he escaped the hangman but was sentenced to life in prison. He spent the rest of his long life, 46 years, as Prisoner Number 7 in Spandau where he lingered long after the other Nazis were freed. Hess was the facility's only prisoner for more than 20 years, his term ending only when the 93-year-old was found hanging from a lamp cord in a garden building in August 1987. The suicide was denounced as a murder by those, including Hess's own son, who suspected he'd been silenced.

But Hess's death didn't end the questions. Had he really come alone? Had someone sent him to Scotland or had someone sent for him?

News of Hess's flight was a bombshell in Berlin, and Nazi authorities quickly moved to disassociate him from the regime. The German public was quickly told that Hess suffered from mental disturbance and hallucinations.

Joseph Goebbels, the Nazi propagandist who knew much about such tactics, feared that the British would use Hess as part of a devastating campaign targeting German morale. He worried in his private diary on May 14 that the German public was “rightly asking how such a fool could be second to the Fuhrer.” 

But the furor gradually died down. Though Hess held a powerful title, his actual influence in the Nazi hierarchy had waned dramatically by 1941, so much so that some have speculated that his flight was born of hopes to regain Hitler's favor by delivering him an agreement with the British. Instead his departure simply consolidated the power of his ambitious and manipulative former deputy Martin Bormann.

Yet a persistent theory has suggested that Hess's ill-fated peace mission was actually carried out with Hitler's knowledge—and the understanding that he'd be disavowed as insane if it failed.

In 2011, Matthias Uhl of the German Historical Institute Moscow unearthed some purported evidence for this claim. Hess's adjutant, Karlheinz Pintsch, had handed Hitler an explanatory letter from Hess on the morning after the flight, and Uhl discovered a report featuring Pintsch's description of that encounter in the State Archive of the Russian Federation.

Pintsch claimed that the Hitler received his report calmly. The flight occurred "by prior arrangement with the English,” Pintsch wrote, adding that Hess was tasked to "use all means at his disposal to achieve, if not a German military alliance with England against Russia, at least the neutralization of England."

This version aligns well with Soviet claims dating back to Stalin himself that British intelligence services had been touch with Hess and duped him into the flight. In fact they may align too well, for the statement was produced during the decade when Pintsch was an often-tortured Soviet prisoner and its language smacks of Cold War propaganda terminology—suggesting the Soviets coerced the version from Pintsch.

Indeed other witnesses reported a very different reaction from Hitler. Inner circle Nazi Albert Speer, waiting outside Hitler's office during the meeting, described the Nazi leader's reaction as “an inarticulate, almost animal out-cry” of rage.  “What bothered him was that Churchill might use the incident to pretend to Germany's allies that Hitler was extending a peace feeler,” Speer wrote in Inside the Third Reich. “'Who will believe me when I say that Hess did not fly there in my name, that the whole thing is not some sort of intrigue behind the backs of my allies? Japan might even alter her policy because of this,'” he quotes Hitler, while also noting Hitler's hope that Hess might luckily crash and die in the North Sea.

Speer discussed the flight with Hess himself 25 years later when both were incarcerated in Spandau. “Hess assured me in all seriousness that the idea had been inspired in him in a dream by supernatural forces,” he said. "We will guarantee England her empire; in return she will give us a free hand in Europe." That was the message he took to England— without managing to deliver it. It had also been one of Hitler's recurrent formulas before and occasionally even during the war.”

British historian Peter Padfield explores the “British duped Hess” theory in Hess, Hitler & Churchill. As with much of the Hess affair definitive evidence is lacking but a few tantalizing possibilities exist. Padfield has unearthed intriguing nuggets from period sources: the diary of a well-placed Czech exile who'd viewed a report suggesting an English trap, reports of Soviet spies who'd uncovered now untraceable evidence of the same. In 2010 the son of a Finnish intelligence agent who'd been on Britain's payroll claimed that his father was involved in the plot.

The official records that have been made available, perhaps not surprisingly, reveal no such role for the British intelligence services. The most plausible motivation for such a plot, were it ever to have existed, was that the British hoped it would convince Hitler to scrap or at least postpone an invasion of Britain; a peace settlement would make such a drastic and dangerous step unnecessary and free him to focus on the battle against his most hated enemy—the Soviet Union.

MI5 files declassified in 2004 suggest that Hess did have his adviser Albrecht Haushofer pen a letter to Hamilton in 1940, suggesting that a neutral site meeting could advance secret peace talks. British intelligence intercepted that letter, investigated (and exonerated) Hamilton for being part of a pro-peace Nazi plot, and seriously considered the possibility of replying to set up a double-cross.

But they dismissed the scheme and simply let the matter drop without ever knowing that Hess was the man behind the communication, the official files suggest.

However those files are far from complete. Some of the intelligence files on the Hess affair are known to have been 'weeded,' or destroyed. Whatever information they held is lost—but other classified files remain and have yet to be released.

Earlier this week, the Duke of Hamilton's son, James Douglas-Hamilton, called for the British government to release its remaining classified documents concerning the affair.

Conspiracy theorists suspect that the documents could contain not only transcripts of interrogations but correspondence between Hess and other figures including George VI. But Douglas-Hamilton, who has written his own book on the Hess affair, suspects they won't embarrass prominent Britons who really did want to deal with Hess but rather they'll likely confirm the standard story.

“The evidence shows Britain had an honorable record in fighting the Third Reich and did not swerve from that position,” he told The Scotsman. “Excessive secrecy with regard to the release of relevant material has, and can serve to, obscure that reality.”

In recent years a few other secret files have emerged. In 2013 a U.S. auction house offered an astounding folder of documents, still marked top secret, some 300 pages that appear to have been authored by Hess himself during his wartime captivity and carried with him to the Trial of the Major War Criminals in Nuremberg. They had been missing ever since.

The files are shrouded in a Hollywood-style intrigue; who got their hands on them, and how exactly, and why did they then simply give them away to the current seller for nothing via an anonymous phone call? But the papers themselves tend to dispel mysteries rather than raise them, and that’s assuming that the contents are genuine. The auction house made some scans and transcripts of them public for the sale, and it’s unclear if they ever changed hands. In one of the digitized documents, Hess described his interview with Hamilton on the morning after his flight in a passage that perhaps provides the best window into the workings of the mind that conceived this unusual attempt.

“The British cannot continue the war without coming to terms with Germany…By my coming to England, the British Government can now declare that they are able to have talks…convinced that the offer by the Fuhrer is genuine,” the files note.  

But the rulers of Great Britain were convinced of no such thing. Former Foreign Secretary Lord Simon, the highest-placed person known to have met Hess, interviewed him on June 10 a few days before his first suicide attempt. "Hess has come on his own initiative,” Simon wrote of the meeting. “He has not flown over on the orders, or with the permission or previous knowledge, of Hitler. It is a venture of his own.”

With that Hess was simply locked up for the rest of his long days, though Winston Churchill, writing in The Grand Alliance, claimed at least some distress at his fate.

“Whatever may be the moral guilt of a German who stood near to Hitler, Hess had, in my view, atoned for this by his completely devoted and frantic deed of lunatic benevolence,” he wrote. “He came to us of his own free will, and, though without authority, had something of the quality of an envoy. He was a medical and not a criminal case, and should be so regarded.”

RELATED: During his captivity Hess often suspected that his meals were being poisoned. Incredibly, food packets that he wrapped and sealed at Nuremberg for future analysis have been sitting in a Maryland basement for 70 years.

Will This $15 Device Protect Against School Shootings?

Smithsonian Magazine

Courtesy of Flickr user SingSkateRockLuv

In the tragic aftermath of the Newtown school massacre, as is the case every time there’s a school shooting, Americans debated what should be done to ensure the safety of innocent schoolchildren. Gun control advocates are pushing to limit access to deadly weapons by imposing tougher firearm regulations, while the National Rifle Association suggests that armed security guards be stationed at every school in the country.

A group of students at Benjamin Banneker Academic High School in Washington D.C. has responded differently. The students have taken it upon themselves to come up with a device that prevents armed intruders from breaking into a classroom. Their invention, the DeadStop, is lightweight, shaped like a small, cup-sized plastic cylinder and easily slips over the common large hydraulic hinge known as a “door closer“  in just seconds.

“So many kids and adults were killed (at Sandy Hook). So we got together and we wanted to know how we could stop intruders from entering our school,” Deonté Antrom, a junior at Benjamin Banneker, said in an interview published on NBCNews.com.

Credit: Benjamin Banneker Academic High School

The school, like many others across the nation, is equipped with doors that cannot be locked from the inside, in order to comply with building code regulations that allow for unobstructed campus-wide evacuations in case of a fire and other disasters. The DeadStop was designed as a workaround, preserving that need for a quick exit in an emergency while also enabling the class to secure itself inside the room when needed.

The design team of ten students, led by math teacher John Mahoney, started out with a prototype made of polyvinyl chloride (PVC) tubing typically found in hardware stores and used a nail to keep the device fastened in place. The flaw with that early concept was that it was not rigid enough to keep the door tightly sealed, so the students are currently developing another version built from metal that would enable the device to work like a clamp.

“The device we have is detachable. It will just be in the teacher’s desk and when there is an announcement that there is a shooter in the building, they will be able to take it out and simply install it on the hinge,” Anjreyev Harvey, another junior on the team, told NBC News. “And how we have it designed, no matter how much the shooter shoots through the glass, or shoots at the hinge, he won’t be able to open (the door).”

Side-locking doors can be used by mischievous students to lock teachers out of their own classrooms, another reason why they are not typically used, and with the DeadStop being portable enough to be slipped into a bag or stored elsewhere, it can conveniently be kept in the teacher’s possession at all times.

The DeadStop is similar to another device called the Jamblock. Invented by Pittsburgh schoolteacher Bob Ploskunak, the Jamblock is designed to easily slip under the door and jam any attempts by gunmen to force themselves in. The lock is already being used by schools in two local districts and, like DeadStop, is garnering attention.

Students at Benjamin Banneker Academic High School hope to patent and release a final product of DeadStop that costs no more than $15. To make this possible, the Massachusetts Institute of Technology has awarded the students a $6,600 grant as part of the Lemelson-MIT InvenTeams program, which was created to inspire and motivate high school students to “cultivate their creativity and experience invention.”

The team will demonstrate its invention at MIT in June 2014.

Will Statues of a Doctor Who Experimented on Enslaved People Come Down Next?

Smithsonian Magazine

Confederate generals are not the only statues causing public outrage in the United States. On Saturday, protesters gathered in New York City’s Central Park to call for the removal of a monument to James Marion Sims—the “father of gynaecology”—a doctor who bought, sold and experimented on slaves.

There are two other Sims statues on state-owned property. One is in Columbia, South Carolina, and the other in Montgomery, Alabama. In an interview with MSNBC, Steve Benjamin, the mayor of Columbia, recently agreed that the local Sims statue should come down “at some point”. Now the New York Academy of Medicine has reissued a statement supporting the removal of Sims’ effigy from Central Park.

Over the past five decades, a small army of academics—including social historians, feminists, African American scholars and bioethicists—have reached a consensus that Sims’ medical research on enslaved patients was dangerous, exploitative and deeply unethical—even by the standards of his times. And doctors at the Medical University of South Carolina, in Sims’ home state, have publicly acknowledged Sims’ overt medical racism.

The ongoing removal of statues that celebrate the Confederacy and other forms of white supremacy, is an opportunity to also correct the problem of Sims’ troubling presence on the symbolic landscape of America’s past.

James Marion Sims (R. O'Brien/Wikimedia Commons)

It is common knowledge that Sims was a slave owner during the years he practiced medicine in Montgomery, Alabama. It is also well known that he performed dangerous experiments on enslaved women, men and babies. These experiments were so dangerous that even his friends and fellow doctors told him that he was going too far.

The evidence of Sims’ medical malpractice is apparent from the extensive published case notes of the procedures he performed and from his autobiography, The Story of My Life. In his autobiography, Sims revealed that the most “memorable era” of his life was between 1844 and 1849, during which he recollected that “there was never a time that I could not, at any day, have had a subject for operation.”

In same years, he doubled the size of his private hospital for enslaved patients, “ransacking country around” Montgomery for incurable cases of vesico-vaginal fistula (an abnormal tract between the bladder and vagina). Enslaved women were particularly prone to this side effect of childbirth, due to the coercive “breeding” practices of slave-owners and widespread sexual exploitation. For Sims’ fistula patients, the memory of these years would have been unbearable, as they were subject to repeated surgery, without anesthesia.

Sims is a typical example of a slave-owning, slave-trading, racist medical researcher, of which there were an abundance in antebellum America. Medical experiments on the enslaved were commonplace throughout the era of slavery. Sims, however, proved particularly shrewd in having positioned his medical practice and backyard private hospital at the heart of Montgomery’s booming slave-trading district.

Sims’ practice in Montgomery, Alabama (Stephen Kenny, Author provided)

Sims’ hospital and medical research thus directly serviced the slave trade. He attempted to patch up the chronically sick so that they could continue to labor, reproduce, or be sold at a profit to their owners.

The latest chapter of Sims’ legacy is still unfolding, with an important new academic study about to be published. Further, Sims’ public history reveals much about patterns of racism, paternalism and sexism—as well as changing attitudes towards slavery, doctors, patients and disease—in the eras of Jim Crow segregation, eugenics, World War II, civil rights and beyond.

Two historians of American medicine, Vanessa Northington Gamble and Susan Reverby, who battled long and hard to bring to light the truths of the Tuskegee syphilis experiment and secure a presidential apology and compensation for the study’s victims, have argued for the Sims monuments to be reconfigured, perhaps removing his likeness and incorporating the stories of his enslaved research subjects.

As the history of patients is still in its infancy and very few statues commemorate the participants in medical trials, this debate may be a stimulus for more inclusive and considerate memorialization. And it may prove useful in drawing attention to the ethics of research today, a time of rapid developments in biomedicine.

Accompanying the removal or reconfiguring of the Sims’ monuments, the history of medicine in the age of slavery and Jim Crow deserves a thorough re-evaluation, as there remain countless other untold stories of exploited and oppressed sufferers to be brought to light and included on history’s balance-sheet.

Will Microneedle Patches Be the Future of Birth Control?

Smithsonian Magazine

In the seemingly cluttered world of less-than-ideal contraceptive options, researchers are developing one that is more reliable, simpler to use, and looks a lot like a spikey Band-Aid.

In a study published in Science Advances today, researchers led by Wei Li, a postdoctoral fellow at Georgia Tech, describe a new contraceptive patch with biodegradable microneedles that release hormones under the skin. Building on burgeoning microneedle technology, the needles on this device separate from their backing within a minute and stay embedded beneath the skin, releasing hormones for over a month.

Scientists at Georgia Institute of Technology and the University of Michigan are collaborating on the project, and it is funded by USAID through a grant to the nonprofit humanitarian development organization FHI 360.

The working prototype contains 100 microneedles, which measure hundreds of micrometers in length and are made out of a biodegradable polymer. The user presses the patch into her skin and lets it rest for about a minute. Once inserted, the fluids between her skin cells trigger a reaction in chemical compounds at the base of the microneedles, causing small carbon-dioxide bubbles and water to form. These bubbles weaken the needle’s connection to the backing, and the water further helps the backing to dissolve. This makes it much quicker and easier to remove the backing from the microneedles than is possible in patches without a fizzing mechanism.

Microscope images show effervescent microneedles on a contraceptive skin patch. When applied to the skin, effervescent bubbles quickly separate the microneedles from the patch so that the patch can be removed after one minute. (Wei Li/Georgia Tech)

Once the microneedles enter the skin, they slowly dissolve, releasing the hormone stored inside into the bloodstream. In animal testing, the hormone concentration remained high enough to be effective for more than 30 days, signaling it may be effective as a long-term contraceptive.

Though the scientists refer to the spikes as “microneedles,” the patch was designed to be painless and the needles undetectable after insertion.

“If we've designed it right, your experience should be that of pressing a patch to the skin,” says Mark Prausnitz, a professor of chemical and biomolecular engineering at Georgia Tech who co-authored the study. “We’ve designed it so that the experience is nothing like a hypodermic needle.”

Microneedling tools are already a trend in cosmetics, used to decrease acne scars and alleviate wrinkles and dark spots. The use of microneedles is also becoming increasingly viable as a way to deliver drugs and pharmaceuticals like insulin and vaccines. Many of these inventions are still undergoing development and testing, and several companies have filed patents for microneedle patches.

These patches are promising because, compared to typical injections, they can be less painful, easier to use and produce no biohazardous waste. Though most other microneedle patches immediately release their drug into the body, the needles in the new contraceptive patch do so slowly over the course of many days. And the new effervescence of the backing allows the needles to break off more quickly, so users must only attach it for about a minute, rather than the 20 minutes some other designs require.

The microneedles, shown here under a microscope, are less than one millimeter tall. (Wei Li/Georgia Tech)

Encased in the microneedles is a dose of levonorgestrel (LNG), the drug most frequently used in intrauterine devices (IUDs) and other forms of contraceptive implants. Though scientists don’t yet know how this delivery method will affect a woman’s body, Prausnitz expects to see similar side effects as other contraceptive devices using LNG.

“We're not innovating in terms of the drug itself,” he says. “We're using a really tried and true drug that's probably been in hundreds of millions of women and has been safe and effective.”

The researchers aim to improve upon existing contraceptives by building one that is long-acting, and easy and painless to apply at home. According to a study published in the journal The Lancet last year, 44 percent of pregnancies worldwide between 2010 and 2014 were unintended. By providing another reliable and accessible contraceptive option, researchers hope to help reduce this number.

“Even with all the choices that exist today, [contraceptives] are not doing what's needed for everybody,” Prausnitz says. “What motivates us is that if we can figure out the science, there can be some good that comes of it.”

The team has so far tested the hormone delivery on rats and a placebo patch on human subjects. The researchers have also conducted interviews and surveys with women of reproductive age in the U.S., India and Nigeria and found the patch was well-received conceptually by these women and physically by the test subjects. Only 10 percent of the subjects who tested the placebo patches reported feeling pain initially, and none were in pain after an hour. None exhibited tenderness or swelling, though some still experienced redness of the skin after a full day.

“Alternative approaches to delivering contraceptives beyond the once-a-day oral pill stand to transform the user experience and maximize patient adherence,” Giovanni Traverso, a gastroenterologist and professor in MIT’s department of mechanical engineering, writes in an email. Traverso, who was not involved in the research, has developed a pill that, after being swallowed, opens in a person’s small intestine, allowing microneedles inside to inject drugs into the bloodstream. “As a community we are enthusiastic about the potential of microneedle patches for extended release of a broad range of drugs, but certainly the impact for contraception is significant.”

The device likely won’t be ready for clinical trial for another two to three years, and it would be several more years until it could be FDA approved and marketable. In that time, researchers will be increasing the quantity of LNG carried in the rat-sized patches tenfold to make them usable in humans. Their challenge is to increase the capacity of the needles without making them too large and painful.

Another crucial next step is to prolong the length of the hormone release. Ideally, they will be able to create a patch that can be changed every three and six months, rather than just one. Reducing the number of patches women have to buy could significantly decrease the overall expense.

“USAID certainly has the mission to bring this kind of patch to developing countries and making it accessible, which means the cost has to be right,” Prausnitz says. “They've made it very clear to us that the target needs to be that a patch must be competitive with the cost of other contraceptive methods.”

If they succeed, scientists may be able to create a product that gives women around the world a much-needed new contraceptive option.

Will Hound Hunting in California Be Banned?

Smithsonian Magazine

This bear has been chased up a tree by a pack of hounds in the California wilderness but appears unconcerned about its predicament. The bear was not shot. Photo by Matt Elyash, California Department of Fish and Game photographer.  

Dog versus bear: An ancient duet of nature? Or an artificial battle royale staged by sport hunters?

Advocates and critics each flaunt the opposing characterizations—but either way, hound hunting can be simply defined: the pursuit of a large mammal using a pack of trained dogs that, often, chase the quarry up a tree. Many times, the human hunter, who often locates his dogs by following the signal emitted from their radio collars, shoots the animal out of the branches. Other times, the hunt ends without a gunshot as the houndsman, satisfied only by the chase, leashes his dogs and leads them away, leaving the quarry—very often a black bear, other times a cougar or bobcat—alive in the treetop. Still other times, the pursued animal may fail to make it up a tree and get mauled by the dogs.

This is hound hunting.

In England, foxes have long been the target animal of the sport as highbrow hunters on horseback follow their bawling hounds to the eventual death of the fox. Such hunting has been banned in the United Kingdom, though hunters seem to be thumbing their nose at the law; they continue mounting their steeds and trailing their hounds—”at least as much as ever,” according to one hunter quoted by the The Telegraph. And in America, hound hunting was romanticized in such literature as The Bear, by William Faulkner, and Where the Red Fern Grows, by Wilson Rawls.

But state by state, the practice—call it a sport, a tradition, a hobby, a way of life—is becoming illegal as people sympathetic to the well-being of wild animals campaign to abolish hound hunting. Of the 32 American states that permit black bear hunting, 14—including Montana, Colorado, Oregon, Pennsylvania and Washington—prohibit hunters from using dogs to chase the animals. Now, California could be looking at a statewide ban. Senate Bill 1221, introduced earlier this year by Senator Ted Lieu (D-Torrance), will ban the use of hounds while hunting bears and bobcats if Governor Jerry Brown signs the bill.

The ban would not affect bird hunters who rely on retrievers to recover ducks and other fowl, researchers who hire houndsmen to assist in treeing study animals, and wildlife officials who conduct depredation hunts of bears and mountain lions deemed dangerous to the public or their property.

Hunters are up in arms and have been protesting at public gatherings. Josh Brones is among those leading the defense of the sport. As the president of the California Houndsmen for Conservation, Brones says that hound hunting does not usually involve killing the bear and, what’s more, brings to life an ancient and natural drama between black bears and canine predators. During an interview, Brones said hound hunting is rather like a game of “hike-and-seek.” In these pursuits, the bear leads the hounds through the woods, often for many miles, before climbing a tree. The houndsman, slower but just as dogged as his hounds, eventually arrives, shoots some shaky video of the bear to post on YouTube and finally departs. Hunters sometimes call this activity catch-and-release—and even many wildlife researchers rely on it.

Brones, like many houndsmen, almost never kills bears, he says.

“In my 28 years of hunting with hounds, I have only killed four , and the last one was more than a decade ago,” he said. “I don’t even take a weapon when hunting for bear.”

Fitted with radio collars, these hounds are bawling and ready for the bear hunt. Photo courtesy of Flickr user Cowgirl Jules.

Brones assures that catch-and-release hunting is not stressful to the bear. Though hunting publications frequently characterize bear hunting as the most epic of adrenaline rushes (just Google hunting bears adrenaline rush), Brones says black bears themselves do not experience particularly increased adrenaline levels when chased by dogs. Rather, by fleeing for miles through the woods, bears—as well as other large game—are answering to basic instincts; they are not afraid—just running, he explained to me. He also described treed black bears yawning and nodding off to sleep in the cozy crook of a tree, indifferent to the dogs below. Department of Fish and Game warden Patrick Foy similarly told of treed mountain lions, which are sometimes pursued via hounds by researchers, as appearing “like they don’t have a care in the world.” Foy said, too, that a chase covering several miles of rough terrain is not especially hard on many large wild animals—just a walk in the woods, really.

“For a bear, six miles is nothing,” Foy said.

Some biologists, however, assure that hound hunting has considerable impacts on wildlife. Rick Hopkins, a conservation ecologist in San Jose, California, said in an interview that he participated in a long-term study more than 20 years ago in which he helped catch and radio collar 30 Bay Area mountain lions. In three of the chases, a cougar was caught and viciously attacked by the dogs. He says he knows, too, of cases in which a research hunt led to a cougar kitten getting killed by the hounds.

“Even in research hunts, which are carefully controlled,” dogs catch and maul the quarry, he said. “And I can guarantee that in less controlled hunts, bear cubs get caught.”

Hopkins went on to say, “It’s absolutely silly to suggest that it’s OK to run animals to exhaustion and chase them up a tree, and think that they’re fine.”

To the sport’s many opponents, hound hunting appears like little more than brazen wildlife harassment. Jennifer Fearing, the California director of the Humane Society of the United States, recently told the press, “It’s just reckless wildlife abuse. Even if don’t intend to kill the bear, there isn’t such a thing as benign catch-and-release hound hunting.” Fearing noted that many public parks prohibit unleashed pet dogs.

“And yet we allow this narrow field of people to not only run their dogs off-leash but with the express purpose of chasing wildlife,” she said.

Brones says bears are very rarely injured by dogs, and he says he doesn’t know of any incidents in which cubs were attacked, though this (incredibly graphic, so be forewarned) video shows it happening. While such tooth-and-claw combat may be rare, no one seems really to know how often it occurs. Hunters are regularly separated for lengths of time (that’s why they use radio collars) from their dogs, which may show extreme aggression toward the pursued animal (the dogs often mob dead bears that have been shot from a tree). And for every dog-and-bear fight videoed and posted online, other similar skirmishes likely go unseen or undocumented. In one case described by an official with the Haven Humane Society in a recent letter to Senator Lieu, an injured bear fleeing from hounds happened to enter the city limits of Redding, California, where it climbed a tree. The said official tranquilized the bear, discovered that it bore severe dog bites and euthanized the animal.

A houndsman-hunter takes aim at a black bear. Hunters assure that bears, like this one, are not stressed or bothered when chased into trees. Photo courtesy of Flickr user Cowgirl Jules.

Hounds on the chase almost certainly scare and disturb nontarget wildlife. One European study (Grignolio et al. 2010) found that roe deer, though not the subject of hound hunts, would shift to less desirable habitat during the boar hunting season, where food was less abundant but where regulations precluded hunters and their hounds from entering. And in a July 2006 report (PDF) from the Pennsylvania Game Commission’s Bureau of Wildlife Management, wildlife biologist Mark Ternent wrote, “Pursuit with hounds also may impose stress, disrupt reproduction, and alter foraging effectiveness of bears or other wildlife. Family groups may become separated, or cubs occasionally killed by hounds. However, several studies have concluded that most biological impacts from hound hunting are minimal (Allen 1984, Massopust and Anderson 1984), and the issue of hound hunting is largely social.”

As a species, black bears are not considered threatened. Scientists believe that there are about 30,000 in California, some 300,000 in the United States, and as many as 725,000 across their entire North American range, from Mexico to Alaska. Every year, licensed bear hunters in California take no more than 1,700—a quota set by the Department of Fish and Game. Half or less of these are currently taken with the assistance of dogs—and it’s almost certain that in California, even if houndsmen are soon banned from unleashing their dogs onto a scent trail, the bear hunt will still go on.

The dogs will just have to stay home.

Weigh in in the comment box below: Is hound hunting of bears, bobcats, mountain lions and other animals a fair chase? Or a sport whose time must end?

Will Commercial Airplanes Have Parachutes Someday?

Smithsonian Magazine

Statistically speaking, the chance of dying in a plane crash is one in about 11 million. Yet, despite significant safety advances that make the likelihood of such a nightmare scenario ever more remote, there’s always that looming fear. But what if passenger airplanes were equipped with parachutes that, during an emergency, allowed them to float safely towards a soft landing?

Ballistic Recovery Systems is the one of the few companies to show that such an idea is indeed plausible. Beginning in 1998, the Saint Paul, Minnesota-based firm has outfitted several small, lightweight aircraft with backup parachutes designed to support as much as 4,000 pounds. Tucked in the rear of the fuselage, the BRS system is activated simply by pulling a red lever that releases a rocket-launched capsule containing a large canopy chute. Once deployed, the suspension lines expand at a controlled rate, allowing the canopy to open fully as the plane's speed slows.

For inventor and BRS founder Boris Popov, adapting something that's used mainly by skydivers and military personnel for flying objects that are several times heavier meant that he had to first come up with a much wider design. He then had to reduce the parachute's bulk and weight without sacrificing structural integrity. His $16,000 rescue parachutes, found in personal aircraft like Cessnas and the entire line of Cirus planes, are comprised of an ultra-lightweight composite material that is five times stronger than steel, but 100 times lighter. The 30-pound parachute is then condensed into a compact package using an 11-ton hydraulic press. The "ballistics" part comes in the form of a rocket motor with about a pound of explosive material, enough to blast the parachute through a fiberglass panel in the rear of the airplane so that the canopy can deploy within seconds. At last count, the company claims their technology has saved nearly 300 lives.

Inevitably, the question becomes whether the technology can be applied to larger commercial aircraft, such as Boeings and Airbus models, to assuage the fears of the billions of airline passengers that travel every year. Well, Popov believes it's definitely doable if the public wills it to happen.

By Popov's calculations, every pound of descending weight requires about a square foot of parachute material for such a system to work. A passenger-loaded Boeing 757 can weigh as much as 250,000 pounds and cruises at around 500 miles per hour. Safely lowering a plane of this size and weight would mean employing multiple BRS parachutes (as many as 21 for a jumbo-sized ,735,000-pound Boeing 747) . One approach to making this more feasible is to engineer an aircraft that can separate into smaller segments. That way, only the passenger cabin would be braced during a freefall. Under this scenario, the wings and other components would detach to shed weight quickly.

It’s an idea that a team of researchers at the Scientific Research Institute of Parachute Design and Production (NII Parachutostroyeniya) in Russia has been exploring for some time. One conceptual blueprint even involves an aircraft designed to automatically sear off its wings using automated blades while the passenger-carrying sections would break off into parachute-equipped survival pods. In a special BBC report, the institute’s chief designer Viktor Lyalin explains that this type of system would “drastically reduce speed and avoid human casualties during take-off and landing accidents."

Implementing such an extreme safety measure, however, may not even be practical considering that aviation experts still question the effectiveness of using parachutes. For instance, a spokesman for the UK Civil Aviation Authority tells the BBC that even in the incredibly unlikely scenario that an airplane stalls in mid-air, there probably wouldn't be enough time for a parachute to deploy as the plane is moving at high speeds. And since most fatal accidents occur during the takeoff or approach and landing phase of the flight, a scenario where a parachute might make a difference is rather remote.

Unphased by skeptics, BRS is working, for now, to further develop the technology to a point where it can be used in private jets and other larger aircraft that seat up to 20 passengers.

CEO Robert Nelson tells The Wall Street Transcript, "...when you start talking about military applications or getting into personal light jets, that is, the small jets that people can afford to own and operate, or even into a bigger class of airplane, when you get up to a higher weight and more passengers, those are areas that we believe the product will work for future applications."

Wide Load

National Air and Space Museum
Felt tip pen drawing on paper. Wide load, 1/30/71.

The spring of 1962 was a busy time for the men and women of the National Aeronautics and Space Administration. On February 20, John H. Glenn became the first American to orbit the earth. For the first time since the launch of Sputnik 1 on October 4, 1957, the U.S. was positioned to match and exceed Soviet achievements in space. NASA was an agency with a mission -- to meet President John F. Kennedy's challenge of sending human beings to the moon and returning them safely to earth by the end of the decade. Within a year, three more Mercury astronauts would fly into orbit. Plans were falling into place for a follow-on series of two-man Gemini missions that would set the stage for the Apollo voyages to the moon.

In early March 1962, artist Bruce Stevenson brought his large portrait of Alan Shepard, the first American to fly in space, to NASA headquarters.(1) James E. Webb, the administrator of NASA, assumed that the artist was interested in painting a similar portrait of all seven of the Mercury astronauts. Instead, Webb voiced his preference for a group portrait that would emphasize "…the team effort and the togetherness that has characterized the first group of astronauts to be trained by this nation." More important, the episode convinced the administrator that "…we should consider in a deliberate way just what NASA should do in the field of fine arts to commemorate the …historic events" of the American space program.(2)

In addition to portraits, Webb wanted to encourage artists to capture the excitement and deeper meaning of space flight. He imagined "a nighttime scene showing the great amount of activity involved in the preparation of and countdown for launching," as well as paintings that portrayed activities in space. "The important thing," he concluded, "is to develop a policy on how we intend to treat this matter now and in the next several years and then to get down to the specifics of how we intend to implement this policy…." The first step, he suggested, was to consult with experts in the field, including the director of the National Gallery of Art, and the members of the Fine Arts Commission, the arbiters of architectural and artistic taste who passed judgment on the appearance of official buildings and monuments in the nation's capital.

Webb's memo of March 16, 1962 was the birth certificate of the NASA art program. Shelby Thompson, the director of the agency's Office of Educational Programs and Services, assigned James Dean, a young artist working as a special assistant in his office, to the project. On June 19, 1962 Thompson met with the Fine Arts Commission, requesting advice as to how "…NASA should develop a basis for use of paintings and sculptures to depict significant historical events and other activities in our program."(3)

David E. Finley, the chairman and former director of the National Gallery of Art, applauded the idea, and suggested that the agency should study the experience of the U.S. Air Force, which had amassed some 800 paintings since establishing an art program in 1954. He also introduced Thompson to Hereward Lester Cooke, curator of paintings at the National Gallery of Art.

An imposing bear of a man standing over six feet tall, Lester Cooke was a graduate of Yale and Oxford, with a Princeton PhD. The son of a physics professor and a veteran of the U.S. Army Air Forces, he was both fascinated by science and felt a personal connection to flight. On a professional level, Cooke had directed American participation in international art competitions and produced articles and illustrations for the National Geographic Magazine. He jumped at the chance to advise NASA on its art program.

While initially cautious with regard to the time the project might require of one of his chief curators, John Walker, director of the National Gallery, quickly became one of the most vocal supporters of the NASA art initiative. Certain that "the present space exploration effort by the United States will probably rank among the more important events in the history of mankind," Walker believed that "every possible method of documentation …be used." Artists should be expected "…not only to record the physical appearance of the strange new world which space technology is creating, but to edit, select and probe for the inner meaning and emotional impact of events which may change the destiny of our race." He urged quick action so that "the full flavor of the achievement …not be lost," and hoped that "the past held captive" in any paintings resulting from the effort "will prove to future generations that America produced not only scientists and engineers capable of shaping the destiny of our age, but also artists worthy to keep them company."(4)

Gordon Cooper, the last Mercury astronaut to fly, was scheduled to ride an Atlas rocket into orbit on May 15, 1963. That event would provide the ideal occasion for a test run of the plan Cooke and Dean evolved to launch the art program. In mid-February, Cooke provided Thompson with a list of the artists who should be invited to travel to Cape Canaveral to record their impressions of the event. Andrew Wyeth, whom the curator identified as "the top artist in the U.S. today," headed the list. When the time came, however, Andrew Wyeth did not go to the Cape for the Cooper launch, but his son Jamie would participate in the program during the Gemini and Apollo years.

The list of invited artists also included Peter Hurd, Andrew Wyeth's brother-in-law, who had served as a wartime artist with the Army Air Force; George Weymouth, whom Wyeth regarded as "the best of his pupils"; and John McCoy, another Wyeth associate. Cooke regarded the next man on the list, Robert McCall, who had been running the Air Force art program, as "America's top aero-space illustrator. Paul Calle and Robert Shore had both painted for the Air Force program. Mitchell Jamieson, who had run a unit of the Navy art program during WW II, rounded out the program. Alfred Blaustein was the only artist to turn down the invitation.

The procedures that would remain in place for more than a decade were given a trial run in the spring of 1963. The artists received an $800 commission, which had to cover any expenses incurred while visiting a NASA facility where they could paint whatever interested them. In return, they would present their finished pieces, and all of their sketches, to the space agency. The experiment was a success, and what might have been a one-time effort to dispatch artists to witness and record the Gordon Cooper flight provided the basis for an on-going, if small-scale, program. By the end of 1970, Jim Dean and Lester Cooke had dispatched 38 artists to Mercury, Gemini and Apollo launches and to other NASA facilities.

The art program became everything that Jim Webb had hoped it would be. NASA artists produced stunning works of art that documented the agency's step-by-step progress on the way to the moon. The early fruits of the program were presented in a lavishly illustrated book, Eyewitness to Space (New York: Abrams, 1971). Works from the collection illustrated NASA publications and were the basis for educational film strips aimed at school children. In 1965 and again in 1969 the National Gallery of Art mounted two major exhibitions of work from the NASA collection. The USIA sent a selection of NASA paintings overseas, while the Smithsonian Institution Traveling Exhibition Service created two exhibitions of NASA art that toured the nation.

"Since we …began," Dean noted in a reflection on the tenth anniversary of the program, the art initiative had resulted in a long string of positive "press interviews and reports, congressional inquiries, columns in the Congressional Record, [and] White House reports." The NASA effort, he continued, had directly inspired other government art programs. "The Department of the Interior (at least two programs), the Environmental Protection Agency, the Department of the Army and even the Veterans Administration have, or are starting, art programs." While he could not take all of the credit, Dean insisted that "our success has encouraged other agencies to get involved and they have succeeded, too."(5)

For all of that, he noted, it was still necessary to "defend" the role of art in the space agency. Dean, with the assistance of Lester Cooke, had been a one-man show, handling the complex logistics of the program, receiving and cataloguing works of art, hanging them himself in museums or on office walls, and struggling to find adequate storage space. In January 1976, a NASA supervisor went so far as to comment that: "Mr. Dean is far too valuable in other areas to spend his time on the relatively menial …jobs he is often burdened with in connection with the art program."(6) Dean placed a much higher value on the art collection, and immediately recommended that NASA officials either devote additional resources to the program, or get out of the art business and turn the existing collection over the National Air and Space Museum, "where it can be properly cared for."(7)

In January 1974 a new building for the National Air and Space Museum (NASM) was taking shape right across the street from NASA headquarters. Discussions regarding areas of cooperation were already underway between NASA officials and museum director Michael Collins, who had flown to the moon as a member of the Apollo 11 crew. Before the end of the year, the space agency had transferred its art collection to the NASM. Mike Collins succeeded in luring Jim Dean to the museum, as well.

The museum already maintained a small art collection, including portraits of aerospace heroes, an assortment of 18th and 19th century prints illustrating the early history of the balloon, an eclectic assortment of works portraying aspects of the history of aviation and a few recent prizes, including several Norman Rockwell paintings of NASA activity. With the acquisition of the NASA art, the museum was in possession of one of the world's great collections of art exploring aerospace themes. Jim Dean would continue to build the NASM collection as the museum's first curator of art. Following his retirement in 1980, other curators would follow in his footsteps, continuing to strengthen the role of art at the NASM. Over three decades after its arrival, however, the NASA art accession of 2,091 works still constitutes almost half of the NASM art collection.

(1) Stevenson's portrait is now in the collection of the National Air and Space Museum (1981-627)

(2) James E. Webb to Hiden Cox, March 16, 1962, memorandum in the NASA art historical collection, Aeronautics Division, National air and Space Museum. Webb's preference for a group portrait of the astronauts was apparently not heeded. In the end, Stevenson painted an individual portrait of John Glenn, which is also in the NASM collection (1963-398).

(3) Shelby Thompson, memorandum for the record, July 6, 1962, NASA art historical collection, NASA, Aeronautics Division.

(4) John Walker draft of a talk, March 5, 1965, copy in NASA Art historical collection, NASM Aeronautics Division.

(5) James Dean, memorandum for the record, August 6, 1973, NASA art history collection, NASM Aeronautics Division.

(6) Director of Planning and Media Development to Assistant Administrator for Public Affairs, January 24, 1974, NASA art history collection, NASM Aeronautics Division.

(7) James Dean to the Assistant Administrator for Public Affairs, January 24, 1974, copy in NASA Art history Collection, Aeronautics Division, NASM.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Smithsonian Institution

July 26, 2007

Wide Load

National Air and Space Museum
Wide Load. The first stage of an Atlas booster is in transit on Cape Kennedy. This rear view perspective includes the "wide load" sign at the bottom of the structure. The lush green landscape is broken on the right by a white rectangular sign with a red arrow pointing to the right and the words "US AIR FORCE" above it. To the left is a yellow diamond street sign.

In March 1962, James Webb, Administrator of the National Aeronautics and Space Administration, suggested that artists be enlisted to document the historic effort to send the first human beings to the moon. John Walker, director of the National Gallery of Art, was among those who applauded the idea, urging that artists be encouraged "…not only to record the physical appearance of the strange new world which space technology is creating, but to edit, select and probe for the inner meaning and emotional impact of events which may change the destiny of our race."

Working together, James Dean, a young artist employed by the NASA Public Affairs office, and Dr. H. Lester Cooke, curator of paintings at the National Gallery of Art, created a program that dispatched artists to NASA facilities with an invitation to paint whatever interested them. The result was an extraordinary collection of works of art proving, as one observer noted, "that America produced not only scientists and engineers capable of shaping the destiny of our age, but also artists worthy to keep them company." Transferred to the National Air and Space Museum in 1975, the NASA art collection remains one of the most important elements of what has become perhaps the world's finest collection of aerospace themed art.

The spring of 1962 was a busy time for the men and women of the National Aeronautics and Space Administration. On February 20, John H. Glenn became the first American to orbit the earth. For the first time since the launch of Sputnik 1 on October 4, 1957, the U.S. was positioned to match and exceed Soviet achievements in space. NASA was an agency with a mission -- to meet President John F. Kennedy's challenge of sending human beings to the moon and returning them safely to earth by the end of the decade. Within a year, three more Mercury astronauts would fly into orbit. Plans were falling into place for a follow-on series of two-man Gemini missions that would set the stage for the Apollo voyages to the moon.

In early March 1962, artist Bruce Stevenson brought his large portrait of Alan Shepard, the first American to fly in space, to NASA headquarters.(1) James E. Webb, the administrator of NASA, assumed that the artist was interested in painting a similar portrait of all seven of the Mercury astronauts. Instead, Webb voiced his preference for a group portrait that would emphasize "…the team effort and the togetherness that has characterized the first group of astronauts to be trained by this nation." More important, the episode convinced the administrator that "…we should consider in a deliberate way just what NASA should do in the field of fine arts to commemorate the …historic events" of the American space program.(2)

In addition to portraits, Webb wanted to encourage artists to capture the excitement and deeper meaning of space flight. He imagined "a nighttime scene showing the great amount of activity involved in the preparation of and countdown for launching," as well as paintings that portrayed activities in space. "The important thing," he concluded, "is to develop a policy on how we intend to treat this matter now and in the next several years and then to get down to the specifics of how we intend to implement this policy…." The first step, he suggested, was to consult with experts in the field, including the director of the National Gallery of Art, and the members of the Fine Arts Commission, the arbiters of architectural and artistic taste who passed judgment on the appearance of official buildings and monuments in the nation's capital.

Webb's memo of March 16, 1962 was the birth certificate of the NASA art program. Shelby Thompson, the director of the agency's Office of Educational Programs and Services, assigned James Dean, a young artist working as a special assistant in his office, to the project. On June 19, 1962 Thompson met with the Fine Arts Commission, requesting advice as to how "…NASA should develop a basis for use of paintings and sculptures to depict significant historical events and other activities in our program."(3)

David E. Finley, the chairman and former director of the National Gallery of Art, applauded the idea, and suggested that the agency should study the experience of the U.S. Air Force, which had amassed some 800 paintings since establishing an art program in 1954. He also introduced Thompson to Hereward Lester Cooke, curator of paintings at the National Gallery of Art.

An imposing bear of a man standing over six feet tall, Lester Cooke was a graduate of Yale and Oxford, with a Princeton PhD. The son of a physics professor and a veteran of the U.S. Army Air Forces, he was both fascinated by science and felt a personal connection to flight. On a professional level, Cooke had directed American participation in international art competitions and produced articles and illustrations for the National Geographic Magazine. He jumped at the chance to advise NASA on its art program.

While initially cautious with regard to the time the project might require of one of his chief curators, John Walker, director of the National Gallery, quickly became one of the most vocal supporters of the NASA art initiative. Certain that "the present space exploration effort by the United States will probably rank among the more important events in the history of mankind," Walker believed that "every possible method of documentation …be used." Artists should be expected "…not only to record the physical appearance of the strange new world which space technology is creating, but to edit, select and probe for the inner meaning and emotional impact of events which may change the destiny of our race." He urged quick action so that "the full flavor of the achievement …not be lost," and hoped that "the past held captive" in any paintings resulting from the effort "will prove to future generations that America produced not only scientists and engineers capable of shaping the destiny of our age, but also artists worthy to keep them company."(4)

Gordon Cooper, the last Mercury astronaut to fly, was scheduled to ride an Atlas rocket into orbit on May 15, 1963. That event would provide the ideal occasion for a test run of the plan Cooke and Dean evolved to launch the art program. In mid-February, Cooke provided Thompson with a list of the artists who should be invited to travel to Cape Canaveral to record their impressions of the event. Andrew Wyeth, whom the curator identified as "the top artist in the U.S. today," headed the list. When the time came, however, Andrew Wyeth did not go to the Cape for the Cooper launch, but his son Jamie would participate in the program during the Gemini and Apollo years.

The list of invited artists also included Peter Hurd, Andrew Wyeth's brother-in-law, who had served as a wartime artist with the Army Air Force; George Weymouth, whom Wyeth regarded as "the best of his pupils"; and John McCoy, another Wyeth associate. Cooke regarded the next man on the list, Robert McCall, who had been running the Air Force art program, as "America's top aero-space illustrator. Paul Calle and Robert Shore had both painted for the Air Force program. Mitchell Jamieson, who had run a unit of the Navy art program during WW II, rounded out the program. Alfred Blaustein was the only artist to turn down the invitation.

The procedures that would remain in place for more than a decade were given a trial run in the spring of 1963. The artists received an $800 commission, which had to cover any expenses incurred while visiting a NASA facility where they could paint whatever interested them. In return, they would present their finished pieces, and all of their sketches, to the space agency. The experiment was a success, and what might have been a one-time effort to dispatch artists to witness and record the Gordon Cooper flight provided the basis for an on-going, if small-scale, program. By the end of 1970, Jim Dean and Lester Cooke had dispatched 38 artists to Mercury, Gemini and Apollo launches and to other NASA facilities.

The art program became everything that Jim Webb had hoped it would be. NASA artists produced stunning works of art that documented the agency's step-by-step progress on the way to the moon. The early fruits of the program were presented in a lavishly illustrated book, Eyewitness to Space (New York: Abrams, 1971). Works from the collection illustrated NASA publications and were the basis for educational film strips aimed at school children. In 1965 and again in 1969 the National Gallery of Art mounted two major exhibitions of work from the NASA collection. The USIA sent a selection of NASA paintings overseas, while the Smithsonian Institution Traveling Exhibition Service created two exhibitions of NASA art that toured the nation.

"Since we …began," Dean noted in a reflection on the tenth anniversary of the program, the art initiative had resulted in a long string of positive "press interviews and reports, congressional inquiries, columns in the Congressional Record, [and] White House reports." The NASA effort, he continued, had directly inspired other government art programs. "The Department of the Interior (at least two programs), the Environmental Protection Agency, the Department of the Army and even the Veterans Administration have, or are starting, art programs." While he could not take all of the credit, Dean insisted that "our success has encouraged other agencies to get involved and they have succeeded, too."(5)

For all of that, he noted, it was still necessary to "defend" the role of art in the space agency. Dean, with the assistance of Lester Cooke, had been a one-man show, handling the complex logistics of the program, receiving and cataloguing works of art, hanging them himself in museums or on office walls, and struggling to find adequate storage space. In January 1976, a NASA supervisor went so far as to comment that: "Mr. Dean is far too valuable in other areas to spend his time on the relatively menial …jobs he is often burdened with in connection with the art program."(6) Dean placed a much higher value on the art collection, and immediately recommended that NASA officials either devote additional resources to the program, or get out of the art business and turn the existing collection over the National Air and Space Museum, "where it can be properly cared for."(7)

In January 1974 a new building for the National Air and Space Museum (NASM) was taking shape right across the street from NASA headquarters. Discussions regarding areas of cooperation were already underway between NASA officials and museum director Michael Collins, who had flown to the moon as a member of the Apollo 11 crew. Before the end of the year, the space agency had transferred its art collection to the NASM. Mike Collins succeeded in luring Jim Dean to the museum, as well.

The museum already maintained a small art collection, including portraits of aerospace heroes, an assortment of 18th and 19th century prints illustrating the early history of the balloon, an eclectic assortment of works portraying aspects of the history of aviation and a few recent prizes, including several Norman Rockwell paintings of NASA activity. With the acquisition of the NASA art, the museum was in possession of one of the world's great collections of art exploring aerospace themes. Jim Dean would continue to build the NASM collection as the museum's first curator of art. Following his retirement in 1980, other curators would follow in his footsteps, continuing to strengthen the role of art at the NASM. Over three decades after its arrival, however, the NASA art accession of 2,091 works still constitutes almost half of the NASM art collection.

(1) Stevenson's portrait is now in the collection of the National Air and Space Museum (1981-627)

(2) James E. Webb to Hiden Cox, March 16, 1962, memorandum in the NASA art historical collection, Aeronautics Division, National air and Space Museum. Webb's preference for a group portrait of the astronauts was apparently not heeded. In the end, Stevenson painted an individual portrait of John Glenn, which is also in the NASM collection (1963-398).

(3) Shelby Thompson, memorandum for the record, July 6, 1962, NASA art historical collection, NASA, Aeronautics Division.

(4) John Walker draft of a talk, March 5, 1965, copy in NASA Art historical collection, NASM Aeronautics Division.

(5) James Dean, memorandum for the record, August 6, 1973, NASA art history collection, NASM Aeronautics Division.

(6) Director of Planning and Media Development to Assistant Administrator for Public Affairs, January 24, 1974, NASA art history collection, NASM Aeronautics Division.

(7) James Dean to the Assistant Administrator for Public Affairs, January 24, 1974, copy in NASA Art history Collection, Aeronautics Division, NASM.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Smithsonian Institution

July 26, 2007

Why the Next Silicon Valley Will Be in the Middle East

Smithsonian Magazine

During the Renaissance, Florence was a wellspring of novel thinking. By mid-20th century, Bell Labs in New Jersey was rolling in patents. And, today, California’s Silicon Valley is teeming with entrepreneurial spirit.

So, where will the next hub of invention be?

Christopher M. Schroeder, an internet entrepreneur and venture investor, predicts that with increased access to technology and the connectivity that follows there will be many centers of innovation springing up worldwide, in cities large and small. In his new book, Startup Rising, he makes a strong case for the Middle East, where a surprising number of young men and women are starting tech companies and where global corporations, such as Google, Yahoo and Cisco, are investing.

This story, at least for you, starts with you attending the “Celebration of Entrepreneurship” in Dubai in 2010. What was this event like?

I was at the “Celebration of Entrepreneurship” because [I am part] of this group of American CEOs and Arab CEOs who are really trying to get to know and understand each other. This was one of the first large gatherings of startups in the Arab world, from North Africa to Yemen. 

You get to this incredibly beautiful hotel in this spectacular city of Dubai that didn’t exist for all intents and purposes 15 years ago, and you would have felt as at home as if you were at any tech gathering or conference in Silicon Valley or anywhere else. It was a modern facility with people hustling and bustling, checking their mobile devices, connecting with each other, going from event to event. It was utterly familiar in what was a totally unfamiliar setting.

You argue that a new narrative is playing out in the Middle East. What is this new narrative, and how does it differ from the one that most Americans associate with the region?

I think when Americans think about the Middle East they are really thinking about political instability and sectarian violence. If you are old enough, that narrative might have started with the Iran Hostage circumstance, and certainly for all of us September 11 had a certain narrative.

But, there are other narratives going on. Where people have access to technology, they have access to communication and they have the ability to see how everyone else is living and doing things and can connect and collaborate. You have this capability of seeing opportunity and of seeing that you can make things happen, and it all can be done unbelievably affordably.

I think it is because we have such a single narrative in our minds about the region that sometimes it escapes our understanding. Of course, it is going to happen in the Middle East the way it has happened in India, Latin America, the way it has happened in Eastern Europe, the way it happens whenever anybody has access to technology.

What effect has the Arab Spring had on entrepreneurship in the region?

I went to this gathering in Dubai in 2010. So, it was shortly after the young man lit himself on fire in Tunisia, but was three months before things really heated up in Cairo. It is no surprise to me that the Arab uprisings happened when they happened, and it is no surprise to me that that which has driven people to want a new expression in politics and society also wants them to have a new creative expression in art, in music and in building businesses.

To be an entrepreneur, you have to be a little crazy, to believe you can build something that was never there before. I think in the Arab uprisings, there were a lot of people that said, “Holy cow, if Mubarak can fall, anything can happen. Maybe I can really build a business where it was never built before.” But, secondly, I think a lot of them very movingly feel that in building a business they are actually building a better society, that they are solving problems with technology in their day-to-day lives. It could be traffic, it could be crime, it could be education, and it could be creating jobs. The Arab uprising really pushed people to feel like what they were doing was not only great for themselves but also actually great for their communities, their countries and the region.

Investors and entrepreneurs are always, as you know, asking about the next “Silicon Valley.” So, is the Middle East it?

Every so often a geographic location becomes something that really changes the global dynamics. But, I think the wonder and the awesomeness of technology today is that we are going to be seeing hubs of technology and innovation all over the world. That isn’t to say that being in an ecosystem where you have a lot of smart people and people who inspire you around you doesn’t matter. You may see more of it in some great centers where people like to live and therefore great talent want to aggregate. But, I think around the world you are just going to be seeing ecosystems of innovation pop up on a regular basis in multiple locations because people can connect better and better with technology.

I saw unbelievable entrepreneurs and innovators in Egypt. I saw unbelievable entrepreneurs in Amman, Jordan, because I think the government and the young people there are really focusing on it. And, at the same time, I have seen them in Beirut and other places as well. I think the idea of there being one hub that rules it all is just not going to be as much in the calculus. Silicon Valley is the exception and not the rule.

Which heavyweight tech companies are investing in the region, and how?

A lot of the major tech companies for a long time like Microsoft, Cisco, and Intel have been in the region. The Arab world has 350 million people. A lot of growth is happening in mobile and other technologies. But what I loved and was very excited by is that some of these players and newer ones like Google not only are building their services there, but they are actually embracing the ecosystem and helping entrepreneurs to develop.

For example, Google sponsored one of the largest startup competitions in Egypt. They literally hired a bus to travel up and down the country to encourage entrepreneurs not just from Alexandria and Cairo but all around the country and gave a huge award of money. In the last six or nine months, LinkedIn and PayPal have opened up operations in the Middle East. They view their jobs as not only selling and marketing and developing their services but as really doing what they can to educate the markets about the use of e-commerce and about how to find great talent and employees.

Can you tell me about Internet, cell phone and smart phone penetration in these countries?

It ranges. Mobile penetration almost in every country certainly exceeds 50 percent. In many of these countries, like Egypt for example, it is literally over 100 percent, which means that people have more than one mobile phone. What’s exciting is that in many respects the Middle East, like other great emerging markets, has never known a world of landlines. So, they are native mobile users and thinking about how to use technology in a mobile environment.

Smartphone penetration in the [Persian] Gulf region is quite high. It is over 50 or 60 percent in some countries and probably less in a place like Egypt, where the proportion is more like 20 percent. But almost everyone I spoke to in the mobile community expects smartphones to have 50 percent penetration in Egypt in the next three years. As Marc Andreessen wrote in the foreword of my book, the world will have 5 billion smartphones in the next eight to ten years. I think in the Middle East you are going to see 50, 60 or 70 percent smartphone penetration within that time.

Is that 50 percent smartphone penetration a number that you’ve seen to be an indicator in other parts of the world? Once you hit and surpass 50 percent, is there a guaranteed spike in innovation?

I don’t think there is any question that if you look at Asia, if you look at parts of Latin America and Eastern Europe, that as greater and greater technology is available not only did you see a rise in middle class and economic output, but more and more companies that are being driven by and innovated around technology. I think there is definitely precedent for it. 

When you dug into specific statistics about Internet use, what were the biggest surprises?

I would not have told you before I got into the data that the number one per capita YouTube consumer on Earth is Saudi Arabia, that the largest plurality of people watching video on YouTube in Saudi Arabia is women and the largest category of videos that they are watching is education. You stop to think about it and it makes perfect sense. If you are in a society where it isn’t easy to get an education in certain areas or the quality of education may not be everything that it could be, and at your fingertips is the ability to be able to get access to any class anywhere in the world, as more of that is starting to get translated into Arabic, it all really kind of fits. It doesn’t seem that surprising anymore.

You have interviewed hundreds of entrepreneurs in the Middle East. How would you describe them? What are the demographics of this population?

The younger generation, 20s, early 30s, has never not known technology and therefore is very comfortable using it and being mobile first in terms of its innovation. A lot of the young people I met had exposure at some point to western education or the West, but hardly a majority of them. 

Probably the biggest thing that hit me like a two-by-four, and in hindsight should have seemed obvious, is that at every event I went to anywhere between 35 and 40 percent of the participants were women. Again, I think a lot of the narrative in the West is to think, well, how can women be participating in this in the Middle East? The fact of the matter is I saw more women on average at a Middle East gathering than I would see on average at a Silicon Valley gathering.

You divide the entrepreneurs into three types: the Improvisers, the Problem Solvers and the Global Players. Can you explain what you mean by each?

Improvisers are taking something that is tried and true and successful elsewhere in the world and saying, how can I make this a success in the Middle East? One of the first companies that was a perfect example of this is a company named Maktoob—the Yahoo! of the Middle East that got bought by Yahoo! for almost $200 million. If you get into the Maktoob experience, it is not just Yahoo! It is not just an Arab putting in Arabic that which is in English. There are lots of sensitivities about the Arab world—cultural things and television shows, music, that is unique.

Anyone who has been to Cairo or any major city in the Middle East knows that the street traffic is mind-blowing. So, of course, a bunch of young Problem Solvers said, “Okay, that’s unacceptable. There are alternative routes. We can figure this out. We are going to create a crowdshare to be used so that people can do the best they can to navigate traffic.” There is no cab dispatching service in many cities in the Middle East so young people have built Uber-like abilities to allow you to find a cab that is near you, which of course helps you in bad traffic and, with GPS, makes you feel safer.

The Global Players are folks who realize the world is one click away so why be limited by any one market. Amr Ramadan from Alexandria, Egypt, was pitching this beautiful weather app, WeatherHD [at a startup competition]. The data it had was interesting. The user interface was interesting. The visuals of it were fantastic. As he was talking about it, I looked down at my iPad and realized I downloaded it six months earlier. I had no idea that it was 7 young people at the time—now it’s like 50—in Alexandria, Egypt, who built it. There are lots of folks who are building solutions that they think are not only interesting for a regional context. There is a wonderful woman from Beirut, Hind Hobeika, who was a college swimmer. She has invented these goggles that are almost like Google Glass; they are heart and breath monitors that are visually in your goggles. That’s not a Middle East-only solution. Any swimmer or trainer anywhere in the world would kill for these. She has manufacturing happening in Asia and distribution happening in the fall in the United States.

What measures are being taken to support entrepreneurs and help ensure their success?

The King of Jordan has helped create and put a lot of weight behind one of the great incubators in Jordan called Oasis500. That has spawned other companies, activities, competitions and gatherings. You have these amazing gatherings. They can be as large as thousands of people, at an ArabNet gathering, or hundreds of people at a mix-and-mentor gathering by Wamda.com. There are startup weekends that happen everywhere from major cities like Amman to Nazareth. There is this bottom-up movement of young people helping young people and seeking out mentors and building connectivity as well as raising capital and the other tactical necessities. It’s viral. It’s everywhere.

Of the hundreds of entrepreneurs you interviewed, whose story sticks with you the most?

Ala’ Alsallal was raised in a refugee camp in Amman and got affiliated with Ruwwad, a totally indigenous, of-the-community youth center that Aramex and Fadi Ghandour [its founder] helped create. He got exposure to computers, which just blew him away, and also got to see mentors and other business people. He got a vision.

With his natural drive and that experience, Ala’ was able to effectively start, out of a scrappy office made with his family, Jamalon, the Amazon of the Middle East, which has a real shot at being the number one online book seller in the region. He eventually got a little bit of money from Oasis500. He just got another round recently. He must be 27 years old or something. To see him come from literally a refugee community with almost no vision of a future to taking advantage of the resources is very hopeful.

Why a Tanzanian Village Chased Six Elephants Off a Cliff

Smithsonian Magazine

Illegal wildlife trade frequently makes headlines these days, but it's not the only potential danger to large animals. Animals like elephants or tigers can also be killed by locals who have no interest in poaching those species' body parts. Instead, such killings often spawn out of retaliation for a crop raid or a cattle attack. 

In 2009, a particularly egregious retaliatory killing took place near Mount Kilimanjaro, in Tanzania. A group of villagers attacked a herd of elephants, killing half a dozen animals. Researchers recently recounted the incident: "A large crowd of villagers surrounded a herd of elephants and chased them, with the aid of torches, motorcycles, fire, and noise, towards a cliff, killing six of them." While this event stands out for the high number of animals killed, the team adds that "we also learned about several other incidents in which elephants had been speared or found dead without indications of ivory poaching."

So why were the villagers doing this? The researchers, a team from Norway, set out to find out by interviewing around 60 locals and asking them about their interpretation of the situation.

Most reported resentment toward elephants due to frequent crop raids; government documents confirmed a significant number of crops had indeed been damaged or destroyed by elephants in the recent past. Elephants also sometimes destroyed water pipes.

At the time the villagers retaliated, the region was also experiencing a drought, making both the elephants and the people all the more desperate for viable crops and dependent sources of water. The government, however, provided little if any help, according to the interviewees. Eventually, the villagers reached their breaking point. As one told the researchers: "We became very furious and said let the government choose either people or elephants. Our village is not a wildlife corridor." 

The villagers, the team concluded, felt "marginalized and disempowered by conservation practices" and saw violence as their only option for taking control of the situation. This case study, although extreme, is not isolated, the team points out. Implementing conservation without taking local communities into account fails both the animals it tries to protect and potentially harms the people who have to live with them, the researchers conclude. 

Why We Should Teach Music History Backwards

Smithsonian Magazine

The problem with music history is it’s almost always presented in the wrong direction: forward, from the beginning of something to the end. History would be more meaningful if it were taught backwards.

Think about it: how does one discover and fall in love with the music by the likes of the Black Keys? Is it through first investigating Charley Patton and then working the way through Son House, the Yardbirds, Led Zeppelin and Lynyrd Skynyrd till finally reaching the Ohio-based blues-rock band? Not if you’re under 35, because by the time you began listening to music, the Black Keys were already part of your world. Once hooked, you love them so much that you read every interview to find out who influenced them. That’s how you and other true fans find out about the backwards progression to North Mississippi Allstars, R.L. Burnside, Mississippi Fred McDowell, and then finally back to Charley Patton.

Fo their part, the Beatles and Rolling Stones sent music lovers scouring for recordings by Buddy Holly, Carl Perkins, Chuck Berry and Muddy Waters in the dusty back bins of the local department store. Holly and Perkins in turn led to Elvis Presley, who led to Bill Monroe and Hank Williams. Berry and Waters led to Howlin’ Wolf, who led to Robert Johnson, and then once again, back to Charley Patton.

That’s how we learn about music: backwards, always backwards. We don’t start our investigations at some arbitrarily chosen point in the past; we begin where we are, from our current burning passion. This is the most effective kind of learning, driven by emotion rather than obligation. If learning is best done this way, shouldn’t music history writing and teaching be done in the same backwards direction?

Obvious problems present themselves. In the history of Western narrative, stories have always been told in the forward direction—with such rare exceptions as playwright Harold Pinter's Betrayal,  “Seinfeld”’s riff on Pinter, and the noir thriller Memento, written by Christopher and Jonathan Nolan. Authors want to give us the earliest incident first and the subsequent incidents later, the cause first and then the effect. But when it comes to cultural history, we already know the effect, because we’re living with it. What we’re curious about is the cause.

The solution to this conundrum is the flashback, a common device in modern fiction. Within each flashback scene, the action and dialogue move forward—even the most sophisticated readers aren’t ready for backwards dialogue. But through the skillful manipulation of such scenes, writers and teachers can lead readers and students backwards through history, reinforcing the audience’s natural inclination.

How might this work? Suppose we were teaching a class of high school students about American music. Where would we begin? We might start with the Brit-soul singer Sam Smith singing his signature song, “Stay with Me.” singer with the buttoned-up white shirt, three-piece blue suit and close-cropped hair. When that song, its album, In the Lonely Hour, and the singer swept four of this year’s biggest Grammy Awards—Best Record, Best Song, Best Pop Vocal Album and Best New Artist—the natural reaction is to ask, “Where did this come from?”

It’s not that Smith is merely copying the past, for he and his producers/co-writers have honed the R&B ballad tradition to a new leanness: the simple drum thump and half-note piano chords allow Smith’s honeyed tenor to remain so conversational that it feels like we’re eavesdropping on his mumbled plea to a departing lover. But Smith is not inventing this sound from scratch either, and the curious young listener is going to want to know what he borrowed. (Curious listeners may be a minority of all listeners, but they’re a significant minority—and it’s for them that music critics write.) Smith is transforming arena-rock anthems by setting their clarion melodies in hymn-like arrangements. With “Stay with Me,” the rock source material (“I Won’t Back Down”) was so obvious that Smith had to share writing credits with Tom Petty and Jeff Lynne.

So we critics must lead those listeners backwards through history. We don’t have to go very far to hear Smith confessing his debt to Mary J. Blige. “I remember holding her Breakthrough album,” Smith confesses in an interview snippet on Blige's newest record, London Sessions. “Holding it in my hands, in my car, listening to it on repeat. To me she was this untouchable goddess.” Smith repays that debt by co-writing four of the new disc's dozen songs with Blige, including the first single, “Therapy,” an obvious allusion to “Rehab” by another Brit-soul singer, the late Amy Winehouse.

Blige sounds revitalized on The London Sessions, as if working with Smith and his British colleagues had returned her to the days of 2005’s The Breakthrough, when her all her collaborations with rappers such as Ghostface Killah, Nas and Jay-Z allowed her to refashion R&B by replacing maximalist arrangements with minimalist beats and romantic sentiment with streetwise skepticism. But let’s go backwards even further and find out where Blige found her sound.

If her attitude and backing tracks came out of the hip-hop scene in the Bronx, where she was born, the vibrancy of her big mezzo was inspired by gospel-soul singers such as Aretha Franklin, Chaka Khan and Anita Baker.

Blige recorded songs made famous by all three of those role models early in her career, and got her start singing in churches in Georgia and the Yonkers, where she spent her troubled childhood. Like Blige, Franklin was a church soloist and a child-abuse victim, according to Respect, the new biography by David Ritz. That dramatic combination of deep wounds and yearning for redemption marks both singers.

Following our historical trail backwards, we find ourselves in 1956 at Detroit's New Bethel Baptist Church, where the 14-year-old Franklin is singing hymns from her new gospel album. She has been touring with her famous preacher father C.L. Franklin and such gospel stars as Sam Cooke, Clara Ward and Inez Andrews, and the teenage prodigy already displays the robust warmth and piercing urgency of those role models. But she also hints at something extra, a cutting edge that comes not from buttery bounty of the “Gospel Queen” Mahalia Jackson but from the guitar-playing gospel renegade: Sister Rosetta Tharpe.

So we go back even further and find ourselves at New York’s Carnegie Hall on December 23, 1938, as the 23-year-old Tharpe performs in the legendary “From Spirituals to Swing” concert organized by John Hammond, who would later sign Franklin to Columbia Records and produce her early albums. This show introduces white New York audiences to the genius of African-American artists such as Tharpe, Count Basie, Joe Turner, James P. Johnson and Big Bill Broonzy, and kicks off the boogie-woogie craze with appearances by pianists Meade Lux Lewis, Pete Johnson and Albert Ammons. Ammons accompanies Tharpe on her two songs, and she steals the show. When she sings her recent hit, “Rock Me,” the lyrics may be asking God to rock her in the bosom of Abraham, but her voice and guitar are hinting at another kind of rocking.

They are also hinting at how easily a love song to God can be turned into a love song for a more earthly creature and how that porous boundary will inspire Franklin, Cooke, Blige, Winehouse, Smith and much of the rest of Anglo-American music for the next 77 years.

If we had tried to tell this story forward, we would have lost most of our audience once they encountered Tharpe’s old-fashioned dresses, twangy guitar and sanctified lyrics. But by telling the story backwards, we were able to lead our listeners from their existing enthusiasm for Smith to newfound excitement over Blige and then Franklin. When our reverse historical journey finally reached Tharpe, our fellow travelers were primed to embrace a spectacular talent they may never have bothered with coming from any other direction.

Why We Have a Civic Responsibility to Protect Cultural Treasures During Wartime

Smithsonian Magazine

Sometime in the mid-6th century A.D., an unknown artist sculpted a beautiful figure standing nearly six feet tall out of the limestone in a man-made cave in northern China. Commissioned by a Buddhist emperor of the Northern Qi dynasty, the figure was a bodhisattva, representing an enlightened human being who delayed his own entry to paradise to help others achieve their own spiritual development. It joined an array of other sculptures, forming an underground temple of Buddhist iconography and signaled the regime’s desire for divine guidance and protection.

But neither enlightenment nor protection prevailed when in 1909 looters, encouraged by civil strife and lawlessness in China, started to cut and remove statues and sculpted heads from the temple cave and sell the treasures on the art market. The standing bodhisattva came to Paris in 1914, in the possession of Chinese immigrant and art dealer C.T. Loo and Swiss poet, collector and antiquities aficionado Charles Vignier. Two years later, they sold the piece to financier Eugene Meyer, who almost immediately offered to exhibit it at the Metropolitan Museum of Art in New York. He and his journalist wife Agnes owned and loaned it for decades. The Meyers eventually bought the Washington Post and supported civic, educational and cultural causes. Agnes Meyer donated the statue to the Smithsonian’s Freer Gallery of Art in 1968. A few years ago, the standing bodhisattva helped anchor an exhibition, "Echos of the Past," organized by the Smithsonian and the University of Chicago, that included the statue’s appearance in a digitally reconstruction of the original Xiangtangshan cave before it was looted.

We know a lot about the sculpture from what we call provenance research—tracking the record of ownership of an artwork. It’s good practice, prescribed in the museum community to ensure that works are legally acquired. Museums generally operate according to a 1970 Unesco treaty that says that artworks illicitly obtained should be returned to their rightful owners. The U.S. and several other nations also seek to recover art work looted during the Nazi-era and return those as well—a practice initiated by the now well known “Monuments Men”—and women.

While museums are sometimes criticized for holding onto items acquired from other nations, their goal has been to preserve, exhibit and learn from them. It’s a noble, worthwhile and civic idea—that we of today might gain insight from understanding the past, and even be inspired by our heritage and that of others. Civic leaders generally support cultural heritage preservation and education as worthy social goals, though sometimes convincing politicians and officials that such efforts merit support from public coffers is not always easy. But actions undertaken in different parts of the world to destroy such heritage brings the basic mission of museums into strong relief.

The Taliban’s blowing up of the Bamiyan Buddhas in 2001 was a shock, as has been the burning of medieval manuscripts in the libraries of Timbuktu and ISIS thugs taking sledgehammers to Akkadian and Assyrian sculptures in the Mosul museum. These heinous acts, condemned around the world, point to the material obliteration of history, of people’s diversity and often a society’s complex, multifaceted nuanced identity.

Extremists say that these objects have no value, but they cynically loot and sell what they can carry off, using such treasures to help finance further destruction. Cultural heritage, whether in the tangible form of monuments, mosques, temples, churches and collections or in the more intangible form of living customs, beliefs and practices is under attack as a strategic pillar of extremist warfare. It is a war on civilization itself—whether that be Islamic, Jewish, Christian, Hindu or Buddhist, eastern, western or indigenous.

Image by © Jamal Saidi/Reuters/Corbis. Assistant Director of the Iraq Museum, Donny Youkhanna, shows the head of a statue of a man from an Assyrian winged bull, damaged by thieves who used a chainsaw to cut the head from the bull's stone body at an archaeological site in Khorsabad, located north of Mosul, in 1996. (original image)

Image by © S. SABAWOON/epa/Corbis. Afghan women pass by the scene where one of the two colossal statues of Buddha carved into the sandstone cliffs were demolished by the Taliban in March 2001, in Bamiyan, Afghanistan. The monumental statues were carved from the cliff in the early 6th and 7th centuries AD. (original image)

Image by © David Honl/ZUMA Press/Corbis. The site of the ancient Buddha Statues of Bamiyan, on the outskirts of Bamiyan, Afghanistan. The statues were destroyed by the Taliban in March 2001. (original image)

Image by © M.A.PUSHPA KUMARA/epa/Corbis. Workers engaged in the final stages of one of the world's tallest granite seated image of the Buddha are seen at the Rambodagalle temple at Rideegama near Kurunegala, Sri Lanka, in September 2014. The structure, 67.5 feet high and designed on the lines of a Bamiyan Buddha image in Afghanistan destroyed by the Taliban, is taking shape not only as a symbol of Buddhism but as a sign of unity among the different communities and religions in Sri Lanka. (original image)

Image by REUTERS/Joe Penney (MALI - Tags: RELIGION SOCIETY) --- Image by © JOE PENNEY/Reuters/Corbis. Librarian Aboubakar Yaro examines an Islamic manuscript from the 17th century at the Djenne Library of Manuscipts, in Djenne, Mali, September 2012. Djenne is thought to have at least 10,000 manuscripts held in private collections, dating from the 14th to 20th centuries. (original image)

Image by Freer Gallery of Art, Gift of Eugene and Agnes E. Meyer. In 1909, encouraged by civil strife and lawlessness in China, looters started to cut and remove statues like this mid-6th century standing Bodhisattva from the temple cave and sell the treasures on the art market. (original image)

Image by Freer/Sackler Gallery. The Boddhisattva anchored a 2011 exhibition, "Echos of the Past: The Buddhist Cave Temples of Xiangtangshan," organized by the Smithsonian and the University of Chicago, which included a digital reconstruction of the original location where looters had removed the artworks in 1909. (original image)

Image by © Corbis. A c.1814 illustration details the fire damage to the Senate and House wings following the attempted burning of the U.S. Capitol by British. (original image)

Image by © MATTES Rene/Hemis/Corbis. The Old Bridge of the city of Mostar in Bosnia-Herzegovina was destroyed in fighting between Croats and Muslims in the 1990s. In 2004 it was rebuilt, again serving to recognize a shared history. (original image)

Image by © Andrew Aitchison/In Pictures/Corbis. The Kigali Memorial Centre, located on a site where 250,000 of the victims of the 1994 genocide in Rwanda were buried in mass graves, opened in 2004 on the 10th anniversary of the atrocity. (original image)

Image by © STRINGER/Reuters/Corbis. A man stands along in a courtyard, during a night vigil to honor former South African President Nelson Mandela, near B Section of Robben Island Maximum Security Prison off the coast of Cape Town in December 2013. (original image)

Image by © NIC BOTHMA/epa/Corbis. The art installation "Sunstar" by artist Christopher Swift on Signal Hill above the city of Cape Town, South Africa, is a 24-meter, eight-pointed star constructed from the original fence that once surrounded Robben Island where former president Nelson Mandela was incarcerated for 27 years. (original image)

Image by © Frank Schumann/dpa/Corbis. An estimated 1.5 million people were murdered at Auschwitz, a concentration camp that was liberated by Soviet troops on January 27, 1945, and turned into a memorial site and museum in 1947. (original image)

Image by © Frank Schumann/dpa/Corbis. The barbed wired fence and watch towers of Auschwitz-Birkenau concentration camp stand covered in mist in Oswiecim, Poland. The camp became a memorial site and museum in 1947 and since 2007 a UNESCO heritage site.. (original image)

One might be tempted to say, sacking and looting are the heritage of humankind in their own right—think the destruction of Solomon’s temple, the pillaging of Rome, the ransacking of Baghdad by the Mongols and the exploits of Conquistadors among the Aztecs and Incas. There are, of course, more modern examples.

Last year we celebrated the bicentennial of the Star Spangled Banner, held in the Smithsonian’s collection. The flag flew over Baltimore weeks after the British burned the U.S. Capitol, the White House and other public buildings in an effort to dispirit the young nation’s citizenry. Often, in modern warfare the scale of bombing and destruction by weaponry can make valued cultural heritage a casualty of inadvertent destruction.

The U.S. faced heavy criticism for the fire-bombing of the architecturally significant Dresden during World War II, but President Franklin Roosevelt and General Dwight Eisenhower recognized the need to try to protect heritage in the midst of the Allied invasion of Europe. Still there are times when a key decision makes a difference. Kyoto, home to much of Japanese imperial tradition and its most treasured sites, was high on the target list for the dropping of the atomic bomb. But U.S. Secretary of War Henry Stimson, even in an all-out war, recognized its cultural importance and vetoed that idea.

Cultural heritage, while targeted for destruction in war, can also be used to help heal after conflict and to reconcile people with their former enemies and their past. As Japan was recovering from the war and under U.S. occupation, it was no less a warrior than General Douglas MacArthur who supported the efforts of Japanese authorities to preserve their cultural treasures. In post-World War II Europe, Auschwitz, the largest concentration camp, became a memorial and museum to recognize and draw understanding from the Nazi effort to exterminate the Jewish people. The 1954 Hague Convention recognizing the value of heritage, demonstrated world-wide condemnation for the deliberate destruction of cultural property in armed conflict and military occupation, and a 1972 Unesco convention formalized an international regime for recognizing world heritage sites.

In the U.S. in the 1980s, American Indians and their culture, a century earlier marked by the government for destruction and assimilation, were celebrated with a national museum at the foot of the U.S. Capitol. In the 1990s, Robben Island, once the home of the infamous prison housing Nelson Mandela and his compatriots fighting against apartheid was turned into a museum for the new South Africa. Both prisoners and guards became docents, educating visitors about the era, and a site that had once drastically divided a population, helped to bring it together. In Bosnia-Herzegovina, the Mostar Bridge, commissioned by Suleiman the Magnificent had been destroyed in fighting between Croats and Muslims. The bridge had more than a roadway; it was a symbol of connection between the two communities and wiping it out served to divide them in conflict. In 2004 it was rebuilt, again serving to recognize a shared history.

The same year, the Kigali Genocide Memorial Centre and museum opened in Rwanda, at the site of mass graves of victims of that genocide, and provided a means to encourage all citizens of that country, Hutu and Tutsi to avoid the racism and intolerance that led to that national tragedy. Not only museums and memorials, but heritage encapsulated in living traditions that once divided people can be used to bring them together. Unesco’s Slave Route project focused on how the African diaspora illustrated the perseverance of people and their cultures while enduring a most odious practice. The Smithsonian working with Yo-Yo Ma, the Aga Khan and Rajeev Sethi demonstrated how conflicts, forced migration and exploitation along the historic Silk Road were surmounted, and resulted in complex and creative cultural expressions in art, music, cuisine, fashion and ideas that connected people around the globe.

Cultural heritage teaches us things. It embodies knowledge of particular times about architecture, engineering, design, social structure, economy, craftsmanship and religious beliefs. It offers an appreciation of history, and lets us understand something about the way in which people lived. But heritage is not only about the past. Heritage is either forgotten and obscured, or articulated and valued in the present. It symbolizes how people think of themselves and others, including their predecessors and neighbors today. In that sense, cultural heritage teaches us about tolerance and respect for a diverse humanity. Saving heritage saves us from the foibles of arrogance, intolerance, prejudice toward and persecution of our fellow human beings. It reminds us of our better nature and like the standing bodhisattva, helps us all live in a more humane world.

The discussion continues in a program “Cultural Heritage: Conflict and Reconciliation” organized at the Smithsonian with the University of Chicago at the Freer Gallery’s Meyer Auditorium on April 17. A session featuring Irina Bokova, Director General of UNESCO, Emily Rafferty, the President of the Metropolitan Museum of Art, Mounir Bouchenaki, Director of the Arab Regional Centre for World Heritage, and Richard Kurin, interviewed by David Rubenstein, Smithsonian Regent and University of Chicago Trustee, and co-founder of The Carlyle Group. The event will be available via webcast.

Why These Early Images of American Slavery Have Led to a Lawsuit Against Harvard

Smithsonian Magazine

There is an image of a man most Americans have probably seen that has come to represent the institution of slavery. He’s bone-thin, big-eyed and shirtless. Without context, he personifies the nameless, storyless mass of people brought over to this country in bondage. But the man in the image has a name, Renty, as does his daughter, Delia, who also appears in a series of mid-19th-century daguerreotypes. We also know they were forced to strip naked and pose for the images commissioned by Harvard biologist and racial theorist Louis Agassiz in 1850 to “prove” the racial inferiority of black people.

Recently, Collin Binkley at the Associated Press reports, their story has opened up new conversation on race and history. This week, Tamara Lanier, a resident of Norwich, Connecticut, filed a suit in Massachusetts state court saying she is a direct descendant of Renty and accusing Harvard of “wrongful seizure, possession and expropriation” of the images of Renty and Delia. The suit asks the university to acknowledge Lanier’s link to Renty and Delia, pay damages, and turn over the images; it also calls upon the university to acknowledge and condemn Agassiz’s racist actions.

Harvard has yet to comment on the case, stating it has not yet been served with papers, Scott Jaschik at Inside Higher Ed reports.

“It is unprecedented in terms of legal theory and reclaiming property that was wrongfully taken,” one of Lanier’s lawyers, Benjamin Crump, says in an interview with Anemona Hartocollis of The New York Times. “Renty’s descendants may be the first descendants of slave ancestors to be able to get their property rights.”

According to Che R. Applewhaite and Molly C. McCafferty at The Harvard Crimson, Agassiz commissioned the images after touring a plantation in South Carolina, looking for enslaved people who were “racially pure”—aka born in Africa—to support his theory of polygenism, the now debunked idea that different human racial groups don’t share the same ancient ancestry. Renty and Delia were two of the subjects selected for the project.

At some point, the images were filed away, but in 1976, a researcher re-discovered the photos in storage. They were recognized to be among the oldest, if not the oldest, images of enslaved people in North America. Since then, the historic images have become almost iconic, appearing in documentaries, on book covers and on conference banners. The Harvard Peabody Museum, which currently holds the now-fragile daguerreotypes, tells The Harvard Crimson that the images are currently in the public domain, and the museum does not charge usage right. It does, however, charge $15 for high-resolution images of the daguerreotypes, which are requested about 10 times a year.

Lanier, a retired chief probation officer for the State of Connecticut, became aware of the images when she began researching her ancestry in 2010. She sent Harvard a letter in 2011 detailing her possible connections.

Lanier had grown up hearing family oral history about an ancestor named Renty Taylor or “Papa Renty” and through her work she believes she has connected her family to the man in the photograph, and by extension his daughter Delia.

Lanier’s genealogical case is a hard one to prove. Records of enslaved families sometimes include people not affiliated by blood. And a handwritten slave inventory list from 1834 that Lanier believes connect her to Renty is not definitive evidence, reports Hartocollis of the New York Times, since it’s not clear if two enslaved men on the plantation called “Big Renty” and “Renty” are related.

Then there is intellectual property law. Photographs are usually the property of the photographer, though Lanier’s suit claims that since the images were taken without the consent of Renty and Delia by Agassiz, he had no right to transfer them to Harvard and they should belong to their next of kin.

The current suit was inspired, in part, by a 2017 conference she attended on the associations between academia and slavery where Renty’s image was projected above the speakers.

Author Ta-Nehisi Coates, who also attended the conference, tells Hartocollis he understands how Lanier must have felt. “That photograph is like a hostage photograph,” he says. “This is an enslaved black man with no choice being forced to participate in white supremacist propaganda — that’s what that photograph was taken for."

If Lanier did win, Crump, her lawyer, suggested in a press conference they would take the images on a tour across the U.S. before loaning them to museums.

Why School Should Be More Like Summer Camp

Smithsonian Magazine

In 2004, hedge fund analyst Salman Khan began tutoring his 12-year-old cousin, Nadia, in some basic math concepts. Since he lived in Boston and she in New Orleans, they spoke by telephone, and he used Yahoo! Doodle to work through specific problems.

As other family members requested his services, Khan began to post simple video lectures on YouTube. Khan realized he was on to something when strangers began leaving comments, thanking him for explaining things like systems of equations and geometry in a way that finally made sense.

In 2009, Khan quit his lucrative job to put all his efforts into Khan Academy. He founded the nonprofit with a lofty goal in mind: to provide a free, world-class education for anyone, anywhere.

Students from 234 countries and territories have logged on to Khan’s site to watch any number of his 3,400 video lectures on topics in math, science, computer science, economics and history. Teachers in some 15,000 classrooms now incorporate his lessons and software into their instruction.

In his new book The One World Schoolhouse, Khan totally reimagines education. He diagnoses the problems with our century-old model for education and envisions schools that better prepare students for today’s world.

Secretary Wayne Clough will interview Khan tomorrow about his refreshing ideas for education reform as part of a Smithsonian Associates event at the National Museum of the American Indian.

What does the school of the future look like, as you see it?

We can define it by what it is, or maybe by what it is not. You won’t have bells ringing every 50 minutes. You won’t have a state-mandated curriculum where all the students and all the teachers are all going at the same pace. Students are not going to be in these rooms where all the desks are pointed at the chalkboard and there is somebody lecturing at them.

What I imagine is much more open, collaborative workspaces. I imagine the students come in, and they work with their mentors. Their mentors will be both students, possibly older students or students who have shown maturity, and the master teachers. They will set goals. Based on those goals that they are trying to achieve, they have a rough allocation of how they might want to be spending their time. One day a student might want to go deep on trigonometry. Then, he or she might spend two weeks researching some problem in biology or writing a short story.

Both teachers and student mentors will be able to keep track and say, “Look, it’s great that you’ve spent the last month working on your novel. We think that is a really important life experience. But we think you need to invest a little bit more time in your core math skills.

Students will build a portfolio of their creative works; it will serve as their academic credentials to show, “Look, I really do know geometry, or I really do have a basic understanding of American History.” It will also include an evaluation as a peer mentor. How good was the student at helping other people? At explaining things? At first it sounds like a very pie-in-the-sky, touchy-feely thing, but this is actually what employers care about.

So you don’t believe in letter grades?

For me, letter grades are a very superficial thing. An “A” can make it look like there was rigor when there wasn’t any. What does an “A” mean? It depends on how hard or rigorous the assessments were. It gives you very little information. They allow us to assess people, realize they have gaps in their knowledge and then just push them forward, guaranteeing that at some point they are going to get frustrated and kind of fall off the bus.

You call for the end of summer vacation. Why?

We want students to learn! Right now, students are spending nine months stressed, going through drills, memorizing things before an exam and then forgetting it. Then, they go to summer vacation. Some of the most affluent or motivated kids might be able to pull off having a very creative summer vacation, but most don’t. For most, it is just kind of lost time.

When people say, “Summer vacation, those are my best memories. That is when I actually got to do creative things. That is when we actually got to travel,” I say, yeah, exactly, that is what the whole year should be like. Make school year-round, but also make it much more like a creative summer camp.

What is the biggest obstacle to reaching this school of the future?

It is very hard to de-program the model that we grew up in. To some degree, our notion of school is adults scheduling every hour of a child’s time. You have to de-program that in the leaders of the first one or two or ten schools. But, I think all of this can be done over the next five or ten years.

As you say, we take the traditional school model for granted—teachers lecturing for 40-90 minute class periods devoted to separate subjects and then assigning homework. But, how and when did this take root?

The Prussians came up with it. To their credit, they said, “We want to have everyone educated.” How do we get everyone educated? Well, it was the late 1700s, early 1800s. Assembly line factories were producing things fairly inexpensively and in reasonable quality, so the Prussians said let’s see if we can industrial revolutionize teaching.

Before that, you would have the master teacher work with one student or small group of students at a time. They said, “Well, how do we get that to scale? We put these students in age-based cohorts and move them at the same pace.”

In the mid-1800s, the model got brought over to the U.S., with a very egalitarian motive: Let’s have universal public education and do it reasonably cost effectively. The Land Grant universities come about, so university was much more accessible. We start to have textbooks, but we need to standardize what a high school diploma means, so we understand what students are coming to the universities with or are entering the job market with. That is when you had the Committee of Ten say there will be primary school and secondary school. In secondary school, you will learn algebra and then geometry and then trigonometry. You will learn physics near the end, and you will learn earth science near the beginning.

As someone with three degrees from MIT and an MBA from Harvard, you have had success within this system. But, what, in your mind, are its biggest flaws?

The biggest flaw is the dearth of time for creativity. This is probably hitting the affluent more than anyone else, strangely enough. I actually felt like I was lucky growing up. My mom was a single mom. We didn’t have a lot of money, so I didn’t take any classes. I was what they used to call in the ‘80s a “latchkey kid.” I would come home, and my mom wouldn’t come home for a couple of hours. I essentially had the afternoons at my disposal.

Frankly, most of my peers, their kids are completely booked. From morning until nighttime they are either in school or some type of soccer or piano practice or they are doing homework, and then they go to sleep. There is no breathing room at all for a child of any age to say, let me create something. Let me invent a new game. Let me just play.

You have created a library of over 3,000 videos explaining everything from basic trigonometry to the Law of Thermodynamics to the Cuban Missile Crisis and Obamacare. What is the key to an effective video lecture—one that will get through to students?

The tone should be respectful. Respectful means not talking down and not talking above. You have to view the viewer as someone who is just like you, someone who is smart and capable of knowing the information, but who just doesn’t know it right now.

Make sure that you cover all of the details. Make sure you cover all the whys. Make sure you draw all the connections. These are things I never had the luxury to do in my schooling. I never had the time or luxury to think, why do I carry a “1” when I add? The class was moving on. But now I do have time. This is my job. My value-add is to think about those and to try to give a little bit more of that intuition and texture. If it can be a little bit quirky and funny, I think it connects with people even more.

How are teachers incorporating your videos and software into their instruction?

The simplest way is teachers writing on the top of the chalkboard on Day 1, if you are ever stuck on anything in this classroom, this site called Khan Academy might help you. There are a lot of supplemental learners—people who are taking a chemistry class at their high school or university and using Khan Academy as a tutor.

The next level is flipping the classroom. When Khan Academy was just ramping up and I was still doing this as a hobby, I would get these emails from teachers saying that they didn’t have to give these lectures anymore. They could say, “We are covering systems of equations or we are covering meiosis. Here is a Khan Academy video that you might want to watch before our next class.” Then, they could use class time to actually do problem solving with students and work directly with them. They essentially had “flipped” the classroom. What used to be homework—the problem solving—was now in the classroom; what used to be class work—the lectures—was now happening at home.

The deepest [application] is the classrooms where the students really are all learning at their own pace. Teachers have the students working on the Khan Academy exercises and videos at their own time and pace, and then the teachers get data and can intervene when appropriate. The class time is being used for interventions with or between students or open-ended projects.

Really, we want to see who is pushing the envelope the most, see if it is working and then why it is working, and then try to share those practices with other teachers.

How does this new type of school level the playing field for all students?

Historically, whenever someone has talked about solutions for the underserved, they would always think about cheap approximations to what the rich had. But any child who has access to the Khan Academy site now has access to the same resources that Bill Gates’ kids are using.

The good thing is, especially in the developed world, computers and broadband are already fairly common. Even in the developing world, things are getting cheap enough that they are starting to become practical, especially on mobile platforms. At minimum, students now have access to this interactive tutoring. Ideally, they will also be able to supercharge what is happening in their classrooms. They would be able to have access to differentiated instruction. This is what kings’ children had. Not even Bill Gates’ children have this personalized attention in their schools. We are saying there is now a way for teachers to give personalized attention to students in a scalable way.

This interview series focuses on big thinkers. Without knowing whom I will interview next, only that he or she will be a big thinker in their field, what question do you have for my next interview subject?

What surprising change in society is around the corner that no one sees coming?

From my last interviewee, Steven Johnson, author of Future Perfect, which claims that the key to progress is peer networks, as opposed to top-down, hierarchical structures: When you look back on all your big thoughts, what is the biggest thing that you missed? What was the biggest hole in your thinking?

When I started this, Wikipedia and these things already existed. I was a 100 percent believer in the peer networks, and I still am. But I assumed for something like this dream of Khan Academy, we were going to have to get millions of people, or at least thousands or hundreds of people, making content. The shocking thing for me was how scalable even one person could be in this domain.

Why Procrastination is Good for You

Smithsonian Magazine

Sometimes life seems to happen at warp speed. But, decisions, says Frank Partnoy, should not. When the financial market crashed in 2008, the former investment banker and corporate lawyer, now a professor of finance and law and co-director of the Center for Corporate and Securities Law at the University of San Diego, turned his attention to literature on decision-making.

“Much recent research about decisions helps us understand what we should do or how we should do it, but it says little about when,” he says.

In his new book, Wait: The Art and Science of Delay, Partnoy claims that when faced with a decision, we should assess how long we have to make it, and then wait until the last possible moment to do so. Should we take his advice on how to “manage delay,” we will live happier lives.

It is not surprising that the author of a book titled Wait is a self-described procrastinator. In what ways do you procrastinate?

I procrastinate in just about every possible way and always have, since my earliest memories going back to when I first starting going to elementary school and had these arguments with my mother about making my bed.

My mom would ask me to make my bed before going to school. I would say, no, because I didn’t see the point of making my bed if I was just going to sleep in it again that night. She would say, well, we have guests coming over at 6 o’clock, and they might come upstairs and look at your room. I said, I would make my bed when we know they are here. I want to see a car in the driveway. I want to hear a knock on the door. I know it will take me about one minute to make my bed so at 5:59, if they are here, I will make my bed.

I procrastinated all through college and law school. When I went to work at Morgan Stanley, I was delighted to find that although the pace of the trading floor is frenetic and people are very fast, there were lots of incredibly successful mentors of procrastination.

Now, I am an academic. As an academic, procrastination is practically a job requirement. If I were to say I would be submitting an academic paper by September 1, and I submitted it in August, people would question my character.

It has certainly been drilled into us that procrastination is a bad thing. Yet, you argue that we should embrace it. Why?

Historically, for human beings, procrastination has not been regarded as a bad thing. The Greeks and Romans generally regarded procrastination very highly. The wisest leaders embraced procrastination and would basically sit around and think and not do anything unless they absolutely had to.

The idea that procrastination is bad really started in the Puritanical era with Jonathan Edwards’s sermon against procrastination and then the American embrace of “a stitch in time saves nine,” and this sort of work ethic that required immediate and diligent action.

But if you look at recent studies, managing delay is an important tool for human beings. People are more successful and happier when they manage delay. Procrastination is just a universal state of being for humans. We will always have more things to do than we can possibly do, so we will always be imposing some sort of unwarranted delay on some tasks. The question is not whether we are procrastinating, it is whether we are procrastinating well.

When does it cross from good to bad?

Some scientists have argued that there are two kinds of procrastination: active procrastination and passive procrastination. Active procrastination means you realize that you are unduly delaying mowing the lawn or cleaning your closet, but you are doing something that is more valuable instead. Passive procrastination is just sitting around on your sofa not doing anything. That clearly is a problem.

What made you want to take a closer look at the timing of decisions?

I interviewed a number of former senior executives at Lehman Brothers and discovered a remarkable story. Lehman Brothers had arranged for a decision-making class in the fall of 2005 for its senior executives. It brought four dozen executives to the Palace Hotel on Madison Avenue and brought in leading decision researchers, including Max Bazerman from Harvard and Mahzarin Banaji, a well-known psychologist. For the capstone lecture, they brought in Malcolm Gladwell, who had just published Blink, a book that speaks to the benefits of making instantaneous decisions and that Gladwell sums up as “a book about those first two seconds.” Lehman’s president Joe Gregory embraced this notion of going with your gut and deciding quickly, and he passed copies of Blink out on the trading floor.

The executives took this class and then hurriedly marched back to their headquarters and proceeded to make the worst snap decisions in the history of financial markets. I wanted to explore what was wrong with that lesson and to create something that would be the course that Wall Street should have taken and hopefully will take.

You looked beyond business to decision-making in sports, comedy, medicine, military strategy, even dating. What did you find?

I was so surprised to find that this two-step process that I learned from arguing with my mother about making my bed is actually a process that is used by successful decision makers in all aspects of life and in all sorts of time frames. It is used by professional athletes at the level of milliseconds. It is used by the military at the level of minutes. It is used by professional dating services at the level of about an hour.

Question one is: what is the longest amount of time I can take before doing this? What time world am I living in? Step two is, delay the response or the decision until the very last possible moment. If it is a year, wait 364 days. If it’s an hour, wait 59 minutes.

For example, a professional tennis player has about 500 milliseconds to return a serve. A tennis court is 78 feet baseline-to-baseline, and professional tennis serves come in at well over 100 miles per hour. Most of us would say that a professional tennis player is better than an amateur because they are so fast. But, in fact, what I found and what the studies of superfast athletes show is that they are better because they are slow. They are able to perfect their stroke and response to free up as much time as possible between the actual service of the ball and the last possible millisecond when they have to return it.

The international dating service It’s Just Lunch advocates that clients not look at photos, because photos lead to snap reactions that just take milliseconds. It asks that they consciously not make judgments about a person when they first meet them. Instead, they tell clients to go to lunch, wait until the last possible moment, and then at the end of lunch just answer one question: Would I like to go out on a second date with this person? In the same way it frees up time for a tennis player to wait a few extra milliseconds, someone on a date will make a better decision if they free up extra minutes to observe and process information.

What else surprised you?

Most people are taught that you should apologize right away. But I was surprised to find that, in most cases, delayed apologies are more effective. If you’ve wronged a spouse or partner or colleague in some substantive, intentional way, they will want time to process information about what you’ve done. If you acknowledge what you did, and delay the apology, then the wronged party has a chance to tell you how they feel in response, and your apology is much more meaningful.

Do you have any practical advice for how people can learn to better manage delay?

Just take a breath. Take more pauses. Stare off into the distance. Ask yourself the first question of this two-step process: What is the maximum amount of time I have available to respond? When I get emails now, instead of responding right away, I ask myself this. It might seem rude, and it did feel rude at first. But the reality is if you respond to every email instantaneously you are going to make your life much more difficult. If the email really doesn’t have to be responded to for a week, I simply cut the information out of the email and paste it into my calendar for one week from today. I free up time today that I can spend on something else, and I’ll be unconsciously working on the question asked in the email for a week.

[Editor’s Note: It took him three hours to respond to an email of mine. He wrote, rather tongue-in-cheek, “so sorry for the delay!”]

How do we stand to benefit from your message?

If we are going to resolve long-term issues like climate change and sustainability, and if we are going to preserve the innovative focus of private institutions, I think we need a shift in mindset away from snap reactions toward delay. Innovation goes at a glacial pace and should go at a glacial pace.

Epiphany stories are generally not true. Isaac Newton did not have an apple fall on his head. Thomas Edison didn’t suddenly discover the light bulb. Tim Berners-Lee didn’t suddenly invent the World Wide Web. If we are going to be able to resolve long-term problems, we need to create new structures where groups of people are given long periods of time without time pressure and can think in a think tank like way. We will give them a real deadline so they can’t just dither, but I think we need to press our decision-making framework out of the 24-hour news cycle and out of the election cycle into a longer-term time frame of maybe a decade.

What is your next big question?

I am intrigued by epistemology and the question of how we know what we know and the limitations on knowledge. There is an idea circling around the back of my brain. But I am going to take the medicine I advise other people to take, and wait. Let it sit and brew.

This interview series focuses on big thinkers. Without knowing whom I will interview next, only that he or she will be a big thinker in their field, what question do you have for my next interview subject?

I would like to know how your subject knows what they know. What is it about their research and experience and background that leads them to a degree of certainty about their views? With what degree of confidence do they hold that idea? Is it 100 percent? Is it 99 percent? Is it 90 percent?

From my last interviewee, evolutionary biologist Sergey Gavrilets: What would you like to have more opportunity to do or more time to do if you had the chance?

I would like to have more time to play golf, actually. I often have my best creative breakthroughs, to the extent I have them at all, on the golf course—when I have a period of five hours to be around grass and trees with a straightforward but maddening task to occupy me.

Why Polaroid Inspired Both Steve Jobs and Andy Warhol

Smithsonian Magazine

Few companies can claim they altered the path of an entire medium but that’s exactly what Polaroid did in the 1950s, ’60s and ’70s to photography. Founded by Edwin H. Land in 1937, Polaroid was the Apple of its day and Land, the original Steve Jobs. The idea factory churned out iconic products such as the SX-70, the one-step instant camera that now resides in the Smithsonian Cooper-Hewitt, National Design Museum in New York City.

In his new book, “Instant: The Story of Polaroid,” Christopher Bonanos of New York chronicles the rise and fall of the company and details how it changed the way we save memories.

What made you want to write a book about Polaroid?

In 1983, when I was 14, I got my first camera, an old one from the ’50s that I bought in a junk shop. I started using it and there is something bewitching and strange about a picture you see right away. I used it on and off through college and beyond. Then in 2008, when Polaroid announced the very end of instant film production, there was a show going on at the Whitney [Museum of American Art] on Robert Mapplethorpe’s Polaroids. I wrote a little story for New York about this sort of moment when the medium was going away but it was also being celebrated in fine arts. I called up a bunch of Polaroid artists, people like Chuck Close who work in Polaroid film, and they were really angry about having this material taken away from them. It led me to discover that there was a Polaroid cult out there of artists, enthusiasts and people who just love this old way of making pictures.

Your description of Edwin Land was reminiscent of Steve Jobs. In terms of innovation and design, was Polaroid the Apple of its day?

Land and Jobs were both just obsessed with making a product perfect. They both worked like crazy. They both really believed in locating a company at the spot where science and technology meet fine arts. And maybe most important of all they both felt that if you make a fantastic product that the world has never seen before, then the marketing and the selling will take care of itself. Land once said, “Marketing is what you do if your product is no good.”

Thirty years later they asked Jobs how much market research he was doing on whatever the Apple product was at the moment and he said, “We didn’t do any. None. It’s not the consumer’s job to know what he wants.” It’s the same philosophy. Land was one of Jobs’ first heroes and they met a few times in Cambridge. When Land was sort of nudged out of Polaroid and into retirement in 1982, Jobs was interviewed not too long after that and he said “That’s the dumbest thing I’ve ever heard. This man is a national treasure.”

Land made some pretty remarkable predictions for the future. He predicted cell phone photography and Instagram.

He may not have specifically seen exactly the device you have in your hand but he came pretty close. There’s a fantastic film of Land from 1970 where he’s explaining his vision of the future of photography as he saw it when he started the business in 1937. He said we’re a long way from a camera that will be like the telephone, something you use everyday like your pencil or your eyeglasses. Then what he does is he reaches into his breast pocket and he pulls out a wallet and he says, “It would be like a wallet” and the thing is black and about 7 inches long and 3 inches wide and he holds it up in front of his eyes vertically and it looks for all the world like he’s got a cell phone in his hand. Really, the thing he wanted was almost no impediment between the photographer and having the picture available to you. In the early days of Polaroid you had to pull-tabs and throw switches and things to make the processing procedure work, his goal all along had been, you click, it does everything and then you just see your picture. Effortless. A cell phone is about as close as you’re going to get to that.

Why did famous photographers such as Ansel Adams and Walker Evans like using Polaroids so much?

Different people liked it for different reasons. Adams loved Polaroid because he was such a technician in black and white that he could really see what he was doing on the spot. If he was hauling a camera up into Yellowstone on his back or in his station wagon, it was extremely valuable to him to be able to see a picture on the spot. Other people liked it for other reasons. Andy Warhol liked the intimacy and that you could see what you got right away. Other people were impatient especially when they were learning. Mapplethorpe learned to shoot with a Polaroid camera because he was both unwilling to wait for the lab and also because a lot of his photos were so explicit that it was not a good idea to send them to the lab.

Image by Princeton Architectural Press / Danny Kim. Edwin Land felt the SX-70 was his ultimate achievement. It was also a fantastic business success. (original image)

Image by Image given as gift to David Bias and Anne Bowerman. A test photo of Land taken on March 13, 1944. (original image)

Image by Princeton Architectural Press / Danny Kim. The Model 95 went on sale in November 1948, and outsold even Land's optimistic projections. (original image)

Image by Princeton Architectural Press / Danny Kim. Polaroid film package redesigns, before and after. (original image)

Image by Bill Ray. Andy Warhol liked the intimacy of Polaroid. You could get up close to people and you could see what you shot instantly. (original image)

Image by Photography courtesy the artist and The Pace Gallery. Chuck Close used the 20x24 Polaroid camera to produce immense images of his own face, including the breakthrough 1979 work Self-Portrait/Composite/Nine Parts. (original image)

Image by David Hockney No. 1 U.S. Trust. The very first SX-70 color print. The man in the photo is engineer Leonard Dionne, and his colleague Al Bellows snapped the photo. (original image)

Image by Princeton Architectural Press / Danny Kim. The Swinger, introduced in 1965 and aimed at teenagers, sold like crazy, even though its photos were small and black-and-white-only. (original image)

Image by Princeton Architectural Press / Danny Kim. The lawsuit between Polaroid and Kodak over their competing instant-camera lines was the biggest patent case of its time. (original image)

Image by The Impossible Project. The Impossible Project's first efforts in developing Polaroid film. (original image)

Image by Jamie Livingston. Jamie Livingston's simple project—a single Polaroid picture every day, with no retakes, even if a better one presented itself—ran for more than 6,000 days, from 1979 to his death, in October 1997. This one's from March 30 of his final year. (original image)

Image by Ellen McDermott. Christopher Bonanos of New York chronicles the rise and fall of Polaroid. (original image)

What do you consider the most iconic photographs ever taken with a Polaroid?

The Warhol portraits that you see in galleries and museums all the time of Liza Minnelli and Elizabeth Taylor are based on those silk screens, which are in turn based on Polaroid photos he shot of all these people. That was his work process. He would take about 50 portraits of anybody he was going to do a painting of and work from those to make silk screens. There are also a number of the Ansel Adams landscapes of Northern California, the ones you see of Yosemite and other famous scenes, are often shot on large format professional-grade Polaroid film. There’s that one portrait “El Capitan Winter Sunrise” from 1968 that is like nothing else. It’s a fantastic demonstration of what you can do with the right camera and a sheet of Polaroid film.

Describe the rivalry between Kodak and Polaroid that resulted in the biggest settlement ever paid out.

They had this uneasy dance for most of their lives because Kodak was, in the beginning, Polaroid’s first big customer and for many years supplied certain components of Polaroid film. Then they sort of had a falling out in the late ’60s because Kodak realized that it had been supporting not a company that was complimentary to its business but somebody who was increasingly taking market share. Kodak had also heard the first inklings of SX-70, which was going to be a blockbuster if it worked, and they suddenly thought, “Are we giving away the game here?” When SX-70 came around Kodak had a big program going to produce its own instant camera and film, which came around four years later. In 1976, Kodak introduced its instant photography line. A week and a half later Polaroid sued them for patent infringement.

They spent 14-and-a-half years in court and when the settlement came in Polaroid vs. Kodak, Polaroid won. Kodak not only had to pay the largest fine ever paid out, which was nearly a billion dollars, but also had to buy back all those cameras. If you had a Kodak instant camera in the ’80s you got a letter saying Kodak will send you a check or a couple shares of stock. The total in the end was $925 million that Kodak had to pay Polaroid and it stood as the largest ever settlement paid out in a patent case until last month when Samsung was ordered to pay Apple $1.049 billion in damages. [Samsung is appealing the decision.]

Land felt as though Kodak had come along with a clumsier, less elegant version of exactly what he’d done without advancing the game and he was a little offended. He once said, “I expected more of Eastman.” In Apple vs. Samsung, a great deal of what was driving things at the beginning was that Jobs was disgusted with Android for exactly the same reasons. It was precisely the same competitive instincts shot through with outrage at the mediocrity of it all.

What started the downfall of Polaroid?

There are a lot of different threads that sort of come together. It’s little stumbles that turn into a snowball effect. Land didn’t put a good successor in place or more accurately, he didn’t have a succession plan in place. His successors did something right and some things wrong but what was missing in the time after Land’s leadership was a big idea. They did a pretty good job of coming up with products that enhanced the technology they already had but they never quite figured out what the next thing was going to be. There were big research projects within Polaroid to work on digital cameras, to work on ink-jet printers and other technologies. A combination of conservatism and entrenched habits and a little fear of what the future without film would look like economically all snowballed together to sort of bind up the company in one business model that it had been building for a long time.

What is “The Impossible Project” and how do they hope to bring Polaroid back?

The current Polaroid is alive, they are trying to make interesting little products again. It’s a much smaller worldview than they once had.

Then there is “The Impossible Project,” which when Polaroid quit the film business in 2008, Dr. Florian Kaps, André Bosman and Marwan Saba dived in and bought the tooling in the very last factory before it was torn down. They have spent a couple of years trying to make film and, when they introduced it in 2010, it was definitely a beta test. First generation film was very problematic. They weren’t able to use the old formulas because they couldn’t get the chemicals anymore, those companies went out of business. Each batch since then has gotten better and last month they introduced the first film that actually behaves like Polaroid 600 film did. It looks like it’s supposed to. It’s easy to shoot and it is marvelous. They really finally got it to where it need to be.

Why People Turn to Lemurs and Other Endangered Animals for Dinner in Madagascar

Smithsonian Magazine

Madagascar is home to many unique and threatened mammals, such as lemurs and small hedgehog-like creatures called tenrecs. Most people wouldn’t think of consuming one of these animals, but for many in Madagascar, bushmeat is on the menu. Scientists assumed that people turned to wild meat just to survive, but two new studies that examine the entire supply chain for this meat have found that consumption of wild mammals in Madagascar is common and far more open a practice than anyone had suspected.

“One of the issues that’s maybe stymied progress [in thwarting the bushmeat trade] is that it always felt like there was a fight between: Are they people starving? Or are they just rich and they want to eat bushmeat as a luxury good?” says the studies’ lead author Kim Reuter, a biologist previously of Temple University and now at Conservation International in Nairobi. “But I want people to see that the reality is less homogenous, in that these are normal people” eating these animals.

In many cases, ordinary people are buying wild meat when they have some extra money, and the commercial part of the bushmeat trade is out in the open and easy to find, Reuter and her colleagues report in PLOS One and an upcoming paper in Environmental Conservation.

A cook prepares wild bat for a restaurant in Madagascar. (Kim Reuter)

Reuter and her colleagues interviewed people in cities and rural towns across northern Madagascar, including in the capital, Antananarivo, in May through August 2013. At every fifth house, the scientists knocked and asked the head of the household about their meat preferences and meat consumption during the last three days, as well as over their lifetime.

The study area covered a cross-section of northern Madagascar, ranging from urban to rural and including many ethnic and religious groups. Some 83 percent of those surveyed said they held taboos against eating certain kinds of meat. These taboos varied by religion, tribe, family and region. Muslims, for example, are not supposed to eat any forest animals, including bushmeat. And families often have taboos against eating specific animals, such as lemurs or tenrecs, which some believe to be associated with bad agricultural harvests.  

Reuter’s team heard other reasons for avoiding bushmeat, as well. “We're in this village in the middle of nowhere,” she recalls, “and this old guy would just tell us, ‘Oh, I don’t eat any lemurs anymore. It’s bad for my cholesterol.’”

Still, 78 percent of people surveyed had eaten wild meat in their lifetimes, and 31 percent had eaten it in the previous six to eight months.

Those surveyed gave different reasons for eating different mammals. For example, they often ate carnivores like the cat-like fossa because the animals ate human food or were threatening farm animals. Lemurs and tenrecs tended to be consumed for subsistence, in contrast, and bats and wild pig were eaten when people had income to spend.

A smaller study, from 2014, had estimated that 98 percent of wild meat in Madagascar was obtained informally, through hunting, bartering or gifting. But Reuter’s team found that in rural areas, about 30 percent of the bat and lemur meat was purchased. And urban residents, their survey showed, purchased 56 percent of the bat meat they ate and 62 percent of their wild pig meat in markets or restaurants. The commercial trade in urban areas was concentrated in a few well-known market stalls and restaurants. Reuter also saw packaged, frozen wild pig available in some supermarkets.

In Madagascar, some market stalls openly sell bushmeat, such as wild pig. (Haley Randell)

These markets and restaurants were not hard to find. “Once we started asking,” says Reuter, “everyone was like, ‘Of course, that place down the street, didn’t you know?’” She had even eaten at one restaurant without noticing that bushmeat was on the menu.

“This type of comprehensive study is really important,” says Drew Cronin, a conservation biologist at Drexel University who studies the bushmeat market in Equatorial Guinea in Africa. “It's hard to target conservation planning unless you've been out there and have the on-the-ground knowledge.”

This new trove of information about wild meat eating suggests that better enforcement of the law help to conserve the rare fauna of Madagascar, says Reuter. Hunting is currently limited by law, but she says none of the hunters she met had a permit to hunt because rules are overly complex and not well-communicated. Outlawing all hunting wouldn’t be a great option, however, because some people do need bushmeat to survive, she says. Conservation efforts might be better spent on targeting the commercial trade in bushmeat at markets and restaurants.

In addition, says Cronin, “Education and outreach is pretty much always positive. The only drawback is, it's a long game.”

During her research, Reuter also noticed that some bat, wild pig and tenrec meat was priced high enough that it’s probably aimed at the tourist market. She suggests educating tourists and adopting a voluntary labeling scheme for meat that has been obtained legally, such as from wild pigs that threatened livestock.

“I believe that if we don’t act on this now,” she says, “it doesn’t matter what research we do. There won’t be much bushmeat left in 10 years to study.”

Why Museums Should Be a Safe Space to Discuss Why #BlackLivesMatter

Smithsonian Magazine

The deputy director of the Smithsonian's National Museum of African American History and Culture had a problem. At the April 25 symposium “History, Rebellion, and Reconciliation,” her panel was a no show. A law professor and two writers were late and had yet to appear.

So to fill the gap, Kinshasha Holman Conwill called upon “Brother Ellis" and with some heavy coaxing, she convinced Rex Ellis, the museum's director of curatorial affairs, to sing a duet—a rendition of Bernice Johnson Reagon's “Ella’s Song.” 

“We, who believe in freedom, cannot rest until it comes,” they sang. “Until the killing of a black man, a black woman’s son, is as important as the killing of a white man, a white woman’s son.”

That move, in many ways, defined the spirit of the day-long symposium. The event featured speakers that ranged from the award-winning director Ava DuVernay (Selma) to the Pittsburgh-based emcee and community activist Jasiri X, and pastor Osagyefo Sekou to Black Alliance for Just Immigration executive director Opal Tometi.

Topics titled “Making Revolution Irresistible” and “Ferguson: What Does This Moment Mean for America?” proved even timelier than organizers could possibly have imagined. Earlier that week, 25-year-old Freddie Gray of Baltimore had died in police custody, and the city was experiencing a good deal more rebellion than reconciliation. Just hours after the symposium ended, a message on the scoreboard at Baltimore’s Camden Yards noted a plea from the city’s mayor and police department that fans remain in the ballpark until further notice “due to an ongoing public safety issue.” By Monday, after Gray's funeral, violence erupted in the city with looting, fires and injuries. By Tuesday, the governor of Maryland had called in the National Guard.

Back at the conference, Lonnie Bunch, the museum's founding director told about 115 attendees that the developments in Baltimore were the latest in a series that has sparked a national conversation.

“Ferguson. Cleveland. Staten Island. North Charleston. Baltimore. All these places have been seared into our consciousness. Yet this violence, this loss of innocence, and loss of life is not just an issue in the African American community,” he said. “It casts a shadow on native communities, on Latino communities. It casts a shadow on almost every corner of the American experience.”

It was somewhat of a refrain at the symposium that museums can provide “safe,” or even “sacred” spaces, within which visitors could wrestle with difficult and complex topics. Just two days before the event, someone had asked Bunch why his museum—just 18 months before opening its new building on the Mall—would engage in such a controversial issue.

“Well he didn’t really say it that way. He said, ‘Are you crazy?’” Bunch said. “I guess the answer is, yeah. I am. In some ways, isn’t that our job? Our job is to be an educational institution that uses history and culture not only to look back, not only to help us understand today, but to point us towards what we can become.”

By providing that Janus-like context of looking simultaneously forward and backward, the Smithsonian is well positioned to host conversations on topics like race and fairness, said the Institution's acting secretary Al Horvath. “It’s been said that the Smithsonian is in the forever business, and that’s true. It’s a privilege to be the guardians of many of America’s greatest treasures,” he said. “The Smithsonian is definitely also in the now business. We are using our convening power to address issues of the day.”

In his previous role as vice president of Colonial Williamsburg’s Historic Area, Ellis, who sang the duet, observed something about the aura of a church on the grounds which made visitors “less fidgety, less anxious, and less playful.” Something about the sacred space suggested to people that they were in a different sort of place and that they had to “upgrade” their behavior, he said. “I think that happens in the museum setting.”

People used to call museums “cathedrals,” said Bunch, who previously directed the Chicago Historical Society and held curatorial positions at the California African American Museum and the Smithsonian’s National Museum of American History, in an interview. Religion is treated differently in Chicago—which is “comfortable with the political, cultural and business communities coming together to discuss issues”—than it is in Washington, D.C., or Los Angeles, he said. Bunch hopes to bring more of that Chicago model to the Mall, and he noted the museum's program at 19th Street Baptist Church. “That allows us to really amplify the possibilities of what we can do here in D.C.,” he said.

There’s evidence it may already be changing at least some minds. Two-thirds of the way into the program, the symposium’s Twitter hashtag had already attracted more than 20 million Tweets—the largest number the museum has ever received. Among those messages were a couple from a user who self-identifies as a Northern Virginia activist and rap artist. “Great symposium, lots to unpack… surprised how radical it all was in a public space,” he Tweeted. “I’m used to many of the topics covered in today’s … symposium in private, was weird and refreshing to hear such radical stuff in public.”

But however “safe” museum spaces are, they aren’t without their challenges. Some people perceive museums—including the Smithsonian—as spaces likelier to engage in conservative, than grassroots, conversations, says Ellis, who hopes to show visitors that the museum can address both history and contemporary grassroots issues.

Why Marquis de Lafayette Is Still America's Best Friend

Smithsonian Magazine

In her new book, Lafayette in the Somewhat United States, writer Sarah Vowell tells the story of the American Revolution through the life and experiences of Marquis de Lafayette, the French aristocrat who joined the Continental Army as a teenager, convinced King Louis XVI to ally with the rebels, and became a close friend of George Washington.

Lafayette symbolizes many things for Vowell: the ideals of democratic government, the hard reality of those democracies, the tremendous debt early Americans owed to France and the importance of friendship. Like her previous books, such as Assassination Vacation, Lafayette strikes witty blows against the stodgy sorts of U.S. history taught in classrooms. It's less a history book than a collection of stories. I spoke with her last week about her work, her opinion of Lafayette, why she doesn't consider herself a historian, and what she admires about the hit Broadway musical Hamilton.

The interview was edited and condensed.

Why did you decide to write a book about Marquis de Lafayette?

That question always stumps me. There are so many answers to that. I lived near Union Square in New York City for about 10 years. There's a statue of Lafayette in the square and it's right next to the sidewalk, so I walked by him pretty much every day. He was one of my neighbors so I was always thinking about him. And also, I had written a shorter piece a number of years ago about Lafeyette's return trip to America in 1824

Was that the story that appeared on This American Life?

Yes, yeah. It was for a show about reunions and that piece was a very kind of sentimental journey, literally, about how he came back in 1824. He was invited by President Monroe, he stays for over a year and the whole country goes berserk for him. It's just Lafayette mania. Two-thirds of the population of New York City meets his ship. Every night is a party in his honor. And I guess the reason that story attracted me was because of the consensus that the whole country embraced him. By 1824, the Civil War is pretty much a foregone conclusion. But because he was a Frenchman and because he was the last living general from Washington's army, the whole country—north and south, left and right—he belonged to everyone and that seemed so exotic to me.

51AZox6paLL._SL160_.jpg

Lafayette in the Somewhat United States

~ Sarah Vowell (author) More about this product
List Price: $27.95
Price: $15.37
You Save: $12.58 (45%)

So Lafeyette comes back to America in 1824, just shy of 50 years after the revolution. Eighty thousand people meet him at New York Harbor. It's an enormous crowd.

Totally. Yes. Only 4,000 met The Beatles in 1964.

So why was Lafayette universally beloved when he returned?

I think there are a few reasons. He is, basically, the most obvious personification of America's alliance with France in the war. And Americans back then were still grateful for French money and gunpowder and soldiers and sailors. The help from the French government was the deciding factor in the revolution. Lafayette was the most swashbuckling symbol of that. There was also, then and now, a great reverence and almost a religious love for George Washington. Lafayette had served with Washington and became his de facto adopted son—Lafayette was an orphan and Washington had no biological children of his own—so their relationship was very close. And so, he was so identified with Washington.

The visit also coincided with the presidential election of 1824, which is basically the first election when Americans had to vote for a non-founding father. There was this nostalgia, this kind of national moment of reflection about how the country had to continue on without its fathers. Lafeyette's secretary kept a diary during that whole trip. He marveled that these newspapers would be full of bile about presidential candidates, then Lafayette would show up, and the day's paper would be all like, "We 'heart' Lafayette." Those two things are related a little bit, nostalgia and reverence for that very singular past and nervousness about the future.

And what happened? Why don't we feel that way anymore?

Well, he has been a little bit forgotten, but I think you could say that about many, many figures in American history. I think the forgetting of Lafayette is just a symptom of the larger cultural amnesia. When I was starting my research on this book, there was this survey done by the American Revolution Center that said most adult Americans they didn't know what century the Revolution was fought in. They thought the Civil War came first. They didn't know the Bill of Rights was part of the Constitution. So yes, Lafayette is a little bit forgotten, but so are a lot of other things more important than him.

You mention in the book this idea that Lafeyette is no longer a person. His name is a bunch of places now.

The most practical effect of his visit in the 1820s was that everything started getting named after him. When I was at Valley Forge, I was with this friend of mine who had lived in Brooklyn. There was a monument to the generals who had been at Valley Forge: Lafayette was one of them, and General Greene and DeKalb. And I remember my friend just calling it "that big monument thing with all the Brooklyn streets." A lot of these people just become street names. It's natural that these people leave behind their names and their stories are forgotten, I suppose. But for me, every time I would walk, say, past the statue of Lafayette down towards Gansevoort Street, the whole city came alive. If there's any practical effect of learning about this stuff, it just makes the world more alive and interesting. And it certainly makes walking around certain cities on the eastern seaboard more fascinating.

Let's rewind five decades. Lafayette crosses the Atlantic in 1777, at age 17. He abandons his pregnant wife—

That was unfortunate.

He leaves behind a comfortable aristocratic life. His family doesn't even know what he's doing and it's all to fight in someone else's war.

Right.

Why?

When you put it like that it does not seem like a good idea.

Plenty of 19-year-olds have bad ideas.

Oh, for sure. I would distrust one who only made good decisions. There are a few reasons for his decision to fight. Lafayette married quite young. He's a teenager. He's the richest orphan in France, and he's kind of pounced upon by this very rich and powerful family, then he marries their daughter. His father-in-law wants him to get a cushy boring job at the French court and be a proper gentleman, but Lafayette is the descendant of soldiers. His ancestors are soldiers going back to the Middle Ages. One of his ancestors fought with Joan of Arc. His father, who died when Lafayette was almost two years old, was killed by the British in battle during in the Seven Years War.

There's a grudge there.

That's one reason he's pretty gung ho to fight the British in America. He wants to be a soldier like his father before him and all the fathers before that. He's just one of many European soldiers who flocked to the American theater of war to volunteer with the rebels, some of them not for particularly idealistic reasons, but because they were out of a job. The defense industry in Europe was downsizing. Lafayette is one of these Frenchmen who are coming over to fight.

The other thing is, he got bitten by the Enlightenment bug and was enamored with ideals about liberty and equality. The letters he writes to his poor, knocked-up wife while he's crossing the ocean are incredibly idealistic. He says that the happiness of America will be bound up with the happiness of mankind, and then we'll establish a republic of virtue and honesty and tolerance and justice. He's laying it on a little bit thick because he has just abandoned her. But it's still very stirring, and I do think he believed it.

So after all of your research, after writing this book, spending a lot of time trying to get into his head, how do you feel about Lafayette? Do you like him?

Do I like him? Yes, I do like him. I am very fond of him. He's a very sentimental person I think part of that was his youth, maybe his being an orphan. Jefferson complained of his canine appetite for affection. Lafayette has this puppy-dog quality.

He was kind of a suck-up.

Yeah, he was. But I like puppy dogs. And when push came to shove, Lafayette got the job done. For all of his French panache, he really did roll up his sleeves and set to work on behalf of the Americans. Maybe it was bound up with his lust for glory.

Washington was constantly dealing with desertion crises. His soldiers are deserting him in droves throughout the whole war. And who can blame them? They're not getting paid. They're not getting fed. There's frequently no water. A lot of them don't have shoes. It's a really crummy job. But then this kid shows up like a football player asking his coach to put him in the game.

In his first battle, the Battle of Brandywine, he's wounded and barely notices because he's so busy trying to rally all the patriot soldiers to stand and fight. He never turns down an assignment. He's always ready to get in the game. And then, when he goes back home to Paris after the war, he's constantly helping the American ministers, Jefferson and Monroe, with boring economic stuff. There's not much glory in that. But Lafayette lobbied to get the whalers of Nantucket a contract to sell their whale oil to the city of Paris. That's real, boring, grownup friendship. And then to thank him, the whole island pooled all their milk and sent him a giant wheel of cheese. What was your question?

Do you like him?

Yes, I do like him. The thing I like about nonfiction is you get to write about people. The older I get, I feel I have more empathy for people's failings because I've had so much more experience with my own. Yes, he was an impetuous person. But generally, I think he was well intentioned. And he also really did believe in these things that I believe in. So, yes. Is he a guy that I want to have a beer with?

Would you?

Yeah, of course. Who wouldn't want to meet him?

In this book, you describe yourself as "a historian adjacent narrative nonfiction wise guy." Self-deprecation aside, how does that—

I don't think of that as self-deprecation. You're thinking of that as self-deprecation in the sense that a proper historian is above me on some hierarchy. I don't think that way at all.

I meant that, in the book, it's played a little bit as a joke. You're teasing yourself, right?

I am, but I'm also teasing Sam Adams, because he says, ["If we do not beat them this fall will not the faithful Historian record it as our own Fault?"] I don't think of myself as an historian and I don't like being called one. And I also don't like being called a humorist. I don't think that's right, partly because my books are full of bummers. I reserve the right to be a total drag. I just consider myself a writer. That's one reason I don't have footnotes. I don't have chapters. I just want to get as far away from the stench of the textbook as I can. I inject myself and my opinions and my personal anecdotes into these things in a way that is not historian-y.

Given how you describe your work, and the empathy you've developed towards peoples' flaws, what can you write about that historians can't?

For one thing, empathy can be really educational. If you're trying to look at something from someone else's point of view, you learn about the situation. You might not agree. But as I go on, I become maybe more objective because of this. Ultimately, there's something shocking about the truth.

I'll give you an example. My last book was about the American takeover of Hawaii in the 19th century. It's the story of how native Hawaiians lost their country. It's a big part of their lives and it's a huge part of their culture. And if you go back to the historical record, there are kind of two narratives. There's the narrative of the missionary boys and their descendants, how these New Englanders took over these islands. Then there's the native version of those events, which is necessarily and understandably upset about all of that.

You're trying to parse complicated histories. There's one line early in the Lafayette book that seems related to this: "In the United States there was no simpler, more agreeable time." Why do you think it's so hard for us to recognize dysfunction within our own history? And where does this temptation to just indulge nostalgia come from?

I don't know. I just loathe that idea of the good old days. Immoral behavior is human nature. So I don't know why there's this human tendency to be nostalgic about the supposedly superior morals of previous generations.

Why is it so difficult to recognize and acknowledge the role that dysfunction has played?

I think it has to do with this country. History is taught not as a series of chronological events, but as adventures in American exceptionalism. When I was growing up, I was taught America never lost a war because "America is God's chosen nation." I started kindergarten the year the helicopters were pulling out of Saigon.

It's funny, one reason why Americans loved Lafayette was because of how much he loved them. In 1824 or 1825, he's speaking before the joint houses of Congress and he says, "America will save the world." What European thinks that? We love to think about ourselves as helpful and good.

As saviors?

Yeah. And sometimes, the historical record doesn't back that up. That's true of every country. But unlike every other country, we have all of these documents that say we're supposed to be better, that say all men are created equal. All of the great accomplishments in American history have this dark backside. I feel very reverential of the Civil Rights Movement. But then you think, well, why was that necessary? Or all of these great amendments we're so proud of. It's like, oh, everyone can vote? I thought we already said that.

So how do you—

Let me say one more thing. You know that scene in Dazed and Confused where the history teacher tells the class that when you're celebrating the Fourth of July, you're celebrating a bunch of like old white guys who didn't want to pay their taxes? I'm not one of those people. I don't think it's all horrors and genocide and injustice. I do think it's still valuable to celebrate those founding ideals. And there are some days that the idea that all men are created equal, that's the only thing I believe in. I think those ideals are still worth getting worked up about.

Just because Jefferson owned slaves, I don't think that completely refutes the Declaration. I think you have to talk about both things. I'm not completely pessimistic about it. That's what I love about nonfiction: if you just keep going back to the truth, it's the most useful and it's the most interesting. I don't want to be a naysayer or a "yaysayer." I want to like say them both together. What would that word be?

Ehhsayer?

Yeah, kind of.

So what's next? Do you have plans for another book?

It's what I do for a living so I would hope so. I have a few ideas floating around but I was actually so late.

With this one?

Yeah. And I still haven't recovered. My books, I think they seem breezy to read. I write them that way purposely. But it's incredibly time consuming to put all that together and edit out the informational clutter. I just hate jargon and pretentious obfuscation. This book, which seems like a nice romp through the Revolutionary War, was actually tedious and life sucking to put together. So, yes, I'll write another book when I get over writing this one.

Have you seen Lin-Manuel Miranda's Hamilton musical [which features a rapping, dancing Marquis de Lafayette]?

I have.

What did you think of it?

I mean, what's not to like?

Well, it's not about Lafayette.

No, it's not about Lafayette. That is my one complaint about Hamilton. It has too much Hamilton sometimes. The thing I loved about it most, honestly, was aesthetic. It so perfectly utilized every aspect of theater. It just milked the meaning out of everything. And the nonstop force of the narrative and the rhythm is so effusive and hilarious. I love how alive it is and how alive the people onstage are.

Daveed Diggs!

Daveed Diggs, yes. Daveed Diggs and his hair. He has so much swagger and joie de vivre. I do love how funny it is. But I also like how it doesn't run away from all of these people and their foibles and how they didn't get along.

What would happen if you and Lin-Manuel Miranda went head-to-head, high school debate style?

I'm glad it's high school debate style and not a rap battle because I'm pretty sure he would kick my ass.

Hamilton versus Lafayette. The battle of American heroes. Who wins?

That's the thing. You don't have to choose. I mean, basically, it's going to be Washington. That's even one of the songs, "It's good to have Washington on your side," I think. They each have their contributions. I mean, probably, ultimately, the banking system is more important day-to-day.

We're lucky we don't have to choose.

It'd be a pretty interesting choice to have to make. But, obviously I hope I never have to debate that guy.

The musical is very concerned with the legacies of historical figures. We talked a bit about this already, the idea of what Lafayette has become. What do you think his legacy is today, aside from the statues and the colleges and the towns? What does he represent?

More than anything, he represents the power and necessity and joys of friendship. I think of him as America's best friend. The lesson of the Revolutionary War in general, and of Lafayette in particular, is the importance of alliance and cooperation. A lot of my book is about how much bickering was going on, but I still call it the "somewhat United States" because the founders were united enough. Britain loses because Britain was alone. America wins because America has France. It's easier to win a war when you're not in it alone. And it's easier to live your life when you're not in it alone.

The friendship among those men is one of their more enduring legacies. It's why we call them, we think of them, we lump them together as "the Founding Fathers." Even though they didn't really get along, and maybe they didn't even like other a lot of the time, but they were in it together. 

Why Lie Detector Tests Can't Be Trusted

Smithsonian Magazine

Francis Gary Powers had his first polygraph experience right after signing up as a pilot for the CIA’s U-2 program in January 1956. In his memoir, Powers described being called into a room where he was confronted with the question,

“Any objection to taking a lie detector test?” Though I had a great many, I didn’t voice them, shaking my head. If this was a condition of the job, I’d do it. But I didn’t like it. … I had never felt so completely exposed, as if there was no privacy whatsoever. If at that moment someone had handed me a petition banning polygraphs forever from the face of the earth, I would gladly have signed it. When I was asked the last question and the straps were taken off, I vowed that never again, no matter what the circumstances, would I undergo such an insult to my integrity.”

Yet Powers would later take another polygraph test, with even higher stakes.

Powers’ case would be an uncommon one, but the polygraph was considered an essential tool in that period, for reasons that had little to do with getting to the truth. The polygraph was more of an attempted answer to a central Cold War conundrum: How could Americans fulfill their pledges to oppose an allegedly totalitarian enemy without becoming totalitarian themselves?

To square this particular circle, federal agencies, first and foremost the CIA, began using a controversial technology developed by psychologists in the early 20th century, and then refined and applied by the police and private businesses since the 1920s. Polygraph measurements—derived from changes in blood pressure, breathing depth, and skin conductivity of an electric current—have never been proved to be reliable indicators of deception. Not only is genuine emotional turmoil hard to reproduce in laboratory studies, but such emotional responses are not uniform among humans and can be imitated by countermeasures (such as pinching yourself before giving a response). In large screening tests, significant numbers of “false positives” (innocent people being labeled deceptive) are unavoidable.

In addition, the question of whether deception during a polygraph test indicates a person is unsuitable for employment transcends merely technical issues. In the final analysis, American security agencies never arrived at a definition of what personal characteristics a model employee should have. Instead, the polygraph provided reasons for dismissing a person as a security risk or denying him or her employment.

Leonarde Keeler was the first American to receive a patent for a polygraph. His patent, granted on January 13, 1931, described the machine as an "apparatus for recording arterial blood pressure." (U.S. Patent 1,788,434)

Bureaucratic usefulness, rather than any scientific validity, goes a long way toward explaining why the polygraph became a standard instrument of the American national security state. The case of Powers and his history with polygraphs is instructive.

From 1956 to 1960, 24 U-2 flights over the USSR yielded invaluable strategic intelligence on Soviet military capabilities. But on May 1, 1960, disaster struck when Powers’ plane was shot down over Sverdlovsk (today called Yekaterinburg). American authorities issued a cover story about a weather balloon gone astray and were caught flat-footed when Nikita Khrushchev presented to the world the remnants of the plane, and then the pilot himself. Powers had miraculously survived and was subsequently put on trial in Moscow and sentenced to 10 years in prison for espionage. In February 1962, he was exchanged for Soviet KGB colonel Vilyam Fisher (alias Rudolf Ivanovich Abel).

Powers returned home a hero under suspicion. Unbeknownst to him and the American public, doubts about his truthfulness arose due to National Security Agency intercepts of Soviet responses to the U-2 flights. Tracked radar signals indicated that Powers’ plane had dropped below its regular altitude of 65,000 feet, making it vulnerable to surface-to-air missile attacks. But Powers vehemently denied that he had allowed the plane to decline. The CIA, fearing for its then-stellar reputation with the American public, insisted on Powers’ innocence as well.

CIA director John McCone set up a board of inquiry under a federal judge, E. Barrett Prettyman, to prepare a statement for public consumption. The document highlighted that medical tests, a background check, and an interrogation had confirmed that Powers “appeared to be truthful, frank, straightforward. … He volunteered with some vehemence that, although he disliked the process of the polygraph, he would like to undergo a polygraph test. That test was subsequently duly administered by an expert. … [Powers] displayed no indications of deviation from the truth in the course of the examination.”

Contrast this with Powers’ own version of his treatment: Getting frustrated by “doubts about my responses, … I finally reacted angrily, bellowing: ‘If you don’t believe me, I’ll be glad to take a lie detector test!’ … Even before the words were out of my mouth, I regretted saying them. ‘Would you be willing to take a lie detector test on everything you have testified here?’ … I knew that I had been trapped.”

Francis Gary Powers holds a model of a U-2 spy plane as he testifies before the Senate Armed Services Committee. Powers' plane was shot down by the Soviets, and he was tried and convicted of spying in the USSR. (Bettmann/Getty Images)

Since shortly after its creation in 1947, the CIA has used the polygraph as part of its personnel security procedures to ascertain the truthfulness of job applicants and employees and to confirm the bona fides of agents. At the height of McCarthyism, utilizing a machine known by the public as a “lie detector” made sense, especially for a brand-new agency that had to be staffed quickly. To its proponents, the polygraph represented a promise of objectivity and fairness along with effective deterrence of spies and traitors. As a CIA inspector general report from 1963 emphasized, “We do not and could not aspire to total security. Our open society has an inherent resistance to police-state measures.”

When challenged by Congress, which investigated federal polygraph use repeatedly beginning in the mid-1960s, the CIA defended the polygraph aggressively. In 1980, the Director of Central Intelligence’s Security Committee insisted: “The utility of the polygraph interview as part of security processing has been demonstrated by empirical means. … These practical results, plus more than thirty years’ experience, make the use of the polygraph in security screening truly unique and indispensable.”

Yet internally, CIA bureaucrats admitted that the practice of sorting out job applicants and employees based on their test results was questionable at best. Even after decades of polygraph practice, the CIA could not define what exactly it meant by elusive terms such as “routine” and “voluntary” in its polygraph program. A 1974 list of questions from polygraph examiners to the general counsel included the following query: “What can a polygraph officer say in response to the question: ‘Do I have to take this test to get a job with the Agency?’ or ‘What happens if I don’t take the test?’” The relevance of the evidence produced during most polygraph tests was also unclear. “The precise yardstick for the measuring of security reliability of an individual continued to be elusive,” an internal CIA history on personnel security concluded in 1973.

Up until his death in a helicopter accident in 1977, Powers insisted that he had acted as a loyal American under trying circumstances. No definite account of the incident has been established yet. We also don’t know what data Powers’ polygraph test produced. However, it is reasonable to conclude that the Kennedy administration found it advisable to assure the public of Powers’ truthfulness, and that announcing that Powers had passed a polygraph test was part of their public relations strategy.

Powers’ experience highlights three ambiguous characteristics of polygraph use by the CIA for purposes of “national security.” First, the claim by polygraph proponents that the test could be a witness for the defense, exonerating loyal citizens, often turned out to be less than clear-cut. Second, while the polygraph relied on the rhetoric of voluntarism, in reality the pressure to take the test often mocked the idea of a free decision. Third, polygraph exams often served to provide official cover rather than revealing the truth of events.

Other questions haunted the polygraph throughout the Cold War, and the often-traumatic experience of the test provoked fierce protests from Americans across ideological lines. Journalists Joseph and Stewart Alsop, two otherwise unrelenting Cold War boosters, compared the polygraph to the embrace of an octopus whose “electric tentacles” produced an “overwhelming impulse to tell all … in order to appease the octopus machine.” Even former chief of CIA counterintelligence James Olson called polygraph exams “an awful but necessary ordeal. We all hate them. … A polygraph examination … is rude, intrusive, and sometimes humiliating. … It’s a grueling process.” Whether the sheer unpleasantness of the exam did more to deter potential traitors, or kept otherwise upstanding citizens from joining the agency, is impossible to determine.

Ultimately, there is the question of whether the polygraph ever caught Soviet spies. Certainly no major communist spy was ever caught by the machine, and the most damaging one, Aldrich Ames, passed two routine polygraph exams after he had delivered deadly information about U.S. activities in the Soviet Union to his handlers.

While the Ames case almost fatally damaged the polygraph’s reputation, the technology was rekindled in the wake of the 9/11 attacks and the subsequent wars in Afghanistan and Iraq, because, once again, it gave the appearance of a scientific way to test such elusive values as loyalty when doing the inherently risky jobs of screening employees and counterintelligence work. As the history of the polygraph makes clear, American policy makers place great trust in technological fixes to thorny political problems—even though they themselves question those fixes privately.

John Baesler is a professor of history at Saginaw Valley State University and the author of Clearer Than Truth: The Polygraph and the American Cold War.

Why Kendrick Lamar’s Pulitzer Win Is History-Making

Smithsonian Magazine

The Pulitzer board handed out its first prize for music in 1943, and for the next seven decades, the award was given exclusively to artists working in the genres of classical music and, in more recent years, jazz. But yesterday, Kendrick Lamar bucked that trend. The Compton-born rapper known for his searing rhymes was awarded a Pulitzer for his 2017 album DAMN., making him the first hip-hop artist to ever win the coveted prize.

As Joe Coscarelli reports for the New York Times, the Pulitzer board described DAMN. as “a virtuosic song collection unified by its vernacular authenticity and rhythmic dynamism that offers affecting vignettes capturing the complexity of modern African-American life.”

After the announcements were made, Dana Canedy, the administrator of the prizes, told Coscarelli that the “time was right” for Lamar’s historic win.

Since the 2012 release of good kid, m.A.A.d city, Lamar’s first major label album, he has been widely hailed as one of this generation's most important and talented artists. With scorching candor, his deeply reflective lyrics vacillate from the intimate to the political, broaching topics like police brutality, gun violence, and the pressing burdens that come with newfound fame.

But in spite of his critical and commercial successes (as Randall Roberts of the Los Angeles Times points out, Lamar is not only the first rapper to win a Pulitzer, but also the first winner to boast a platinum or No. 1 album), Lamar has never picked up one of the industry’s most significant awards: the Grammy for Album of the Year. He has been nominated three times—for good kid, m.A.A.d city, 2015’s To Pimp a Butterfly, and DAMN.

“Lamar has been at the top of his game for years and his own industry has refused to award him,” Ira Madison of The Daily Beast writes, “so it was certainly a shock that the Pulitzer board would.”

The Pulitzer has historically recognized a narrow breadth of musical genres—and artists. According to Constance Grady of Vox, it took until the late 1990s for the award to be granted to a jazz musician, when Wynton Marsalis' three-hour oratorio on slavery and escape, “Blood on the Fields,” was honored in 1997. Some three decades earlier, Duke Ellington was denied that honor when the Pulitzer jury recommended granting him the prize in 1965, but the board opted against honoring anyone that year, which was viewed as a refusal to critically acknowledge a genre of music born out of the African American experience. “I’m hardly surprised that my kind of music is still without, let us say, official honor at home,” Ellington said in a September 1965 interview with New York Times Magazine titled "This Cat Needs No Pulitzer Prize.”

Dwandalyn Reece, curator of music and performing arts at the Smithsonian’s National Museum of African American History and Culture, tells Smithsonian.com that Lamar’s history-making Pulitzer win is an important “recognition of the cultural, musical influence of hip-hop and rap” and “a recognition of African American music traditions.”

Reece also points out that the Pulitzer’s long-standing neglect of popular music genres reflects an entrenched value system that has elevated Western classical music and, now to some extent, jazz as paragons of refinement and prestige.

But that value system, it seems, is changing.

"People don't make music to have awards, but they want their music to be recognized and valued," Reece says. "Awards [can be] an acknowledgement of the artistry and influence and presence of African American artists and the musical traditions they celebrate, in a world that has really silenced them historically in so many areas."

Lamar's work, which is steeped in the African American experience, “captures the essence of what this genre of music is all about: the reflection of real life and authenticity," Reece says.

She pauses for a moment, and then adds, “He's really stellar.”

Why Jon Batiste Is the Perfect Choice to Be the “Late Night” Bandleader

Smithsonian Magazine

It’s a rare talent that can get a crowd of adults on their feet, singing along to “If You’re Happy And You Know It,” just moments after impressing that same crowd with an original jazz composition.

But that's just what Jonathan Batiste, who will soon debut as bandleader on the highly anticipated “Late Show With Stephen Colbert,” accomplished this summer at the Newport Jazz Festival. Lyrical passages, flowing from the piano, gave way to a boisterous New Orleans party, which then transitioned into Batiste grabbing a melodica and leading his band, Pied Piper style, into the crowd to perform that ridiculous, but joyful, children’s song. When the musicians segued into “On the Sunny Side of the Street,” the crowd erupted spontaneously.

This radiant charisma and uncanny ability to collapse the distance between a jazz band and skeptical, uninitiated audience make the 30 year-old artist the ideal figure to bring new life to late-night television.

“I’m from New Orleans, which is all about direct engagement out in the street with all the parades and Mardi Gras Indians and jazz funerals,” Batiste said in an interview conducted at Newport. “I’m trying to take that and put it into my generation, a group that doesn’t have enough joy and celebration in their lives. I like the energy the crowd gives you and I want to feel it by being at the center of it. Sometimes even being on stage is too far away.”

Batiste, drummer Joe Saylor and alto saxophonist Eddie Barbash—soon to be the core of the band on Colbert's new show—met when they were all students at the Juilliard School. To counter the ivory-tower syndrome of academia, the band began taking their instruments onto subway cars in 2010-2011. At first the other riders avoided eye contact for fear of being asked for money, but when the musicians kept playing without passing the hat, the listeners relaxed and then got swept as familiar tunes were turned inside out into ebullient reinventions. Batiste realized that jazz could connect with non-jazz audiences if it met them halfway.

“It’s all about making the moment have an energy that people want to share,” he explained. “In a live performance, it’s a collaboration with the audience; you ride the ebb and flow of the crowd's energy. On television, you don’t have that. So the question is, ‘How do I make a moment that if I were at home watching it on TV I would want to be there too?’ You have to send that energy out there through the cameras and have faith that it's engaging the audience.”

How, in other words, do you turn millions of widely dispersed TV viewers into the delirious dancers at the Newport Jazz Festival or the startled riders on a New York subway car? Not by memorizing a song or a routine but by trusting in one’s instincts as an improviser. Only if you’re creating something new in the moment, he argued, can you maintain an energy level high enough to command an audience's wandering attention. Batiste got a taste of this in the supporting role of the pianist in the fictional trumpeter Delmond Lambreaux’s band on the HBO series “Treme.” But the true epiphany came during his first appearance on “The Colbert Report” in 2014.

“If you check out that first interview,” said Batiste, “you can see the energy flowing between us. Halfway through the interview, he threw away the cue cards and came up close to my face and there was really a back and forth. It was one of the most fun interviews I’ve ever done.”

“Stephen did his interviews in character, where he basically pretended he’s a total idiot. [Colbert will abandon that persona on his new show.] A lot of people didn’t know how to respond to that; maybe they didn’t know he’s in character or maybe they didn’t know how to respond to a character. But I could tell he was asking me these really deep questions but framing them as if he were an idiot, so I responded to the deepness rather than idiocy. Once he threw the cue cards away, we were improvising.”

And improvisation, Batiste insisted, is essentially the same whether it's happening in music, comedy, dance or daily life. Whether you’re a jazz pianist, a stand-up comic or a parent trying to shepherd three kids to a store, you have a general goal in mind but you’re making up the details as you go—the only difference is the materials employed: notes, words or parental instinct. Batiste believes that if you really are creating something new in front of people, they will respond whether they are jazz fans or not, young kids or jaded adults.

“We performed on the subway to reach people who might not otherwise have access to this music,” Batiste added. “The subway in New York is a great social experiment; there are so many races and ways of life sitting together on each car. I guess that’s similar to TV, where you have millions of people of all races and cultures, and they may not have access to jazz either, because it’s certainly not on TV now. And what I learned from the subway is that if you want to reach across whatever separates us as people, you have to be totally in the moment.”

Why John Dillinger’s Relatives Want to Exhume His Body

Smithsonian Magazine

After the notorious bank robber John Dillinger was shot to death by federal agents in 1934, thousands of spectators converged at his funeral, some of them swiping flowers and dirt from the grave as souvenirs. Worried that the situation might escalate to grave robbing, Dillinger’s family went to great lengths to ensure that his body remained firmly in the ground, encasing his remains under layers of concrete and iron.

So it came as a surprise when reports surfaced earlier this week that the Indiana State Department of Health had issued a permit to Dillinger’s living relatives, allowing them to exhume the criminal’s body. Though the reasons for the planned exhumation were not immediately clear, Vanessa Romo of NPR now reports that Dillinger’s niece and nephew have indicated that they suspect the body interred under Dillinger’s headstone may not belong to their outlaw uncle.

Separate affidavits signed by Mike Thompson and his sister, Carol Thompson, cite multiple pieces of “evidence” fueling their suspicions that it was not Dillinger who was gunned down outside Chicago’s Biograph Theater on July 22, 1934. The eye color, ear shape and fingerprints of the man who was killed that day do not match Dillinger’s, according to the documents. The affidavits also claim that the deceased had a heart condition—though the siblings do not “elaborate on why the heart condition supports their theory that the man wasn't Dillinger,” the Associated Press notes.

The newly issued permit allows the body to be disinterred from Indiana’s Crown Hill Cemetery and restored to its grave by September 16. The affidavits stipulate that Dillinger's relatives are seeking to have the remains re-examined with forensic analysis and, possibly, DNA testing, according to the AP, which also reports that the exhumation will be chronicled for an upcoming History Channel Documentary.

Dillinger and his gang of criminals shocked and dazzled the nation with their bold heists and dramatic prison escapes. They robbed multiple banks across the Midwest, raided police arsenals and killed 10 men. But during the fallow years of the Great Depression, when Americans were feeling vanquished by widespread poverty, Dillinger was seen as something of a rebel hero who took what he wanted from the banks.

This is hardly the first time that questions have been raised about his fate.

The outlaw was killed after seeing the Clark Gable film Manhattan Melodrama with several companions—one of whom, a brothel madam who went by the name Anna Sage, was colluding with the FBI. When Dillinger realized that the authorities were closing in on him, he grabbed a pistol from his trouser pocket and ran toward an alley. As he tried to escape, he was shot three times and killed.

A common theory posits that federal agents accidentally shot a Dillinger look-a-like named Jimmy Lawrence, whose name Dillinger had in fact been using as he gallivanted around Chicago. In their affidavits, Mike Thompson and Carol Thompson say it is “critical” to find out if Dillinger did in fact live beyond the date of his reported death—and, if the rumors should prove true, to find out “where he lived, whether he had children, and whether any such children or grandchildren are living today.”

But the FBI dismisses this idea as a “conspiracy theory” based purely on “circumstantial evidence,” noting that the dead man’s fingerprints were taken immediately after the shooting and during an autopsy—and were a positive match for Dillinger’s both times. Bill Helmer, co-author of Dillinger: The Untold Story, tells Dawn Mitchell and Holly V. Hays of the Indianapolis Star that he, too, believes the look-a-like theory is “total nonsense.” Not all of Dillinger’s surviving relatives support the move to exhume his body, either.

"I don't believe in desecrating the dead,” Jeff Scalf, Dillinger’s great nephew, says in an interview with Alyssa Raymond of NBC affiliate WTHR. “I think it's been 85 years. It doesn't matter."

Why Humans Are the Only Primates Capable of Talking

Smithsonian Magazine

Compared to humans, most primates produce a limited range of vocalizations: At one end of the spectrum, there’s the Calabar angwantibo, an arboreal west African critter capable of offering up just two distinct calls. At the other end, there’s the bonobo, a skilled chatterbox known to voice at least 38 different calls.

A new study published in Frontiers in Neuroscience suggests these variations can’t be attributed simply to inadequate vocal anatomy. Like their hominid cousins, non-human primates possess a functional larynx and vocal tract. The crux of the matter, according to lead author Jacob Dunn, a zoologist at Anglia Ruskin University in Cambridge, is brainpower.

“The primate vocal tract is ‘speech ready,’ but ... most species don’t have the neural control to make the complex sounds that comprise human speech,” Dunn writes for The Conversation.

Dunn and co-author Jeroen Smaers of New York’s Stony Brook University ranked 34 primate species according to vocal ability, as represented by the number of distinct calls the animals produce. The pair then analyzed these rankings in relation to existing studies of the respective species’ brains.

Apes with varied vocalization patterns tended to have larger cortical association areas (neural regions responsible for responding to sensory input) and brainstem nuclei involved in control of the tongue muscles, Victoria Gill reports for BBC News.

These findings, according to a press release, reveal a positive correlation between relative size of cortical association areas and primates’ range of distinct vocalizations. In layman’s terms, speech ability comes down to neural networks, not vocal anatomy. Primates whose sound-producing brain regions are larger can produce a wider variety of calls than those with relatively smaller brain regions.

Dunn and Smaers’ research offers insights on the evolution of speech, Gill notes. Instead of attributing speech skills to humans’ allegedly superior intelligence, the study suggests that speech evolved in conjunction with the rewiring of human brains.

As mankind placed increasing importance on vocal communication, neural regions evolved to fit these needs. Apes, on the other hand, adapted to fit different priorities, retaining an anatomical capacity for vocalization but failing to develop the accompanying neural characteristics needed for speech.

In an interview with Gill, Durham University zoologist Zanna Clay, who was not involved in the study, described the new findings as “interesting,” but added that scientists still lack a basic understanding of how primates use and interpret vocalizations.

Clay, co-author of a 2015 study on bonobo communication, previously told BBC News’ Jonathan Webb that bonobos release identical squeaking sounds, or “peeps,” during disparate situations such as feeding and traveling.

“On their own, [the peeps] don't tie so strongly to one meaning," Clay said.

Within a certain context, however, peeps relay different meanings—perhaps related to the situation at hand or placement in a sequence of vocalizations. This suggests that bonobos are capable of understanding “structural flexibility,” or the use of a single vocal signal in multiple contexts. This phenomenon was previously believed to be a uniquely human ability, Webb writes.

“We do not even really understand how the primates themselves classify their own vocal repertoires,” Clay tells Gill. “This needs to come first before correlations are made. We know that many primates and other animals can escape the constraints of a relatively fixed vocal system by combining calls together in different ways to create different meanings. The extent to which call combinations might map on to [brain anatomy] would be a promising avenue to explore."

73-96 of 6,696 Resources