Skip to Content

Found 6,676 Resources

Why Holograms Will Probably Never Be as Cool as They Were in "Star Wars"

Smithsonian Magazine

Stereoscopes entertained every Victorian home with their ability to produce three-dimensional pictures. Typewriters and later fax machines were once essential for business practices. Photo printers and video rentals came and went from high streets.

When innovative technologies like these come to the end of their lives, we have various ways of remembering them. It might be through rediscovery – hipster subculture popularizing retro technologies like valve radios or vinyl, for example. Or it might be by fitting the technology into a narrative of progress, such as the way we laugh at the brick-sized mobile phones of 30 years ago next to the sleek smartphones of today.

These stories sometimes simplify reality but they have their uses: they let companies align themselves with continual improvement and justify planned obsolescence. Even museums of science and technology tend to chronicle advances rather than document dead-ends or unachieved hopes.

But some technologies are more problematic: their expectations have failed to materialize, or have retreated into an indefinite future. Sir Clive Sinclair’s C5 electric trike was a good example. Invisible in traffic, exposed to weather and excluded from pedestrian and cycle spaces, it satisfied no one. It has not been revived as retro-tech, and fits uncomfortably into a story of transport improvement. We risk forgetting it altogether.

When we are talking about a single product like the C5, that is one thing. But in some cases we are talking about a whole genre of innovation. Take the hologram, for instance.

The hologram was conceived by Hungarian engineer Dennis Gabor some 70 years ago. It was breathlessly reported in the media from the early 1960s, winning Gabor the Nobel Prize in Physics in 1971, and hologram exhibitions attracted audiences of tens of thousands during the 1980s. Today, tens of millions of people have heard of them, but mostly through science fiction, computer gaming or social media. None of those representations bear much resemblance to the real thing.

When I first began researching the history of the field, my raw materials were mostly typical fodder for historians: unpublished documents and interviews. I had to hunt for them in neglected boxes in the homes, garages and memories of retired engineers, artists and entrepreneurs. The companies, universities and research labs that had once kept the relevant records and equipment had often lost track of them. The reasons were not difficult to trace.

The future that never came

Holography had been conceived by Gabor as an improvement for electron microscopes, but after a decade its British developers publicly dubbed it an impractical white elephant. At the same time, American and Soviet researchers were quietly developing a Cold War application: bypassing inadequate electronic computers by holographic image processing showed good potential, but it could not be publicly acknowledged.

Instead, the engineering industry publicized the technology as “lensless 3D photography” in the 1960s, predicting that traditional photography would be replaced and that holographic television and home movies were imminent. Companies and government-sponsored labs pitched in, eager to explore the rich potential of the field, generating 1,000 PhDs, 7,000 patents and 20,000 papers. But by the end of the decade, none of these applications were any closer to materializing.

From the 1970s, artists and artisans began taking up holograms as an art form and home attraction, leading to a wave of public exhibitions and a cottage industry. Entrepreneurs flocked to the field, attracted by expectations of guaranteed progress and profits. Physicist Stephen Benton of Polaroid Corporation and later MIT expressed his faith: “A satisfying and effective three-dimensional image”, he said, “is not a technological speculation, it is a historical inevitability”.

Not much had emerged a decade later, though unexpected new potential niches sprang up. Holograms were touted for magazine illustrations and billboards, for instance. And finally there was a commercial success – holographic security patches on credit cards and bank notes.

Ultimately, however, this is a story of failed endeavour. Holography has not replaced photography. Holograms do not dominate advertising or home entertainment. There is no way of generating a holographic image that behaves like the image of Princess Leia projected by R2-D2 in Star Wars, or Star Trek’s holographic doctor. So pervasive are cultural expectations even now that it is almost obligatory to follow such statements with “… yet”.

Preserving disappointment

Holography is a field of innovation where art, science, popular culture, consumerism and cultural confidences intermingled; and was shaped as much by its audiences as by its creators. Yet it doesn’t fit the kind of stories of progress that we tend to tell. You could say the same about 3D cinema and television or the health benefits of radioactivity, for example.

When a technology does not deliver its potential, museums are less interested in holding exhibitions; universities and other institutions less interested in devoting space to collections. When the people who keep them in their garages die, they are likely to end up in landfill. As the Malian writer Amadou Hampâté Bâ observed: “When an old person dies, a library burns”. Yet it is important we remember these endeavors.

Technologies like holograms were created and consumed by an exceptional range of social groups, from classified scientists to countercultural explorers. Most lived that technological faith, and many gained insights from sharing frustrating or secret experiences of innovation.

It gets left to us historians to hold these stories of unsuccessful fields together, and arguably that’s not sufficient. By remembering our endeavours with holograms or 3D cinema or radioactive therapy we may help future generations understand how technologies make society tick. For that vital reason, preserving them needs to be more of a priority.

Why Freshwater Dolphins Are Some of the World’s Most Endangered Mammals

Smithsonian Magazine

Flipper, dolphin tattoos and the performing dolphins of SeaWorld all share one thing in common: the ocean. But although sea-loving dolphins dominate popular imagination, lesser known sentient cetaceans do, in fact, exist outside of salty waters.

They are the river dolphins, comprised of several species that are specially adapted to inhabit freshwater bodies around the world. Habitats include the Indus, Ganges, Brahmaputra, Mekong and Irrawaddy rivers in Asia, as well as South America’s Amazon river system.

River dolphins never rose to Flipper-esque fame, probably in part due to their rarity. While bottlenose dolphins are regularly sighted off the majority of the world’s coasts and are a staple of aquariums and zoos, all of the world’s freshwater dolphins are currently listed as either critically endangered or endangered. One, the Yangtze River dolphin from China, is almost certainly already extinct, as it hasn’t been spotted for about a decade.

All told, freshwater dolphins are one of the world’s most endangered groups of mammals. One of the problems standing in the way of saving the freshwater dolphins, however, is an overall lack of knowledge about them.

When the Yangtze River dolphin disappeared, that event happened to quickly that researchers didn’t even have time to figure out what exactly caused its decline and eventual extinction. They suspected a combination of factors—including giant hydropower dams, ship traffic, pollution and accidental capture in fishing nets—played roles, but without scientific study before the species was lost they couldn’t be sure which of these things, if any, was most detrimental. 

(original image)

In an effort to prevent history from repeating itself, researchers from Scotland, Pakistan and Tanzania teamed up to study one of the surviving species of river dolphins: the Indus River dolphin. That species calls the Indus River—which mainly flows through Pakistan—home. As of 1990, the Indus River dolphin’s range had shrunk by 80 percent, and the authors wanted to know why.

They undertook a number of activities to figure this out. They compiled historical dolphin sightings along the river, conducted interviews with older fishermen living in the dolphin’s former ranges, surveyed previous studies published in the scientific literature and assembled data about major construction events along the river.

They found that humans, not surprisingly, were the extreme home-wreckers behind the dolphins’ decline. From 1886 to 1971, a series of 17 gated, largely impassable dams were built along the river, essentially splitting the dolphin’s habitat into 17 disjointed sections. Some of those sections are regularly drained for agriculture, leaving them almost completely dry for months on end. In most fragments, the dolphins disappeared within 50 years following dam construction. Today, they can be found in just six of those sections.

The length of the river fragment that the dolphins lived in proved to be one of the most important factors for predicting whether they would still be around 50 to 100 years after those barriers were built. Likewise, the more water that flowed through those habitats, the better the chances the dolphins could make ends meet.

This finding “underlines the great importance of maintaining large sections of intact river habitat to sustain tropical aquatic biodiversity,” the researchers write in PLoS One.

Unexpectedly, some human activities that seem like obvious extinction culprits in fact played little if any role. Although more than 90 percent of industrial and municipal effluent that Pakistan dumps into its rivers is untreated, the authors point out that by the 1980s—the time Pakistan ramped up its industry and agriculture to the point that pollution was a major problem—the dolphins had already been missing from those river sections for years.

Likewise, until 2010, most fishing in the river took place in side channels rarely used by dolphins, meaning collisions with boats and entanglement in nets probably didn’t play a major role in the dolphins’ decline. 

No plans are in motion to restore the Indus River to a healthy state, and the authors point out that that ecosystem serves as a warning to other nations that are considering damming Himalayan, Southeast Asian and other global rivers. “Hundreds of new dams and water developments are planned or are under construction in many of the world's rivers, and large losses of aquatic biodiversity can be expected," Gill Braulik, lead author of the study, said in a release.

As for the Indus River dolphins, their long-term survival is questionable. The authors’ model predicted that 100 years after being isolated by the dams, dolphin populations only have a 37 percent probability of survival. In other words, so long as the dams remain, the dolphins will probably never be completely free from the threat of extinction.

At the same time, people and the larger environment will continue to suffer, too. As the authors write, “The amount of habitat fragmentation and level of water withdrawals from rivers in Pakistan is extreme, negatively affecting human communities, eroding the delta, destroying fisheries and concentrating pollutants.” So the decline of Indus dolphins may also be a harbinger of worse things to come.  

(original image)

Why Every Food Lover Should Visit the Twin Cities

Smithsonian Magazine

Let's talk about the sweet potatoes at Young Joni. How they're blackened like campfire marshmallows, the insides all gooey and sweet. How they're spiked with gochugaro and topped with barely-there ruffles of bonito flakes. And, underneath it all, clinging to the plate, an enlightened schmear of crème fraîche and smoky charred scallions.

And, sure, let's talk about how the mushrooms are freakishly juicy — water-balloon juicy — because they're confited in olive oil before they hit the grill. Or how my favorite of Minnesota's embarrassment of lakes is the miniature one made of chestnut-miso butter pooled beneath those plump mushrooms.

We could talk this way about a lot of what's coming off the wood fire at this handsome Korean-ish pizza-and-other-stuff restaurant in the artsy, low-slung Minneapolis neighborhood of Northeast. But I'm inclined not to belabor the thesaurus-taxing explications and dutiful prepositions of the professional food describer (this thing atop that one, and a dollop of something else) and just say it directly: this stuff is really good. Get here and eat it if you can. Even if that means strapping on a pair of cross-country skis and braving the whiteout of a freak spring blizzard, as it did for some undeterred Young Joni devotees just before I visited in late April.

"I want you to walk in here and feel like the restaurant is giving you a big hug," said Ann Kim, chef-proprietor of the two-year-old establishment, who also runs Pizzeria Lola and Hello Pizza, in Southwest Minneapolis. Call it Korean-Midwestern hygge. Call it the embrace of fire and spice by an often-freezing city newly hip to the multidimensional tastes of its increasingly diverse populace. Call it the embodiment of quirky, cosmopolitan Minneapolis, St. Paul's ever-so-slightly showier younger sibling. Whatever it is, it's working. The place was packed to the wood-beamed rafters. Guests ordered the amatriciana pizza, a meat-heavy pie called the Yolo, and another topped with fennel sausage, mozzarella, onion, and a dusting of fennel pollen.

Kim grew up in the suburb of Apple Valley in the late 1970s when, it's fair to say, the full spectrum of the Asian pantry had not yet permeated the markets or mindshare of America's Casserole Belt. With her parents working, her grandmother ran and fed the household.

"Every November, we'd help her make enough kimchi to last the year," Kim said. "The only vessel we had that was big enough was our plastic kiddie pool. She'd let the cabbage brine in there, and then, in summer, my sister and I would clean out the pool and swim in it again."

Another pizza served at Young Joni comes topped with arugula and Korean barbecue, which Kim served at Lola as a lark years ago. "For some people, their first experience with Korean food is on top of a pizza pie — I love that."

**********

In 1850, the Swedish novelist Fredrika Bremer toured the territory that eight years later would become a state and declared prophetically: "What a glorious new Scandinavia might not Minnesota become!"

And so, over the next century or so, it kind of did. Swedes and Danes and Norwegians joined Germans, Italians, and other settlers. The power of the St. Anthony Falls was harnessed, and the flour-milling industry blossomed on the shores of the Mississippi River. Minneapolis and its next-door neighbor, St. Paul, grew large and prosperous, and everyone agreed, in their Midwestern, non-braggadocious way, that they were pretty nice places to live if you didn't mind the winter. Hubert Humphrey and Walter Mondale's presence on the national stage gave the Twin Cities a reputation as a bastion of liberalism, even as they remained mostly white.

The Stone Arch Bridge over St. Anthony Falls, in Minneapolis. (Christopher Testani)

But in more recent decades, the demographics have shifted. The Twin Cities have benefited from a transformative influx of immigrants from Mexico, Korea, and Vietnam, among others. Hmong refugees from Laos and Thailand began arriving in the mid 1970s. Today, there are thriving populations of Somalians, Liberians, and Ethiopians, and a dynamic South Asian community. The state's foreign-born population has more than doubled since the early 1990s.

Sitting at Young Joni's bar, I was joined by Cameron Gainer, an artist and publisher of a literary arts-and-culture quarterly called the Third Rail. Gainer came to town a decade ago from New York, when his wife, Olga Viso, took over as executive director of the Walker Art Center.

"Back then, it was difficult to find anywhere to go after 8:30," Gainer said. "I'd tell people where we'd moved and they'd say, "Oh, Milwaukee's great!"" Now, he explained, living here feels like being at the center of something rapidly expanding and evolving: a vibrant creative class; a community of engaged artists, architects, and chefs. An American city like no other.

From left: The North Loop neighborhood, in Minneapolis; menswear shop Askov Finlayson; the famed Grain Belt sign by the Mississippi River. (Christopher Testani)

Andrew Zimmern, host of Bizarre Foods and outspoken booster of his adopted hometown, added to the list of reasons to love this place: "Prince was from here. You can swim, sail, or canoe on our lakes — on your lunch hour. We have the Minnesota State Fair, the single greatest party on planet Earth. And we've gone from not having a single oyster bar in town to being a national powerhouse as a restaurant city. All in one generation."

The Twin Cities' pioneering cultural institutions have continued to reinvent themselves. The Walker, which was reclad and expanded in 2005 by Herzog & de Meuron, last year completed a lengthy overhaul of its iconic sculpture garden, adding 18 new works by artists such as Katharina Fritsch and Theaster Gates. The 55-year-old Guthrie Theater unveiled a striking new Jean Nouvel–designed home in 2006, with its Endless Bridge cantilevered out toward the Mississippi. St. Paul's Minnesota Museum of American Art is in the midst of a massive expansion. Also last year, the century-old Minneapolis Institute of Art put on the first major exhibition of contemporary Somalian artwork. Artists have colonized the industrial buildings of Northeast Minneapolis, converting the brick husks into studios and galleries. This dynamic cultural scene is by design: Minnesota ranks second in the nation after Washington, D.C., for per-capita government spending on the arts. "There's a let's-make-stuff vibe that's amazing," Gainer said. "There are opportunities to collaborate, to do things that don't exist yet, like start an art journal or open a Korean pizza joint."

"Give us one fried-chicken sandwich as a garnish, please," Sameh Wadi said. We were wearing plastic bibs and slurping frozen daiquiri slushies at Grand Catch, the bright and buoyant Asian-style Cajun seafood-boil restaurant that he and his brother Saed just opened with Thien Ly, a Vietnamese chef, on St. Paul's leafy Grand Avenue.

Sameh, a Palestinian-American chef and restaurateur with a general air of mischievous merriment, was ordering lunch for the two of us. The sandwich, he emphasized, was a mere palate cleanser to be shared between the main events: copious platters of pungently spiced crawfish, corn, gangly shrimp, and a Dungeness crab the size of a large chihuahua, whose carapace we'd lift and drink from as if it were a sacred chalice filled with brothy crab-innard delights.

He met Thien Ly when a friend brought him to Cajun Deli, Ly's hole-in-the-wall seafood-boil spot in suburban Brooklyn Park. For Sameh, who'd opened and closed a Middle Eastern fine-dining restaurant and moved on to run an eclectic street food truck and restaurant called World Street Kitchen ("burritos with fried rice and curry chicken, shawarma tacos — everything's delicious and makes no sense"), the border-bending Viet-Cajun boil was a revelation.

"It burned my face, but it's so addicting," he said. Returning obsessively for years, he got to know Ly. Eventually, he and the Wadi brothers talked shop and decided to open one.

The bar at Young Joni, a Korean-influenced restaurant in Northeast Minneapolis. (Christopher Testani)

And here we were, bibbed and broth-splattered, drinking pink slushies in coupe glasses in this bright spot on a flush avenue, and there was a neon sign on the wall that read WHAT'S CRACKIN? and crab dip with fermented crab paste and Middle Eastern spices and an ice-cream machine nicknamed Betty Lou that dispensed raspberry-lychee soft serve to help cool the burn. I kept forgetting what state or country I was in — and hoping I didn't have to leave.

I wondered, were the Twin Cities ready for this 10 years ago? "Absolutely not," Sameh said. "Ten years ago, people weren't ready for my white-tablecloth Middle Eastern restaurant with foie gras on the menu. Now people are just game. Now you can go to a Vietnamese restaurant, and they're doing Minnesota walleye in clay pots. It's a gorgeous thing."

**********

"Last week people were so angry!" the chef Gavin Kaysen said with a laugh. Happily, I'd missed the late-season blizzard. The Great Thaw had come to the Cities and nobody seemed angry about anything.

Kaysen's restaurant, Spoon & Stable, is in Minneapolis's North Loop, a fast-changing riverfront neighborhood of broad avenues, where old stables and warehouses are now populated by start-ups and coffee bars. A Minnesota native, Kaysen left for a decade or so to work in Napa Valley and New York City, where he ran kitchens for Daniel Boulud and won a James Beard Award. When he came home in 2014, he had a sense that the city's restaurant scene was ready for its close-up. There's been a line out the door for his impeccable modern American food with regional ingredients (bison tartare with watermelon radishes; birch-smoked cobia; pea-leaf fusilli with lamb and morels) ever since.

From left: Grand Catch, a St. Paul Viet-Cajun seafood spot; Balinese chicken thigh at Hai Hai, in Minneapolis; a sever at Parallel espresso bar. (Christopher Testani)

I met Kaysen and his pastry chef, Diane Yang, a first-generation Hmong-American, at Hmong Village, where we ate chicken wings stuffed with vermicelli noodles and ogled bitter melon vines. I’d arrived at the market with Carolina barbecue sauce on my shirt, slightly sauced myself on Old Fashioneds made with Dr. Pepper syrup and a proprietary bourbon peculiar to the restaurant Revival, in another part of St. Paul. There, I’d received useful instruction from Thomas Boemer in both the proper coloring of North Carolina-style fried chicken (“golden retriever slash labradoodle”) and the subtle differences between Minneapolis and St. Paul. Thomas grew up in the South, but his family is old St. Paul blood. It’s here he and his business partner run a group of Revivals and are opening a gigantic Basque-inspired live-fire restaurant, food market, and event space in the soon-to-be-revitalized Keg & Case warehouse next to the historic Schmidt Brewery in the Bluffs. “You’re not going to see a cat café here,” Boemer said, a subtle dig at flashier, more cosmopolitan Minneapolis which has, in point of fact, just opened its first cat café. “I was going to go, but my wife gave shamed me out of it.”

I mention the barbecue sauce at Hmong Village not just to emphasize that it had been a busy period of eating. (As the hometown hero Prince sang under different circumstances, "Touch if you will my stomach/Feel how it trembles inside.") Taken together, the Twin Cities today are less a New Scandinavia and more a varied, singularly American cultural smorgasbord.

Another thing that's changed is the embrace of winter. Eric Dayton and his brother, Andrew, sons of Minnesota governor Mark Dayton and vocal supporters of modern Minnesota, own the men's boutique and lifestyle brand Askov Finlayson, which has the motto "Keep the North Cold." The Daytons are among those working to rebrand the state as the "North" and reposition its famously cold winters as a point of pride.

Eric recalled a trip to Copenhagen at a time when the global spotlight was on all things Nordic. "I thought we had a lot of the same strengths in our city and our state, yet we were getting written off as flyover country," he says. "We had allowed the rest of the country to tell our narrative for us." The effort started with a line of beanies emblazoned with NORTH. Now Eric is among the leaders of the midwinter Great Northern festival, a 10-day food-and-activity-filled celebration that unites three of the Twin Cities' most popular cold-weather events: St. Paul's winter carnival, a cross-country ski festival, and the U.S. Pond Hockey Championships. (Tagline: "Hockey. The Way Nature Intended.")

What are we getting wrong about this place, I — East Coast outsider, air-dropped in to tell this place's story because we'd heard there was good food and endless cultural diversions — asked, a little sheepishly.

"When I went off for college, people I'd meet would tell me they'd seen Fargo," Eric said. "I don't think we get credit for what a vibrant city this is, the strength of the creative community, the dining scene, and world-class museums. These things get overlooked when it gets lumped in with this catchall idea of the region."

From left: A view along the West River Parkway, in Minneapolis; a croque madame at Parallel, an espresso bar in Minneapolis. (Christopher Testani)

For a sense of the changing face and can-do spirit of the North, head over to artisan glassblowing factory Hennepin Made and Parallel, the sleek espresso bar inside. Jackson Schwartz, a friend of Kaysen's, trained in glassblowing in Australia but came back to make his mark in Minnesota.

"I don't want to compete at a level of what Minneapolis has to offer," Schwartz told me. "I want to compete on an international level. If you walked into this café in Amsterdam or Seattle or wherever, you'd think, Okay, this fits here. This is the place to be. That's the level I want to be at."

Another glimpse of the new can be found at the Hewing Hotel in the North Loop, a recent arrival that has the familiar hallmarks of a hiply converted industrial building (the exposed brick walls, the naked light bulbs), along with bear-patterned wallpaper and framed axes. There's a fireplace in the lobby and a rooftop spa pool that converts to a hot tub in the winter. It's a stylized Paul- Bunyan-goes-to-Brooklyn kind of atmosphere that might feel hokey were the Hewing not housed in a former farm-machinery warehouse, in a city still in touch with its outdoorsy, hunting-fishing-axe-wielding side.

I'd come to the Twin Cities to wander their side streets and waterfronts and to feast on the fat of their land. At Grand Café in South Minneapolis, I feasted, tiny fork in hand, on the fat itself. Described on the menu, simply and weirdly, as "Beef fat slowly roasted in bay leaf," the dish is a lip of fat from a rib eye, gently poached with rosemary and thyme and bay leaf, then rolled and cut and served warmish. Jamie Malone (chef, owner, soft-spoken enabler) had upgraded the situation with caviar that crowned nickel-size disks of opaline fat. On paper, it sounds like comical overkill. In actuality, it's just really nice, understated (if caviar-topped fat can be understated), and suave. Which pretty much sums up this generous, comfortable but not grandly proportioned dining room and everything Malone's doing in it.

Next, because I am an adult and can eat whatever I want even if it kills me, I ordered the Paris-Brest pastry filled with chicken-liver mousse, a recent cover star for this publication's sister magazine, Food & Wine. The choux was crisp, burnished with a glaze made of black honey and luster dust (which sounds like something you'd encounter in the loo of a louche 70s Parisian nightclub, but is actually a product bakers use to make their cupcakes sparkle). Was it good? It's an uppity, sweet, salty, fatty, crunchy, creamy, savory doughnut that's luster-dusted Instagram gold. Bien sûr, it was very, very good.

From left: Lobby décor at the Hewing Hotel, in Minneapolis's North Loop; sturgeon custard in an eggshell at Grand Café, in South Minneapolis. (Christopher Testani)

The Grand Café is descended from a bakery that opened on these premises in 1951. Fifteen years ago it morphed into a café with a neighborhood following and minimal culinary aspirations. When Malone took over last year, she was committed to not sprucing the place up any more than she needed to. The walls are dusky pink, the wood tables uncovered, the tin ceiling hasn't been tended to in a while. The effect of the whole is quietly chic, a captivating, relaxing space that doesn't try too hard to be any one of those things.

"I want people to feel transported. I want it to feel whimsical," Malone said. "And — this is going to sound really stupid — I want you to feel genuinely cared about, because there's a lot of love and respect in this room. Oh, and I want it to feel like a Wes Anderson movie."

"We spritz our pepperoni with red wine," said the server at Pig Ate My Pizza. His T-shirt said SURLY BREWING. His bearing said: Not surly at all. He was earnest and enthusiastic about the spritzing and maybe a little distracted by the cloud of flavored smoke rising off the Morning Maple pizza as he lifted up a cloche with a flourish. This is, by a rather wide margin, the second-looniest place run by Travail Collective, a merry band of chefs and DIY showmen whose flagship enterprise, Travail, serves ticketed, "20+ course" tasting-menu dinners twice a night, Wednesdays through Saturdays.

"It's about disconnecting people from their reality and bringing them together in our reality," said chef and cofounder Mike Brown, of a communal dining style that might include eating off meat hooks dangled above your head, or a vegetable dish choreographed to musical accompaniment by a cello player (Brown's neighbor). One memorable engagement involved, as Brown put it, "a liquid-nitrogen bomb exploding and a person in a rabbit suit running around."

“Oh I remember that,” said Dara Moskowitz Grumdahl, affectionately. Dara’s the restaurant critic for Mpls. St. Paul magazine and host of “Off the Menu” on Minneapolis CBS radio. After two pizzas and a gigantic platter of house-made charcuterie at Pig, neither of us had the energy for twenty plus more courses, so we were snacking on a reuben sandwich at Travail’s bar. “I’m talking to a puppeteer and a robotics guy,” Brown went on. “Sometimes an idea like Chuck E. Cheese will just come into our mind and we’ll construct a dish around that.”

I’m not sure animatronic Chuck E. Cheese servers are the future of fine dining, in Minneapolis or anywhere. But I do like talking to Mike. I like his antic schemes and I like the general genuineness with which they seem to be received. The room is full of happy people.

Brown has a theory about why Minnesotans are so earnest and easygoing. Coming back to Minneapolis after a long absence, he recalled, "I stepped off the plane and breathed in this tasteless, smell-less winter air and just thought, Oh, thank god, the great equalizer is here! You kind of have to respect each other for surviving winter here. You have to put up with each other and help them shovel their car out of the snow."

Ahmed, an Uber driver from Mogadishu who picked me up on my way home, agreed. "Winter is hard," he said, "but it keeps the bad people away. That's what they say."

I hadn't heard that said, but it made sense to me. In those last few days of wandering and eating, I hadn't met a single one.

Other articles from Travel + Leisure:

Why Engineering Will Be Vital in a Changing Climate

Smithsonian Magazine

Conversations about climate change usually focus on ways to reduce the human footprint, from cutting carbon emissions to developing cleaner technologies. But in many cases we are already feeling the effects, and we will likely continue to experience climate ripples even as we work to stem the tide.

Finding ways to adapt to climate change is therefore just as crucial as mitigation, says Smithsonian Secretary G. Wayne Clough. A civil engineer and former president of the Georgia Institute of Technology, Clough has been part of teams tasked with designing solutions for protecting human lives and infrastructure from intensifying natural conditions, such as rising sea levels and stronger hurricanes.

In this special presentation at the Smithsonian Castle, Clough outlines the ways Institution scientists are adding to our knowledge about the effects of climate change as seen from land, sea and space, and he gives his personal insights on the engineering opportunities and challenges we face as society works to adapt to unavoidable change.

In an exclusive video interview, Secretary Clough also gave his perspective on the Smithsonian's first official statement on climate change—hear him explain the valuable role that the Institution can play in research and education around this important issue. 

Why Earthquakes Make Napa Wine Taste So Good

Smithsonian Magazine

Early Sunday morning, a magnitude-6.0 earthquake rumbled through Northern California. It was the largest quake to hit the Bay Area since the 1989 Loma Prieta earthquake, a magnitude-6.9 temblor that collapsed the Bay Bridge. With an epicenter just nine miles south of the town of Napa, the quake left dozens injured and damaged historic buildings throughout the Napa Valley.

All told, the region is thought to have sustained upwards of $1 billion in damages, and one sector has seen some especially tragic loses: Napa's wine industry, which had just begun harvesting its 2014 crop. In an interview with the Associated Press, Tom Montgomery of B.R. Cohn Winery in Glen Ellen, California, estimated that as much as 50 percent of the winery's product was destroyed in the quake. "It's not just good wine we lost," Montgomery told the AP. "It's our best wine."

In an average year, Napa's wine industry generates $50 billion. The nonprofit group Napa Valley Vintners says that it's too early to estimate the amount of damage the earthquake caused, though their website states that it "is not expected to have a significant impact on Napa Valley wine inventory in general." And geologically speaking, earthquakes are a major reason Napa has become synonymous with wine.

"This is the kind of earthquake that created the Napa Valley, or at least the final morphology of the valley now," says Ken Verosub, professor of earth and planetary sciences at the University of California, Davis. "There's nothing here that's a big surprise."

The Napa Valley sits at the northern end of the San Francisco Bay, between the Vaca Mountains to the east and the Mayacamas Mountains to the west. The entire area rests on what is known as a transform fault zone: an area where two of Earth's tectonic plates slide past each other. In the case of the Bay Area, the sliding of the Pacific plate past the North American plate drives activity along the famous San Andreas fault zone. This major plate-boundary fault visibly stretches for 600 miles lengthwise through California.

But 40 million years ago, another crucial plate helped shape the Napa Valley of today. Back then the oceanic Farallon plate was subducting, or diving under, the North American plate. During subduction, some material is scraped from the plates and deposited on Earth's surface rather than sinking into the planet. As the Farallon plate moved under what is now California, it deposited a mix of material, so that today the western half of Northern California boasts a panoply of mineral riches, including blocks of limestone and sedimentary rock as well as fragments of the ancient sea floor.

At the same time, the Farallon plate was pulling away from the Pacific plate to the west, creating a "spreading center" where hot rock oozes up to fill the gap. Around 30 million years ago, this spreading center began to dive under the North American plate, and the San Andreas fault was born. Heat from the spreading center then triggered volcanic activity along the southern and northern boundaries of the transform fault. Volcanic rocks up to eight million years old have been found in the eastern part of the Napa Valley, says Verosub.

The valley itself formed as a result of a fault step-over—part of the network of complex fractures that branch off the main San Andreas fault line. Within a step-over, a particular fault jumps over an area of land but then continues in the same direction. Think of it like drawing a line on a piece of paper, stopping, moving your pencil down a few inches and continuing the same line. The area between a step-over is put under an immense amount of geological tension, which in some cases can cause the land to sink down, effectively creating a valley.

Fault activity, as well as erosion via wind and rain, continued to break apart the many types of rocks around the valley, depositing their riches on the valley floor. The end result is the Napa Valley's spectacular diversity of soil: over 100 variations, or equal to half of the world's soil orders. In wine growing, soil diversity is extremely advantageous, allowing numerous grape varieties to grow in a relatively small area. In the southern part of the Napa Valley, for instance, the calcium-rich soil favors pinot noir grapes. In the north, more volcanic soils help cabernet grapes thrive. 

Diverse soil isn't the only remnant of the valley's tectonic past. "The Napa Valley has a large climatic gradient due to the geomorphology created by the tectonics," says Verosub. Hills and knolls formed by megaslides from the Vaca Mountains millions of years ago influence the climate of the valley floor. All told, the Napa Valley is home to 14 distinct American Viticultural Areas, each of which is completely unique due to its combined soil and climate.

With continued activity along the fault, Napa remains vulnerable to earthquakes like the one on Sunday. Moreover, the high amounts of sediment on the valley floor means the region really feels any shaking caused by tectonic movement. "[The sediment] may be great for grapes, but when there’s an earthquake anywhere in proximity, you get amplification," says Susan Hough, a seismologist at the United States Geological Survey in Pasadena, California. Still, any damages sustained during the earthquake may simply be the price Napa vintners pay for growing in such a geologically attractive part of the world.

Why Doesn't Anyone Know How to Talk About Global Warming?

Smithsonian Magazine

When Vox.com launched last month, the site's editor-in-chief, Ezra Klein, had a sobering message for us all: more information doesn't lead to better understanding. Looking at research conducted by a Yale law professor, Klein argued that when we believe in something, we filter information in a way that affirms our already-held beliefs. "More information...doesn’t help skeptics discover the best evidence," he wrote. "Instead, it sends them searching for evidence that seems to prove them right."

It's disheartening news in many ways—for one, as Klein points out, it cuts against the hopeful hypothesis set out in the Constitution and political speeches that any disagreement is merely a misunderstanding, an accidental debate caused by misinformation. Applied to our highly polarized political landscape, the study's results make the prospect of change seem incredibly difficult.

But when applied to science, the results become more frightening. Science, by definition, is inherently connected to knowledge and facts, and we rely on science to expand our understanding of the world around us. If we reject information based on our personal bias, what does that mean for science education? It's a question that becomes especially relevant when considering global warming, where there appears to be an especially large chasm between scientific knowledge and public understanding.

"The science has become more and more certain. Every year we’re more certain of what we’re seeing," explains Katharine Hayhoe, an atmospheric scientist and associate professor of political science at Texas Tech University. 97 percent of scientists agree that climate change is happening, and 95 percent of scientists believe that humans are the dominant cause. Think of it another way: over a dozen scientists, including the president of the National Academy of Sciences, told the AP that the scientific certainty regarding climate change is most similar to the confidence scientists have that cigarettes contribute to lung cancer. And yet as the scientific consensus becomes stronger, public opinion shows little movement. 

"Overall, the American public’s opinion and beliefs about climate change haven’t changed a whole lot," says Edward Maibach, director of George Mason University's Center for Climate Change Communication. "In the late 90s, give or take two-thirds of Americans believed that climate change was real and serious and should be dealt with." Maibach hasn't seen that number change much—polls still show about a 63 percent belief in global warming—but he has seen the issue change, becoming more politically polarized. "Democrats have become more and more convinced that climate change is real and should be dealt with, and Republicans have been going in the opposite direction."

It's polarization that leads to a very tricky situation: facts don't bend to political whims. Scientists agree that climate change is happening—and Democrats and Republicans alike are feeling its effects now, all over the country. The Intergovernmental Panel on Climate Change (IPCC) keeps reiterating that things look bleak, but avoiding a disaster scenario is still possible if changes are made right now. But if more information doesn't lead to greater understanding, how can anyone convince the public to act?

***

In the beginning, there was a question: what had caused the glaciers that once blanketed the Earth to melt? During the Ice Age, which ended around 12,000 years ago, glacial ice covered one-third of the Earth's surface. How was it possible that the Earth's climate could have changed so drastically? In the 1850s, John Tyndall, a Victorian scientist fascinated by evidence of ancient glaciers, became the first person to label carbon dioxide as a greenhouse gas capable of trapping heat in the Earth's atmosphere. By the 1930s, scientists had found an increase in the amount of carbon dioxide in the atmosphere—and an increase in the Earth's global temperature.

In 1957, Hans Suess and Roger Revelle published an article in the scientific journal Tellus that proposed that carbon dioxide in the atmosphere had increased as a result of a post-Industrial Revolution burning of fossil fuels—buried, decaying organic matter that had been storing carbon dioxide for millions of years. But it wasn't clear how much of that newly released carbon dioxide was actually accumulating in the atmosphere, versus being absorbed by plants or the ocean. Charles David Keeling answered the question through careful CO2 measurements that charted exactly how much carbon dioxide was present in the atmosphere—and showed that the amount was unequivocally increasing.

In 1964, a group from the National Academy of Sciences set out to study the idea of changing the weather to suit various agricultural and military needs. What the group members concluded was that it was possible to change climate without meaning to—something they called "inadvertent modifications of weather and climate"—and they specifically cited carbon dioxide as a contributing factor.

Politicians responded to the findings, but the science didn't become political. The scientists and committees of early climate change research were markedly bipartisan, serving on science boards under presidents both Democrat and Republican. Though Rachel Carson's Silent Spring, which warned of the dangers of synthetic pesticides, kicked off environmentalism in 1962, the environmental movement didn't adopt climate change as a political cause until much later. Throughout much of the '70s and '80s, environmentalism focused on problems closer to home: water pollution, air quality and domestic wildlife conservation. And these issues weren't viewed through the fracturing political lens often used today—it was Republican President Richard Nixon who created the Environmental Protection Agency and signed the National Environmental Policy Act, the Endangered Species Act and a crucial extension of the Clean Air Act into law.

But as environmentalists championed other causes, scientists continued to study the greenhouse effect, a term coined by the Swedish scientist Svante Arrhenius in the late 1800s. In 1979, the National Academy of Sciences released the Charney Report, which stated that "a plethora of studies from diverse sources indicates a consensus that climate changes will result from man's combustion of fossil fuels and changes in land use."

The scientific revelations of the 1970s led to the creation of the IPCC, but they also caught the attention of the Marshall Institute, a conservative think tank founded by Robert Jastrow, William Nierenberg and Frederick Seitz. The men were accomplished scientists in their respective fields: Jastrow was the founder of NASA's Goddard Institute for Space Studies, Nierenberg was the former director of the Scripps Institution of Oceanography and Seitz was the former president of the United States National Academy of Sciences. The institute received funding from groups such as the Earhart Foundation and the Lynde and Harry Bradley Foundation, which supported conservative and free-market research (in recent years, the institute has received funding from Koch foundations). Its initial goal was to defend President Reagan's Strategic Defense Initiative from scientific attacks, to convince the American public that scientists weren't united in their dismissal of the SDI, a persuasive tactic which enjoyed moderate success.

In 1989, when the Cold War ended and much of the Marshall Institute's projects were no longer relevant, the Institute began to focus on the issue of climate change, using the same sort of contrarianism to sow doubt in the mainstream media. It's a strategy that was adopted by President George W. Bush's administration and the Republican Party, typified when Republican consultant Frank Luntz wrote in a memo:

"Voters believe that there is no consensus about global warming within the scientific community. Should the public come to believe that the scientific issues are settled, their views about global warming will change accordingly. Therefore, you need to continue to make the lack of scientific certainty a primary issue in the debate."

It's also an identical tactic to one used by the tobacco industry to challenge research linking tobacco to cancer (in fact, Marshall Institute scientist Seitz once worked as a member of the medical research committee for the R. J. Reynolds Tobacco Company).

But if politicians and strategists created the climate change "debate," the mainstream media has done its part in propagating it. In 2004, Maxwell and Jules Boykoff published "Balance as bias: global warming and the US prestige press," which looked at global warming coverage in four major American newspapers: the New York Times, the Los Angeles Times, the Washington Post and the Wall Street Journal, between 1988 and 2002. What Boykoff and Boykoff found was that in 52.65 percent of climate change coverage, "balanced" accounts were the norm—accounts that gave equal attention to the view that humans were creating global warming and the view that global warming was a matter of natural fluctuations in climate. Nearly a decade after the Charney Report had first flagged man's potential to cause global warming, highly-reputable news sources were still presenting the issue as a debate of equals.

In a study of current media coverage, the Union of Concerned Scientists analyzed 24 cable news programs to determine the incidence of misleading climate change information. The right-leaning Fox News provided misinformation on climate change in 72 percent of its reporting on the issue; left-leaning MSNBC also provided misinformation in 8 percent of its climate change coverage, mostly from exaggerating claims. But the study found that even the nonpartisan CNN misrepresented climate change 30 percent of the time. Its sin? Featuring climate scientists and climate deniers in such a way that furthers the misconception that the debate is, in fact, still alive and well. According to Maibach, the continuing debate over climate science in the media explains why fewer than one in four Americans know how strong the scientific consensus on climate change really is. (CNN did not respond to requests for a comment, but the network hasn't featured a misleading debate since February, when two prominent CNN anchors condemned the network's use of debate in covering climate change.)

Sol Hart, an assistant professor at the University of Michigan, recently published a study looking at network news coverage of climate change—something that nearly two-thirds of Americans report watching at least once a month (only a little over a third of Americans, by contrast, reported watching cable news at least once a month). Looking at network news segments about climate change from 2005 to mid-2011, Hart noticed what he perceived as a problem in the networks' coverage of the issue, and it wasn't a balance bias. "We coded for that, and we didn’t see much evidence of people being interviewed on network news talking about humans not having an affect on climate change," he explains.

What he did notice was an incomplete narrative. "What we find is that the impacts and actions are typically not discussed together. Only about 23 percent of all articles on network news talked about impacts and actions in the same story. They don’t talk about them together to create a cohesive narrative."

But is it the media's responsibility to create such a narrative? 

In the decades before the digital revolution, that question was easier to answer. Legacy media outlets historically relied on balance and impartiality; it wasn't their place, they figured, to compel their readers to act on a particular issue. But the information revolution, fueled by the web, has changed the media landscape, blurring the lines between a journalist's role as a factual gatekeeper and an activist.

"With the advent of digital online, there’s a lot more interaction with the audience, there’s a lot more contributions from the audience, there’s citizen journalists, there’s bloggers, there’s people on social media. There are tons and tons of voices," Mark Glaser, executive editor at PBS MediaShift, explains. "It’s hard to just remain this objective voice that doesn’t really care about anything when you’re on Twitter and you’re interacting with your audience and they’re asking you questions, and you end up having an opinion."

***

For a long time, climate change has been framed as an environmental problem, a scientific conundrum that affects Arctic ice, polar bears and penguins; a famously gut-wrenching scene from Al Gore's An Inconvenient Truth mentions polar bears having drowned looking for stable pieces of ice in a warming Arctic Ocean. It's a perfectly logical interpretation, but increasingly, climate scientists and activists are wondering whether or not there's a better way to present the narrative—and they're turning to social scientists, like Hart, to help them figure that out.

"Science has operated for so long on this information deficit model, where we assume that if people just have more information, they’ll make the right decision. Social scientists have news for us: we humans don’t operate that way," Hayhoe explains. "I feel like the biggest advances that have been made in the last ten years in terms of climate change have been in the social sciences."

As Hayhoe spoke about the frustrations of explaining climate change to the public, she mentioned a cartoon that circulated around the internet after the IPCC's most recent report, drawn by Australian cartoonist Jon Kudelka.

For scientists like Katharine Hayhoe, Jon Kudelka's cartoon sums up the frustrations from communicating climate change to the public. (Jon Kudelka )

"I think that my colleagues and I are becoming increasingly frustrated with having to repeat the same information again and again, and again and again and again—and not just year after year, but decade after decade," Hayhoe says.

In other countries around the world, the climate change message appears to be getting through. In a Pew poll of 39 countries, global climate change was a top concern for those in Canada, Asia and Latin America. Looking at data from all included countries, a median of 54 percent of people placed global climate change as their top concern—in contrast, only 40 percent of Americans felt similarly. A 2013 global audit of climate change legislation stated that the United States' greenhouse gas emission reduction targets are "relatively modest when compared with other advanced economies." And "almost nowhere" else in the world, according to Bill McKibben in a recent Twitter chat with MSNBC's Chris Hayes, has there been the kind of political fracturing around climate change that we see in the United States.

To help Americans get the message, social scientists have one idea: talk about the scientific consensus not more, but more clearly. Starting in 2013, Maibach and his colleagues at GMU and the Yale Project on Climate Change Communication conducted a series of studies to test if, when presented with the data of scientific consensus, participants changed their mind about climate change. What they found was that in controlled experiments, exposure to a clear message conveying the extent of scientific consensus altered participants' estimate of the scientific consensus significantly. Other experimental studies have turned up similar results—a study conducted by Stephan Lewandowsky of the University of Bristol, for example, found that a clear consensus message made participants more likely to accept scientific facts about climate change. Frank Luntz, to the shock of veteran pundit watchers, was right: a clear scientific consensus does seem to change how people understand global warming.

Partially in response to Maibach's findings, the American Association for the Advancement of Science recently released their report "What We Know: The Reality, Risks and Response to Climate Change." The report, Maibach says, is "really the first effort...that attempted to specifically surface and illuminate the scientific consensus in very clear, simple terms." The report's first paragraph, in plain terms, notes that "virtually every national scientific academy and relevant major scientific organization" agrees about the risks of climate change. The New York Times' Justin Gillis described the report's language as "sharper, clearer and more accessible than perhaps anything the scientific community has put out to date."

And yet, the report wasn't universally heralded as the answer to climate change's communication problem—and it wasn't just under fire from conservatives. Brentin Mock, writing for Grist, wasn't sure the report would win climate scientists new support. "The question is not whether Americans know climate change is happening," he argued. "It’s about whether Americans can truly know this so long as the worst of it is only happening to 'certain other vulnerable' groups." Slate's Philip Plait also worried that the report was missing something important. "Facts don’t speak for themselves; they need advocates. And these advocates need to be passionate," he wrote. "You can put the facts up on a blackboard and lecture at folks, but that will be almost totally ineffective. That’s what many scientists have been doing for years and, well, here we are."

To some, the movement needs more a scientific consensus. It needs a human heart.

***

Matthew Nisbet has spent a lot of time thinking about how to talk about climate change. He's been studying climate change from a social science perspective since his graduate studies at Cornell University in the late 1990s and early 2000s and currently works as an associate professor at American University's School of Communications. And though he acknowledges the importance of a scientific consensus, he's not convinced it's the only way to get people thinking about climate change.

"If the goal is to increase a sense of urgency around climate change, and support an intensity of opinion for climate change being a lead policy issue, how do we make that happen?" he asks. "It’s not clear that affirming consensus would be a good long-term strategy for building concern."

Nisbet wanted to know if the context in which climate change is discussed could affect people's views about climate change: is the environmental narrative the most effective, or could there be another way to talk about climate change that might engage a wider audience? Along with Maibach and other climate change social scientists, Nisbet conducted a study that framed climate change in three ways: in a way that emphasized the traditional environmental context, in a way that emphasized the national security context and in a way that emphasized the public health context. 

They thought that maybe placing the issue of climate change in the context of national security could help win over conservatives—but their results showed something different. When it came to changing the opinions of minorities and conservatives—the demographics most apathetic or hostile to climate change—public health made the biggest impact.

"For minorities, where unemployment might be 20 percent in some communities, they face everyday threats like crime. They face discrimination. Climate change is not going to be a top of mind risk to them," Nisbet explains. "But when you start saying that climate change is going to make things that they already suffer from worse, once you start talking about it that way, and the communicators are not environmentalists or scientists but public health officials and people in their own community, now you've got a story and a messenger that connects to who they are."

The public health angle has been a useful tool for environmentalist before—but it is especially effective when combined with tangible events that unequivocally demonstrate the dangers. When smog blanketed the industrial town of Donora, Pennsylvania in 1948 for five days, killing 20 people and rendering another 6,000 ill, America became acutely aware of the danger air pollution posed to public health. Events like this eventually spurred action on the Clear Air Act, which has played a large part in the reduction of six major air pollutants by 72 percent since its passage.

One voice that has begun focusing on the tangible impacts of climate change by showing its effects on everything from public health to agriculture is Showtime's new nine-part documentary series "Years of Living Dangerously." Eschewing images of Arctic ice and polar bears, the show tackles the human narrative head-on, following celebrity hosts as they explore the real-time effects of climate change, from conflict in Syria to drought in Texas. Over at the Guardian, John Abraham described the television series as "the biggest climate science communication endeavor in history."

But, as Alexis Sobel Fitts pointed out in her piece "Walking the public opinion tightrope," not all responses to the series have been positive. In a New York Times op-ed, representatives of the Breakthrough Institute, a bipartisan think tank committed to "modernizing environmentalism," argue that the show relies too heavily on scare tactics, which might ultimately harm its message. "There is every reason to believe that efforts to raise public concern about climate change by linking it to natural disasters will backfire," the op-ed states. "More than a decade’s worth of research suggests that fear-based appeals about climate change inspire denial, fatalism and polarization." "Years of Living Dangerously"'s reception, Fitts argues, reflects complex public opinion—for a subject as polarizing as climate change, you're never going to be able to please everyone.

Glaser agrees that the situation is complex, but feels that the media owes the public honesty, whether or not the truth can be deemed alarmist.

"I think the media probably should be alarmist. Maybe they haven’t been alarmist enough. It’s a tough balancing act, because if you present something to people and it’s a dire situation, and that’s the truth, they might just not want to accept it," he says. "That response, to say, 'This is just exaggerated,' is just another form of denial."

***

Climate change, some say, is like an ink blot test: everyone who looks at the problem sees something different, which means that everyone's answer to the problem will inherently be different, too. Some social scientists, like Nisbet, think that such a diversity of opinions can be a strength, helping create a vast array of solutions to tackle such a complicated issue.

"We need more media forums where a broad portfolio of technologies and strategies are discussed, as well as the science," Nisbet explains. "People need to feel efficacious about climate change—what can they do, in their everyday lives, to help climate change?"

Sol Hart, the Michigan professor, agrees that the current climate change narrative is incomplete. "From a persuasive perspective, you want to combine threat and efficacy information," he explains. "So often, the discussion is that there are very serious impacts on the horizon and action needs to be taken now, but there’s not much detail on action that could be taken."

Adding more context to stories might help round out the current narrative. "There’s such noise and chaos around a lot of big stories, and people just take these top-line items and don’t really dig deeper into what are the underlying issues. I think that’s been a big problem," Glaser explains. Slate has been doing explanatory journalism for years with its Explainer column, and other sites, like Vox and The Upshot (an offshoot of the New York Times) are beginning to follow a similar model, hoping to add context to news stories by breaking them down into their component parts. According to Glaser, that's reason for optimism. "I think news organizations do have a responsibility to frame things better," he says. "They should give more context and frame things so that people can understand what’s going on."

But Hayhoe thinks that we need more than just scientists or the media—we need to engage openly with one another.

"If you look at science communication [in Greek and Roman times] there were no scientific journals, it wasn’t really an elite field of correspondence between the top brains of the age. It was something that you discussed in the Forum, in the Agora, in the markets," she says. "That’s the way science used to be, and then science evolved into this Ivory Tower."

One organization that's trying to bring the conversation down from the Ivory Tower and into the lives of ordinary citizens is MIT's Climate CoLab, part of the university's Center for Collective Intelligence, which seeks to solve the world's most complex problems through crowdsourcing collective intelligence. Without even signing up for an account, visitors interested in all aspects of the climate change can browse a number of online proposals, written by people from all over the world, which seek to solve problems from energy supply to transportation. If a user wants to become more involved, they can create a profile and comment on proposals, or vote for them. Proposals—which can be submitted by anyone—go through various rounds of judging, both by CoLab users and expert judges. Winning proposals present their ideas in a conference at MIT, in front of experts and potential implementers. 

"One of the things that's novel and unique about the Climate CoLab is the degree to which we're not just saying 'Here's what's happening,' or 'Here's how you should change your opinions,'" Thomas Malone, the CoLab's principal investigator, explains. "What we're doing in the Climate CoLab is saying, 'What can we do, as the world?' And you can help figure that out.'"

Climate change is a tragedy of the commons, meaning that it requires collective action that runs counter to individual desires. From a purely self-interested standpoint, it might not be in your best interest to give up red meat and stop flying on airplanes so that, say, all of Bangladesh can remain above sea level or southeast China doesn't completely dry out—that change requires empathy, selflessness and a long-term vision. That's not an easy way of thinking, and it runs counter to many Americans' strong sense of individualism. But by the time that every human on Earth suffers enough from the effects of rising temperatures that they can no longer ignore the problem, it will be too late.

Why Does the Internet Hate Renoir?

Smithsonian Magazine

Ask any art expert to name an Impressionist, and Pierre-Auguste Renoir is sure to come up. His early paintings, such as Luncheon of the Boating Party, are heralded, famous works. But don't tell that to Max Geller, the person behind a popular Instagram account called "Renoir Sucks At Painting."

For months, Geller has led a tongue-in-cheek campaign against Renoir. In April, he petitioned the White House to remove all of his paintings from the National Gallery of Art. Nearly 10,000 people follow his Instagram account, and earlier this month, the movement finally broke out of cyberspace. A small group, led by Geller, held a mock protest outside the Museum of Fine Arts in Boston, reports Mahita Gajanan for the Guardian

Protesters signs that read "ReNOir" and "God Hates Renoir" quickly attracted media attention, and apparently convinced more people to join the cause.

In an interview with NPR's Laura Wagner, Geller explains his stance:

I hate Renoir because he is the most overrated artist east, west, north and south of the river Seine. I think in real life trees are beautiful and the human eyeball conveys emotional force. If you took his word for it, trees would be a collection of disgusting, green squiggly lines and eyeballs would be jet black as if they were colored by sharpies. In real life trees are beautiful; Renoir just sucks at painting.

The saccharine sweetness of some of Renoir's paintings in particular draws Geller's ire — he calls it "treacle." But critics have long expressed disapproval at this particular quality of Renoir's work. A century ago, American Impressionist Mary Cassatt criticized his paintings of "enormously fat women with very small heads." As recently as 2007, New York Times art critic Roberta Smith bemoaned his "acres of late nudes" and the "ponderous staginess" of his work.

Whether Geller's campaign is a jokey stunt, or an irreverent call for serious reconsideration, it does prove one thing: even old, familiar art can be controversial.

Why Do We Love Period Dramas So Much?

Smithsonian Magazine

The biggest costume drama in history premiered 77 years ago, and we’re easily as in love with the genre today.

Gone With The Wind premiered on this day in 1939, in Atlanta, Georgia. It was huge, writes Carrie Hagen for Smithsonian.com, both culturally and financially. The governor had declared that day a state holiday, and before the movie started “around 300,000 fans lined the flat-decorated streets to greet the movie’s stars,” she writes. Gone With The Wind remains the highest grossing film of all time when adjusted for inflation. But what was behind the appeal of the costume drama?

“Modern audiences can see the intrinsic racial problems in the film’s nostalgic treatment of the Confederacy,” Hagen writes. Similiarly, we can see race and gender problems in period dramas like Downton Abbey. When things like violence against women or overt racism towards black people appear on screen, most people wouldn’t be okay with those things if we saw them out in the world today. But many among us still love the shows, which have extremely high viewing numbers. The question is why.

“We Americans love our costume dramas, and we particularly love those that play at cultural and social experiences beyond the ken of our national collective identity,” writes s.e. smith in a Bitch Magazine article about Indian Summers, PBS’s follow to Downton Abbey. Period dramas like those two or, say, any production related to the work of Jane Austen aren’t set in a world that people today inhabit. This is also true of Gone With The Wind, which was set in the Confederate South, a place that was long gone when the movie premiered.

Period dramas tend to focus on the aesthetics of the past rather than on its real hardships (although some of those are thrown in to keep the story moving.) For the story of Rhett and Scarlett, the success of the 1,000-page novel it was based on helped the movie, but so did “the epic’s record-setting production costs, which brought elaborate wardrobes and new uses of Technicolor and sound to screen,” Hagen writes. “But perhaps another reason for its longevity is its glamorous portrayal of an ideology that lost a war a long time ago.”

“People dress up for Downton Abbey parties as the people upstairs, not the people downstairs,” smith told Sarah Mirk in a separate interview for Bitch Magazine. “You don’t see things that would’ve been common at the time. There wouldn’t have been electricity in the servants’ quarters, servants were probably using outhouses rather than indoor plumbing, servants were eating the worst cuts of meat and the leftovers.” What viewers of Downton Abbey see of the servants’ world is mostly “this kind of bright, idealized version of the comfortable English farm kitchen.” Similiarly, Downton doesn’t really show how life was for people of color or people with disability, Mirk notes. It shows a beautifully set version of how life was like long ago and far away.

The thing about creating the past, as the makers of period drams do, is that it doesn't have to look as complicated as the present. No matter how earnest their intent to replicate the past, in fact, it cannot look as complicated as the present. Of course, to the people who lived in Edwardian England, it was exactly as complicated as 2016 America seems today. We can read the past or see it on the screen, but we never have to truly experience how complex and difficult it was. That can be a comfort for viewers, because really, their lives are complicated enough.

Why Didn't the First Earth Day's Predictions Come True? It's Complicated

Smithsonian Magazine

The first Earth Day was revolutionary. That can be difficult to imagine today as we’re bombarded by calls for sustainability year-round. Yet only 46 years ago, some 20 million Americans protested and demanded that the government curb pollution, protect wildlife and conserve natural resources.

Remarkably, government leaders listened. In the years after the first Earth Day, the Environmental Protection Agency was founded. Congress passed the Clean Air Act, the Clean Water Act and the Endangered Species Act, among other powerful environmental laws. In short, Earth Day changed the trajectory of our country and, probably, the world.

Environmental scientists led the movement, predicting chilling futures—that overpopulation would cause worldwide famine; pollution would blanket cities and kill thousands; a mass extinction was upon us; oil and mineral reserves were about to run out. Nearly all of these predictions foresaw doom by the year 2000—which we’re now far past. While environmental concerns still reign, the extreme conditions predicted 46 years ago have, for the most part, not yet materialized.

It’s easy to poke fun at these “failed predictions”—and many environmental skeptics do. Those critics aren’t entirely wrong; some of the era’s predictions were based on faulty logic. But others failed to come true because the predictions themselves changed the course of history.

Running Out Of Everything

Many of the era’s incorrect predictions centered on resource scarcity—oil, minerals, food—but perhaps the most famous one came ten years after the first Earth Day, when a scientist and economist made a public bet that lives on in environmental discourse today.

The scientist was Paul Ehrlich, an outspoken biologist whose studies on the population dynamics of butterflies led him to a dramatic conclusion: That the human population was too big and soon would strip the world of resources, leading to mass starvation.

The economist was Julian Simon, who disagreed with Ehrlich. Humans are not butterflies, he argued, and have a powerful tool that prevents resource scarcity: a market economy. When a useful resource becomes rare, it becomes expensive, and that high price incentivizes exploration (to find more of that resource) or innovation (to create an alternative).

The two never met or debated in person. But in 1980, Simon challenged Ehrlich to a bet in the pages of a scientific journal, and Ehrlich accepted. The biologist selected five raw minerals—chromium, copper, nickel, tin, and tungsten—and noted how much of each he could buy for $200. If his prediction was right and resources were growing scarce, in 10 years the minerals should become more expensive; if Simon was correct, they should cost less. The loser would pay the difference.

In October 1990, ten years later, Simon received a check in the mail from Ehrlich for $576.07. Each of the five minerals had declined in price. Simon and his faith in the market were victorious.

“The market is ideally suited to address issues of scarcity,” says Paul Sabin, a Yale environmental historian who wrote the book on the Simon-Ehrlich Wager. “There’s often cycles of abundance and scarcity that are in dynamic relationship with each other where one produces the other.”

Take oil: Repeatedly over the past decades, oil prices have shot up, leading some people to predict peak oil—the end of fossil fuels and the start of an energy crisis. But by market logic, high prices encourage enterprising people to seek new oil sources, develop new extraction technologies, or otherwise invest in bringing oil onto the market. Demand and high prices brought us fracking, for instance, and now gas at the pump is cheaper than ever. Research into the next potential oil technology, extraction of methane hydrates, is already underway.

Similar patterns occur with minerals like copper, one of Ehrlich’s picks from his wager with Simon. At the time of the bet, the price of copper was on the rise, and, as a result, some investors took to copper production, increasing supply, says Sabin. Then in 1977, GE and Bell laid their first fiber-optic phone lines, which carry more information than copper wire. The new technology spread through the 1980s—and by the end of the Simon-Ehrlich wager, demand for copper was down, as was its price.

Each mineral from the bet has its own story, says Sabin, and many involve people. An international tin cartel collapsed, leading to a drop in tin prices. With other metals, strikes and union resistance were sorted out, and prices dropped.

Feeding the Planet

The biggest apocalyptic claims around the first Earth Day related to overpopulation and food shortages. "Population will inevitably and completely outstrip whatever small increases in food supplies we make," Ehrlich said in an often-quoted 1970 Mademoiselle interview. “The death rate will increase until at least 100-200 million people per year will be starving to death during the next ten years.”

Ehrlich was right about the growing population—but not about mass starvation. Famine and starvation continue throughout the world, but not to the extremes he predicted. The reason is the Green Revolution, which began decades before the first Earth Day, in Mexico, and really gained steam just about the time Ehrlich made his predictions.

In the 1940s, Mexico imported half of the grain needed to feed its population. Its government feared food scarcity and famine—and those fears sparked an agricultural revolution.

The Mexican Ministry of Agriculture teamed up with the Rockefeller Foundation to import American biologists to work on the problem, one of whom was Norman Borlaug. Over several decades, Borlaug used selective breeding to create strains of wheat with bigger kernels and smaller stems that could feed more people per acre; similar techniques were applied to rice. As a result, by 1980, wheat yields doubled in Pakistan and India, and poverty rates halved even as human populations expanded. By 1963, Mexico was exporting wheat instead of importing it.

Ultimately, Ehrlich and others’ predictions about feeding our growing population failed to come true; human ingenuity found a way. But even Borlaug acknowledged that increasing yields would not be a permanent solution.

“The green revolution has won a temporary success in man's war against hunger and deprivation; it has given man a breathing space,” Borlaug said in a speech after he received the Nobel Peace Prize in 1970. “But the frightening power of human reproduction must also be curbed; otherwise the success of the green revolution will be ephemeral only.”

The Pollution Problem

Around the first Earth Day, environmental scientists made dire predictions about pollution. “In a decade, urban dwellers will have to wear gas masks to survive air pollution,” reported Life magazine in 1970. “At the present rate of nitrogen buildup, it’s only a matter of time before light will be filtered out of the atmosphere and none of our land will be usable,” said ecologist Kenneth Watt.

These predictions didn’t come to pass, but not because of economic incentives. When the synthetic pesticide DDT caused bird populations to plummet, as Rachel Carson documented in Silent Spring, there were no market incentives to reverse that trend. An increase in lead poisoning or asthma creates a market for medicines and treatment, but not for decreasing the pollutants that cause them.

And so on that first Earth Day, people fighting oil spills, power plant pollution, pesticides and litter protested in the streets. The government responded to public outcry, activism and the collective predictions of the era by creating our most powerful environmental laws—the Clean Air Act, the Clean Water Act, the Endangered Species Act and others.

“The sense of concern, the feeling of crisis, the agitation and political mobilization associated with [the era’s predictions] interestingly had an effect not on energy or mineral resource production but on control of pollution,” says Sabin. “People like Ehrlich shared a vision that the path that we were on wasn’t a good one, that it was headed towards crisis—and that gave energy and support for the legislation.”

And the regulations have worked. After DDT was banned in 1972, populations of bald eagles and other birds rebounded. Regulations on nitrogen dioxide and particulate pollution have improved air quality in cities alongside children’s lung development. In the late 1970s, 88 percent of American children had elevated lead levels in their blood; after leaded gasoline was phased out, that number dropped to less than 1 percent.

Pollutants continue to cause problems; the horrific case of lead poisoning in Flint show that regulations are not perfect solutions. But those predictions and the resulting activism during the first Earth Day drove change.

The Legacy Lives On

Even though the dire predictions didn’t come to be, they live on in our environmental discourse—and then as now, the most extreme voices get the most attention.

“It is important to acknowledge that there’s a relationship between the past predictions and the current ones,” says Sabin. “They helped feed a dynamic of extremes with both sides bashing each other.”

This is evident in the loudest parts of the climate change discussion. Extremists on one side are certain the world is going to end; extremists on the other are certain everything is fine and climate change is a conspiracy.

The truth is more complicated. Climate change won’t destroy the planet, although it will change the environment we’re accustomed to, in ways we can't predict and with possibly dire consequences. And weaponizing “failed predictions” of the past to justify leaving the climate problem to the market is deceptive. If we don't act because a previous prediction "failed," we face an array of human suffering, which will hit the poorest and disadvantaged the hardest.

“We should try to figure out the relationship between the earlier predictions and the current ones,” says Sabin, “The environmental community and advocates for climate action will be in a stronger position if they can figure out how to explain why climate change is different [from past predictions of resource scarcity] and why we need to take action now.”

Why Did Flamingos Flock to Mumbai in Record Numbers This Winter?

Smithsonian Magazine

Since the 1980s, a large flock of migratory flamingos has come to Mumbai with the intent to nom. Between 30,000 and 40,000 of the large pink birds have frequented the capital city of the Indian state of Maharashtra. This year, however, the population of flamingos has tripled, reports Payal Mohta at The Guardian, with conservationists estimating that 120,000 of the birds are hanging out along the mudflats of Thane Creek this year to enjoy a buffet of blue-green algae.

So why have so many extra flamingos joined the party? Researchers suspect one factor may have to do with sewage. Clara Lewis at The Times of India reports that despite the establishment of the Thane Creek Flamingo Sanctuary in recent years, the area has become a hot spot for pollution. A 2016 report on the water quality revealed alarming levels of pollution in Thane Creek brought on by unchecked sewage discharges and illegal dumping.

It’s believed that all of that organic waste is causing a boom in the growth of the blue-green algae in the mudflats where the flamingos go to feast.

“It is a well-studied phenomenon in nature that one species’ waste is food for the other,” Debi Goenka, honorary secretary of the Bombay Natural History Society (BNHS), tells Mohta of the Guardian. “The sewage in the creek promotes biological growth of blue-green algae, which is food for the flamingo.”

Conservationist and naturalist Sunjoy Monga, who has authored a book on Mumbai’s birds, agrees, saying that it’s unlikely there would be so many birds if the human imprint on the body of water wasn’t so apparent. “This phenomenon is called edge nature,” he says. “Here, wilderness merges with human impact and some species are able to thrive in it. It’s a double-edged sword.”

If the spike in flamingos indicates a trend, though, conservationists fear it may be a short-lived one. The mudflats where the birds congregate are under multiple threats: While the sewage and construction debris being flushed down Thane Creek may be the cause of the expansion of the mudflats and adjacent mangroves, without intervention, the sediment build up threatens to block the creek entirely.“Over time, the deposition of sediment has narrowed the channel,” a 2017 study noted. In that scenario, the whole area could dry up, destroying the mangroves and flamingo habitat.

Development is also a concern. Mohta reports that the Uran wetlands, once home to a flock of flamingos, was recently reclaimed for construction of an airport, and the construction of a sea bridge across the Thane Creek mudflats called the Mumbai Trans-Harbour Link caused the birds to move from their preferred location. Last month, authorities also authorized the construction of a bullet train route that would bisect the flamingo sanctuary.

The BNHS is still looking to give a more definitive answer as to why so many flamingos flocked to Mumbai this year. Since launching a 10-year project to study the birds last October, Lewis of Times of India reports that a 20-person team has been responsible for counting the flamingos and testing the water for heavy metals and other pollutants.

Rahul Khot, assistant director of BNHS and principal investigator of the team, says the researchers have already collected some interesting data: Of the two species of flamingos found in Mumbai—the greater flamingo and lesser flamingo—the number of greater flamingos has decreased since October, while the number of lesser flamingos has skyrocketed. In the future, they plan to add radio trackers to birds to gain a better understanding of their migration patterns.

“It’s really good to see large number of birds visiting this metrocity,” Khot says in an interview with NPR, “but that also adds to our responsibility to conserve their habitat so that incoming future coming generation will also enjoy this bird.”

Why Cokie Roberts Admired Dolley Madison

Smithsonian Magazine

When Cokie Roberts started out in journalism in the 1960s, the constant refrain she heard from men in the business was “we don’t hire women to do that.”

But the congressional journalist and political commentator—who died at age 75 on Tuesday "due to complications from breast cancer," according to a family statement—carved her own space in the industry and, in the process, helped transform the role of women in the newsroom.

“It was very difficult,” Roberts later said in an interview with Smithsonian Associates’ Paul Vogelzang. “When you moved up through the ranks you were often the only women there. When people finally put women on the air, they basically had their one woman and that was it.”

The daughter “of prominent U.S. Representatives Hale Boggs and Lindy Boggs, who represented a New Orleans-centered district for half a century,” as a biography and oral history by the U.S. House explains, her early memories were filled with moments like “riding the old Senate subway, with its wicker seats; accompanying her father on the House Floor on the Opening Day of Congress in the late 1940s; prodding her father to speak out on the floor in support of the Voting Rights Act of 1965; and listening to prominent dinner guests such as Speaker Sam Rayburn of Texas.”

Because of her family’s history, Roberts—born Mary Martha Corinne Morrison Claiborne Boggs in New Orleans, Louisiana, in 1943, but known as “Cokie” since childhood because her brother couldn't pronounce Corinne—never questioned that she would get into politics in some capacity. All of those formative years spent at the Capitol and House of Representatives made an impact. "I became deeply committed to the American system,” she recalled in the oral history project, “And as close up and as personally as I saw it and saw all of the flaws, I understood all of the glories of it.”

But rather than run for office herself, which she worried would cause difficulties for her husband, journalist Steve Roberts, she chose to cover Capitol Hill as a reporter. By the 1980s she’d risen to national prominence as a journalist for NPR and ABC News.

In a statement, NPR president and CEO Jarl Mohn praised her “signature voice and commentary…[which] accompanied public radio listeners, provided context for news and [has] been a familiar presence in their homes." Roberts, who joined the broadcasting company in 1978 to report on the Panama Canal Treaty, was, as Mohn added, seen as "one of NPR's 'founding mothers,'” alongside journalists such as Nina Totenberg, Linda Wertheimer and Susan Stamberg. (The reason there was some space for women at NPR early on, of course, was because the pay was significantly less than what commercial networks of the day were offering, as NPR national political correspondent Mara Liasson pointed out in an interview earlier this year.)

Throughout her career, Roberts was widely respected by her peers in media and by the politicians she covered on both sides of the political aisle. As Neil Genzlinger writes in her New York Times obituary, in the wake of Roberts’ death, Representative Eric Swalwell, a California Democrat recalled on Twitter, for instance, “a 2001 talk in which she ‘encouraged all of us, Republicans and Democrats, to always seek consensus where we could.’”

Perhaps because she was long accustomed to being one of the few women in the room, Roberts also paid special attention to women’s history. It was, in fact, because of her depth of knowledge on the first ladies of the United States that Kim Sajet, director of the Smithsonian’s National Portrait Gallery, invited her to speak on the museum’s "Portraits" podcast this summer.

Sajet remembers first meeting Roberts many years ago during her tenure as the president of the Historical Society of Pennsylvania. “She was just incredibly smart and incredibly funny. She really knew her homework and was quite irreverent as well,” Sajet says, adding that Roberts "looked at history at a 90-foot height and can fill in the history with all these interesting details."

Tellingly, she says, when asked before the podcast which of the presidents' wives she wanted to focus on, Dolley Madison was among her top picks. The fourth first lady, says Sajet, embodied a model of dealing with Washington society that Roberts, in a way, cast her own career after.

“It didn’t matter where you were on politics, Dolley would bring anyone into her drawing room. Everyone could talk it through and work it out," says Sajet. "That was one of the things Cokie was admiring of, I believe, that Dolley brought people of different opinion together in a respectful and open way to talk."

Why Chemicals in the U.S. Are Still “Innocent Until Proven Guilty”

Smithsonian Magazine

Last month, President Barack Obama signed a chemical bill that was meant to solve a problem few people knew they had. That problem was the substandard safety of everyday chemicals—an issue that affects anyone who uses household cleaners, has a couch or wears clothing. In a month filled with dramatic political news, this seemingly small legislative achievement received little media attention. Yet it actually represents a major reform, providing the decades-old Toxic Substances Control Act (TSCA) with a much-needed retrofit.

In the European Union, safety laws guarantee that both industrial and household chemicals are vetted for their potential risks to human health and the environment before they appear on the market. In the United States, however, chemicals are generally “innocent until proven guilty”—a maxim that’s good for people, but bad for potential toxic chemicals. Scientists at the Environmental Protection Agency have found that the majority of chemicals in use today have not been sufficiently examined for human health toxicity or environmental exposure. How can this be?

Originally passed in 1976, the old TSCA was meant to help the EPA regulate the safe production and use of industrial chemicals. But the act was founded on scientific assumptions and practices that are far outdated today. Perhaps worse, TSCA also grandfathered in a long list of “existing” chemicals—which made it extremely difficult for the EPA to pull them from the market even if they were later shown to be harmful. (It has been easier for the EPA to require companies to develop data on chemicals that are new to the market, but many hurdles still exist.)

As a result, people have been exposed to toxic chemicals left under-regulated by the EPA for decades—with devastating effects. This has been the case since 1989, when a federal court overturned the EPA’s ban on asbestos, one of the best-known carcinogens ever used. Since then, the EPA has never attempted to completely pull an existing chemical from the market. Lead, which is known to harm children’s brain development at extremely low levels and was banned from use in house paint in 1978, is still used in ammunition and some industrial manufacturing.

Newly developed chemicals approved by the EPA through the TSCA review process have also proved to be hazardous. FireMaster 550, a flame retardant, was developed as a supposedly safer replacement chemical after the leading flame retardant for furniture foam was banned in several states and pulled from the market. Yet in 2012, after being reviewed and approved for use by the EPA in 1997, scientists were uncovering evidence that it was a neurotoxic obesogen (a compound that can lead to weight gain by altering fat metabolism).

Despite the fact that the EPA has recently labeled FireMaster 550 to be of “high” or “very high” concern for reproductive, developmental, neurological and aquatic toxicity, it remains on the market. In fact, today it's still praised by its manufacturer as “an innovative move to greener chemicals.”

Responding to these failures, public health advocates have been pushing for TSCA reform for decades. Activists pursued an uneven “patchwork quilt” of regulations that made it hard for chemical manufacturers and retailers to stay ahead of chemical restrictions around the country. As an advocacy leader from the manufacturing industry told me in an anonymous interview for my book on the topic: “We would like to have a level playing field across all 50 states, and have preemption over anything a state might try to develop.” To push for their preferred version of TSCA reform, the chemical industry spent more than $125 million on lobbying since 2014.

The new act ensures that the EPA will now prioritize and evaluate chemicals based on risk, not cost-benefit calculations. In other words, the agency has to affirm the expected safety of newly developed chemicals. The act also somewhat reduces chemical companies’ abilities to hide important data behind the veil of “confidential business information.” In addition, the act requires that the EPA rely less on animal testing and more on high-throughput testing and screening—guidelines that are not only more humane, but are in line with recent developments in toxicity research in recent decades.

These are all major strides. “The general consensus is that this bill is ‘better than current law,’” notes Nancy Buermeyer of the Breast Cancer Fund, a nonprofit that aims to prevent environmental causes of cancer, including toxic chemicals. But it still “falls far short” in important ways, she says, as should be expected from any piece of legislation so enthusiastically supported by the industry it is charged with regulating. The act requires risk evaluations of only 20 high-priority chemicals at a time, a fraction of the more than 80,000 chemicals currently on the TSCA inventory. It also preempts states from enacting their own restrictions on potentially dangerous chemicals as soon as EPA begins its review, even though such reviews can take years, and bars future action on EPA-evaluated chemicals with few exceptions. 

Ultimately, the effectiveness of the act will come down to how it is implemented. The EPA has already released a timeline for the next year. Of particular note is the establishment of a “Science Advisory Committee on Chemicals,” which is meant to provide independent expertise and consultation to the EPA. These efforts by EPA scientists, federal regulators and involved stakeholders like the chemical industry and environmental advocates will determine whether the agency can achieve its goal of evaluating chemicals based on the “best available science.”

The new law is a step in the right direction, but it remains to be seen whether it will do enough to hold potentially harmful chemicals accountable. 

Why Charging for Plastic Bags Makes People Give Them Up

Smithsonian Magazine

Across the world, plastic bags are losing popularity. The European Union has called for an 80 percent reduction by 2019; Italy, Wales, Ireland and France have already begun either charging extra for plastic bags or eliminating them altogether several years ago. In the U.S., some states have banned the bags outright (California and Hawaii) or set up a bag fee system (Washington State).

Anti-bag activists feared that bag fees wouldn't do much—people would just shrug and pay the five cents for the bag. According to new research, however, that has not been the case: bag fees do indeed reduce their overall usage. 

The authors reached these conclusions after studying Buenos Aires, where some grocery stores in certain parts of the city charge for bags, and others do not. They interviewed people about their bag use and also measured bag use before and after the new rules were instated. 

The researchers found that it's not that people are worried about wasting money on the bags, the Washington Post reports. It's that the fee forces us to recognize the problem. That lone is enough of an incentive to get us to either carry our groceries or bring our own reusable bag. "This small change disrupts habitual behaviors and helps people draw a tighter linkage between the environmental awareness that they already possess,"  the Post writes, "and actions in the world that actually advance that consciousness and their values." 

In other words, we don't need to go so far as banning all bags, at least not in the beginning. To get the movement starts, the Post concludes, "You can just give people the slightest push, and let them fix the problem themselves." 

Why Are Superachievers So Successful?

Smithsonian Magazine

What does a Pulitzer Prize-winning war photographer have in common with a tennis legend? Or how about a celebrated opera diva and a Los Angeles civil rights lawyer? What does Alec Baldwin have in common with Yogi Berra?

A lot, says journalist Camille Sweeney, who, along with co-author Josh Gosfield, interviewed dozens of highly accomplished men and women for a new book, The Art of Doing: How Superachievers Do What They Do and How They Do It So Well. Whether someone is setting out to create one of the most popular blogs on the Internet, as Mark Frauenfelder did with BoingBoing, or to win a record amount of money on "Jeopardy!," people who accomplish amazing things rely on a particular collection of strategies to get to the top—and many of them are not what you’d expect.

Who is a superachiever?

Somebody at the top of their craft. Ken Jennings, for example, he didn’t just win on "Jeopardy!," he was the winningest contestant ever on "Jeopardy!"—he won 74 times. It’s the person who is going beyond success.

Do you think that the people you interviewed for the book are fundamentally different from the rest of us?

No! It’s interesting. I think when we started out I might have thought that. But after talking to them and really thinking about their lives, I don’t think that they’re different. When they arrived at what they thought they were going to be doing, they just kept at it. They kept up the energy. And when all the doubters and the haters were saying, “This isn’t going to work,” they didn’t listen. When they felt like they could learn something, they took what they could. It gave me hope that if you put your mind to something, you can be a superachiever. It takes a lot of work, and the work doesn’t stop. These people are pretty 24/7 about what they’re doing.

Your book includes profiles of a wide array of people—business gurus, scientists, actors, musicians, writers and athletes. How did you decide whom to include?

We always thought of our cast of characters as being the most fabulous dinner party you could go to. Anywhere you could sit, you would be getting information from people as disparate as high-wire artist Philippe Petit, dog whisperer Cesar Millan or the opera diva Anna Netrebko.

This is an eclectic group, but you discovered they all share several key strategies and personality traits. What are some of the common threads?

Probably the biggest is self-awareness—the ability to be self-questioning. I love to talk about Martina Navratilova. She had picked tennis up as a young girl and was playing extremely well, better than 99.9 percent of people worldwide ever played tennis. Yet, she was very inconsistent. She had this realization when [American tennis great] Chris Evert beat her, just a drubbing, that all along she was playing based on the assumption that talent and instinct alone was enough to get her to the top and keep her there. She realized that she was not in nearly the condition that she would need to be to be able to play consistently, so she started playing four hours every day. She transformed herself into a playing machine. Using this process of self-evaluation, she was able to get so much further than she would have had she not. She’s just one example, but we kept seeing this over and over again.

Superachievers might look like loners—at the top of the mountain, by themselves. But they all found ways to connect themselves to people who would support their dreams and their goals. Everybody had this skill of active listening, when you’re taking in what another person’s saying and processing it, listening for information that you’re going to put into action. That’s something that’s surprising for very successful people—you would imagine that they don’t want to be told (what to do), because they know everything. You wouldn’t think that Tony Hsieh, the CEO of Zappos.com, or Martina Navratilova, has to listen, but that is what they’re doing.

Another thing that these people had in common was patience—not something that you would normally associate with a hard-charging, successful person. We had a really good chat with Hélio Castroneves, the Indy 500 race car driver. When he was a young boy, his father got him into go-karting. He would get in there and he’d feel like he’d have to lead every lap and go as fast as he could and get to the end. His father kept saying, “Use your head.” By that, he meant, “You’ve got the passion and you’ve got the ambition, but temper that by knowing when to make the right move.” So, in one particular race, he literally held back and let another kart go in front of him so he could use all the energy that he had for that very last lap. Boom, he won the race. It was a wake-up call for him that he didn’t have to win every lap. 

Smithsonian.com recently interviewed a psychologist who argued that successful people often benefit from psychopathic tendencies. Did you detect any psychopaths among your subjects?

Well, I’m not a scientist. But I think what is interesting is [how psychopaths] manage emotions. Being really skillful at managing your emotions means you’re able to separate yourself and examine those emotions, feel them when they’re about to occur, and create a path for them to happen but not derail you. These people that I talked with, they’re really skilled at using their emotions. They’re able to use their frustration and their anger to propel them, to fuel action.

One thing that seemed conspicuously absent from your list was natural talent. How important do you think that is to success?

I think it is important, but I think you could have a really talented artist who never picks up a pen and draws. Certainly, the people that we talked to showed talent early on. But I think it’s what you do with that talent that makes all the difference. One of my favorite interviews was with Jessica Watson, the teenager who circumnavigated the globe alone [in a sailboat] in 2010. It was an idea she had when she was 11. She had no sailing background. There was no talent that she was pursuing. But at 11, Jessica got this idea that she could do it. So, her real talent became holding onto that dream. 

Are there any downsides to being a superachiever? Did these people have to make sacrifices to reach their goals?

I think one of the things with superachievers is that they’re very single-minded, very focused. They shape their life around their dreams or their goals, rather than the other way around. But to me, as long as you’re keeping the goal in mind and recognizing all of the sacrifices that goal is going to take, then I wouldn’t say there’s a downside.

Even if we aren’t superachievers, can regular people use these techniques and strategies in our own lives?

Absolutely. There is a process of doing everything. Superachievement may seem like this impenetrable block of success, this almost intimidating concept. But when you break it down into very small things, or patterns to the way somebody does something, you can grab it and absorb it right into your life. There is this exciting opportunity for people to start seeing the world through this different lens, whether you’re looking at the people we chose or people in your life.

You met so many people for this project—who was the most fun to interview?

Philippe Petit, the high-wire artist who walked between the World Trade Center towers. He’s full of anger and bravado. He has ideas about how you have to go straight into chaos in order to create art, risking his life by being up on the high wire. He has a lot of interesting techniques and strategies. One is he goes rock-jumping in riverbeds. If it’s slippery and mossy, he could fall and hit his head, so every time he moves to the next rock, he has a whole process of decision-making that he has to do very, very quickly.

There’s a lot of good advice in this book, but that’s probably one thing we shouldn’t try at home.

Exactly. No!

Why Are Fewer People Majoring in History?

Smithsonian Magazine

The Great Recession reshaped the United States in a number of ways, but a new analysis suggests it was strong enough to even impact the past. Writing for the American Historical Association's blog Perspectives on History, Northeastern University’s Benjamin M. Schmidt crunched the numbers and found that since the financial crisis hit in 2008, the number of history majors at colleges and universities has dropped by more than 30 percent.

According to statistics from the National Center for Education Statistics, there were 34,642 history majors in 2008. Fast forward to 2017, the count was 24,266. Most of that decline occurred after 2012, with a notable single-year drop of more than 1,500 between 2016 and 2017.

Schmidt points out that the history major has had low points before. The discipline weathered a significant decline from 1969 and 1985, when the major dropped by 66 percent. However, those numbers were linked to higher education’s boom in the ’60s that saw the discipline’s rapid expansion and subsequent bust when higher education growth slowed in the ’70s.

The exodus from history this time around is especially pronounced at private, not-for-profit institutions. While all demographic groups are impacted, the highest drops in the field have been seen among Asian-Americans and women, according to Schmidt, who notes that the Department of Education's methodology only accounts for, among other things, binary gender in its polling questions.

History isn’t the only discipline in the humanities hemorrhaging undergrads. English, foreign languages, philosophy and anthropology are among those that have seen big drops as well. But the new analysis shows that since the 2008 Recession, history has suffered the steepest decline.

“One thing I learned earning a history degree is that people usually announce a ‘crisis’ so they can trot out solutions they came up with years earlier,” Schmidt wrote in an article sounding the alarm in the Atlantic this summer. “I don’t have any right now. But the drop in majors since 2008 has been so intense that I now think there is, in the only meaningful sense of the word, a crisis.”

So why are students avoiding majoring in our shared past? Schmidt tells Emma Pettit at the Chronicle of Higher Education that post-recession, the trend is for students to pursue majors that appear to have higher job prospects rather than follow their academic interests. “Students and their parents seem to be thinking a lot more that they need to major in something practical, [something that is] likely to get them a job at the back end,” he says. The emphasis on STEM (Science, Technology, Engineering and Mathematics) education, he adds, has also led more students away from majoring in the humanities, in hopes of graduating with a degree that will land them a more lucrative job.

But that anxiety around job prospects from a humanities education isn’t necessarily rooted in reality. While students and those who help them make decisions about their education may believe that humanities degrees don’t lead to good jobs (thanks, Garrison Keillor), the American Community Survey (ACS), which has been conducted by the US Census Bureau annually since 2000, reflects a more nuanced picture of graduates. As Paul B. Sturtevant detailed for AHA’s Perspectives in 2017, the ACS's statistical survey of 3.5 million American households “suggest[s] that the picture for history majors is far brighter than critics of the humanities would have you believe, even those who think the sole purpose of a college degree is to achieve a well-paying job.”

In an interview with Pettit of the Chronicle of Higher Education, Schmidt also points out another, more hopeful, reason for declines in the major: Smaller cross-disciplinary majors like African-American studies and women’s and gender studies are also attracting students, who may have previously opted to major in history. These majors give students a specialized lens into their area of study and offer the promise of more personal attention and opportunities than larger programs. “These more-traditional majors are just becoming less and less central to higher education as time goes on and as newer, cross-disciplinary programs become more accessible at a wider variety of schools,” he says.

So what’s to be done to take the history major back to the future? The first step might be demythologizing what it means to major in history. The AHA Tuning Project, for one, is working to “articulate the disciplinary core of historical study and to define what a student should understand and be able to do at the completion of a history degree program,” and will be holding a session at the 2019 annual conference to give undergraduate advisors more tools to counsel students on the opportunities a history degree presents.

For now, as Colleen Flaherty at Inside Higher Ed reports, at least one university is bucking the trend. For the class of 2019, history is the most popular major at Yale University, after a major slump in the 2000s. Alan Mikhail, incoming chair of the history at Yale, says that the discipline’s success is no accident. The program actively recruits students, hires new faculty members in areas of growing interest and rejiggered the major to make it a more linear course of study, more akin to the way students move through STEM fields. “One important thing that came out from our conversations with students when we were considering changes was that the major lacked coherence or a logical path,” he says. “Students are [now] with each other in classes in all four years, work on the same problem sets, and build camaraderie."

Observing the data, Schmidt says the worst drops in the history major may be over. "It is reasonable to hope that the trends of the last decade will eventually bottom out, perhaps even in the next year or two," he writes. Mikhail, for his part, believes that, at least at Yale, the current historical moment may bring more students back into the historical fold. He points out that economic and political models failed to predict the turning points of the last two decades, including 9/11 and its aftermath, the economic crisis and the 2016 election. Instead of relying on models and algorithms, he argues, society is learning that it needs more people with a critical eye, long-term perspective and familiarity with the nuance and messiness of the past to help guide us into the future. In other words, historians.

Why Abraham Lincoln Was Revered in Mexico

Smithsonian Magazine

American historian Michael Hogan makes a bold claim. He says that Abraham Lincoln is in no small part responsible for the United States being blessed for many generations with an essentially friendly nation to the south—this despite a history that includes the United States annexation and conquest of Mexican territory from Texas to California in the 1840s, and the nations’ chronic border and immigration tensions. “Lincoln is revered in Mexico,” Hogan says. As evidence, he points to the commemorative statues of Lincoln in four major Mexican cities. The one in Tijuana towers over the city's grand boulevard, Paseo de los Héroes, while Mexico City's Parque Lincoln features a replica of sculptor Augustus Saint-Gardens' much admired Standing Lincoln, identical to the one in London's Parliament Square. (The original stands in Lincoln Park in Chicago.) These are commanding monuments, especially for a foreign leader.

In his 2016 study, Abraham Lincoln and Mexico: A History of Courage, Intrigue and Unlikely FriendshipsHogan points to several factors that elevated the United States’ 16th president in the eyes of Mexicans, in particular Lincoln’s courageous stand in Congress against the Mexican War, and his later support in the 1860s for democratic reformist Benito Juárez, who has at times been called the “Abraham Lincoln of Mexico.” Lincoln’s stature as a force for political equality and economic opportunity—and his opposition to slavery, which Mexico had abolished in 1829—made the American leader a sympathetic figure to the progressive followers of Juárez, who was inaugurated as president of Mexico in the same month and year, March 1861, as Lincoln.

“Both were born very poor, pulled themselves up by their bootstraps, became lawyers, and ultimately reached the highest office of their countries,” says Hogan in a telephone interview from Guadalajara, where he has lived for more than a quarter-century. “Both worked for the freedom of oppressed peoples—Lincoln demolishing slavery while Juárez helped raise Mexican workers out of agrarian peonage.” (In a lighter vein, Hogan points out that physically, they were opposites: While the gangly Lincoln stood six-foot-four, Juárez reversed those numbers, at a stocky four-foot-six.)

Early on in Lincoln’s political career, as a freshman Whig congressman from Illinois, he condemned the 1846 U.S. invasion of Mexico, bucking the prevailing patriotic tide and accusing President James K. Polk of promoting a falsehood to justify war. After a skirmish of troops in an area of what is now south Texas, but was then disputed territory, Polk declared that "American blood has been shed on American soil” and that therefore “a state of war” existed with Mexico. “Show me the spot where American blood was shed,” Lincoln famously challenged, introducing the first of eight “Spot resolutions” questioning the constitutionality of the war. Lincoln’s stand proved unpopular with his constitutents—he became known as “Spotty Lincoln”—and he did not seek re-election.

He was not alone in his protest, however. Among others, New Englanders such as John Quincy Adams, who lost a son in the war, and Henry David Thoreau, who wrote his famed essay, “On Civil Disobedience,” in reaction to the war, also dissented. Ulysses S. Grant, who distinguished himself as an officer serving in Mexico, later wrote in his memoirs that it had been “the most unjust war ever waged against a weaker nation by a stronger.”

In seizing more than half of Mexico’s territory as the spoils of war, the U.S. increased its territory by more than 750,000 square miles, which accelerated tensions over the expansion of slavery that culminated in the carnage of the American Civil War. Hogan believes strongly that the long-term economic impact on Mexico should inform thinking about border politics and immigration today, “We conveniently forget that the causes of northward migration have their origins,” he writes, “in the seizure of Mexico’s main ports to the west (San Diego, San Francisco, Los Angeles), the loss of the rich silver mines of Nevada, the gold and fertile lands of California, and the mighty rivers and lakes which provide clean water to the entire southwest.”

In the course of researching his Lincoln book, Hogan made an important discovery in the archives of the Banco Nacional de México: the journals of Matías Romero, a future Mexican Treasury Secretary, who, as a young diplomat before and during the American Civil War, represented the Juárez government in Washington.

Romero had written a congratulatory letter to Lincoln after the 1860 election, to which the president-elect cordially thanked Romero, replying: “While, as yet I can do no official act on behalf of the United States, as one of its citizens I tender the expression of my sincere wishes for the happiness, prosperity and liberty of yourself, your government, and its people.”

Those fine hopes were about to be tested as never before, in both countries.

During its own civil war of the late 1850s, Mexico had accrued significant foreign debt, which the French Emperor Napoleon III ultimately used as pretext to expand his colonial empire, installing an Austrian archduke, Ferdinand Maximilian, as Emperor Maximilian I of Mexico in 1863. The United States did not recognize the French regime in Mexico, but with the Civil War raging, remained officially neutral in the hope that France would not recognize or aid the Confederacy.

Nevertheless, the resourceful Romero, then in his mid-20s, found ways to secure American aid in spite of official policy, mainly by establishing a personal relationship with President Lincoln and the First Lady, Mary Todd Lincoln. From there, Romero was able to befriend Union generals Grant and Philip Sheridan, connections that would later prove crucial to the Mexican struggle. “What particularly endeared Romero to the American president,” Hogan notes, “was that he escorted Mrs. Lincoln on her frequent shopping trips…with good-natured grace. It was a duty which Lincoln was happy to relinquish.”

With Lincoln’s earlier letter in hand,Romero made the rounds with American bankers in San Francisco, New York and Boston, Hogan says, selling bonds that raised $18 million to fund the Mexican army. “They bought cannon, uniforms, shoes, food, salaries for the men, all kinds of things,” he says. “And Grant later helped them secure even better weapons—Springfield rifles. He would go to the Springfield people and say, “Get them some decent rifles. I don’t want them fighting the French with the old-fashioned ones.”

After the Civil War, the U.S. became even more helpful in the fight for Mexican liberation. In a show of support, Grant dispatched 50,000 men to the Texas border under General Sheridan, instructing him to covertly “lose” 30,000 rifles where they could be miraculously “found” by the Mexicans. Sheridan’s forces included several regiments of seasoned African-American troops, many of whom went on to fight in the Indian Wars, where they were nicknamed the Buffalo Soldiers.

By 1867, the French had withdrawn their occupying army; the Juárez forces captured and executed Maximilian, and the Mexican Republic was restored. Though Lincoln didn’t live to see it, his Mexican counterpart had also triumphed in a war for the survival of his nation. “Lincoln really loved the Mexican people and he saw the future as us being allied in cultural ways, and also in business ways,” Hogan reflects. “He supported the growth of the railroads in Mexico, as did Grant, who was a big investor in the railroads, and he saw us as being much more united than we are.”

Though most of this history has receded in the national memories of both countries, Hogan believes that Lincoln’s principled leadership and friendship—outspoken in the 1840s, tacit in the 1860s—created a pathway for mutually respectful relations well into the future.

Why 150,000 Sculptures in the U.K. Are Being Digitized

Smithsonian Magazine

Statues and figures of humans or animals, busts and heads, abstract works, religious or devotional objects, figurative memorials and tombs, detached and figurative architectural features, assemblage sculptures, preparatory works and maquettes will be digitized in an ambitious campaign to catalogue all of the United Kingdom’s public sculptures—yes, all of them.

In total, Martin Bailey of the Art Newspaper reports, that’s 150,000 entries, including 20,000 works displayed within museums and buildings and 130,000 or so found outdoors.

The initiative marks Art U.K.’ s second foray into the world of mass digitization. Between 2003 and 2012, the nonprofit organization, which stems from the Public Catalogue Foundation charity, chronicled, photographed and digitized 212,000 of the country’s public oil paintings. This time around, as the organization turns its eye on statues, the digitization process is expected to be far faster, with a projected finish line of late 2020, according to the Guardian’s Mark Brown.

An initial crop of 1,000 works, including Auguste Rodin’s bronze cast of biblical first woman Eve, Elisabeth Frink’s water-dwelling “Boar” and Bruce Williams’ towering aluminum panel of six couples kissing, was already published last week.

Art U.K.’s Katey Goodwin and Lydia Figes define the parameters for sculptural works included in the project in a blog post. “[F]or the sake of making this a manageable and cost-effective project, we had to be selective and choose which types of three-dimensional art to include–and what not to include,” they write. Decorative and “functional” objects, as well as antiquities crafted prior to 1000 A.D., are among the works that won’t make the cut. Pieces brought to Britain from other countries—Bailey highlights a 15th-century Nigerian Benin bronze head—will be included.

Auguste Rodin, "Eve," 1882 (Tracy Jenkins/Art U.K. )

The most prominent sculpture currently listed on the database is likely Rodin’s “Eve,” an 1882 statue that now stands outside of a Nando’s in the English county of Essex. The French sculptor originally designed “Eve” for a “Gates of Hell” commission he spent nearly 40 years crafting. At the time of Rodin’s death, the monumental work remained unfinished. “Eve” eventually ended up in Paris’ Musée Rodin; in 1959, a British art curator convinced the museum to part with the cast, which he then moved to the Essex hamlet of Harlow.

Other entries of interest include abstract sculptor Barbara Hepworth’s hand-carved “Contrapuntal Forms,” Bernard Schottlander’s undulating steel “Calypso,” and a trio of seated Buddha figures dating to the 1800s. The full catalogue of works is available via Art U.K.’s website.

According to a press release, one of the campaign's aims is to promote critical discussion of specific sculptural works. Potential lines of inquiry include why the database features so few sculptures of women and what is being done to redress this balance, how to discuss Britain’s legacy of slavery and colonialism when sculptures commemorate those who profited from them, and what sculpture can reveal about a post-Brexit Britain.

There’s also the larger question of the artistic merits of the medium overall. “Most people, when they think about art, will probably think about paintings rather than sculpture, and that’s slightly odd because we walk past sculptures and public monuments all the time,” Art U.K. director Andrew Ellis says in an interview with Apollo’s Florence Hallett.

The debate over which medium reigns supreme goes way back, and it is perhaps best characterized by the so-called paragone argument, which found Renaissance Old Masters such as Titian, Jan van Eyck and Petrus Christus vouching for painting with as much fervor as sculptors like Donatello and Ghiberti argued for sculpture’s superiority, according to Oxford Art Online.

While Goodwin and Figes argue that sculpture has long been relegated as an “afterthought to painting,” the burgeoning Art U.K. database may be able to add some nuance to that conversation, showcasing the diverse forms of expression afforded by the medium—from realistic busts of historical figures to streamlined abstractions to eclectic works you might not even register at first glance as sculpture.

Who's Better at Pokémon, Anarchist Twitch Players or a Betta Fish?

Smithsonian Magazine

Who's better at playing the hit 1996 video game Pokémon Red: a anarchic collective of video game enthusiasts, or a betta fish? Thanks to an enterprising coder and Grayson Hopper—a fish—we may soon find out.

Back in February of this year an anonymous Australian programmer combined the video game streaming site Twitch, a computer version of Pokémon Red and a crowd control scheme to spawn the unwieldy beast that is Twitch Plays Pokémon—a live, massive online version of the video game in which hundreds of thousands of players coordinated (or, you might say, competed) to control a single video game character. Hundreds of thousands of people mashing up, down, left, right—seldom with any degree of cooperation—struggling to become a Pokémon master. That's contestant number one.

Contestant number two is Grayson Hopper, a betta fish that has for the past few days been controlling the game by swimming around its tank. Grayson swims to the top left of the tank? Grayson's character goes left. Swims right? Goes right. Grayson, as you can imagine, is about as useful as a Magikarp. Yet, surprisingly, Grayson has managed to get quite a lot done in just a few short days, says Wired UK.

In the 125 hours he's been going so far, he has successfully chosen his first starter Pokemon: a Charmander (excellent choice), named it "AAAABBK" and defeated his first opponent, the rival's Squirtle. Not bad for a fish.

For much of this morning Grayson has been stuck in what appears to be the character's starting house, as thousands of people watch on in anticipation.

After 16 days Twitch Plays Pokémon eventually cleared the game's main quest, defeating the Elite Four and the players' main rival. It's been nearly a week so far and Grayson has yet to leave Pallet Town. Anarchy seems to be more useful than a fish, so far.

Who Needs a Boss When You Have Your Co-Workers?

Smithsonian Magazine

Steven Johnson is optimistic about the future. But, in order to ensure progress going forward, he insists that we harness the power of the peer network.

In his new book, Future Perfect, Johnson highlights the success of collaborative efforts such as Wikipedia and Kickstarter and advises us to use similar decentralized networks of people to help solve problems in the coming years. He calls his worldview “peer progressivism.”

What is flawed about the way we, as a society, think about progress?

We are strangely biased, as individuals and media institutions, to focus on big sudden changes, whether good or bad—amazing breakthroughs, such as a new gadget that gets released, or catastrophic failures, like a plane crash. We tend to not have a lot of interest in stories of incremental progress, where every year something gets one percent better or even a fraction of one percent better.

There has been an amazing drop in crime in the United States over the last 20 years. Divorce rates—everybody always talks about 50 percent of marriages end in divorce. Well, that was true in 1979. It is no longer true. People are much less likely to divorce now. Drug use is down. Teenage pregnancy is down. School dropout rates are down. There is a long list of indices of social health that have improved over the last 20 years. You just don’t hear about it.

One of the key things that progress is made of is this slow-but-steady progress, and it is not necessarily coming from innovations of the marketplace. It is not Apple that is causing smoking to decline at the incredible rate that it has over the last 20 or 30 years. It is a broad network of people—some of them working for government agencies, some of them just by word of mouth, some of them philanthropic organizations—that are kind of spreading the word and getting people to give up this dangerous habit.

We need to be celebrating this type of progress because it is good news, and it is nice to have good news, but also because it helps us to understand how we can do more of it.

In the book, you say that the public’s response to the Miracle on the Hudson encapsulates everything that is wrong with our outlook. How so? 

It is extraordinary how safe flying has become. You are now statistically more likely to be elected president of the United States in your lifetime than you are to die in a plane crash. What an amazing achievement as a society! But what we end up focusing on are the catastrophic failures that are incredibly rare but happen every now and then. 

Even when we have a story like the “Miracle on the Hudson,” where the plane crashes but everyone survives, we point to the superhero of Captain Sully. He was an amazing pilot and did an amazing job in landing that plane, but he was only part of that story. The other key part of that story was the way that the plane performed in that situation.

The engines didn’t fail catastrophically, sending shards of titanium into the fuselage and blowing up the plane, and they survived to give enough power to the electronic system. This enabled the Airbus to keep its fly-by-wire system intact, which enabled Sully to have all of this really crucial assistance in pulling the plane down to land at the right level of descent. Those systems were the combined knowledge of thousands of people, some of them working for the private sector but many of them actually working in government agencies and in NASA, that set up both the technology and the engineering that made it possible for that landing to happen.

As a society, we are like, “Look at the Superman!” or “It’s a miracle!” In fact, it precisely wasn’t a miracle. It was this long, collaborative network of ideas being shared and improved upon that built that system and enabled that plane to survive. If we don’t figure out a way to champion those network successes then we are also missing an important part of the story.

Believing in the peer network is a political orientation, as far as you see it, right? 

Yeah. Here is this emerging political philosophy that doesn’t readily fit the existing categories that we have. The cliché of the left is that it believes in the power of the state and the government to provide platforms and safety nets for society, and the cliché of the right is that it just believes in the marketplace and wants the government to get out of everybody’s way. But if you actually believe in this other thing, the power of the peer network to solve problems, it is hard to figure out which camp you are supposed to belong to. I decided to write this book to attempt to formalize this belief system that I am seeing around me and to give it a name.

What makes a peer network better able to solve our problems than a hierarchy?

Organizations that empower folks further down the chain or try to get rid of the big hierarchal chains and allow decision making to happen on a more local level end up being more adaptive and resilient because there are more minds involved in the problem.

In a peer network, no one is officially in charge. It doesn’t have a command hierarchy. It doesn’t have a boss. So, all the decisions are somehow made collectively. The control of the system is in the hands of everyone who is a part of it. They are modeled, in many cases, on the success of the Internet, the web and Wikipedia, all of which are peer networks in their architecture.

You want to have diverse perspectives in the network. And there has to be some kind of mechanism, when ideas are shared through the network, for the good ideas to be amplified and for the bad ideas to be weeded out.

[The Web site] Kickstarter, for instance, is a great example of a peer network supporting creative arts with “crowdfunding” techniques. One of the key things about Kickstarter is that less than 50 percent of the projects get funded. That is a sign that it is working, because not every project deserves to be funded. There is a selection pressure there of individuals voting for certain things with their financial support. Good ideas rise to the top and get funding, and ideas that aren’t as good don’t survive.

You advocate that we should be building more of these networks. Where? In what areas?

One mechanism is the idea of prize-backed challenges, where a wealthy person or the government creates some kind of prize for solving a problem that for whatever reason the market and the state aren’t solving on their own. There is a long tradition of prizes being a big driver of breakthroughs in science and technology. The Royal Society in the United Kingdom started these prizes, which they call “premiums” that drove a lot of breakthroughs in the age of the Enlightenment. What they do is create market-like incentives for a much more distributed, diverse network of people to apply their talents, minds and ingenuity to solve a problem.

There is a great opportunity to use these kinds of mechanisms in healthcare. In my book, I talk a little bit about creating these big billion dollar prizes for breakthroughs in various forms of prescription drugs. As long as you agree once you have come up with this drug to release it, effectively, open source and allow generics to be produced at much lower cost, we will give you $2 billion for your breakthrough. You end up then taking those ideas and getting them into circulation much more quickly, so that other people can improve them, because there is not a patent on the invention. Those kinds of mechanisms, I think, could be a great force for good in the world.

Is there low-hanging fruit? What is a problem that you think could be solved immediately, if only a peer network were created to address it?

One of the problems we have with the way that elections are funded these days is that a very small number of people are having a disproportionate impact on the system. A tiny percentage of the population is contributing a huge amount of the money to these campaigns. That is a betrayal of democratic values but also peer progressive values, in the sense that you want to have a diverse and decentralized group of people who are funding the system.

The wonderful solution to this, though it will be very hard to implement, is this idea of democracy vouchers, which Larry Lessig and a few other folks have come up with. This idea suggests that registered voters get $50 of their taxes, money that they are going to spend paying their taxes, that they can spend on supporting a candidate or supporting a party. They can match that with $100 of their own money if they want. If you were a candidate and you said, “Hey, I would like to have access to that money,” you would have to renounce all other forms of financial support. There would be so much money in that system that it would be hard to say no to it. That would instantly take this very undemocratic process, where one percent of the population is funding most of these campaigns, and turn it into a much more participatory system.

This interview series focuses on big thinkers. Without knowing whom I will interview next, only that he or she will be a big thinker in their field, what question do you have for my next interview subject?

When you look back on all your big thoughts that you have had over your career, what is the biggest thing that you missed? What is the thing that in all your observations about the world you now realize was a total blind spot that you should have figured out 10 years before it suddenly surprised you? What was the biggest hole in your thinking?

From my last interviewee, Hanna Rosin, author of The End of Men: Can women fit the genius mold? Can you imagine a female Bill Gates, someone who works outside the institution, drops out of work, completely follows her own rhythm? That is the kind of woman that seems next on the landscape. And can that be a woman? 

Yeah. One thing we know about unusually innovative people and creative thinkers is that they are very good at connecting disciplines. They are very good at seeing links from different fields and bringing them together, or borrowing an idea from one field and importing it over. That is often where a great breakthrough comes from. It doesn’t come from an isolated genius trying to have a big thought.

I think that there is a lot of evidence that that kind of associative thinking is something that for whatever reason, whether it is cultural or biological—I suspect it is probably a combination of both—women, on average, are better at than men. They are able to make those connective leaps better than men can. If we create cultural institutions that allow women with those talents to thrive, I think you are going to see a lot of Wilhelma Gates in the future.

Who Are the Real Hollywood Figures Behind 'Hail, Caesar!'?

Smithsonian Magazine

On its surface, the critically lauded Coen brothers movie Hail, Caesar! is a fantastical retro caper comedy (with musical numbers!) and a star-packed ensemble cast. On another level, it’s a meta-meditation on Hollywood and the dirty work that goes into the glossy final product. The biggest whitewash is splashed over the protagonist, Capitol Studio’s fixer Eddie Mannix, based on a real-life MGM executive with the same name, but with an important difference. While Josh Brolin's tightly wound but decent Mannix is played for laughs, the real Eddie Mannix wasn’t funny at all.

According to The Fixers, a scrupulously researched 2005 book by E. J. Fleming, a short but far-from-comprehensive list of Mannix’s misdeeds included being a wife beater and a philanderer. He injured a girlfriend, a young dancer named Mary Nolan, so badly she needed surgery to recover. When Nolan had the audacity to sue him, Mannix leveraged corrupt policemen to threaten her with trumped up drug charges. Mannix and other studio brass tampered with the evidence at the 1932 murder scene of Jean Harlow’s husband, producer Paul Bern, to make it look like suicide, because murder would introduce too many questions, including the inconvenient fact that Berne was still married to another woman.

“At his face, Eddie was a nice guy,” Fleming says. For the book, he interviewed scores of Hollywood old timers including Jack Larson, who played Jimmy Olsen in the 1950s television series The Adventures of Superman. Larson told Flemming he loved Eddie. “That being said,” Flemming says, “[Mannix] was a d***.”

Among his more infamous fixes: It is believed that Mannix tracked down and bought the film negative of a porno movie made by young dancer Billie Cassin, before she became Joan Crawford.

Hail, Caesar! follows the milder, fictional Mannix on a busy day and night in 1951 as he sorts out all manner of troubles involving a dizzying array of stars and movie genres: he brainstorms solutions to the inconvenient out-of-wedlock pregnancy of an Esther Williams-ish star (Scarlett Johansson). Hail, Caesar!’s Mannix also deals with the kidnapping of Baird Whitlock, (George Clooney) the star of an epic (and epically expensive) biblical story who is being held for ransom by a group of money-hungry communist writers called “The Future.”

The characters are all inspired by real stars of the era: George Clooney is the handsome, blotto actor who could be a Charlton Heston/Richard Burton hybrid, but (aside from the alcoholism) mostly he seems to be playing a cartoonish version of himself, a handsome, charismatic star with a natural facility with leftwing politics. Tilda Swinton plays waspish identical twin sisters who are competing gossip columnists torn from the Hedda Hopper/Louella Parson page and Channing Tatum, a talented hoofer who kills it as a dancing sailor, a la Gene Kelly. Capital Pictures (also the company in the Coen’s 1991 Barton Fink) stands in for MGM.

As he runs from crisis to crisis, Brolin’s Mannix relieves stress by going to confession and smacking a couple of people.

The real Mannix was an Irish Catholic New Jersey tough who made his bones as a bouncer at East Coast amusement parks owned by brothers Nicholas and Joseph Schenck. Mannix followed Nicholas Schenck to Loew’s, a company expanding its entertainment offerings to the brand-new motion pictures, when Loew’s merged with MGM in 1924. Schenck sent Mannix west to be his eyes and ears. Mannix arrived in a Hollywood still making silent pictures and began working as a comptroller and assistant to star producer Irving Thalberg.

At the studio, Mannix met Howard Strickling, a young assistant publicist.  According to Fleming, within a year of arriving, both Strickling and Mannix were part of MGM’s inner circle, specifically they were known as “The Fixers.”  During Mannix’s career, which stretched into the 1950s, MGM made scores of classic films and shorts, everything from The Thin Man movies with Dick Powell and Myrna Loy, to Gone With the Wind, The Wizard of Oz and later classic musicals like Show Boat and Singing in the Rain. Under the old studio system, actors signed contracts and worked exclusively for one studio. Among MGM’s legendary stable were Greta Garbo, William Haines, Robert Montgomery, Judy Garland, Andy Rooney and Clark Gable.

The two were micromanaging control freaks. They compiled reports on their stars from studio drivers, waiters and janitors. They read private telegrams coming in and out of the studio and bribed police officers. They manipulated and hid information, going to great lengths to benefit the studio, including helping arrange heterosexual dates and even sham marriages for gay actors. For instance, Fleming cites a studio-fabricated affair between Myrna Loy and closeted actor Ramon Navarro. The author says Loy learned first learned of her love for Navarro by reading about it in the Los Angeles Times. Star William Haines, who went on to become a lauded interior decorator, was let go when he would not drop his boyfriend Jimmie Shields.

Under Strickling and Mannix, the studio made problems disappear. Clark Gable kept Strickling and Mannix very busy. They were either telling papers he had been hospitalized for stomach problems when he was instead having his teeth replaced by less-charming dentures or cleaning up car wrecks, including one in which Gable may have killed a pedestrian. Actress Loretta Young became pregnant after an encounter with Gable during the filming of 1935’s Call of the Wild (Young later called the incident rape.) Mannix and Strickling helped hide Young from view during her pregnancy and then arranged for her to “adopt” her own child, just as Johannson’s character does in Hail, Caesar!.

“Gable loved Eddie,” says Fleming. “He was like Eddie. He wasn’t very educated, he was a hard working guy, but he was completely amoral.”

Like Lindsay Lohan or Charlie Sheenthe stars of Hollywood’s golden age were just as trouble prone, but society was less forgiving. “They were going to get in trouble and when they did Eddie Mannix helped get them out of it. They got in trouble and he fixed it.” Fleming says the stars seemed to appreciate that Mannix solved their problems and moved on. “You don’t get the impression from people who knew Eddie that he gave them shit for it.” Instead he made the case that they owed MGM their loyalty.

But Mannix’s dizzyingly list of suspected crimes goes beyond helping others and includes the mysterious death of his first wife Bernice, who died in a car crash outside of Las Vegas while trying to divorce him. Fleming says there is no way of knowing if Mannix was responsible, but “she divorced him for the affairs, the affairs were part of the divorce filing. He wouldn’t have been happy about that going public.”

His second wife, Toni, was the source of more controversy. She had had an affair with George Reeves of Superman fame. When Reeves was murdered in 1959, many thought Mannix was involved. Although never proven, Fleming believes Reeves’s newest girlfriend, society girl Leonore Lemmon, was responsible (the 2006 movie Hollywoodland takes that theory and runs with it.)

Personal scandal aside, Mannix’s and MGM’s fortunes faded together in the '50s. In United States v. Paramount Pictures Inc., the Supreme Court dealt a blow to the profits of big studios like MGM by breaking up their monopoly ownership of theater chains and the distribution of films to independent theatres. Likewise, actors and directors asserted their independence, asking for a percentage of profits, often in lieu of a salary. Television came on the scene, presenting a competing outlet for Americans’ attention. After years of ill-health, Mannix died in 1963.

But in Hail Caesar!’s 1951 all these forces are being felt, but the studio and its fixer Eddie Mannix are going full tilt, in a satirized Coen brothers universe where the art of movie making is simultaneously dirty and beautiful, but nonetheless meaningful. It all goes to show that the Coens have a great reverence for movies, past and present.

White Room

National Air and Space Museum
Pencil drawing on a paper depicting a back-up capsule in a white room.

The spring of 1962 was a busy time for the men and women of the National Aeronautics and Space Administration. On February 20, John H. Glenn became the first American to orbit the earth. For the first time since the launch of Sputnik 1 on October 4, 1957, the U.S. was positioned to match and exceed Soviet achievements in space. NASA was an agency with a mission -- to meet President John F. Kennedy's challenge of sending human beings to the moon and returning them safely to earth by the end of the decade. Within a year, three more Mercury astronauts would fly into orbit. Plans were falling into place for a follow-on series of two-man Gemini missions that would set the stage for the Apollo voyages to the moon.

In early March 1962, artist Bruce Stevenson brought his large portrait of Alan Shepard, the first American to fly in space, to NASA headquarters.(1) James E. Webb, the administrator of NASA, assumed that the artist was interested in painting a similar portrait of all seven of the Mercury astronauts. Instead, Webb voiced his preference for a group portrait that would emphasize "…the team effort and the togetherness that has characterized the first group of astronauts to be trained by this nation." More important, the episode convinced the administrator that "…we should consider in a deliberate way just what NASA should do in the field of fine arts to commemorate the …historic events" of the American space program.(2)

In addition to portraits, Webb wanted to encourage artists to capture the excitement and deeper meaning of space flight. He imagined "a nighttime scene showing the great amount of activity involved in the preparation of and countdown for launching," as well as paintings that portrayed activities in space. "The important thing," he concluded, "is to develop a policy on how we intend to treat this matter now and in the next several years and then to get down to the specifics of how we intend to implement this policy…." The first step, he suggested, was to consult with experts in the field, including the director of the National Gallery of Art, and the members of the Fine Arts Commission, the arbiters of architectural and artistic taste who passed judgment on the appearance of official buildings and monuments in the nation's capital.

Webb's memo of March 16, 1962 was the birth certificate of the NASA art program. Shelby Thompson, the director of the agency's Office of Educational Programs and Services, assigned James Dean, a young artist working as a special assistant in his office, to the project. On June 19, 1962 Thompson met with the Fine Arts Commission, requesting advice as to how "…NASA should develop a basis for use of paintings and sculptures to depict significant historical events and other activities in our program."(3)

David E. Finley, the chairman and former director of the National Gallery of Art, applauded the idea, and suggested that the agency should study the experience of the U.S. Air Force, which had amassed some 800 paintings since establishing an art program in 1954. He also introduced Thompson to Hereward Lester Cooke, curator of paintings at the National Gallery of Art.

An imposing bear of a man standing over six feet tall, Lester Cooke was a graduate of Yale and Oxford, with a Princeton PhD. The son of a physics professor and a veteran of the U.S. Army Air Forces, he was both fascinated by science and felt a personal connection to flight. On a professional level, Cooke had directed American participation in international art competitions and produced articles and illustrations for the National Geographic Magazine. He jumped at the chance to advise NASA on its art program.

While initially cautious with regard to the time the project might require of one of his chief curators, John Walker, director of the National Gallery, quickly became one of the most vocal supporters of the NASA art initiative. Certain that "the present space exploration effort by the United States will probably rank among the more important events in the history of mankind," Walker believed that "every possible method of documentation …be used." Artists should be expected "…not only to record the physical appearance of the strange new world which space technology is creating, but to edit, select and probe for the inner meaning and emotional impact of events which may change the destiny of our race." He urged quick action so that "the full flavor of the achievement …not be lost," and hoped that "the past held captive" in any paintings resulting from the effort "will prove to future generations that America produced not only scientists and engineers capable of shaping the destiny of our age, but also artists worthy to keep them company."(4)

Gordon Cooper, the last Mercury astronaut to fly, was scheduled to ride an Atlas rocket into orbit on May 15, 1963. That event would provide the ideal occasion for a test run of the plan Cooke and Dean evolved to launch the art program. In mid-February, Cooke provided Thompson with a list of the artists who should be invited to travel to Cape Canaveral to record their impressions of the event. Andrew Wyeth, whom the curator identified as "the top artist in the U.S. today," headed the list. When the time came, however, Andrew Wyeth did not go to the Cape for the Cooper launch, but his son Jamie would participate in the program during the Gemini and Apollo years.

The list of invited artists also included Peter Hurd, Andrew Wyeth's brother-in-law, who had served as a wartime artist with the Army Air Force; George Weymouth, whom Wyeth regarded as "the best of his pupils"; and John McCoy, another Wyeth associate. Cooke regarded the next man on the list, Robert McCall, who had been running the Air Force art program, as "America's top aero-space illustrator. Paul Calle and Robert Shore had both painted for the Air Force program. Mitchell Jamieson, who had run a unit of the Navy art program during WW II, rounded out the program. Alfred Blaustein was the only artist to turn down the invitation.

The procedures that would remain in place for more than a decade were given a trial run in the spring of 1963. The artists received an $800 commission, which had to cover any expenses incurred while visiting a NASA facility where they could paint whatever interested them. In return, they would present their finished pieces, and all of their sketches, to the space agency. The experiment was a success, and what might have been a one-time effort to dispatch artists to witness and record the Gordon Cooper flight provided the basis for an on-going, if small-scale, program. By the end of 1970, Jim Dean and Lester Cooke had dispatched 38 artists to Mercury, Gemini and Apollo launches and to other NASA facilities.

The art program became everything that Jim Webb had hoped it would be. NASA artists produced stunning works of art that documented the agency's step-by-step progress on the way to the moon. The early fruits of the program were presented in a lavishly illustrated book, Eyewitness to Space (New York: Abrams, 1971). Works from the collection illustrated NASA publications and were the basis for educational film strips aimed at school children. In 1965 and again in 1969 the National Gallery of Art mounted two major exhibitions of work from the NASA collection. The USIA sent a selection of NASA paintings overseas, while the Smithsonian Institution Traveling Exhibition Service created two exhibitions of NASA art that toured the nation.

"Since we …began," Dean noted in a reflection on the tenth anniversary of the program, the art initiative had resulted in a long string of positive "press interviews and reports, congressional inquiries, columns in the Congressional Record, [and] White House reports." The NASA effort, he continued, had directly inspired other government art programs. "The Department of the Interior (at least two programs), the Environmental Protection Agency, the Department of the Army and even the Veterans Administration have, or are starting, art programs." While he could not take all of the credit, Dean insisted that "our success has encouraged other agencies to get involved and they have succeeded, too."(5)

For all of that, he noted, it was still necessary to "defend" the role of art in the space agency. Dean, with the assistance of Lester Cooke, had been a one-man show, handling the complex logistics of the program, receiving and cataloguing works of art, hanging them himself in museums or on office walls, and struggling to find adequate storage space. In January 1976, a NASA supervisor went so far as to comment that: "Mr. Dean is far too valuable in other areas to spend his time on the relatively menial …jobs he is often burdened with in connection with the art program."(6) Dean placed a much higher value on the art collection, and immediately recommended that NASA officials either devote additional resources to the program, or get out of the art business and turn the existing collection over the National Air and Space Museum, "where it can be properly cared for."(7)

In January 1974 a new building for the National Air and Space Museum (NASM) was taking shape right across the street from NASA headquarters. Discussions regarding areas of cooperation were already underway between NASA officials and museum director Michael Collins, who had flown to the moon as a member of the Apollo 11 crew. Before the end of the year, the space agency had transferred its art collection to the NASM. Mike Collins succeeded in luring Jim Dean to the museum, as well.

The museum already maintained a small art collection, including portraits of aerospace heroes, an assortment of 18th and 19th century prints illustrating the early history of the balloon, an eclectic assortment of works portraying aspects of the history of aviation and a few recent prizes, including several Norman Rockwell paintings of NASA activity. With the acquisition of the NASA art, the museum was in possession of one of the world's great collections of art exploring aerospace themes. Jim Dean would continue to build the NASM collection as the museum's first curator of art. Following his retirement in 1980, other curators would follow in his footsteps, continuing to strengthen the role of art at the NASM. Over three decades after its arrival, however, the NASA art accession of 2,091 works still constitutes almost half of the NASM art collection.

(1) Stevenson's portrait is now in the collection of the National Air and Space Museum (1981-627)

(2) James E. Webb to Hiden Cox, March 16, 1962, memorandum in the NASA art historical collection, Aeronautics Division, National air and Space Museum. Webb's preference for a group portrait of the astronauts was apparently not heeded. In the end, Stevenson painted an individual portrait of John Glenn, which is also in the NASM collection (1963-398).

(3) Shelby Thompson, memorandum for the record, July 6, 1962, NASA art historical collection, NASA, Aeronautics Division.

(4) John Walker draft of a talk, March 5, 1965, copy in NASA Art historical collection, NASM Aeronautics Division.

(5) James Dean, memorandum for the record, August 6, 1973, NASA art history collection, NASM Aeronautics Division.

(6) Director of Planning and Media Development to Assistant Administrator for Public Affairs, January 24, 1974, NASA art history collection, NASM Aeronautics Division.

(7) James Dean to the Assistant Administrator for Public Affairs, January 24, 1974, copy in NASA Art history Collection, Aeronautics Division, NASM.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Smithsonian Institution

July 26, 2007

White Room

National Air and Space Museum
Felt tip pen drawing on paper. White room.

The spring of 1962 was a busy time for the men and women of the National Aeronautics and Space Administration. On February 20, John H. Glenn became the first American to orbit the earth. For the first time since the launch of Sputnik 1 on October 4, 1957, the U.S. was positioned to match and exceed Soviet achievements in space. NASA was an agency with a mission -- to meet President John F. Kennedy's challenge of sending human beings to the moon and returning them safely to earth by the end of the decade. Within a year, three more Mercury astronauts would fly into orbit. Plans were falling into place for a follow-on series of two-man Gemini missions that would set the stage for the Apollo voyages to the moon.

In early March 1962, artist Bruce Stevenson brought his large portrait of Alan Shepard, the first American to fly in space, to NASA headquarters.(1) James E. Webb, the administrator of NASA, assumed that the artist was interested in painting a similar portrait of all seven of the Mercury astronauts. Instead, Webb voiced his preference for a group portrait that would emphasize "…the team effort and the togetherness that has characterized the first group of astronauts to be trained by this nation." More important, the episode convinced the administrator that "…we should consider in a deliberate way just what NASA should do in the field of fine arts to commemorate the …historic events" of the American space program.(2)

In addition to portraits, Webb wanted to encourage artists to capture the excitement and deeper meaning of space flight. He imagined "a nighttime scene showing the great amount of activity involved in the preparation of and countdown for launching," as well as paintings that portrayed activities in space. "The important thing," he concluded, "is to develop a policy on how we intend to treat this matter now and in the next several years and then to get down to the specifics of how we intend to implement this policy…." The first step, he suggested, was to consult with experts in the field, including the director of the National Gallery of Art, and the members of the Fine Arts Commission, the arbiters of architectural and artistic taste who passed judgment on the appearance of official buildings and monuments in the nation's capital.

Webb's memo of March 16, 1962 was the birth certificate of the NASA art program. Shelby Thompson, the director of the agency's Office of Educational Programs and Services, assigned James Dean, a young artist working as a special assistant in his office, to the project. On June 19, 1962 Thompson met with the Fine Arts Commission, requesting advice as to how "…NASA should develop a basis for use of paintings and sculptures to depict significant historical events and other activities in our program."(3)

David E. Finley, the chairman and former director of the National Gallery of Art, applauded the idea, and suggested that the agency should study the experience of the U.S. Air Force, which had amassed some 800 paintings since establishing an art program in 1954. He also introduced Thompson to Hereward Lester Cooke, curator of paintings at the National Gallery of Art.

An imposing bear of a man standing over six feet tall, Lester Cooke was a graduate of Yale and Oxford, with a Princeton PhD. The son of a physics professor and a veteran of the U.S. Army Air Forces, he was both fascinated by science and felt a personal connection to flight. On a professional level, Cooke had directed American participation in international art competitions and produced articles and illustrations for the National Geographic Magazine. He jumped at the chance to advise NASA on its art program.

While initially cautious with regard to the time the project might require of one of his chief curators, John Walker, director of the National Gallery, quickly became one of the most vocal supporters of the NASA art initiative. Certain that "the present space exploration effort by the United States will probably rank among the more important events in the history of mankind," Walker believed that "every possible method of documentation …be used." Artists should be expected "…not only to record the physical appearance of the strange new world which space technology is creating, but to edit, select and probe for the inner meaning and emotional impact of events which may change the destiny of our race." He urged quick action so that "the full flavor of the achievement …not be lost," and hoped that "the past held captive" in any paintings resulting from the effort "will prove to future generations that America produced not only scientists and engineers capable of shaping the destiny of our age, but also artists worthy to keep them company."(4)

Gordon Cooper, the last Mercury astronaut to fly, was scheduled to ride an Atlas rocket into orbit on May 15, 1963. That event would provide the ideal occasion for a test run of the plan Cooke and Dean evolved to launch the art program. In mid-February, Cooke provided Thompson with a list of the artists who should be invited to travel to Cape Canaveral to record their impressions of the event. Andrew Wyeth, whom the curator identified as "the top artist in the U.S. today," headed the list. When the time came, however, Andrew Wyeth did not go to the Cape for the Cooper launch, but his son Jamie would participate in the program during the Gemini and Apollo years.

The list of invited artists also included Peter Hurd, Andrew Wyeth's brother-in-law, who had served as a wartime artist with the Army Air Force; George Weymouth, whom Wyeth regarded as "the best of his pupils"; and John McCoy, another Wyeth associate. Cooke regarded the next man on the list, Robert McCall, who had been running the Air Force art program, as "America's top aero-space illustrator. Paul Calle and Robert Shore had both painted for the Air Force program. Mitchell Jamieson, who had run a unit of the Navy art program during WW II, rounded out the program. Alfred Blaustein was the only artist to turn down the invitation.

The procedures that would remain in place for more than a decade were given a trial run in the spring of 1963. The artists received an $800 commission, which had to cover any expenses incurred while visiting a NASA facility where they could paint whatever interested them. In return, they would present their finished pieces, and all of their sketches, to the space agency. The experiment was a success, and what might have been a one-time effort to dispatch artists to witness and record the Gordon Cooper flight provided the basis for an on-going, if small-scale, program. By the end of 1970, Jim Dean and Lester Cooke had dispatched 38 artists to Mercury, Gemini and Apollo launches and to other NASA facilities.

The art program became everything that Jim Webb had hoped it would be. NASA artists produced stunning works of art that documented the agency's step-by-step progress on the way to the moon. The early fruits of the program were presented in a lavishly illustrated book, Eyewitness to Space (New York: Abrams, 1971). Works from the collection illustrated NASA publications and were the basis for educational film strips aimed at school children. In 1965 and again in 1969 the National Gallery of Art mounted two major exhibitions of work from the NASA collection. The USIA sent a selection of NASA paintings overseas, while the Smithsonian Institution Traveling Exhibition Service created two exhibitions of NASA art that toured the nation.

"Since we …began," Dean noted in a reflection on the tenth anniversary of the program, the art initiative had resulted in a long string of positive "press interviews and reports, congressional inquiries, columns in the Congressional Record, [and] White House reports." The NASA effort, he continued, had directly inspired other government art programs. "The Department of the Interior (at least two programs), the Environmental Protection Agency, the Department of the Army and even the Veterans Administration have, or are starting, art programs." While he could not take all of the credit, Dean insisted that "our success has encouraged other agencies to get involved and they have succeeded, too."(5)

For all of that, he noted, it was still necessary to "defend" the role of art in the space agency. Dean, with the assistance of Lester Cooke, had been a one-man show, handling the complex logistics of the program, receiving and cataloguing works of art, hanging them himself in museums or on office walls, and struggling to find adequate storage space. In January 1976, a NASA supervisor went so far as to comment that: "Mr. Dean is far too valuable in other areas to spend his time on the relatively menial …jobs he is often burdened with in connection with the art program."(6) Dean placed a much higher value on the art collection, and immediately recommended that NASA officials either devote additional resources to the program, or get out of the art business and turn the existing collection over the National Air and Space Museum, "where it can be properly cared for."(7)

In January 1974 a new building for the National Air and Space Museum (NASM) was taking shape right across the street from NASA headquarters. Discussions regarding areas of cooperation were already underway between NASA officials and museum director Michael Collins, who had flown to the moon as a member of the Apollo 11 crew. Before the end of the year, the space agency had transferred its art collection to the NASM. Mike Collins succeeded in luring Jim Dean to the museum, as well.

The museum already maintained a small art collection, including portraits of aerospace heroes, an assortment of 18th and 19th century prints illustrating the early history of the balloon, an eclectic assortment of works portraying aspects of the history of aviation and a few recent prizes, including several Norman Rockwell paintings of NASA activity. With the acquisition of the NASA art, the museum was in possession of one of the world's great collections of art exploring aerospace themes. Jim Dean would continue to build the NASM collection as the museum's first curator of art. Following his retirement in 1980, other curators would follow in his footsteps, continuing to strengthen the role of art at the NASM. Over three decades after its arrival, however, the NASA art accession of 2,091 works still constitutes almost half of the NASM art collection.

(1) Stevenson's portrait is now in the collection of the National Air and Space Museum (1981-627)

(2) James E. Webb to Hiden Cox, March 16, 1962, memorandum in the NASA art historical collection, Aeronautics Division, National air and Space Museum. Webb's preference for a group portrait of the astronauts was apparently not heeded. In the end, Stevenson painted an individual portrait of John Glenn, which is also in the NASM collection (1963-398).

(3) Shelby Thompson, memorandum for the record, July 6, 1962, NASA art historical collection, NASA, Aeronautics Division.

(4) John Walker draft of a talk, March 5, 1965, copy in NASA Art historical collection, NASM Aeronautics Division.

(5) James Dean, memorandum for the record, August 6, 1973, NASA art history collection, NASM Aeronautics Division.

(6) Director of Planning and Media Development to Assistant Administrator for Public Affairs, January 24, 1974, NASA art history collection, NASM Aeronautics Division.

(7) James Dean to the Assistant Administrator for Public Affairs, January 24, 1974, copy in NASA Art history Collection, Aeronautics Division, NASM.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Smithsonian Institution

July 26, 2007

White Room

National Air and Space Museum
White Room. Page from a bound sketchbook. A barrier of a single rope attached to several stands lines the bottom of the page and behind it are several tall conical structures. Text written on the largest structure in the foreground reads: "adapts CSM to LV SLA houses Lunar Module." The structure to the left is labeled "SLA." A person is standing in the lower left.

In March 1962, James Webb, Administrator of the National Aeronautics and Space Administration, suggested that artists be enlisted to document the historic effort to send the first human beings to the moon. John Walker, director of the National Gallery of Art, was among those who applauded the idea, urging that artists be encouraged "…not only to record the physical appearance of the strange new world which space technology is creating, but to edit, select and probe for the inner meaning and emotional impact of events which may change the destiny of our race."

Working together, James Dean, a young artist employed by the NASA Public Affairs office, and Dr. H. Lester Cooke, curator of paintings at the National Gallery of Art, created a program that dispatched artists to NASA facilities with an invitation to paint whatever interested them. The result was an extraordinary collection of works of art proving, as one observer noted, "that America produced not only scientists and engineers capable of shaping the destiny of our age, but also artists worthy to keep them company." Transferred to the National Air and Space Museum in 1975, the NASA art collection remains one of the most important elements of what has become perhaps the world's finest collection of aerospace themed art.

The spring of 1962 was a busy time for the men and women of the National Aeronautics and Space Administration. On February 20, John H. Glenn became the first American to orbit the earth. For the first time since the launch of Sputnik 1 on October 4, 1957, the U.S. was positioned to match and exceed Soviet achievements in space. NASA was an agency with a mission -- to meet President John F. Kennedy's challenge of sending human beings to the moon and returning them safely to earth by the end of the decade. Within a year, three more Mercury astronauts would fly into orbit. Plans were falling into place for a follow-on series of two-man Gemini missions that would set the stage for the Apollo voyages to the moon.

In early March 1962, artist Bruce Stevenson brought his large portrait of Alan Shepard, the first American to fly in space, to NASA headquarters.(1) James E. Webb, the administrator of NASA, assumed that the artist was interested in painting a similar portrait of all seven of the Mercury astronauts. Instead, Webb voiced his preference for a group portrait that would emphasize "…the team effort and the togetherness that has characterized the first group of astronauts to be trained by this nation." More important, the episode convinced the administrator that "…we should consider in a deliberate way just what NASA should do in the field of fine arts to commemorate the …historic events" of the American space program.(2)

In addition to portraits, Webb wanted to encourage artists to capture the excitement and deeper meaning of space flight. He imagined "a nighttime scene showing the great amount of activity involved in the preparation of and countdown for launching," as well as paintings that portrayed activities in space. "The important thing," he concluded, "is to develop a policy on how we intend to treat this matter now and in the next several years and then to get down to the specifics of how we intend to implement this policy…." The first step, he suggested, was to consult with experts in the field, including the director of the National Gallery of Art, and the members of the Fine Arts Commission, the arbiters of architectural and artistic taste who passed judgment on the appearance of official buildings and monuments in the nation's capital.

Webb's memo of March 16, 1962 was the birth certificate of the NASA art program. Shelby Thompson, the director of the agency's Office of Educational Programs and Services, assigned James Dean, a young artist working as a special assistant in his office, to the project. On June 19, 1962 Thompson met with the Fine Arts Commission, requesting advice as to how "…NASA should develop a basis for use of paintings and sculptures to depict significant historical events and other activities in our program."(3)

David E. Finley, the chairman and former director of the National Gallery of Art, applauded the idea, and suggested that the agency should study the experience of the U.S. Air Force, which had amassed some 800 paintings since establishing an art program in 1954. He also introduced Thompson to Hereward Lester Cooke, curator of paintings at the National Gallery of Art.

An imposing bear of a man standing over six feet tall, Lester Cooke was a graduate of Yale and Oxford, with a Princeton PhD. The son of a physics professor and a veteran of the U.S. Army Air Forces, he was both fascinated by science and felt a personal connection to flight. On a professional level, Cooke had directed American participation in international art competitions and produced articles and illustrations for the National Geographic Magazine. He jumped at the chance to advise NASA on its art program.

While initially cautious with regard to the time the project might require of one of his chief curators, John Walker, director of the National Gallery, quickly became one of the most vocal supporters of the NASA art initiative. Certain that "the present space exploration effort by the United States will probably rank among the more important events in the history of mankind," Walker believed that "every possible method of documentation …be used." Artists should be expected "…not only to record the physical appearance of the strange new world which space technology is creating, but to edit, select and probe for the inner meaning and emotional impact of events which may change the destiny of our race." He urged quick action so that "the full flavor of the achievement …not be lost," and hoped that "the past held captive" in any paintings resulting from the effort "will prove to future generations that America produced not only scientists and engineers capable of shaping the destiny of our age, but also artists worthy to keep them company."(4)

Gordon Cooper, the last Mercury astronaut to fly, was scheduled to ride an Atlas rocket into orbit on May 15, 1963. That event would provide the ideal occasion for a test run of the plan Cooke and Dean evolved to launch the art program. In mid-February, Cooke provided Thompson with a list of the artists who should be invited to travel to Cape Canaveral to record their impressions of the event. Andrew Wyeth, whom the curator identified as "the top artist in the U.S. today," headed the list. When the time came, however, Andrew Wyeth did not go to the Cape for the Cooper launch, but his son Jamie would participate in the program during the Gemini and Apollo years.

The list of invited artists also included Peter Hurd, Andrew Wyeth's brother-in-law, who had served as a wartime artist with the Army Air Force; George Weymouth, whom Wyeth regarded as "the best of his pupils"; and John McCoy, another Wyeth associate. Cooke regarded the next man on the list, Robert McCall, who had been running the Air Force art program, as "America's top aero-space illustrator. Paul Calle and Robert Shore had both painted for the Air Force program. Mitchell Jamieson, who had run a unit of the Navy art program during WW II, rounded out the program. Alfred Blaustein was the only artist to turn down the invitation.

The procedures that would remain in place for more than a decade were given a trial run in the spring of 1963. The artists received an $800 commission, which had to cover any expenses incurred while visiting a NASA facility where they could paint whatever interested them. In return, they would present their finished pieces, and all of their sketches, to the space agency. The experiment was a success, and what might have been a one-time effort to dispatch artists to witness and record the Gordon Cooper flight provided the basis for an on-going, if small-scale, program. By the end of 1970, Jim Dean and Lester Cooke had dispatched 38 artists to Mercury, Gemini and Apollo launches and to other NASA facilities.

The art program became everything that Jim Webb had hoped it would be. NASA artists produced stunning works of art that documented the agency's step-by-step progress on the way to the moon. The early fruits of the program were presented in a lavishly illustrated book, Eyewitness to Space (New York: Abrams, 1971). Works from the collection illustrated NASA publications and were the basis for educational film strips aimed at school children. In 1965 and again in 1969 the National Gallery of Art mounted two major exhibitions of work from the NASA collection. The USIA sent a selection of NASA paintings overseas, while the Smithsonian Institution Traveling Exhibition Service created two exhibitions of NASA art that toured the nation.

"Since we …began," Dean noted in a reflection on the tenth anniversary of the program, the art initiative had resulted in a long string of positive "press interviews and reports, congressional inquiries, columns in the Congressional Record, [and] White House reports." The NASA effort, he continued, had directly inspired other government art programs. "The Department of the Interior (at least two programs), the Environmental Protection Agency, the Department of the Army and even the Veterans Administration have, or are starting, art programs." While he could not take all of the credit, Dean insisted that "our success has encouraged other agencies to get involved and they have succeeded, too."(5)

For all of that, he noted, it was still necessary to "defend" the role of art in the space agency. Dean, with the assistance of Lester Cooke, had been a one-man show, handling the complex logistics of the program, receiving and cataloguing works of art, hanging them himself in museums or on office walls, and struggling to find adequate storage space. In January 1976, a NASA supervisor went so far as to comment that: "Mr. Dean is far too valuable in other areas to spend his time on the relatively menial …jobs he is often burdened with in connection with the art program."(6) Dean placed a much higher value on the art collection, and immediately recommended that NASA officials either devote additional resources to the program, or get out of the art business and turn the existing collection over the National Air and Space Museum, "where it can be properly cared for."(7)

In January 1974 a new building for the National Air and Space Museum (NASM) was taking shape right across the street from NASA headquarters. Discussions regarding areas of cooperation were already underway between NASA officials and museum director Michael Collins, who had flown to the moon as a member of the Apollo 11 crew. Before the end of the year, the space agency had transferred its art collection to the NASM. Mike Collins succeeded in luring Jim Dean to the museum, as well.

The museum already maintained a small art collection, including portraits of aerospace heroes, an assortment of 18th and 19th century prints illustrating the early history of the balloon, an eclectic assortment of works portraying aspects of the history of aviation and a few recent prizes, including several Norman Rockwell paintings of NASA activity. With the acquisition of the NASA art, the museum was in possession of one of the world's great collections of art exploring aerospace themes. Jim Dean would continue to build the NASM collection as the museum's first curator of art. Following his retirement in 1980, other curators would follow in his footsteps, continuing to strengthen the role of art at the NASM. Over three decades after its arrival, however, the NASA art accession of 2,091 works still constitutes almost half of the NASM art collection.

(1) Stevenson's portrait is now in the collection of the National Air and Space Museum (1981-627)

(2) James E. Webb to Hiden Cox, March 16, 1962, memorandum in the NASA art historical collection, Aeronautics Division, National air and Space Museum. Webb's preference for a group portrait of the astronauts was apparently not heeded. In the end, Stevenson painted an individual portrait of John Glenn, which is also in the NASM collection (1963-398).

(3) Shelby Thompson, memorandum for the record, July 6, 1962, NASA art historical collection, NASA, Aeronautics Division.

(4) John Walker draft of a talk, March 5, 1965, copy in NASA Art historical collection, NASM Aeronautics Division.

(5) James Dean, memorandum for the record, August 6, 1973, NASA art history collection, NASM Aeronautics Division.

(6) Director of Planning and Media Development to Assistant Administrator for Public Affairs, January 24, 1974, NASA art history collection, NASM Aeronautics Division.

(7) James Dean to the Assistant Administrator for Public Affairs, January 24, 1974, copy in NASA Art history Collection, Aeronautics Division, NASM.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Smithsonian Institution

July 26, 2007

White Room

National Air and Space Museum
White room, Cape Kennedy, April, 1970.

The spring of 1962 was a busy time for the men and women of the National Aeronautics and Space Administration. On February 20, John H. Glenn became the first American to orbit the earth. For the first time since the launch of Sputnik 1 on October 4, 1957, the U.S. was positioned to match and exceed Soviet achievements in space. NASA was an agency with a mission -- to meet President John F. Kennedy's challenge of sending human beings to the moon and returning them safely to earth by the end of the decade. Within a year, three more Mercury astronauts would fly into orbit. Plans were falling into place for a follow-on series of two-man Gemini missions that would set the stage for the Apollo voyages to the moon.

In early March 1962, artist Bruce Stevenson brought his large portrait of Alan Shepard, the first American to fly in space, to NASA headquarters.(1) James E. Webb, the administrator of NASA, assumed that the artist was interested in painting a similar portrait of all seven of the Mercury astronauts. Instead, Webb voiced his preference for a group portrait that would emphasize "…the team effort and the togetherness that has characterized the first group of astronauts to be trained by this nation." More important, the episode convinced the administrator that "…we should consider in a deliberate way just what NASA should do in the field of fine arts to commemorate the …historic events" of the American space program.(2)

In addition to portraits, Webb wanted to encourage artists to capture the excitement and deeper meaning of space flight. He imagined "a nighttime scene showing the great amount of activity involved in the preparation of and countdown for launching," as well as paintings that portrayed activities in space. "The important thing," he concluded, "is to develop a policy on how we intend to treat this matter now and in the next several years and then to get down to the specifics of how we intend to implement this policy…." The first step, he suggested, was to consult with experts in the field, including the director of the National Gallery of Art, and the members of the Fine Arts Commission, the arbiters of architectural and artistic taste who passed judgment on the appearance of official buildings and monuments in the nation's capital.

Webb's memo of March 16, 1962 was the birth certificate of the NASA art program. Shelby Thompson, the director of the agency's Office of Educational Programs and Services, assigned James Dean, a young artist working as a special assistant in his office, to the project. On June 19, 1962 Thompson met with the Fine Arts Commission, requesting advice as to how "…NASA should develop a basis for use of paintings and sculptures to depict significant historical events and other activities in our program."(3)

David E. Finley, the chairman and former director of the National Gallery of Art, applauded the idea, and suggested that the agency should study the experience of the U.S. Air Force, which had amassed some 800 paintings since establishing an art program in 1954. He also introduced Thompson to Hereward Lester Cooke, curator of paintings at the National Gallery of Art.

An imposing bear of a man standing over six feet tall, Lester Cooke was a graduate of Yale and Oxford, with a Princeton PhD. The son of a physics professor and a veteran of the U.S. Army Air Forces, he was both fascinated by science and felt a personal connection to flight. On a professional level, Cooke had directed American participation in international art competitions and produced articles and illustrations for the National Geographic Magazine. He jumped at the chance to advise NASA on its art program.

While initially cautious with regard to the time the project might require of one of his chief curators, John Walker, director of the National Gallery, quickly became one of the most vocal supporters of the NASA art initiative. Certain that "the present space exploration effort by the United States will probably rank among the more important events in the history of mankind," Walker believed that "every possible method of documentation …be used." Artists should be expected "…not only to record the physical appearance of the strange new world which space technology is creating, but to edit, select and probe for the inner meaning and emotional impact of events which may change the destiny of our race." He urged quick action so that "the full flavor of the achievement …not be lost," and hoped that "the past held captive" in any paintings resulting from the effort "will prove to future generations that America produced not only scientists and engineers capable of shaping the destiny of our age, but also artists worthy to keep them company."(4)

Gordon Cooper, the last Mercury astronaut to fly, was scheduled to ride an Atlas rocket into orbit on May 15, 1963. That event would provide the ideal occasion for a test run of the plan Cooke and Dean evolved to launch the art program. In mid-February, Cooke provided Thompson with a list of the artists who should be invited to travel to Cape Canaveral to record their impressions of the event. Andrew Wyeth, whom the curator identified as "the top artist in the U.S. today," headed the list. When the time came, however, Andrew Wyeth did not go to the Cape for the Cooper launch, but his son Jamie would participate in the program during the Gemini and Apollo years.

The list of invited artists also included Peter Hurd, Andrew Wyeth's brother-in-law, who had served as a wartime artist with the Army Air Force; George Weymouth, whom Wyeth regarded as "the best of his pupils"; and John McCoy, another Wyeth associate. Cooke regarded the next man on the list, Robert McCall, who had been running the Air Force art program, as "America's top aero-space illustrator. Paul Calle and Robert Shore had both painted for the Air Force program. Mitchell Jamieson, who had run a unit of the Navy art program during WW II, rounded out the program. Alfred Blaustein was the only artist to turn down the invitation.

The procedures that would remain in place for more than a decade were given a trial run in the spring of 1963. The artists received an $800 commission, which had to cover any expenses incurred while visiting a NASA facility where they could paint whatever interested them. In return, they would present their finished pieces, and all of their sketches, to the space agency. The experiment was a success, and what might have been a one-time effort to dispatch artists to witness and record the Gordon Cooper flight provided the basis for an on-going, if small-scale, program. By the end of 1970, Jim Dean and Lester Cooke had dispatched 38 artists to Mercury, Gemini and Apollo launches and to other NASA facilities.

The art program became everything that Jim Webb had hoped it would be. NASA artists produced stunning works of art that documented the agency's step-by-step progress on the way to the moon. The early fruits of the program were presented in a lavishly illustrated book, Eyewitness to Space (New York: Abrams, 1971). Works from the collection illustrated NASA publications and were the basis for educational film strips aimed at school children. In 1965 and again in 1969 the National Gallery of Art mounted two major exhibitions of work from the NASA collection. The USIA sent a selection of NASA paintings overseas, while the Smithsonian Institution Traveling Exhibition Service created two exhibitions of NASA art that toured the nation.

"Since we …began," Dean noted in a reflection on the tenth anniversary of the program, the art initiative had resulted in a long string of positive "press interviews and reports, congressional inquiries, columns in the Congressional Record, [and] White House reports." The NASA effort, he continued, had directly inspired other government art programs. "The Department of the Interior (at least two programs), the Environmental Protection Agency, the Department of the Army and even the Veterans Administration have, or are starting, art programs." While he could not take all of the credit, Dean insisted that "our success has encouraged other agencies to get involved and they have succeeded, too."(5)

For all of that, he noted, it was still necessary to "defend" the role of art in the space agency. Dean, with the assistance of Lester Cooke, had been a one-man show, handling the complex logistics of the program, receiving and cataloguing works of art, hanging them himself in museums or on office walls, and struggling to find adequate storage space. In January 1976, a NASA supervisor went so far as to comment that: "Mr. Dean is far too valuable in other areas to spend his time on the relatively menial …jobs he is often burdened with in connection with the art program."(6) Dean placed a much higher value on the art collection, and immediately recommended that NASA officials either devote additional resources to the program, or get out of the art business and turn the existing collection over the National Air and Space Museum, "where it can be properly cared for."(7)

In January 1974 a new building for the National Air and Space Museum (NASM) was taking shape right across the street from NASA headquarters. Discussions regarding areas of cooperation were already underway between NASA officials and museum director Michael Collins, who had flown to the moon as a member of the Apollo 11 crew. Before the end of the year, the space agency had transferred its art collection to the NASM. Mike Collins succeeded in luring Jim Dean to the museum, as well.

The museum already maintained a small art collection, including portraits of aerospace heroes, an assortment of 18th and 19th century prints illustrating the early history of the balloon, an eclectic assortment of works portraying aspects of the history of aviation and a few recent prizes, including several Norman Rockwell paintings of NASA activity. With the acquisition of the NASA art, the museum was in possession of one of the world's great collections of art exploring aerospace themes. Jim Dean would continue to build the NASM collection as the museum's first curator of art. Following his retirement in 1980, other curators would follow in his footsteps, continuing to strengthen the role of art at the NASM. Over three decades after its arrival, however, the NASA art accession of 2,091 works still constitutes almost half of the NASM art collection.

(1) Stevenson's portrait is now in the collection of the National Air and Space Museum (1981-627)

(2) James E. Webb to Hiden Cox, March 16, 1962, memorandum in the NASA art historical collection, Aeronautics Division, National air and Space Museum. Webb's preference for a group portrait of the astronauts was apparently not heeded. In the end, Stevenson painted an individual portrait of John Glenn, which is also in the NASM collection (1963-398).

(3) Shelby Thompson, memorandum for the record, July 6, 1962, NASA art historical collection, NASA, Aeronautics Division.

(4) John Walker draft of a talk, March 5, 1965, copy in NASA Art historical collection, NASM Aeronautics Division.

(5) James Dean, memorandum for the record, August 6, 1973, NASA art history collection, NASM Aeronautics Division.

(6) Director of Planning and Media Development to Assistant Administrator for Public Affairs, January 24, 1974, NASA art history collection, NASM Aeronautics Division.

(7) James Dean to the Assistant Administrator for Public Affairs, January 24, 1974, copy in NASA Art history Collection, Aeronautics Division, NASM.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Smithsonian Institution

July 26, 2007
97-120 of 6,676 Resources