Found 3,558 Resources containing: Amazon
John Ridley’s life changed six months ago when the film he wrote, 12 Years A Slave, won three Oscars, including best picture. Ridley also won for his screenplay, adapted from the memoir of Solomon Northup. His new film, Jimi: All is By My Side, which he wrote and directed, hits theaters September 26 and follows another icon of African-American history—Jimi Hendrix.
The film stars André Benjamin (André 3000 of Outkast) and focuses on Hendrix when he lived in London in 1966 and 1967. “That was an interesting time in Hendrix’s career, sort of cutting his teeth and getting a feel for playing professionally,” says National Museum of African American History and Culture curator Kevin Strait about the time period in which Ridley’s movie takes place. Following that stay in London, Strait says, Hendrix “came on the scene and just blew people away.”
When the African American History Museum opens in 2016, several Hendrix artifacts will be part of an inaugural exhibition, “Musical Crossroads,” alongside objects relating to James Brown, J Dilla, and others. “He expanded the sonic vocabulary of the electric guitar in ways that were essentially unforeseen,” Strait says. “Hendrix altered the conception and the general image of what a rock icon is and could be.” Among the Hendrix-related items at the museum will be a vest he wore and a Marshall speaker he used during performances.
Smithsonian.com spoke with John Ridley about his research process, his favorite Hendrix songs, and how he got around the fact that Hendrix’s estate wouldn’t let him use the legend’s most famous songs.John Ridley won an Oscar earlier this year for 12 Years A Slave. His new film is about Jimi Hendrix. (Ryder Sloane)
Where are you these days and what are you working on?
Right now I’m physically in Los Angeles. I actually spend most of my time though in Austin, Texas. We’re filming “American Crime” down there, the television series that I’m writing and producing and I actually directed the pilot. So splitting my time between both those cities.
What was your research process for Jimi: All Is by My Side?
Once I really started to believe that there was a screenplay here, [I used] any available information, whether it was archival information, personal interviews, histories. Any life like Jimi’s, at some point it becomes legendary. And the stories are out there and there are events that people document, but people recall him in a number of different ways. And it’s one of those things that you get to a certain point that there is a little bit of refereeing going on in terms of, "OK, one person said this and one person said that." […] But the fun part is really being able to take those moments and dig as deeply as possible and then render some life to them. It’s one thing to report on those things, but as a storyteller and as a filmmaker, it’s another thing to say to the actors, "OK, this is how we’re gonna do it, this is the emotion that we want to put into it, these are the facts that are best available to us and how do we make it live now? How do we make living history out of legend?"
How important is it to stick to the biographical facts?
Obviously, we’re not documentarians and even with historians, they get into arguments about what happened and who was involved with it. But one of the exciting things about this in particular was that because it was a finite space of time, because it was [portraying] one year in two hours, because it had this internal drive of Jimi going to London […] there wasn’t really a lot of need for taking artistic license with things. I mean, one thing for example I did take artistic license with was the character of Ida. The reason she’s called Ida, she was actually his girlfriend, Devon [Wilson], who he met when he came back to America. But because it was his most significant girlfriend of color, I thought it was very important to add an ethnocentric perspective to that relationship and why it was different than his relationship with these two other ladies.
Why did you focus on this particular moment in Hendrix’s career?
It was a transformative year for Jimi. He left New York literally under the name Jimmy James, the name he was performing with, and then came back to the States as Jimi Hendrix, J-I-M-I. A lot happened in that year. London at that time was […] art, it was culture, it was cinema, it was music, it was all of those things. So to be able to render not just the history of Jimi Hendrix, but the history of pop culture at that time, to have those two elements really interacting, I thought it was a rare opportunity to tell a story that was on the one hand taking individuals with a rock and roll status and showing the human nature of them, and also showing this incubator so to speak, this petri dish. There’s so much cross pollination going on of different styles.Jimi: All Is by My Side stars André Benjamin of Outkast and opens in theaters September 26. (Patrick Redmond)
Hendrix’s estate reportedly didn’t grant you permission to use music from his catalog. Was that part of the decision to focus on his pre-fame years?
It actually wasn’t part of the philosophy. I knew going into it what [happened when] Paul Greengrass and the Hughes brothers had tried to make a Hendrix film […] so I had no illusions of what we may or may not be able to get access to based on what other individuals did or were able to get. But at the same time, the story, I believed that it had an emotional quality that lived on in its own. […] But I mean look, it’s like anything else in life, you can look at the things that are in front of you as limitations or you can say, "Look, we can get past them or get around them and do it in a way that’s unique and is very, very special to the story that we’re telling."
How was researching 12 Years A Slave different?
12 Years a Slave, I mean look, what was interesting in that regard was that there’s a document sitting right in front of you [Solomon Northup’s memoir] and there’s a lot of things that one would take as fact because there was a single narrative. […] But within that, though, it was a real exclamation of emotion, of the time, of the language. It was certainly much further away from me. The fact of the matter is I wasn’t alive then, the fact of the matter is I wasn’t around in London in 1967. So both of them, even though it is a film, it is not a documentary. There’s certainly a place where one could say I could just take a creative license with this or that. But I do take it seriously. As I said, I’ve been very fortunate to be in environments where you have to get the facts right and you can’t hide behind creative license. And even though this was a space where I could do that and I certainly own up to it in the spaces where we do do that, there is something exciting about being able to take history as it’s presented. It does work, there’s no need to manipulate it, so why not ride it as the history is laid out?
Is there a connection between All Is by my Side and 12 Years a Slave?
I would say the connection is an emotional velocity with storytelling or the story that is there. 12 Years a Slave, it’s a different kind of a velocity. Someone is looking for his physical freedom to be able to return to his family and have recognized the value that he has as an individual. Obviously with Jimi’s story it’s a little different, but it is about a finite space of time, it is about a passion, it is about a person who’s trying to find his value in the world and express himself […] So yes, they are a bit different, it’s not a direct comparison, but with those two stories I certainly felt the passion that was in them and as a storyteller—one as writer, one as a writer-director—if there’s anything that I hope I accomplished, it’s translating that passion that I was exposed to in the story to an audience.
Any favorite Hendrix songs?(Patrick Redmond)
What makes us want to grow a lily in a pot? It’s a question at the center of entomologist Stephen Buchmann’s latest book, The Reason for Flowers: Their History, Culture, Biology and How They Change Our Lives. People have been obsessed with flowers since ancient times, Buchmann notes. A painted casket found in King Tutankhamun’s tomb is decorated with a bouquet including cornflowers and lily petals, and Chinese gardeners have grown lotus, peonies, magnolias and tiger lilies since at least 1,000 B.C.
Today, certain flowers have enormous cultural value: In Grasse, France, the distilled oils of jasmine plants can fetch $12,000 a pound, Buchmann writes in a chapter about perfume. He also devotes a chapter to flowers in literature. But his specialty is the science—Buchmann’s interest in flowers began during his childhood in California, when he would chase bees through wild meadows, and his research focuses on the weird and wonderful relationships flowers have forged with their animal pollinators.
I spoke to Buchmann about why we all love flowers and what mysteries these floral wonders still hold. (The following has been edited for length.)
If we visited your home in Arizona, what types of flowers would we find?
I have cut flowers and potted plants year-round. My favorites are the multicolored Chilean Alstroemeria, because their blooms last so long, along with various modern and heirloom roses and the glorious white-flowered Asian moth orchids. My all-time favorite flowers are the orchids, in part because of their incredible diversity of forms, scents and colors. I am especially intrigued by neotropical orchids like Stanhopea and Gongora. These produce spicy scents and no rewards of edible pollen or nectar. Visiting male orchid bees scrape up the floral scents using special hairs on their front legs. After spending weeks or months harvesting orchid and other scents, the bees store the scents in their inflated hind legs. Eventually, they use these purloined floral scents as their own sexual attractants.
Which flowers are underappreciated?
Skunk cabbage. This lowly flower from the eastern United States uses its own internal heat to melt its way up through the snow, and the same heat production volatizes its carrion-like smell into the air to attract its fly pollinators. It’s an amazing example of floral adaptions in action. Many flowers like the Voodoo lily and the starfish flower from Africa are living biochemical factories that produce the same nitrogenous chemicals found when vertebrate bodies decompose. Carrion flowers often mimic the color, scent and even texture of dead animals, corpses ripe for the egg-laying activities of various filth flies.
Is there a rare or exotic flower you’d most like to see in your lifetime?
The giant mottled and red Rafflesia arnoldii had been on my bucket list for many years until I saw it firsthand a few years ago in the rainforests of Sabah, Malaysia, on the island of Borneo. I’d also relish the opportunity to see the giant corpse flower Amorphophallus titanum in the wilds of Indonesia. A. titanum is a contender for the worlds’ largest flower, another one of the carrion flowers whose heat and intense death-like stench attracts its fly pollinators.
Image by Mark van Veen/Buiten-beeld/Minden Pictures/Corbis. A painted casket found in King Tutankhamun’s tomb is decorated with a bouquet that includes cornflowers, like the one seen here. (original image)
Image by Nobuyuki Yoshikawa/Corbis. Buchmann says some of his favorite flowers are the multihued Alstroemeria, because the blooms last so long. (original image)
Image by Christian Ziegler/Minden Pictures/Corbis. Iridescent bees approach the blossoms of the Gongora leucochila orchid in Panama. (original image)
Image by Chinch Gryniewicz/Ecoscene/Corbis. The "underappreciated" flowers of American skunk cabbage. (original image)
Image by Alcibbum Photography/Corbis. The exotic Rafflesia arnoldii flower blooms in the tropical rainforest of Sumatra. (original image)
Image by Fred Hirschmann/Science Faction/Corbis. California poppy and owl's clover decorate the desert near Kitt Peak in Arizona. (original image)
Image by USGS Bee Inventory & Monitoring Lab. This Centris decolorata bee was found in Puerto Rico. Centris bees specialize in flowers that produce energy-rich floral oils as a reward for pollinators. (original image)
Who writes about flowers most poetically?
Alfred Tennyson, Emily Dickinson, Ezra Pound, Louise Gluck, Sylvia Plath, Ted Hughes. A favorite is the work of Walt Whitman, who gave us wonderful imagery of garden lilacs in his poem “When Lilacs Last in the Dooryard Bloom’d,” a poem about the death of Abraham Lincoln. And since I grew up in the ’60s and ’70s, immersed in the southern California rock scene, another favorite are the dead flowers penned in song lyrics by Mick Jagger and Keith Richards on their Sticky Fingers album.
Do you prefer the flowers of Van Gogh or O’Keeffe?
Easy. I’ve always adored the powerful but simplified lines and folds [and] the macroscopic views of flowers by the late artist Georgia O’Keeffe. I’m attracted by the simplicity and power, and perhaps, like so many, drawn to their subliminal sexual imagery.
What destination in the world has the best blooms?
For wildflowers growing outdoors, the Sonoran Desert around my home in Tucson. Every year we have the dependable palo verde trees bursting into brilliant yellow, but every 10 or 20 years the desert puts on spectacular wildflower displays, including Arizona poppies, owl’s clover, lupines and globe mallows, among others.
What is the most fascinating flower discovery in the past decade?
It has been found that flowers have a negative charge that may influence pollinator visits. Every object that flies through the air, whether it is a baseball, a jumbo jet or a humble bumblebee, acquires a strong positive electric charge. A honeybee might be carrying a charge of several hundred volts. When a positively charged bee lands on a negative flower, pollen grains can actually jump an air gap and attach to the stigma [the part of a flower where pollen germinates]. These passive electrostatic charges aid the natural pollen-holding branched hairs on the bodies of most bees. Bees may even be able to “label” flowers they just visited with these charges and not revisit empty flowers in the future.
What’s the most unusual adaptation for attracting a pollinator?
About 8 percent of the world’s flowers have pored anthers, which is the only way for pollen to leave the flower. Certain bees, like bumblebees and carpenter bees, literally turn themselves into living tuning forks—their powerful thoracic muscles deliver sonic blasts to the flower, which ignite a maelstrom of pollen grains that come flying out of the anther pores, striking the bees and allowing them to efficiently collect the pollen grains as food.
Another most unusual adaptation occurs in some tropical and desert plants. Instead of producing typical pollen and nectar as floral rewards offered to pollinators, these “oil flowers,” like Barbados cherry or range ratany, have blisters on their undersides. Bees in the genus Centris rupture the blisters with special squeegee hairs on their front legs and transport these energy-rich floral oils back to their nests. The oils are mixed with pollen as larval food.
What botanical puzzle would you most like to answer?
I’d like to know how bees are most attracted to flowers and the most important sensory cues used in recognizing flowers from a distance. We know very little about this subject, especially in nature, outside of the artificial-flower testing arrays used by many modern behaviorists. Bees have thousands of tiny ommatidia, which together make up their compound eyes. Their visual acuity is only about one-sixtieth that of our human eyes. A flying bee needs to be almost on top of a bloom, about a foot away, before it can make the bloom out, although “flicker fusion” (the ability to detect rapid changes across their visual field) allows bees to detect the highly saturated spots of floral colors while flying across a meadow. My fantasy would be to see the world as a bee does, to become a flying bee, but only for a few minutes, because of all the entomologists, birds, spiders and lizards lurking nearby.
On August 27, 1831, the Richmond Compiler asked: “Who is this Nat Turner?” At the time, Turner was hiding in Southampton, Virginia, not far from the site where he launched the most important slave revolt in American history. Nat Turner’s Revolt, which had taken place just five days earlier, had left more than 50 whites dead; by the time the trials finished, a similar number of suspected rebels were either killed extra legally or condemned and executed.
Even when Nat Turner was captured, on October 30, 1831, the Compiler’s question had remained unanswered. As a result, a white lawyer, Thomas R. Gray, arranged to go to the jail where Turner was held awaiting his trial and take down what Turner described as “a history of the motives which induced me to undertake the late insurrection.” Over the last decade, scholars working with other sources and doing close textual analysis of The Confessions of Nat Turner have become increasingly confident that Gray transcribed Turner’s confession, with, as Gray claimed, “little or no variation.”
While The Confessions of Nat Turner remains the ur-text for anyone who wants to understand Nat Turner, this 5,000-word account creates as many questions as it answers. As a result, the document has become a springboard for artists who want to imagine the life of the most famous American to rebel against slavery. In 1967, the novelist William Styron published a novel based upon Turner’s Confessions. The novel both won immediate acclaim including a Pulitzer Prize and caused an uproar, as black scholars including John Henrik Clarke took issue with the way that Styron imagined that the rebel leader was inspired in part by his frustrated sexual longings for a white woman.
This week, a new re-imagining of Nat Turner’s story hits the big screen as Birth of a Nation opens in theaters nationwide. Filmmaker and actor Nate Parker portrays Southampton’s most famous son as a “warm, encouraging preacher,” in the words of the New Yorker’s Vinson Cunningham. Nate Parker portrayal highlights the religiosity of the slave rebel leader whose personal Bible has been put on display for the first time at the Smithsonian’s new National Museum of African American History and Culture. But what do we really know about Turner’s religion?
Fortunately, Turner’s Confessions, recorded by Thomas R. Gray, provides important clues to Turner’s central religious beliefs.
Most slaves could not read. Some of them owned Bibles anyway, which could then serve as tangible reminders of the “Good News” contained within. Turner, on the other hand, learned how to read as a child, and his Bible was the book that he knew intimately. When captured after the revolt, Turner readily placed his revolt in a biblical context, comparing himself at some times to the Old Testament prophets, at another point to Jesus Christ. In his Confessions, Turner quoted the Gospel of Luke twice, and scholars have found many other passages in which his language echoed the language of the Bible including passages from Ezekiel, Joshua, Isaiah, Matthew, Mark, and Revelation. Like many 19th-century American Protestants, Turner drew his inspiration and much of his vocabulary from the Bible.
While Turner valued the Bible, he rejected the corollary that scripture alone was the only reliable source of guidance on matters religious and moral. Turner believed that God continued to communicate with the world. Turner describes two other ways that God communicated with him. First, God communicated directly to him: at one point, “the Lord had shewn me things that had happened before my birth.” At another point, “the Holy Ghost had revealed itself to me.” On May 12, 1828, “the Spirit instantly appeared to me.” When asked by Gray what Turner meant by the Spirit, Turner responded “The Spirit that spoke to the prophets in former days.” Turner saw himself as a modern prophet.
Turner believed that God also communicated to him through the natural world. His neighbors saw stars in the sky, not realizing that according to Turner, they were really “the lights of the Saviour's hands, stretched forth from east to west.” More often Turner looked at prodigies—or unusual natural phenomena—as indirect messages from God. In a field one day, he found “drops of blood on the corn as though it were dew from heaven.” When he saw “leaves in the woods hieroglyphic characters, and numbers, with the forms of men in different attitudes, portrayed in blood,” he was reminded of “figures I had seen in the heavens.”
The most consequential signs appeared in the months prior to the revolt. In February, Southampton, located in southern Virginia, experienced a solar eclipse, which Turner interpreted as a providential signal to start recruiting potential rebels. With the eclipse, “the seal was removed from my lips, and I communicated the great work laid out for me to do, to four in whom I had the greatest confidence,” the first conspirators to join his plot. In August, a sun with a greenish hue appeared across the eastern seaboard. Turner immediately understood this peculiar event as a signal from God that the time to begin the revolt had arrived.
Turner’s views on private revelation were not unlike those of his contemporaries Joseph Smith, the founder of Mormonism, and William Miller, the father of the Adventist movement. Turner’s views were clearly unacceptable to the whites who controlled Southampton’s interracial churches. Throughout the region, Protestant churches run by whites ministered to both whites and blacks. Often these churches’ black members met separately from its white members, but on communion day the entire church black and white came together to commemorate Jesus’s last supper. When Turner tried to join one of these churches, the church refused to baptize the religious slave who saw himself as a prophet.
Although it is not surprising that whites rejected Turner’s religious views, they were also suspect in the black community. In part, this was because at one point his vision seemed too close to the proslavery religion that most slaves rejected. While he was in his 20s, Turner ran away from his owner. When he was in the woods, the Holy Spirit appeared to Turner and ordered him to “return to the service of my earthly master—‘For he who knoweth his Master's will, and doeth it not, shall be beaten with many stripes, and thus, have I chastened you.’” When the slaves heard Turner quote the slaveholders’ favorite passage from Luke, the slaves themselves rejected Turner’s claims to prophesy. “The negroes found fault, and murmurred against me, saying that if they had my sense they would not serve any master in the world.”
This was not the only time that the religious Turner found himself at odds with the men who would join his revolt. In the spring of 1831, when Turner and his co-conspirators were deciding the day for the revolt, the rebels selected Independence Day with its obvious political resonances. Turner, who saw the revolt in Biblical terms, never reconciled himself to this date. As July 4th approached, he worried himself “sick” and postponed the revolt. Likewise, on August 21, 1831, Turner met for the first time rebels whom he had not personally recruited. He asked Will—who would become the most enthusiastic of the rebels—why he joined the revolt. Will responded “his life was worth no more than others, and his liberty as dear to him.” Will professed no loyalty to Turner and gave no hint that he believed in Turner’s religion. Perhaps for similar reasons, when blacks referred Turner at the trials, they called him Captain Nat or General Nat, instead of alluding to his religious position as a preacher or a prophet.
Perhaps Turner’s religious separation from the black community can help make sense of perhaps the most surprising thing about Turner’s religion: the only disciple that Turner named in his Confessions was Etheldred T. Brantley, a white man. While there was a tradition of white anti-slavery in the region—only five years before the revolt, Jonathan Lankford was kicked out of Black Creek Baptist church for refusing to give communion to slaveholders—it seems unlikely that Brantley, who was not involved in the revolt, was converted by Turner’s antislavery. Instead it seems more likely that Brantley was drawn by Turner’s millennialism, Turner’s ability to convert Brantley’s heart, and Turner’s success in stopping the outbreak of a disease where blood oozed from Brantley’s pores.
Turner always understood his revolt in religious terms. When Turner was locked in prison, facing a certain date with Southampton’s executioner, Gray asked, “Do you not find yourself mistaken now?” Turner responded, “Was not Christ crucified[?]” For Turner, but not necessarily for everyone who joined his revolt, the Southampton Revolt was part of an unfolding modern biblical drama.
Patrick H. Breen teaches at Providence College. His book, The Land Shall Be Deluged in Blood: A New History of the Nat Turner Revolt, was published by Oxford University Press in 2015.
The striking, nearly monochromatic works of Romaine Brooks are receiving a fourth major showing at the Smithsonian American Art Museum in Washington, D.C., which owns about half the known output of the American expatriate who lived in Paris.
But the new exhibition, “The Art of Romaine Brooks” on view this summer, speaks most frankly about her sexual identity—her work is almost exclusively about women, and her own self-portraits show her in men’s clothing and a top hat.
The exhibition includes the 18 paintings and 32 drawings in the museum's collections—works we’ve seen before—but Joe Lucchesi, the contributing curator, says “the thing that is profoundly different about this show is the framing around the artist’s life itself and the issues of gender and sexuality that are really at the core at the work.”
The last Smithsonian showing of Brooks, in 1986, came at a time when feminist scholarship was just beginning, says Lucchesi, an associate professor of art history and the Women, Gender and Sexuality Studies program coordinator at St. Mary’s College of Maryland.
“There’s a profound cultural change that’s happened between the 1980s and now,” he says. “It’s actually quite interesting to me to think about that show and the one that’s up now as being on opposite sides of a huge culture shift that’s occurred over the last 30 years.”
It results in a higher profile for an artist who should be recognized as a leading cultural figure of the 20th century, according to biographer Cassandra Langer, author of Romaine Brooks, A Life, who recently spoke at a Smithsonian symposium on Brooks. “She stands alongside Virginia Woolf and Gertrude Stein as a major participant in the intellectual and artistic life of her times and beyond,” Langer says.
The American artist was born in Rome in 1874 as Beatrice Romaine Goddard, heiress of a mining fortune after following a troubled childhood where her father left the family, her mother became emotionally abusive and her brother was mentally ill.
"Brooks had a Gothic childhood replete with a mad cousin in the attic, an abusive and cruel mother, a conservative and cold sister and an insane brother,” Langer says. “As a child she was beaten and humiliated.”
Even living in a mansion, she often had to fend for herself. “It’s a little Tale of Two Cities,” Lucchesi says. “She’s a super rich girl, living like a street urchin. And nobody believes she’s a rich girl.”
She became a poor art student in Italy and France before she inherited the windfall that allowed her independence and a new way of depicting her world.
“She was one of the first modern artists to depict women’s resistance to patriarchal representations of the female in art,” Langer says. “She understood that women in art had been treated as objects rather than subjects. She made it her mission to change all that.”
That put her ahead of her time.
“Sexuality, gender and identity are now at the cutting edges of the current arts scene,” Langer says. Brooks (who got that name from a marriage that lasted less than a year) “started this conversation long before it became fashionable to do so.”
Her early nude, Azalées Blanches from 1910, was an unusual subject for a woman. “I grasped every occasion no matter how small, to assert my independence of views,” Brooks said in her unpublished memoir. Its provocative pose led to comparisons to the figure in Édouard Manet’s Olympia.
Brooks turned to performance artist Ida Rubinstein, whom Langer calls “the Lady Gaga of her day,” as a model for one of her best known paintings, that of a Red Cross relief worker outside a burning French city in the 1914 La France Croisee.
That Brooks was in love with Rubinstein was not as well known but certainly not hidden.
“Some of the critics at the time danced around some of the sexual identity issues, but they always understood it as a little bit of boundary pushing, and almost always characterized it as something very inventive, very forward thinking,” Lucchesi says.
Reproductions of the image exhibited at the Bernheim Gallery in Paris in 1915 raised money for the Red Cross, and as a result Brooks won a Cross of the Legion of Honor from the French government in 1920.
Brooks was proud enough of the medal to include it, as one of the few spots of color in her celebrated, typically gray 1923 Self Portrait, in which she devised a proudly androgynous mask for herself as carefully as did an artist much later in the century, Langer says. “Like David Bowie, she became very good at projecting her confected self. But this was just a cover for the very vulnerable and needy child she still remained.”
Because of her sexuality, Brooks “has been marginalized,” according to Langer, “most significantly due to the homophobic misunderstandings of her domesticity.”
But her chosen artistic style was also at odds with the increasingly fashionable cubist abstractions of the era. At the time when Stein’s nearby salon was celebrating the work of Picasso, Brooks’ moodier representational works were more comparable to that of Whistler.
Brooks retreated from paintings for decades, concentrating on fascinating, psychological drawings that Lucchesi says are of equal interest (and also on display).
She stayed true to her vision throughout, although by the time she died in Paris in 1970 at the age of 96, she had been largely forgotten. (Her own defiant epitaph was: “Here remains Romaine, who Romaine remains.”)
“It’s very difficult for female artists historically to garner a lot of attention, and then you add the sexual identity issues—I think all of those things kept her out of the mainstream,” Lucchesi says.
For her part, Langer says, “I always considered her queerness paradoxically essential and beside the point. The simple truth is she was a great artist whose work has been misinterpreted and overlooked.”
More and more people are aware of Brooks, thanks in part to a 2000 show at the National Museum of Women in Art, a few blocks away from the American Art Museum, also curated by Lucchesi.
But in the last big Smithsonian show in 1986, her sexual identity issues were “pretty coded,” he says. The American expatriate writer “Natalie Barney barely shows up in that catalog even though they were basically together for 50 years,” he says.
It wasn’t the Institution that was conservative, “it’s kind of the way the world was.”
But to take in the work now, he says, “what you’re seeing is an LGBT subculture in the active process of trying to define itself,” Lucchesi says. “And that’s really exciting to me.”
In her paintings, he says, “she’s participating in an effort to shape a visible image of what it means to be a lesbian in that era. And I think that’s very significant."
In 2016, “I think there’s a lot of interest in her work because there’s a bit of a recognition with things that are going on now with, for example, trans identities or more gender-fluid identities, and it’s very interesting to look back at someone 100 years ago who was also navigating things that weren’t so clear and developing a language really for the first time.”
That the show of 18 paintings and 32 drawings opened days after a LGBT-targeted massacre in Orlando makes the exhibition bittersweet. And yet its portraits in grays and black reflect a somber mood of the community after that tragedy.
“There’s a kind of quietness about her work, there’s a kind of heaviness to it, a seriousness to it that I think suddenly was very apparent in that moment of mourning,” Lucchesi says. “I hate that it became interesting for that reason. But there is real opportunity to have the show participate in some of the conversations that are happening right now.”
“The Art of Romaine Brooks” continues through October 2, 2016, at the Smithsonian American Art Museum in Washington, D.C.
When 21-year-old Susan La Flesche first stepped off the train in Philadelphia in early October 1886, nearly 1,300 miles from her Missouri River homeland, she’d already far surpassed the country’s wildest expectations for a member of the so-called “vanishing race.” Born during the Omaha’s summer buffalo hunt in June 1865 in the northeast corner of the remote Nebraska Territory, La Flesche graduated second in her class from the Hampton Normal and Agricultural Institute in Virginia, now Hampton University. She was fluent in English and her native tongue, could speak French and Otoe, too. She quoted scripture and Shakespeare, spent her free time learning to paint and play the piano. She was driven by her father’s warning to his young daughters: “Do you always want to be simply called those Indians or do you want to go to school and be somebody in the world?”
The wind-whipped plains of her homeland behind her once again, she arrived in Philadelphia exhausted from the journey, months of financial worry, logistical concerns, and of course, by the looming shadow of the mountain now before her: medical school. Within days, she would attend her first classes at the Women’s Medical College of Pennsylvania, a world apart from the powwows, buffalo hunts and tipis of her childhood.
Standing at the vanguard of medical education, the WMCP was the first medical school in the country established for women. If she graduated, La Flesche would become the country’s first Native American doctor. But first, she would need to break into a scientific community heavily skewed by sexist Victorian ideals, through a zeitgeist determined to undercut the ambitions of the minority.
“We who are educated have to be pioneers of Indian civilization,” she told the East Coast crowd during her Hampton graduation speech. “The white people have reached a high standard of civilization, but how many years has it taken them? We are only beginning; so do not try to put us down, but help us to climb higher. Give us a chance.”
Three years later, La Flesche became a doctor. She graduated as valedictorian of her class and could suture wounds, deliver babies and treat tuberculosis. But as a woman, she could not vote—and as an Indian, she could not call herself a citizen under American law.
In 1837, following a trip to Washington on the government’s dime, Chief Big Elk returned to the Omaha people with a warning. “There is a coming flood which will soon reach us, and I advise you to prepare for it,” he told them. In the bustling streets of the nation’s capital, he’d seen the future of civilization, a universe at odds with the Omaha’s traditional ways. To survive, Big Elk said, they must adapt. Before his death in 1853, he chose a man with a similar vision to succeed him as chief of the Omaha Tribe—a man of French and Indian descent named Joseph La Flesche, Susan’s father.
“Decade after decade, [Joseph] La Flesche struggled to keep threading an elusive bicultural needle, one that he believed would ensure the success of his children, the survival of his people,” writes Joe Starita, whose biography of La Flesche, A Warrior of the People, was released last year.
Joseph’s bold push for assimilation – “It is either civilization or extermination,” he often said – wasn’t readily adopted by the whole tribe. Soon the Omaha splintered between the “Young Men’s Party,” open to the incorporation of white customs, and the “Chief’s Party,” a group loyal to traditional medicine men who wouldn’t budge. When the Young Men’s Party started building log cabins rather than teepees, laying out roads and farming individual parcels, the conservatives nicknamed the north side of the reservation “The Village of the Make-Believe White Men.” It was here, in a log cabin shared by her three older sisters, that Susan grew up learning to walk a tightrope between her heritage and her future.
“These were choices made to venture into the new world that confronted Omahas,” says John Wunder, professor emeritus of history and journalism at the University of Nebraska-Lincoln. “The La Flesche family was adept at learning and adopting languages, religions, and cultures. They never forgot their Omaha culture; they, we might say, enriched it with greater knowledge of their new neighbors.”
It was here, in the Village of the Make-Believe White Men, that La Flesche first met a Harvard anthropologist named Alice Cunningham Fletcher, a women’s rights advocate who would shepherd her to the East and up the long, often prejudiced ladder of formal education.
And it was here, in the Village of the Make-Believe White Men, that a young Susan La Flesche, just 8 years old, stayed at the bedside of an elderly woman in agonizing pain, waiting for the white agency doctor to arrive. Four times, a messenger was sent. Four times, the doctor said he’d be there soon. Not long before sunrise, the woman died. The doctor never came. The episode would haunt La Flesche for years to come, but it would steel her, too. “It was only an Indian,” she would later recall, “and it [did] not matter.”
None of the challenges of her education could fully prepare La Flesche for what she encountered upon her return to the reservation as physician for the Omaha Agency, which was operated by the Office of Indian Affairs. Soon after she opened the doors to her new office in the government boarding school, the tribe began to file in. Many of them were sick with tuberculosis or cholera, others simply looking for a clean place to rest. She became their doctor, but in many ways their lawyer, accountant, priest and political liaison. So many of the sick insisted on Dr. Susan, as they called her, that her white counterpart suddenly quit, making her the only physician on a reservation stretching nearly 1,350 square miles.
She dreamed of one day building a hospital for her tribe. But for now, she made house calls on foot, walking miles through wind and snow, on horseback and later in her buggy, traveling for hours to reach a single patient. But even after risking her own life to reach a distant patient, she would often encounter Omahas who rejected her diagnosis and questioned everything she’d learned in a school so far away.
Over the next quarter-century, La Flesche fought a daily battle with the ills of her people. She led temperance campaigns on the reservation, remembering a childhood when white whiskey peddlers didn’t loiter around the reservation, clothing wasn’t pawned and land wasn’t sold for more drink. Eventually she did marry and have children. But the whiskey followed her home. Despite her tireless efforts to wean her people away from alcohol, her own husband slipped in, eventually dying from tuberculosis amplified by his habit.
But she kept fighting. She opened a private practice in nearby Bancroft, Nebraska, treating whites and Indians alike. She persuaded the Office of Indian Affairs to ban liquor sales in towns formed within the reservation boundaries. She advocated proper hygiene and the use of screen doors to keep out disease carrying flies, waged unpopular campaigns against communal drinking cups and the mescal used in new religious ceremonies. And before she died in September 1915, she solicited enough donations to build the hospital of her dreams in the reservation town of Walthill, Nebraska, the first modern hospital in Thurston County.
And yet, unlike so many male chiefs and warriors, Susan La Flesche was virtually unknown beyond the Omaha Reservation until earlier this year, when she became the subject of Starita’s book and a PBS documentary titled “Medicine Woman.”
“Why did they say we were a vanishing race? Why did they say we were the forgotten people? I don’t know,” says Wehnona Stabler, a member of the Omaha and CEO of the Carl T. Curtis Health Education Center in Macy, Nebraska. “Growing up, my father used to say to all of us kids, ‘If you see somebody doing something, you know you can do it, too.’ I saw what Susan was able to do, and it encouraged me when I thought I was tired of all this, or I didn’t want to be in school, or I missed my family.”
The Omaha tribe still faces numerous health care challenges on the reservation. In recent years, charges of tribal corruption and poor patient care by the federal Indian Health Service has dogged the Winnebago Hospital, which today serves both the Omaha and Winnebago tribes. The hospital of La Flesche’s dreams closed in the 1940s – it’s now a small museum – marooning Walthill residents halfway between the 13-bed hospital seven miles north, and the Carl T. Curtis clinic nine miles east, to say nothing of those living even further west on a reservation where transportation is hardly a given. Alcoholism still plagues the tribe, alongside amphetamines, suicide and more.
But more access to health care is on the way, Stabler says, and La Flesche “would be very proud of what we’re doing right now.” Last summer, the Omaha Tribe broke ground on both an $8.3 million expansion of the Carl T. Curtis Health Education Center in Macy, and a new clinic in Walthill.
“Now people are putting her story out, and that’s what I want. Maybe it’s going to spark another young native woman. You see her do it, you know you can do it, too.”
Among America’s pantheon of scientific innovators, few have led lives as notable as that of Harrison G. Dyar, Jr. (1866-1929), an outré entomologist whose personality was as colorful as the caterpillars he studied.
The subject of scientist-turned-biographer Marc Epstein’s recent book, Moths, Myths, and Mosquitoes: The Eccentric Life of Harrison G. Dyar, Jr., is remembered not only for prodigious productivity in his field of research, but also for his oddly exotic avocations.
Dyar instigated fiery feuds with his fellow entomologists. He was concurrently married to two different women. And he dug elaborate, electric-lit tunnels beneath two of his D.C. residences, disposing of the dirt in a vacant lot, or else passing it off as furnace dust or fertilizer.
Long after his death, there were whispers that the tunnels had enabled him to shuttle between his lovers—an urban legend that, while apocryphal, speaks to the mystery in which Dyar seems perennially shrouded.
Epstein, a specialist in Lepidoptera (moths and butterflies) at California’s Department of Food and Agriculture and a Research Associate in association with the Smithsonian’s Department of Entomology, aimed to address as many of Dyar’s disparate facets as he could in his new book—“the whole enchilada,” he says.Harrison Gray Dyar, Jr. (1866-1929) (Smithsonian Institution Archives)
This proved to be quite the challenge. “You could choose just one aspect and easily write a book the size [of mine],” he adds. Epstein’s holistic approach to the Dyar narrative spawned an incredible piece of nonfiction.
Dyar—the offspring of an inventor whose work in telegraphy nearly beat Samuel Morse to the punch and a spiritualist whose sister supposedly co-hosted a séance attended by no less than President Abraham Lincoln—was fated from birth to lead a sui generis life. Throughout his long and meandering career, the bug boffin’s exploits would win him as many enemies as they would admirers.
It cannot be denied that Dyar’s contribution to the field of entomology was staggering. Over the course of his eventful existence, the Gotham-born scientist named some 3,000 insect species, and compiled a hefty catalog enumerating 6,000 varieties of lepidopterans. He also pioneered in work on sawflies and mosquitoes, the latter a source of serious concern to those overseeing the construction of the Panama Canal, and in 1917 donated 44,000 miscellaneous insect specimens to the Smithsonian Institution. As Epstein aptly puts it: “Everything he did was in the hundreds or thousands.”
Fastidious in the extreme, Dyar captured, bred and reared the creatures he studied in droves; his essays furthered the understanding of the elusive role of larval stages in taxonomic classification.
Dyar’s Law, a principle invoking head size in larvae as a predictor of the number and nature of stages (instars) in insects’ full life cycles, is in wide use to this day, applicable in what the literature has shown to be 80 percent of instances.
Image by Photograph by Marc Epstein. A saddleback caterpillar (top) and spiny oak slug caterpillar (bottom), both limacodids. (original image)
Image by Photograph by Jane Ruffin. An eye-catching rose slug caterpillar. Dyar's interest in limacodids is mirrored in his biographer, whose present-day research builds off of Dyar's work. (original image)
Image by Smithsonian Institution Archives. Sketches found in one of Dyar's many notebooks illustrating variations in markings among skiff moth caterpillars. (original image)
Image by Photograph by Jane Ruffin. Dyar was endlessly fascinated by limacodid larvae, like the skiff moth caterpillar seen here. (original image)
One cause of Dyar’s punctiliousness, Epstein posits, was his deep-seated compulsivity.
Manifest in Dyar’s ceaseless collection efforts (including a transcontinental “honeymoon” trip with his wife Zella), prolific note-taking (often on the backs of grocery receipts, bills of sale and letters), and arcane cross-referencing (Dyar’s writings are coded with scores of mysterious symbols), this trait, which served him well in his scientific pursuits, did little to endear him to his peers and loved ones.
While conducting research at the National Museum, for instance, Dyar complained bitterly about the bureaucratic organization of the Smithsonian Institution, and resented delays in the publication of his scientific findings. In 1913, seeking to obviate these roadblocks, Dyar founded his very own entomology journal, which he titled Insecutor Inscitiae Menstruus—“persecutor of ignorance monthly.”
Dyar also picked nasty personal fights. So vituperative were his criticisms of fellow entomologist J.B. Smith, and so tactless his pooh-poohing of Smith’s late colleague and friend, Rev. George Hulst, that Smith ultimately swore “to have no further relations with the National Museum so long as Dyar remained.”Explorers scout Dyar's Dupont Circle tunnels following their rediscovery in 1924. (Library of Congress)
If Dyar’s professional life was rocky, his private one was rockier.
In the early years of the 20th century, Zella Dyar, who in 1888 had won Harrison’s affections by sending him Lepidoptera specimens from Southern California, became increasingly aware of her husband’s fondness for another woman—Wellesca Pollock.
The fair and auburn-haired Pollock was a kindergarten teacher whom Harrison had met— and to whom he had taken quite a fancy—during a Chautauqua excursion in the Blue Ridge Mountains in 1900. Dyar had named a member of the family Limacodidae (one of his “pet” Lepidoptera groups) after her that November (Parasa wellesca), and his visits to her place of residence had grown more and more regular in the years following.
The situation took a bizarre turn when Wellesca announced her 1906 marriage to Wilfred P. Allen, a fellow whom no one ever saw but who fathered three children of hers over the next decade.
Zella, alarmed by the dubious identity of Wellesca’s partner, especially in light of her own husband’s increasingly lengthy periods of absence from home, wrote desperate letters to her. Wellesca responded reassuringly, stating that whatever she felt for Dyar was purely “sisterly” in nature.A schematic of the tunnel network beneath Dyar's B Street home, located just south of the National Mall. (Photograph from the Washington Post, illustration by Marc Epstein.)
Years after this epistolary exchange (and others that followed), Harrison Dyar moved to secure a quick, low-profile divorce from Zella. Once she became aware of the lurid details of her husband’s relationship with Wellesca, however, the possibility of such a tidy split evaporated.
Wellesca’s hush-hush attempt to obtain a divorce from her own “husband” was stymied as well, albeit for a different reason. “Unconvinced of Allen’s existence,” Epstein recounts, “the judge ruled that Wellesca was unable to divorce him.”
The messy resolution of this debacle, which eventually saw Harrison and Wellesca officially united at severe professional cost to the former, is but one of the many intriguing threads traced in Epstein’s book.
The various stressors in Dyar’s life may well have fueled the creation of the labyrinthine tunnel networks found beneath two of his D.C. properties (one in Dupont Circle, the other just south of the National Mall), in which his own children were sometimes apt to play, and in which a 1924 Washington Post exposé postulated that “Teuton war spies” and “bootleggers” had once fraternized. The digging, which Dyar himself wrote off as little more than a physical workout, was, in Epstein’s view, a form of “Dyarian absolution”—a way for the scientist to battle his inner demons.
Research into the scientific findings of Dyar, as well as the juicy minutiae of his tortuous life, proceeds apace to this day. With no shortage of notebooks, scratch paper, and unpublished short stories (many of them autobiographical) to peruse, archival Dyar investigators have their work cut out for them.
Spearheaded by Epstein, the Smithsonian’s own ongoing efforts at transcription, decryption, and data base compilation promise boons not only for the entomological community, but for everyday citizens, each of whom stands to learn much from the fascinating story of one of America’s lesser-known scientific stars.
Marc Epstein will speak on the vibrant life of Harrison G. Dyar, Jr. from 6:45-8:15 PM on Tuesday, May 17. The Smithsonian Associates event, for which tickets are now available online, will take place at the Smithsonian’s S. Dillon Ripley Center.
It was 109 years ago, on a fall day in 1906, when Detroit art collector Charles Lang Freer agreed with a visiting dealer on a price for a Japanese screen by a little known artist named Tawaraya Sōtatsu.
The purchase of a work that became known as Waves at Matsushima, he wrote to a fellow collector, only came “after much dickering of a most exasperating nature” with the Tokyo dealer. He paid $5,000 for a pair of six-fold screens—the other by Hokusai—a price that was half of what the dealer was originally asking. But he ended up with a priceless and influential work that is currently centerpiece to what’s being billed as a once-in-a-lifetime exhibition in Washington, D.C.
“Sōtatsu: Making Waves” is the first major retrospective in the Western hemisphere devoted to the 17th- century artist—the first and only opportunity to see more than 70 pieces of his work from 29 lenders from the U.S., Japan and Europe on display together, amid works later artists did in homage to one of the most highly influential artists of his time.
The exhibition is only showing at the Smithsonian’s Freer Gallery of Art and Arthur M. Sackler Gallery, because of stipulations made when Freer pledged his collection to the country—a pledge that coincidentally also came in 1906—that the work not travel.
“In pledging his collection, Freer sought to encourage greater understanding and appreciation of Asia and its artistic traditions among his fellow Americans,” writes Julian Raby, the director of the Freer and Sackler Galleries, in his forward to the accompanying catalog to “Making Waves,” itself the first English-language survey of Sōtatsu’s art and a richly designed and elegant volume.
Image by Freer Gallery of Art. Coxcombs, Maize, and Morning Glories, Sōtatsu school, early 1600s (original image)
Image by Freer Gallery of Art. Mimosa Tree, Poppies and Other Summer Flowers Sōtatsu school, 1630-1670 (original image)
Image by Freer Gallery of Art. A pair of a six-panel screen entitled Trees, Sōtatsu school, mid 1600s (original image)
Image by Freer Gallery of Art. A pair of six-panel folding screens entitled Trees, Sōtatsu school, mid-1600s (original image)
Image by Freer Gallery of Art. Summer and Autumn Flowers, Sōtatsu school, 1600s (original image)
In making that long-ago purchase, Raby says, “[Freer] instinctively sensed that Sōtatsu, little known in Freer’s day, would emerge as a figure of singular importance in the history of Japanese art.”
The D.C. exhibition coincides with the 400th anniversary of the Rinpa style of painting, which began as a process of dropping ink onto a wet background to create delicate detail, also known as tarashikomi. A related exhibition at the Freer and closing next month when that esteemed gallery undergoes a two-year renovation is entitled “Bold and Beautiful: Rinpa Screens” and traces Sōtatsu’s influence on the work of other artists as well, including Ogata Kōrin (1658–1716) and his brother Ogata Kenzan (1663–1743).
Less is known about the biography of Sōtatsu. He is thought to have been born in 1570 and lived until about 1640—but his designs revolutionized Japanese art and survived to influence works 400 years later, from the likes of Gustav Klimt to Henri Matisse.
The six-fold screen at the center of the exhibit, Waves at Matsushima with its shimmering gold and silver tones, is believed to have been created about 1620. The work didn’t acquire its name until about 100 years ago. The title refers to an area of small pine-covered islands in Japan that came to be well known in recent years for having survived the 2011 tsunami.
“Freer didn’t buy them as ‘The Waves of ..’ anything,” says James Ulak, senior curator of Japanese art at the Freer and Sackler and who co-curated the exhibit. “They were simply described as ‘Roiling Waves and Rocks,’” Ulak says of the screens, “which is probably just as well. It doesn’t indicate a specific place.” The swirls and eddies of the water doesn’t necessarily indicate treacherous crossings, Ulak says. “Roiling waters, in hand scrolls and religious tracts, are things from which blessings emerge,” he says. “Just because it’s stormy, doesn’t mean it’s bad.”
And amid the swirling waters are rocks of safe shores, sandbars and pines.
“Sōtatsu literally made waves in his brilliant reworking of visual traditions for a vital new society that was emerging in early 17th-century Kyoto,” says Raby, who calls them “screens of utmost importance in the history of Japanese art. “In scale, elegance, illusion and looming abstraction, they announced a stylistic turn that would influence Japanese art and indeed Western art well into this century,” he says.
“And it is these screens, these waves, that form the pivotal point for this exhibition.”
With its precise and hypnotic lines of water amid branches and the much more abstract smudges of rocks, Ulak says, “the screen itself is an absolute encyclopedia of Sōtatsu’s technique, his use of pigments, his blending of pigments without lines, letting degrees of tonality form images.”
And where there are lines in the crashing waves, Ulak says, “look at these waves and think about holding a brush and doing this. Look at the line. It’s an incredible work of craft.”
And the showpiece is only the beginning of the exhibition, which covers the artist’s days as a craftsman and commoner in a Kyoto fan shop, his collaborations with a great calligrapher of the time, Hon’ami Kōetsu, and his work as a restorer of ancient texts such as the Lotus Sutra. The artist’s relatively speedy ascent from craftsman to favored artist of the sophisticated elite was something new at the time.
“Sōtatsu appears at a time when a whole society is shifting,” Ulak says. By incorporating older images from hand scrolls of 12th- to 14th-centuries on a series of fans, “you see the phenomenon of everyone with some means in Japanese society being able to become fluent with the mantle of a unified past.”
His success with the nobility led to creating a studio where as part of a team he created some stunning artwork and later, influenced artists for centuries to come. But over the centuries, Sōtatsu’s name faded from memory.
Likely originally commissioned for a temple by a wealthy sea captain, “Waves at Matsushima” only became wider known after a pair of exhibitions in the early 20th century.
One show was in 1913 and revived Sōtatsu’s reputation among artists in Japan but also in Europe, where his jewel tones and flat landscapes had direct influence on artists from Henri Matisse to Gustav Klimt. The other came in 1947, Raby adds when, “in the rubble of a just-concluded war, the Tokyo Museum held two remarkable parallel exhibitions, one was on Sōtatsu and the other on Matisse.
“To young Japanese artists who viewed the exhibitions, the coincidence was undeniable,” Raby says. “No one could miss the parallels. For Sōtatsu’s vocabulary seemed so very modern.” It only took, he said, “in the space of less than a generation, an entire shift, the forefront of which was Charles Lang Freer,” he says.
“And in recognition of this, in 1930 a monument was erected to Freer in Japan. Where? Not just in Kyoto,” Raby says, “but adjacent to Sōtatsu’s grave.”
“Sōtatsu: Making Waves” continues through Jan. 31, 2016 at the Smithsonian’s Arthur M. Sackler Gallery, Washington D.C.
A global campaign to boycott what activists are calling “dirty gold” gained its 100th official follower three days before Valentine’s Day.
The pledge was launched in 2004 by the environmental group Earthworks, which has asked retail companies not to carry gold that was produced through environmentally and socially destructive mining practices. Eight of the ten largest jewelry retailers in the United States have now made the pledge, including Tiffany & Co., Target and Helzberg Diamonds. The No Dirty Gold campaign is anchored in its “golden rules,” a set of criteria encouraging the metal mining industry to respect human rights and the natural environment.
While the list of retailers aligned in their opposition to dirty gold continues to grow longer, most gold remains quite filthy. The majority of the world’s gold is extracted from open pit mines, where huge volumes of earth are scoured away and processed for trace elements. Earthworks estimates that, to produce enough raw gold to make a single ring, 20 tons of rock and soil are dislodged and discarded. Much of this waste carries with it mercury and cyanide, which are used to extract the gold from the rock. The resulting erosion clogs streams and rivers and can eventually taint marine ecosystems far downstream of the mine site. Exposing the deep earth to air and water also causes chemical reactions that produce sulfuric acid, which can leak into drainage systems. Air quality is also compromised by gold mining, which releases hundreds of tons of airborne elemental mercury every year.
Gold has traditionally been a gift of love, and, not surprisingly, jewelry sales spike around Valentine’s Day. According to a recent survey released by National Jeweler, about 20 percent of Americans who planned to give a Valentine’s Day gift this year said they would be buying jewelry—sales estimated to total about $4 billion. Thus, activists see Valentine’s Day as a prime opportunity to educate consumers and stifle the trade of dirty gold. Payal Sampat, Earthworks’ director of the No Dirty Gold campaign, wants consumers to understand the back story of the gold industry. This, she believes, would spur an improvement in mining practices.
"We believe gold and metal mining can be done much more responsibly," Sampat says. "It's feasible, but consumers need to think about the impacts they have when they buy jewelry."
But the demand for gold is tremendous now. Several months ago, gold’s value hit $1,800 an ounce. It has since dropped to roughly $1,300—though that’s still five times its price in the late 1990s. The money to be made at all levels of the industry, from laborers knee-deep in mud to executive officers reaping thousands of dollars a day, creates powerful incentive to find gold—even though doing so may now be harder than ever. Alan Septoff, communications manager for the No Dirty Gold campaign, says that easily accessible gold has become scarcer and scarcer through time. “What we have left in most mines is very low-quality ore, with a greater ratio of rock to gold,” Septoff said.
This, he explains, makes the energy required to mine that gold—and the waste and pollution produced in the process—proportionally greater and greater. In other words, dirty gold is only getting dirtier. What’s more, gold that cannot be traced back to some level of deforestation, air and watershed pollution, and human injury and death is virtually nonexistent, according to Septoff.
“There is no such thing as clean gold, unless it’s recycled or vintage,” he says.
But James Webster, the curator of mineral deposits at the American Museum of Natural History, says the story is not as dark and one-sided as some may spin it. A clean gold mining industry is indeed possible, he says. Moreover, the industry is not as destructive at it may seem. Some states have strict—and effective—regulations on the handling of mine waste and runoff, Webster says.
"Cyanide is not as nasty/scary as it may sound," he wrote in an email. "Its half-life is brief in the presence of sunlight."
Yet the Environmental Protection Agency has reported that 40 percent of watershed headwaters in the western United States have been contaminated by mining operations. Many of these are tiny sites, and there are, overall, roughly 500,000 defunct metal mines in 32 western states that the EPA has plans to clean up. Remediation of these sites may cost more than $35 billion.
One of the largest open pit mines is located near Salt Lake City—the Bingham Canyon Mine. The deepest mine in the world, it is about 4,000 vertical feet from its rim to the bottom. Bingham Canyon is known as a copper mine, but the site yields gold, too. More than 600 tons of gold have come out of the mine since its opening in 1906, and every year, $1.8 billion worth of metals are produced here.
Another infamous American mine is the Berkeley Pit, in Montana. This mine made the nearby town of Butte rich and prosperous for a time, but the site was eventually exhausted of riches—including copper and gold—and retired. In the decades since, water has seeped into the Berkeley Pit and filled the mine, and today it contains one of the most lethally polluted lakes in the world. The toxic, acidic water killed 342 snow geese that landed here in 1995. The water, many people fear, will eventually taint the region’s groundwater supply.
The Grasberg Mine, in Indonesia, is one of the largest gold mines in the world and is owned by American company Freeport McMoRan. The Grasberg Mine is also located smack in the middle of Lorentz National Park, creating such a huge scar on the Earth that can be seen from space. The mine dumps about 80 million tons of waste debris into the Ajkwa river system every year, according to Sampat at Earthworks. Another American company, Newmont, owns the Batu Hijau mine, also in Indonesia. This operation dumps its waste into the ocean near the island of Sumbawa.
While the EPA struggles to remediate and restore almost countless mine sites in the United States, and while activists work to stem the tide of demand on the gold industry, efforts are underway to develop more open pit mines. Among the most controversial is the Pebble Mine, proposed for Alaska’s Bristol Bay region. The project, critics say, could destroy or seriously damage unspoiled wilderness, wildlife habitat, indigenous cultures and the region’s sockeye salmon fishery. Of the Pebble Mine, Septoff at Earthworks said, “There could not be a clearer example of a short-term profit gained at a long-term loss.”
The road ahead for the Pebble Mine’s proponents will not likely be a smooth one. A major investor in the project backed out late last year, and the jewelry industry—which uses about half of all gold mined each year —has expressed opposition to the project. Several days ago, Tiffany & Co.’s chairman and CEO Michael Kowalski told JCK Magazine that developing the Pebble Mine site will almost certainly do more damage than it’s worth to the environment, the region’s salmon-based economy and the face of the gold industry itself.
“The possibility of this ending in disaster is so high, it’s hard to see how any mining company could go forward,” Kowalski told JCK.
The EPA released a report in January in which the agency said development of the mine would carry many risks of damage to the ecology and culture of the region.
There is an activist slogan that says, “The more you know, the less gold glows.”
But ethical, responsibly mined gold may actually be possible. It has been estimated that about 165,000 metric tons of gold have been mined in all of human history. Most of this gold is still in circulation—and a growing number of jewelers are making use of this material. Brilliant Earth, Leber Jeweler and Toby Pomeroy are three companies that have abandoned new gold and opted,instead, to only deal in recycled and second-hand material, thereby cutting mining out of the equation.
Beth Gerstein, co-founder of Brilliant Earth, based in San Francisco, says there have long been “inconsistencies” between the traditional perceived value of gold as a romantic symbol and the realities of extracting raw gold from the Earth.
“Jewelry is a symbol of commitment and values and we want this to be true inside and out,” Gerstein said.
Gerstein, along with her business partner, launched Brilliant Earth in 2005, and she says demand for recycled gold has grown since the beginning.
“Consumers want to know that the product they’re buying hasn’t had a negative impact on the world,” Gerstein said. The gesture of recycled precious metals seems a virtuous one, and public interest in supporting the effort seems to reflect goodwill. But Webster, at the American Museum of Natural History, says that recycling gold has so far done little to offset the destruction of mining.
"Unfortunately, the demand for gold, annually, far exceeds the amount recycled," he wrote.
He even feels that applying any symbolic or superficial value to gold, whether recycled or fresh from an open pit mine, is ultimately only furthering the problems linked to much of the mining industry:
"To me, it is interesting that because the majority of gold that is mined and extracted from ores is directed to the jewelry industry (an enterprise that societies might be able to survive with less of), we could run societies on Earth with much less gold mining."
Forty years ago, in the world of wine, there were just three categories: the good stuff (French), the very good stuff (also French), and everything else. Few doubted that anything could be finer than the vines in Burgundy and Bordeaux, and those at the upper end of wine snobbery were unshakable Francophiles.
And then an event on May 24, 1976—a singular, notable event that would come to be known by the mythical name “The Judgment of Paris”—dramatically changed the way wines were seen and sought forever after. A well known and highly respected British wine seller and educator, Steven Spurrier, whose specialty was fine French vintages, organized a blind tasting that put unlabeled French and California whites and reds in front of nine French experts.
The idea originated with Patricia Gallagher, the American manager of Spurrier's store in the center of Paris. She had tried a few California wines, and on vacation in 1975 she traveled to the state’s wine valleys to meet vintners and sample their wares. Her enthusiasm inspired Spurrier to make a similar trip in April of 1976 and to select certain wines for a tasting the next month. Reached in London, Spurrier told me that the timing was intended to coincide, more or less, with the bicentennial of the American revolution, though that is “not an anniversary we Brits celebrate much.”
What then happened was revolutionary.
Image by Photo courtesy of Bella Spurrier. Steven Spurrier enlisted distinguished French wine experts and had them taste white and ten red. (original image)
Image by Photo courtesy of Bella Spurrier. From left to right: Patricia Gallagher, Steven Spurrier and French judge Odette Kahn (original image)
Image by Photo courtesy of Bella Spurrier. The Paris tasting shifted attention to California, and gave other vintners there encouragement to create some of the best wines in the world. (original image)
Image by Photo courtesy of Chateau Montelena Winery. A telegram announcing the "stunning success" at the Paris tasting. (original image)
Image by National Museum of American History. Bottles of the two triumphant vintages 1973 Chateau Montelena chardonnay and 1973 Stag's Leap Wine Cellars cabernet sauvignon are now held in the Smithsonian collections. (original image)
Spurrier enlisted distinguished French wine experts and had them taste ten white wines—six California chardonnays and four French white Burgundies. And he had them taste ten reds—six California cabernets and four French reds from Bordeaux. To everyone’s surprise, especially the judges, the two wines that came out on top were from Napa Valley, a 1973 Chateau Montelena chardonnay and a 1973 cabernet sauvignon from Stag’s Leap Wine Cellars.
Bottles of these two triumphant vintages are now held in the Smithsonian collections at the National Museum of American History, where a sold-out celebration of the anniversary will take place May 16 and 17. (And no, the vintage bottles will not be uncorked to mark the win.)
Steven Spurrier was just as surprised as his French tasters. On the cusp of the event’s 40th-anniversary, he told me that the judges tended to be tougher on the reds, and that he thinks the Stag’s Leap cab won because they thought it was from Bordeaux. As it happened, a writer from Time magazine, George Taber, was covering the event and did a story on it, making the California coup international news.
The winemakers responsible for this unexpected victory, Miljenko (Mike) Grgich, who made the chardonnay (while working for Chateau Montelena winery, owned by Jim Barrett, and not long after founded Grgich Hills Estates) and Warren Winiarski, founder of Stag’s Leap Wine Cellars, were initially unaware of their newfound prominence. Each told me they didn’t know about the tasting, and didn’t even know their wines had been part of the contest.
Now in his mid-90s, Grgich, who grew up in a wine-making family on the Dalmatian coast of what was then Yugoslavia and brought his family traditions to the Napa Valley, was happy to hear of his wine’s victory, but not entirely surprised.
His Chateau Montelena chardonnay had already beaten three famous white Burgundies the year before in a blind tasting in San Diego. But his pleasure was intense. He told me that when he got a phone call telling him that the New York Times was sending reporters and a photographer to interview him about Paris, “I started dancing around the winery and singing in Croation that I was born again.”
If Mike Grgich came to wine more-or-less genetically, Warren Winiarski, born in 1928, took a far more circuitous route. Though his father had made dandelion wine at home (legally) during Prohibition, wine had not been part of his life in the beer and hard liquor America of his youth.
When he spent a year in Italy researching a thesis on Machiavelli as a graduate student in political science at the University of Chicago, he first encountered wine as an everyday mealtime beverage. Then, back in Chicago, he had what might be considered a spiritual awakening, when, as he drank a New York State vintage, he says the “wine revealed itself to me.”
Winiarski calls that “an Athena moment.” With his new appreciation of what wine could be, he and his wife decided to move to California, where he served “voluntary indentured servitude” to learn how to turn grapes into gratification. Eventually, they bought a prune orchard and converted it into their first vineyard in what became Stag’s Leap district of southern Napa Valley. “Prunes didn’t lend themselves to making great beverages,” he told me, as we sat in his splendid hilltop house, with a sweeping view of the original vineyards and of the high rock outcropping that is the actual Stag’s Leap.
Like a writer of short stories, Winiarski talks about a wine having “a beginning, a middle, and an end,” about “how the mind processes what is being tasted,” and of his having “a responsibility to the fruit” when making a vintage.
The effects of the Judgment of Paris were varied and pronounced. Many in France were miffed, not surprisingly, one writer claiming that everyone knew French wines were superior “in principle.” But Winiarski contends that the tasting caused the French to “wake up from taking things for granted.”
Though Napa pioneers such as Robert Mondavi had already developed methods for producing fine wines, the Paris tasting shifted attention to California, and gave other vintners there encouragement to create some of the best wines in the world.
Both Winiarski and Grgich went on to further triumphs. In 1977, the first vintage from Grgich Hills beat 221 other chardonnays from around the world, including France. When I asked Mike Grgich if he’d ever made a better chardonnay than his winner in Paris, he said, “The 1973 was very good, but I always think we can do better.”
Warren Winiarski, waving a hand over the rows of vines spreading out below his windows, bright green with their spring leaves, said, “For us, the Paris tasting was a Copernican revolution. We never looked at our wines the same way again.”
His vineyards have had many proud moments, and have produced many renowned vintages, but Winiarski counts as a high point a certain evening in the San Francisco Harbor, when Ronald and Nancy Reagan celebrated their anniversary with Queen Elizabeth and Prince Philip aboard the royal yacht Britannia and were served a Stag’s Leap 1974 Cabernet.
The 1973 bottles of Chateau Montelena Chardonnay and Stag's Leap Wine Cellars Cabernet Sauvignon are on view in the exhibition "Food: Transforming the American Table 1950-2000" at the National Museum of American History.
Bourbon’s introduction to America begins with the first Thanksgiving and ends with dismemberment in a Virginia swamp some 500 years ago. In 1619, Captain George Thorpe — a well-connected lawyer back in England — moved to a plantation on the outskirts of Jamestown. Intensely interested in the New World that surrounded him, Thorpe sought out new crops that could be a potential cash source for the colonists. He struck gold with corn and, short on the ingredients to brew English beer, began substituting his new preferred grain into the distilling process.
Thorpe’s distilled corn spirit may have tasted nothing like today’s bourbon, and there’s no record that he sold it to other colonists, probably reserving it for his own use. His experiment with distilling corn didn't last long, however — by 1622, an American Indian rebellion led to the massacre of a quarter of America’s colonial population; Thorpe was bludgeoned to death and his body mutilated.
To Reid Mitenbuler, Thorpe’s short-lived experimentation with what would become known as bourbon is the perfect way to begin exploring the history of the spirit in America.
“I wanted to use that story because that to me got to the idea of capitalism and business in America,” says Mitenbuler. “You’ve got this guy, George Thorpe, who is a New World fortune seeker who comes over here and part of his mandate was to look for crops that would be profitable. He was looking at silk, he looked at possibly buying grapes for wine, and corn catches his eye -- this New World grain.”
That New World grain would eventually grow into an $8 billion dollar global industry, but it would always retain a particularly American sensibility, earning a Congressional declaration that dubbed it America’s native spirit in 1964. Bourbon Empire, Mitenbuler’s history of the spirit, dissects that designation: what makes bourbon distinctly American?
From the spirit’s roots in colonial Virginia to today’s craft movement, Mitenbuler teases out the contradictions inherent in the spirit’s history. In an industry dominated by marketing that sells bourbon as a small-producer craft product, readers of Mitenbuler’s tale might be surprised to hear that by the year 2000, close to 98 or 99 percent of American whiskey was produced by eight corporations running 13 plants. Knob Creek, with its block-printed label and small-batch look, is produced by Jim Beam. Bulleit Bourbon claims heritage back to 1830, but was actually created in 1999 and is owned by Diageo, a spirit company that owns Smirnoff vodka.
“Whiskey is an industrial product in a lot of ways, so it makes sense that these big companies could do it well,” says Mitenbuler.
In a sense, Mitenbuler argues, the contradictions in bourbon’s story are what make it such an American spirit — both agrarian and industrial, craft and commodity, new and old. Today, as bourbon experiences a boom not seen since before Prohibition, a new class of consumers is yet again reimagining America’s favorite spirit.
“We’ve reimagined it into this thing it is right now, this icon,” Mitenbuler says. “That’s how history works for a lot of things, we reimagine the past. The myth is created well after the fact.”
We spoke with Mitenbuler about his new book and the unique status bourbon holds as an American icon.
A theme that runs throughout the book is the dual nature of bourbon -- you describe it by turns as agrarian and industrial, frontier and capitalistic, big and small. Is that unique to bourbon?
That’s one of the things that attracted me to the story, because America is like that -- we’re full of contradictions. People a lot of times have nicknamed bourbon “America’s spirit.” What makes it America’s spirit? Capital, business -- that’s what America is known for worldwide.
Americans are sometimes uncomfortable with admitting that hardcore business is part and parcel with some of the rhetoric we wrap up around ideas of freedom and independence. I don’t know if we should always try to divorce the two -- we should just admit to it.
You also talk a bit about the Civil War, and how that reshaped whiskey in the same way that it forever transformed the trajectory of United States history. How did the Civil War change bourbon?
It was after the Civil War that you really see brand names come into existence. Today, you see bourbon brands with all kinds of dates going way into the past, and those usually aren’t true.
After the war, you see the nation really reaching full-scale industrialization, and that’s when you first see today what we recognize as our modern whiskey industry beginning to emerge. The industry starts to consolidate -- a small number of large producers that are specializing in whiskey.
Kentucky is largely thought of as the capital of bourbon production -- 95 percent of today’s bourbon comes from Kentucky. But that didn’t happen until after Prohibition.
It’s like a lot of businesses in America, where it used to be that every town had its bank or its pharmacy and you get to today where you go into every town and every town has its CVS. It’s the exact same thing with whiskey.
There’s a mystique held to the small producer -- we cherish that in America. But large corporations know that there’s that pull to the idea of small holders, so they work very hard to make their brand seem small. You go to the liquor store and see 100 different labels and think they all come from different places, but a lot of the time the spirit in the bottle is the exact same thing as what might be a few rows over, just labeled and marketed differently.
What’s the biggest change between the bourbon we’re drinking now and bourbon from the 1900s, when bourbon was really coming into its own?
I’ve had this conversation with a lot of people: was the actual product in the bottle better or was it worse? I’ve tried some older bourbons. [Bourbon can keep indefinitely as long as it is stored properly, in a sealed bottle without too much extra air.] Some were incredible and some were disgusting. We sometimes attach romance to the past, especially when we’re talking about food. I often joke that you want to drink the frontier's whiskey as much as you want the frontier’s medical care.
What is the most exciting thing that’s happening in bourbon right now?
All the experimentation going on with the craft movement, as well as reintroducing some techniques that have been lost. When craft whiskey reaches its full potential, it’s going to be great. But I’m also quick to point out that we’re not there yet.
Whiskey has a longer learning curve than just about any type of food. It really does take years to make the most exceptional types of whiskey, and that’s something that a lot of these new producers just don’t have yet. They don’t have the capital or the time needed to make what are really the best whiskeys.
When did bourbon become such an icon in the American imagination?
Right now is when it’s really reaching its peak as that kind of icon. One of the things people forget today, because bourbon in recent years has been up-marketed and become a foodie thing, is that bourbon is really a very humble product. It’s just some grain fermented, thrown into a barrel and aged. It doesn’t necessarily have to be that expensive to be good. It’s a very humble thing, and throughout most of its history it had a downmarket, blue collar reputation. That’s been a big part of its heritage.
“Happiness is going eyeball-to-eyeball with those Cub fans. That's really what I appreciated most about playing in Wrigley Field.”
One of the great pleasures in life has to be having a hot dog and Coke at the ballpark. Imagine sitting in the bleachers at Wrigley Field on a sunny afternoon on May 24, 1957, before the start of a game between the Chicago Cubs and Milwaukee Braves. With the game’s scorecard—like the one shown here—on your lap, you bite into your frankfurter as you scan the grass in front of the ivy-covered brick outfield wall.
There, you see two giants of the game standing together in conversation. The first is Hank Aaron, who would become the National League’s MVP that year wearing his gray Milwaukee Braves road uniform. Next to him is a rising NL superstar, Ernie Banks, who would hit 43 home runs that season, wearing the Chicago Cubs home pinstriped flannel jersey also shown here. The jersey features a zipper front, which was first adopted by the Cubs in 1937. During the 1940s, 1950s and 1960s, many teams used zippered jerseys instead of button front jerseys, while a handful of teams wore them into the 1970s and even the 1980s.Banks’s 1957 Chicago Cubs home jersey with an official score-card (back side) for the game played between the Chicago Cubs and Milwaukee Braves, May 24, 1957 (Francesco Sapienza)
No player in the history of baseball has been so closely identified with a single city as Ernie Banks, a favorite of Chicago fans and known for much of his career as “Mr. Cub.” Less well known is that Banks, a migrant from the South, was a pioneer of civil rights in professional baseball, a living bridge between the Negro Leagues and the multiethnic but by no means post-racial game of today.
Born on January 31, 1931, in Dallas, Texas, Ernest Banks attended Booker T. Washington High School and excelled in football and basketball. The school did not offer baseball so Ernie played on a fast-pitch softball team in the church league, which fit nicely with his mother’s hopes that Ernie would become a minister.
When Banks was a high school sophomore, his skill on the field became evident to anyone who saw him play. He was introduced to the owner of the Detroit Colts, a travel team from Amarillo, Texas, which served as a feeder team for the Negro Leagues. Banks tried out for the Colts and became the team’s shortstop, traveling to games in Texas, New Mexico, Kansas, Nebraska and Oklahoma. When Banks was a high school senior, the Colts played the Kansas City Stars, and Banks impressed Stars manager “Cool Papa” Bell with his demeanor and his skill on the diamond. “His conduct was almost as outstanding as his ability,” said Bell, who promised Banks a job with the Kansas City Monarchs if he completed his senior year of high school. Bell recommended Banks to Buck O’Neil, the manager of the Monarchs.
The Monarchs offered Banks $300 a month, and Eddie and Essie Banks gave their consent for their son to become a professional ballplayer. In signing with the Monarchs, Banks joined one of the most storied teams in the Negro Leagues, a pillar of black baseball.
“‘Cool Papa’ Bell was the first one who impressed me,” Banks said later. “Buck O’Neil helped me in many ways. He installed a positive influence.” Banks joined the Monarchs in the middle of the 1953 season, after a two-year stint in the Army, and he played shortstop and batted .347 for the rest of the season.
Banks later said that “playing for the Kansas City Monarchs was like my school, my learning, my world. It was my whole life.” Banks played so well that he quickly caught the attention of the Cubs, who were making efforts to integrate. The Cubs wasted little time in signing Banks, and he made his major league debut on September 17, 1953.
Banks was the first African-American to play for the Cubs, and Jackie Robinson advised him to keep his head down and to be prepared for insults directed at him because of his ethnicity. Banks, naturally quiet, was happy to focus on baseball, but his reluctance to speak openly in favor of the civil rights movement led some activists to brand him an “Uncle Tom.” It did not help when Banks said, “I look at a man as a human being; I don’t care about his color. Some people feel that because you are black you will never be treated fairly, and that you should voice your opinions, be militant about them. I don’t feel this way. You can't convince a fool against his will . . . If a man doesn't like me because I'm black, that's fine. I'll just go elsewhere, but I'm not going to let him change my life.”Official scorecard for game played between the Chicago Cubs and Milwaukee Braves, May 24, 1957 (Eden Man, 4D Studios)
In 1954, his first full year and official rookie season, Banks finished second in the National League’s voting for rookie of the year. He won the Most Valuable Player award in 1958 and 1959, becoming the first NL player to win the award in back-to-back seasons. As shortstop, he led the league in fielding percentage three times. In 1960, he became the first Cubs player to be awarded a Gold Glove, and he led the league in fielding percentage, double plays, put-outs, and assists. His double-play partner was Gene Baker, another Negro League veteran. When Steve Bilko joined the Cubs at first base, Wrigley Field announcer Bert Wilson delighted in labeling the double-play combination, “Bingo to Bango to Bilko.”
A knee injury forced Banks to sit for a few games in 1961, after playing 717 consecutive games. When he returned, he was sent to left field, where he committed just one error in 23 games. In June, he moved to first base, where he played until he retired ten years later. He played more than a thousand games in each position—1,125 at shortstop and 1,259 at first, though he is best remembered for his years as a quick, agile shortstop.
Banks also was one of his era’s most prodigious sluggers. He retired after the 1971 season, finishing his career with 512 home runs and a lifetime batting average of .274. The 277 home runs he hit as a shortstop set an MLB record later broken by Cal Ripken. He also had 1,636 RBI and 2,583 hits, and he holds Cubs records for games played (2,528) and total bases (4,706). He also played a record 2,528 games without reaching the postseason. Despite the team’s failings, Chicagoans love their Cubs, and in 1969, readers of the Chicago Sun-Times named Banks the “Greatest Cub Ever.” Banks returned the favor by repeatedly expressing his pride in having played his entire major-league career with one team and for the same owners, the Wrigley family.
In his later seasons, when Banks suffered batting slumps, Cubs manager Leo Durocher complained that he couldn’t remove Banks from the lineup because he was too popular: “I had to play him,” Durocher said. “Had to play the man or there would have been a revolution in the street.” For his part, Banks thanked Durocher profusely for his coaching. In the later years of his career, as more African-American players joined the league and the Cubs, Banks became somewhat more vocal about injustice, though he always insisted he was there for baseball, not politics. He was also a lifelong Republican who supported Richard Nixon and Ronald Reagan. Heavily Democratic Chicago forgave him.
Banks was elected to the Baseball Hall of Fame in 1977, his first year of eligibility, and his number 14 was the first number retired by the Cubs. Banks was named a Library of Congress Living Legend, a designation that recognizes those "who have made significant contributions to America's diverse cultural, scientific and social heritage." In 2013, another adopted Chicagoan, President Barack Obama, awarded him the Presidential Medal of Freedom, along with 15 other recipients including Bill Clinton and Oprah Winfrey.
Banks died on January 23, 2015. Following a public visitation, a memorial service was held at the Fourth Presbyterian Church in downtown Chicago. After the service, a procession moved from the church to Wrigley Field, Banks’ major league home from 1953 to 1971. The Friendly Confines welcomed home the franchise’s greatest ambassador.
This article was excerpted from Game Worn: Baseball Treasures from the Game’s Greatest Heroes and Moments by Stephen Wong and Dave Grob, Smithsonian Books, 2016
London Mayor Boris Johnson grew up with Winston Churchill. That is, his parents would often quote the British Bulldog around the house. So when Churchill’s estate asked Johnson to write a biography to commemorate the 50th anniversary of Churchill’s death, the mayor agreed. We spoke with Johnson about Churchill’s legacy and his new book, The Churchill Factor.
In The Churchill Factor, you quote a source as saying that Winston Churchill was the greatest Briton of all time. What made Churchill so great?
The sheer scale of his achievement and in particular, in being the only man who could possibly have saved Britain and indeed western civilization in May 1940 from a catastrophe that would have disgraced humanity.
You write that when you were growing up, your father would recite Churchill quotes. Can you recall any in particular?
He would recite some of the famous lines from [Churchill's] great wartime speeches. And I think it was my mother who used to tell us jokes about Churchill. You know, the famous one about when he’s in the lavatory and he’s told, someone comes to him and, the Lord Privy Seal has got a message for him. And he shouts out, ‘Tell the Lord Privy Seal that I’m sealed in the privy, and I can only deal with one shit at a time’…That turns out, to my amazement, to be true. Or at least partly true.
Do you have a favorite Churchill quote?
There are so many. His gift for language was so incredible…The great story about when the chief whip comes to tell him about some minister who’s disgraced himself on a park bench. Some Tory cabinet minister is caught on a park bench at 6 o’clock in the morning in February with a guardsman, which is a total disgrace. And obviously the party machine starts to think he’s got to resign and the news of this is brought to Churchill in his study in Chartwell. And he doesn’t turn around from his desk and the chief whip’s relating this unhappy event, and Churchill says after a long pause, “Do you mean to say that so and so was caught with a guardsman?” “Yes, prime minister.” “On a park bench?” “Yes.” “At 6 o’clock in the morning?” “Yeah, that’s right.” “In this weather?” “Yes, prime minster.” “By God, man, it makes you proud to be British.”
But aren't some of those great quotes fake?
That’s the trouble. I heard that one from his grandson, whether or not that’s a substantial source, I don’t know…When [politician] Bessie Braddock told him he was drunk, he certainly did say, “Well madam, you’re ugly, but I’ll be sober in the morning.” I’m afraid that is true, and very rude. [Experts believe that Churchill got that line from a W.C. Fields movie.]
What makes his quotes so memorable?
He loves reversing word orders…chiastics. So, “It’s not only the beginning of the end, but the end of the beginning.” Or, “I’m ready to meet my maker, whether or not my maker is ready to be meeting me.” Or, “I’ve taken more out of alcohol than alcohol has taken out of me.” “We shape our places and then they shape us.” He’s using the same device.
And, as your write in your book, his larger-than-life style of writing suited the larger-than-life times.
He was always thought to be a bit over the top until events themselves became a bit over the top. And there was only one man who could rise to the level of events, and that was Churchill, and he found the words that suited the time. And he did it by mixing up Latin and English words very brilliantly…He would swoop from the flowery Latinate words to the very short, punchy Anglo-Saxon words.
With so much written on Churchill, how did you manage to write something new?
There have been very few books recently that have tried to bring him to a wider audience, and tried genuinely to explain what made him the man he was. I think there’s been such an abundance of Churchill scholarship and "Churchilliana," that no one has really tried to bring things together and to explain why he was the guy he was and also how he impacted on history and put it in an accessible way. That was what I think the Churchill estate felt they needed as we came up to the 50th anniversary of his death. Because in Britain, certainly, his memory is fading, and although everybody knows who he was, everybody knows what he did in 1940, there are aspects of his life that are now being lost and forgotten. And so the estate and Churchill 2015, this organization, we’re all very keen to try and make him as widely known as possible…I’m proud to say [the book] is selling a huge number of copies, at least in Britain, which is obviously I think more to do with the subject than with the author.
What was your research process like?
Thankfully, I had a lot of cooperation from the Churchill estate, from a brilliant guy called Allen Packwood, who runs the Churchill Archives Centre, and a wonderful, wonderful researcher called Warren Dockter. Dr. Dockter, as he's notably called. Warren and I used to wander around. We've been around the battlefields of the First World War, we've been to the war rooms, we've been to [Churchill's] painting studio. It’s just been incredible fun. I managed to carve out hours here and there in my job as mayor and it’s been fun just to walk around and share ideas with Warren.Growing up, Johnson would hear his parents quoting Winston Churchill around the house. (EPA/Francisco Guasco/Corbis)
Like Churchill, you’re a British journalist-turned-politician. In what ways have you tried to emulate him?
The truth is, lots of journalists and politicians try pathetically to emulate [him], but all of us fall so very short. He casts a very long shadow over the whole thing…I’m a journalist of sorts and I’m a politician of sorts, but it’s nothing on the same scale as Churchill. He wrote huge numbers of books. He produced more than Shakespeare, more words than Dickens, and indeed more than both of them combined. He won a Nobel prize [for literature]. He was the highest paid journalist of his time. So as a journalist he’s hard to match and as a politician, certainly. It’s extraordinary. He was 64 years in House of Commons. He held virtually every great office of state. He was indispensable in winning the First World War, the Second World War. I mean, Christ.
He smoked 250,000 cigars. I think I’ve probably in my life smoked fewer than 100, so it’s pathetic. Any comparisons are ludicrous.
You’re the keynote speaker at the next International Churchill Conference. What makes this 50-year milestone since his death so significant?
I think he reminds British people of a certain quality of greatness, and I think people are interested in this idea of what makes somebody great...So Churchill 2015 is an opportunity to reflect on those values and continuing global importance.Winston Churchill, who died 50 years ago, is remembered for his witty and profound quotes. (Keystone/Corbis)
For mathematician and "mathemagician" Arthur Benjamin, the best way to illustrate the basic concept of algebra is with a little magic trick: "Think of a number between one and ten. Now, double it. Add ten. Then, divide by two. Subtract the number you started with originally.”
"Is the number you arrived at 5?"
Then, he explains why the trick works: "Let's call the number you started with n—and right away, we've achieved the major goal of algebra, which is the concept of abstraction, using a letter to represent a value that we don't know. First, you doubled the number, so you had 2n. Then, you added 10, so you had 2n + 10. After that, you divided the number by 2. When you divide 2n + 10, you get n + 5. Finally, when you subtract n—no matter what number it is—you're left with 5."
That's just one example of how Benjamin, who is well known for his magic shows in which he performs mental calculations at lightning speeds, demonstrates the fun and fascinating aspects of math in his latest book, The Magic of Math: Solving for x and Figuring Out Why.
"Many mathematicians say that math exists without humans. But Arthur epitomizes the social strain of math that involves sharing ideas with people and converting young people, getting them interested in math by reminding them it's a cool subject," says Paul Zeitz, professor in the Department of Mathematics and Statistics at the University of San Francisco. "Even when he gives math talks without magic, he's amazing at captivating an audience. He's done more for math than most people do in their entire careers."
Benjamin, a mathematics professor at Harvey Mudd College, became a "mathemagician" by accident, beginning the '70s. Growing up, he was an avid follower of the prolific popular mathematics writer Martin Gardner, reading books such as Mathematical Carnival and completing Gardner's puzzles in Scientific American. He started doing magic shows for kids while he was in high school, which then grew into shows for adults, featuring feats of mental agility that were less challenging than they appeared. Benjamin was good at doing quick math in his head, so he added that to his shows and developed various techniques for doing fast math in his head—the same way many others had.
"What made me different was I had a knack for performing on stage," he says. After finishing graduate school in 1989, he settled in southern California and started performing again, earning a coveted spot at The Magic Castle in Hollywood, a clubhouse for the world's best magicians. Twenty-five years later, he's still performing, at roughly 75 events a year—a full-time job, in addition to his full-time teaching.
"I learned how to be a good teacher through my early experiences as a magician," he says. "My approach to teaching has always been, 'How do I make this material entertaining?' Math is a serious subject, but that doesn't mean it has to be taught in an overly serious way."
In The Magic of Math—his first high-profile release since 2006, when he wrote the popular Secrets of Mental Math—he explains why “9” is the most magical number, offers some mental math shortcuts, and shares his technique for figuring out the day of the week for any date of the current or upcoming year. "I want people to have two reactions to each exercise, or trick," he says. "First, I want them to say, 'Cool!' and second, I want them to ask, 'Why?' "
Both the fun and the explanations are often missing from math instruction in today's schools, according to Benjamin—despite the fact that so many people are calling for better education in science, technology, engineering and math (STEM). Frequent testing leaves little time for exploring math for the sake of beauty and pleasure, and students are often told they need to learn math skills only because they'll need them in future math classes. "I don't like delayed gratification," Benjamin says, "especially when the answer never comes. Math needs to be made relevant."
Another problem is that many teachers don't love math. "It's hard to fake a passion for math," he says. "A lot of elementary school teachers are math-phobic and I worry they are passing on those phobias to students. You can't expect students to be more excited about math than their teachers are." Yet, there are barriers that keep people who truly love math—such as math and engineering majors—from going back to classrooms as educators. "I wish we could do more to attract the best and brightest math students to teaching, and give great teachers more money and more respect," he adds.
In his 2009 TED Talk, Benjamin suggested that high school students should be taught statistics rather than calculus and he still advocates for this approach. While calculus is currently a must for students seeking acceptance to competitive colleges, "I would rather see the typical high school graduate have a good understanding of probability and statistics," he says. "Unless students are going into engineering, science, math or a related field, they're not going to use calculus in their daily lives, but probability and statistics are around us everyday, when we're reading the newspaper or making financial decisions. Data surrounds us, and the more you understand it, the better off you will be."
Keith Devlin, executive director of the Human-Sciences and Technologies Advanced Research Institute (H-STAR) at Stanford University, agrees: "Calculus is a totally inappropriate summit course for high school. Most students at that age do not yet have enough other mathematics beneath their belts, or sufficient mathematical maturity to do it properly."
Overall, Benjamin hopes that someday students will have more options, and less of a prescribed track, when it comes to studying math. "I believe that will allow a love of math to grow, and not be decimated," he says.
He hopes The Magic of Math will be a resource for students, parents, teachers and adults who are curious about math. At the end of the book, in a section called "Aftermath," he recommends other resources such as Kahn Academy, The Art of Problem Solving and videos by Numberphile. He says, "There are a ton of popular math books out there right now, perhaps because people are looking outside of the education system for fun math. If mine is the only math book you ever read, then I have failed."
“Free booze, free blues, that’s Freddie,” sings James Zimmerman, a jazz scholar and a senior producer at the National Museum of American History, who served as the Smithsonian Jazz Masterworks Orchestra’s producer and executive producer for 11 years.
Zimmerman’s voice mimics the smooth, dreamy instrumentation of “Freddie Freeloader,” found on Miles Davis’ 1959 masterpiece Kind of Blue. He uses the words that lyricist and singer Jon Hendricks penned for the complex arrangement years later. Words so fitting that one could imagine Davis approaching Hendricks to say, “Mother [expletive], what are you doing writing words to my song?”
Leaving the theater after seeing Don Cheadle’s new film Miles Ahead about the raspy-voiced Davis, Zimmerman is singing to prove his point.
“Miles was the greatest singer on the open mouth trumpet that there’s ever been,” he says, echoing the words of jazz great Gil Evans. It’s what first attracted Zimmerman, himself an accomplish vocalist, to Davis’ music in the ’80s.
Davis was a middle-class son of a dentist, born into a racially divided America, who once was clubbed on the head by a white policeman for standing outside of a venue where he was performing. In addition to numerous Grammy Awards, Davis has a star on the Hollywood Walk of Fame, was inducted into the Rock and Roll Hall of Fame and even had his work honored by Congress. Different versions of Davis exist side by side: He was an unquestionable genius, who had an electrifying stage presence, a great affection for his children, but also, as Francis Davis writes in the Atlantic, the troubled artist was “peacock vain,” addled by drugs, and, by his own account, physically abused his spouses.Miles Davis by Max Jacoby, 1991 (National Portrait Gallery)
“[B]eing a Gemini I’m already two,” Davis wrote in his 1990 autobiography Miles. “Two people without the coke and two more with the coke. I was four different people; two of them people had consciences and two didn’t."
Rather than attempt to reconcile the varied pieces of the legendary jazz trumpeter and bandleader, Cheadle’s film takes the form of an impressionistic snapshot, aiming to tell a “gangster pic” about the jazz great that Davis himself would have wanted to star in.
The film's title, Miles Ahead, Zimmerman speculates—also the name of his second album that he did with Evans—alludes to how Davis was always moving forward with his music, from the origins of “cool jazz,” collaborating with Evans in the late 1940s, moving to “hard bop” in the 1950s, changing the game again with modal improvisation in the late ’50s, then taking rock influences to create a fusion sound, as heard in his 1969 jazz-rock album In a Silent Way.Davis wore this custom Versace jacket during his one of his last performances with Quincy Jones at the 1991 Montreaux Jazz Festival in Switzerland. (National Museum of American History)
“He was always with the times,” says Zimmerman. “He was listening and he was willing to be a risk taker, without any doubts, without any thoughts of failing. That was the way he was.”
The film grounds itself what has been called Davis' “silent period,” from 1975 to 1980, when the musician was riddled by depression and drugs and couldn’t play the trumpet. It’s an interesting choice, seeing as his sound expressed who he was. “He described his music as his voice,” says Zimmerman. “Sometimes, he wouldn’t talk, he’d just say, ‘Hey let the music speak for itself,’ because he was pouring everything into that.”
In a way, that’s what the movie does, though. The decidedly anti-biopic riffs from one imagined scenario to another, articulating long notes and short trills over a timeline of Davis’ life in the late ’50s and early ’60s. The film often relies on music to explore his relationship with his wife Frances Taylor, as well as his work with musicians John Coltrane and Red Garland and Paul Chambers and Art Taylor.
“The music is hot, the music is very athletic, there’s all kinds of musical gymnastics going on when he meets Frances,” Zimmerman says. A prima ballerina, she was involved with the theater and Broadway. Davis was captivated by her beauty, but perhaps was more drawn to her as an artist. He would go to her shows, and it opened him up to new sounds and influences.
“Broadway, you have a pit orchestra, so he was hearing different things, and I think that got inside of him,” says Zimmerman, guiding Davis away from the hot, energetic music of bebop into the passionate, emotive music that he would create in Sketches of Spain and Porgy and Bess.
While Taylor arguably wasn’t his first wife (Irene Birth, who he had three children with, came first though they had a common-law marriage), nor would she be his last, Zimmerman can see why the film chose to focus on their relationship.
“Frances just sort of got into his heart in a deep way,” says Zimmerman. “That makes me think of [Frank] Sinatra and Ava Gardner and how Ava Gardner dug into his heart and he could never overcome Ava Gardner.”
The silent period comes after Taylor leaves him. Davis was heavily into drugs, was likely dealing with emotional exhaustion from his already 30 years of work as a musical pioneer and was physically worn out. He suffered from sickle-cell anemia and his condition, coupled with the pain from injuries he sustained in a 1972 car crash, had worsened. Still it was a shock to the jazz cats that he stopped playing during that period.
“For someone to be in the limelight for so long to stop recording and leave recording—lots of people talk about that, but they don’t necessarily do it because the music is very much apart of them,” says Zimmerman. “Miles said that and he really didn’t play. The hole was there, but he didn’t play.”
Though the film uses the dynamic between Davis and a fictional Rolling Stone journalist to push Davis to return to the music, it was George Butler, a jazz record executive, who helped persuade Davis to get back into the studio, even sending him a piano. So too did the new music he was hearing.
“The electronic music, the synthesizers, those kinds of things were intriguing to Miles,” says Zimmerman. It took him awhile after being out so long to build up his embouchere.
“That’s everything to a trumpet player,” says Zimmerman. “It took him awhile to get back, but he was listening and playing and working compositions and determining who he could make a statement with.”
In 1989, Zimmerman saw Davis play at Wolf Trap National Park for the Performing Arts in Vienna, Virginia. He performed with a seven-piece band that included saxophonist Kenny Garrett, guitarist Foley and Ricky Wellman, the former drummer for Chuck Brown, Washington D.C.'s renowned “Father of Go Go.” All of these musicians appeared on Davis’ latest album, Amandla. Zimmerman remembers the sound as funky, with some Go-Go influences to it.
“It was sort of him, of the times,” says Zimmerman. “The times were always changing and he was going along with that.”
While the film might not have gotten all of the facts, Zimmerman says it pulled at a greater sense of who Davis was.
“The reality is fiction has foundation in truth, in nonfiction,” says Zimmerman. “I think they got his personality dead on.”
Once the United States’ largest private residence and the most expensive to build, today you could almost miss it. The Winchester Mystery House in San Jose, California, sits between the eight lanes of the I-280 freeway, a mobile home park, and the remains of a Space Age movie theater. The world has changed around it, but the mansion remains stubbornly and defiantly what it always was.
Each time I visit the Mystery House, I try to envision what this space must have looked like to the “rifle widow” Sarah Winchester, when she first encountered it in 1886—acre after acre of undulating orchards and fields, broken only by an unassuming eight-room cottage.
Legend holds that before the 1906 earthquake—when her estate was as huge and fantastically bizarre as it would ever be with 200 rooms, 10,000 windows, 47 fireplaces, and 2,000 doors, trap doors, and spy holes—not even Sarah could have confidently located those original eight rooms.
Winchester had inherited a vast fortune off of guns. Her father-in-law Oliver Winchester, manufacturer of the famous repeater rifle, died in 1880, and her husband, Will, also in the family gun business, died a year later. After she moved from New Haven, Connecticut, to San Jose, Winchester dedicated a large part of her fortune to ceaseless, enigmatic building. She built her house with shifts of 16 carpenters who were paid three times the going rate and worked 24 hours a day, every day, from 1886 until Sarah’s death in 1922.
An American Penelope, working in wood rather than yarn, Winchester wove and unwove eternally. She built, demolished and rebuilt. Winchester hastily sketched designs on napkins or brown paper for carpenters to build additions, towers, cupolas or rooms that made no sense and had no purpose, sometimes only to be plastered over the next day. In 1975, workers discovered a new room. It had two chairs, an early 1900s speaker that fit into an old phonograph, and a door latched by a 1910 lock. She had apparently forgotten about it and built over it.
In 1911, the San Jose Mercury News called Winchester’s colossus a “great question mark in a sea of apricot and olive orchards.” Over a century later, the San Francisco Chronicle was still baffled: “the Mansion is an ornately complex answer to a very simple question: Why?”
The answer: Her building is a ghost story of the American gun. Or so the legend went. A spiritualist in the mid-1800s, when plenty of sane Americans believed they could communicate with the dead, Wincehster became terrified that her misfortunes, especially the death of her husband and one-month old daughter, were cosmic retribution from all the spirits killed by Winchester rifles. A relative said many decades later Winchester fell “under the thrall” of a medium, who told her that she would be haunted by the ghosts of Winchester rifle victims unless she built, non-stop—perhaps at ghosts’ direction, for their pleasure, or perhaps as a way to elude them. Haunted by conscience over her gun blood fortune and seeking either protection or absolution, Winchester lived in almost complete solitude, in a mansion designed to be haunted.
When I heard her ghost story from a friend in graduate school, I was enthralled. Eventually, Winchester became the muse for my book on the history of the American gun industry and culture.A postcard showing the Winchester Mystery House circa 1900-05 (Courtesy Flickr/San Jose Public Library California Room)
I keenly anticipated my first visit to the Mystery House. I must have been hoping that the house would yield up its secret to me. At first glance I was deflated, for the unusual reason that from the outside, the house wasn’t entirely weird.
But the drama of this house, like the drama of Winchester’s life, was unfolding on the inside. A staircase, one of 40, goes nowhere and ends at a ceiling. Cabinets and doors open onto walls, rooms are boxes within boxes, small rooms are built within big rooms, balconies and windows are inside rather than out, chimneys stop floors short of the ceiling, floors have skylights. A linen closet as big as an apartment sits next to a cupboard less than an inch deep. Doors open onto walls. One room has a normal-sized door next to a small, child-sized one. Another has a secret door identical to one on a corner closet—it could be opened from within the room, but not from without, and the closet drawer didn’t open at all.
Details are designed to confuse. In one room, Winchester laid the parquetry in an unusual pattern: When the light hit the floor a particular way, the dark boards appeared light, and the light boards, dark. Bull’s-eye windows give an upside-down view of the world. Even these basic truths, of up and down, and light and dark, could be subverted.
The house teems with allusions, symbols and mysterious encryptions. Its ballroom features two meticulously crafted Tiffany art-glass windows. Here, she inscribed her most elegant clues for us. The windows have stained glass panels with lines from Shakespeare. One reads, “These same thoughts people this little world.” It’s from the prison soliloquy in Shakespeare’s Richard II. Deposed from power and alone in his cell, the king has an idea to create a world within his prison cell, populated only by his imaginings and ideas.
Winchester’s mansion conveys a restless, brilliant, sane—if obsessive—mind and the convolutions of an uneasy conscience. Perhaps she only dimly perceived the sources of her unease, whether ghostly or profane. But she wove anguish into her creation, just as any artist pours unarticulated impulses into her work. Over repeated visits, I came to think that if a mind were a house, it would probably look like this.
The house is an architectural exteriorization of an anguished but playful inner life. Ideas, memories, fears and guilt occur to us all day long. They come to consciousness. If they displease or terrify, we brood or fuss over them for a while, then revise them to make them manageable, or we plaster over them and suppress them, or refashion them into another idea. One of the house’s builders recalled, “Sarah simply ordered the error torn out, sealed up, built over or around, or … totally ignored.” The mental and architectural processes of revision, destruction, suppression and creation were ongoing, and similar.Mrs. Winchester’s Main Bedroom (Copyright Winchester Mystery House)
Perhaps the same mental process happens with a country’s historical narratives about its most contentious and difficult topics—war, conquest, violence, guns. The family name was synonymous by the 1900s with a multi-firing rifle, and the Winchester family had made its fortune sending more than 8 million of them into the world. It wasn’t crazy to think that she might have been haunted by that idea, that she might have perpetually remembered it, and just as perpetually tried to forget.
I’ve come to see the house as a clever riddle. Winchester made charitable donations, certainly, and if she had wanted to, she could have become a philanthropist of greater renown. But the fact remains that she chose to convert a vast portion of her rifle fortune into a monstrous, distorted home; so we can now wander through her rooms imagining how one life affects others.
Instead of building a university or a library, Sarah Winchester built a counter-legend to the thousands of American gunslinger stories. And in this counter-legend, the ghosts of the gun casualties materialize, and we remember them.
Pamela Haag, Ph.D., is the author most recently of The Gunning of America: Business and the Making of an American Gun Culture. She has published two other books and numerous essays on a wide variety of topics.
“Somebody else can have Madison Avenue,” Lyndon Johnson once said. “I’ll take Bird”—that is, his wife, Claudia Alta Taylor “Lady Bird” Johnson. (She got her elegant nickname as a toddler, when a nanny said she was as “purty as a lady bird.”) The president recognized her political acumen. Not everyone did—or does. When Robert Schenkkan’s play All the Way, about the fight for passage of the 1964 Civil Rights Act, appeared on Broadway, some friends and advisers said that Lady Bird Johnson was not given enough credit. The screen version, which appeared last month on HBO to much praise, recasts her as a more important figure in her husband’s administration.
But I don’t think it went nearly far enough. Her influence, like that of many first ladies, is still not fully understood and is often underestimated. She was wise to keep it that way while she was in the White House—as the example of more publicized first ladies perhaps shows. Now, she deserves more credit.
Lady Bird Johnson was a political adviser, moral compass, and informal therapist for her husband, who was, according to Lyndon Johnson’s adviser Joe Califano, essentially a manic-depressive. “She helped him when he was down,” he told me while I was researching my book about first ladies. “She leveled it out for him.” Larry Temple, who served as special counsel to President Johnson, said “there was nobody closer during my time to LBJ than Lady Bird Johnson. Absolutely no one whose advice, whose counsel, whose judgment he sought and took more than Lady Bird Johnson.” When the first lady occasionally left the White House, Temple knew to tread carefully. “If she were gone,” he remembered, the president was “like a caged animal.”
Lady Bird Johnson came into the White House in mourning after President Kennedy’s assassination, unlike most first ladies who are celebrated with inaugural balls. But she wasted no time once she moved in. The Highway Beautification Act of 1965, which cleaned up the nation’s highways and limited billboards, was her signature issue as first lady. But her job as a trusted adviser to her husband gave her influence on many other topics throughout LBJ’s presidency. For example, she helped inform her husband’s decision to push through Congress the historic Civil Rights Act, which overturned Jim Crow segregation laws. She knew that action needed to be taken after witnessing firsthand the humiliation of her family’s cook, Zephyr Wright, when they drove together from the Johnsons’ Texas ranch to Washington. Hotel managers in the South refused to offer her a room because Wright was African-American.
Johnson’s first lady was furious at such discrimination. But she also knew the South well, as she grew up in a small East Texas town. During the presidential election campaign, she helped her husband to victory when she traveled 1,628 miles across eight southern states on her “Lady Bird Special.” She rallied fellow southerners, some of whom resented her husband for forcing them to change their way of life with his civil rights legislation. She made 47 speeches on the whistle-stop train trip and bravely stood up to hecklers with signs that read, “Black Bird, go home!”
When she wasn’t campaigning, Lady Bird Johnson wielded power quietly. Though she was a trailblazer—the first wife of a U.S. president to have her own press secretary and the first to campaign without her husband—she did not make her influence widely known. She was in the White House from 1963 to 1969, before many tenets of feminism were widely accepted, and she was expected to focus on being a wife and mother. If this meant that she did not get the praise she deserved, she also avoided much of the criticism heaped on other first ladies who came after her.
The most criticized first ladies were Nancy Reagan and Hillary Clinton. Much has been made of Reagan’s covert power: She famously instigated the dismissal of her husband’s chief of staff, Don Regan, and persuaded President Reagan to appoint more moderate Republicans as advisers. Men in the West Wing called her “Evita” (after Argentina’s powerful first lady Eva Perón) and “The Missus” behind her back. She became a lightning rod for her husband’s administration and had to shoulder the burden of criticism.
So did Hillary Clinton, who was equally unapologetic about her influence in her husband’s administration. (Clinton is the only first lady to have run for public office, making her second bid for the presidency this year.) Many voters were aghast when Bill Clinton named his wife to head up his ambitious health care reform plan. She also took up an office in the West Wing—a controversial decision that she later told Laura Bush she regretted making.
Lady Bird Johnson, by contrast, worked out of a small blue sitting room overlooking the rose garden in the White House’s second floor. She used her influence surreptitiously but effectively. Mornings, when the Johnsons breakfasted together in the bedroom, President Johnson would listen intently. “He felt that she had no alternative agenda except his best interest and she would tell him what he needed to hear whether he wanted to hear it or not,” the Johnsons’ daughter, Luci, told me. She laughed and explained that her mother was “that one person who’s going to tell him if there’s spinach in his teeth so he has a chance to get to a mirror and get it out.”
He even asked her to grade his speeches. In a phone call after a news conference on March 7, 1964, Lady Bird Johnson asked her husband, “You want to listen for about one minute to my critique, or would you rather wait until tonight?” “Yes, ma’am,” he replied. “I’m willing now.” Her major takeaway: He needed to speak more slowly and stop looking down at his notes so often. “I’d say it was a good B-plus,” she said. In 1968, right before LBJ shocked the nation in a live, nationally televised address when he said he would not be seeking another term, it was Lady Bird Johnson who walked into the Oval Office with a note. “Remember—,” it read, “Pacing and drama.”
It was also Lady Bird Johnson who, in 1964, insisted on releasing a statement in support of their close friend and top political adviser, Walter Jenkins, who was arrested on what was then called a “homosexual morals” charge in a YMCA men’s room a few blocks from the White House. Lyndon Johnson wavered, suggesting they keep quiet for political reasons. But Lady Bird Johnson would not abandon their friend in his hour of need. “If we don’t express some support to him,” she said, “I think that we will lose the entire love and devotion of all the people who have been with us.”
After the Johnsons retired to their Texas ranch in 1969, LBJ lived only four more years, dying of a heart attack in 1973 at age 64. Lady Bird Johnson outlived her husband by almost thirty-five years, but they were fulfilling ones for her. She continued her work on environmental causes in Texas, founding the National Wildflower Research Center. She planned her husband’s library and could often be found working in her office there. And she became the grande dame of former first ladies, calling her successors to check in on them during difficult times in the White House. Rosalynn Carter told me that during the Iran hostage crisis, “Lady Bird Johnson often reached out with concern.”
No one understood better how tricky a position the office of first lady could be. Her example shows that Americans seem to want their first ladies to be seen and not heard. Johnson knew this instinctively, and she was able to stay above the fray in a way that Reagan and Clinton were not. But that didn’t mean Johnson wasn’t powerful. Though it operated in the shadows, her influence was real and lasting.
On April 27, 1945, days before Adolf Hitler committed suicide in his Berlin bunker, an enterprising writer convinced a young Army sergeant to commandeer a jeep and drive into the heart of the embattled city, without an adequate map or any real plan for what might come next.
Virginia Irwin, a reporter for the St. Louis Post-Dispatch, would be one of the first Americans to witness Russian fighters clashing with the remnants of Nazi forces. Irwin’s nerve-wracking journey netted her the scoop of her bold wartime career, but she has since been largely overlooked among pioneering female combat correspondents. No American correspondent had been inside the city in years – foreign reporters had been kicked out in 1941. Irwin provided an unparalleled firsthand account to readers across the nation.
As they wound their way through lines of haggard Russian troops headed for Berlin, a surreal scene awaited Irwin and her traveling companions, journalist Andrew Tully of the Boston Traveler and the driver, Sergeant Johnny Wilson. They saw exhausted soldiers singing and celebrating as they advanced into the final battle. Despite the chaos – bodies littered the sidewalks amid ongoing fighting – the mood encompassed both merciless vengeance and jubilant relief. “The Russians were happy – with an almost indescribably wild joy,” she recalled. “They were in Berlin. In this German capital lies their true revenge for Leningrad and Stalingrad, for Sevastopol and Moscow.”
The arrival of Russian forces in Berlin signaled the proverbial nail in the coffin for Hitler’s regime as Allied forces progressed irreversibly toward the German capital. The specter of the Russians’ arrival inspired fear in residents who had hunkered down to ride out the final, futile months. When Irwin arrived, the city was still under a barrage of artillery and the site of street-by-street combat. She and her companions had no protection whatsoever for their opportunistic push into Berlin, risking safety in their quest for the first reporting out of the Hitler’s Berlin.
That night, navigating into the city without proper maps and no fixed destination, they stumbled across a Russian command post where they were welcomed by a surprised but raucously hospitable group of Russian officers. Irwin’s descriptions were of a dreamlike blend of death and dancing – they were fêted by their hosts as fighting raged blocks away, shaking the ground and filling the air with the smell of “cordite and the dead.” She danced until she was “puffing from the exertion.” Toasts were raised to Stalin, Churchill, Roosevelt and Truman.
She felt a degree of disdain for the German civilians she encountered, but was so taken by her Soviet hosts – who “fight like mad and play with a sort of barbaric abandon” – that in the emotion and gravity of the moment she declared a desire to “join the Russian Army and try to help take Berlin.”Post-Dispatch reporter Virginia Irwin and Army Sgt. Johnny Wilson in Berlin April 27-28, 1945, while the Russians were advancing upon the last German defenders in the bomb-wrecked city. She got there four days before Adolf Hitler killed himself. (St Louis Post-Dispatch / Polaris)
Irwin typed this account by candlelight as it happened, but it wasn’t until more than a week later, after V-E Day was declared, that readers across the country would be captivated by this glimpse into the last chapter in the long and bloody fight for Europe. There had been a steady stream of stories about hometown soldiers fighting in Europe, but Irwin’s series showed readers war from another perspective. For the Russians she encountered, this was not a distant war – it was one in which they had lost loved ones at home. The sense of vengeance deeply felt, and the corresponding fear among Germans remaining in Berlin, was palpable. “You get a real sense of a city on the brink with everything falling apart from the way she writes about it - you get a sense of what she felt,” says Jenny Cousins, who spearheaded an archival project at the American Air Museum in Britain that included Irwin. “It’s a very visceral account, and obviously that’s the first. People haven’t been in Berlin in years other than POWs. There’s no one else who’s got this experience. She was there before Hitler’s death.”
The Associated Press wire service realized the magnitude of her scoop, and soon picked up her story, with newspapers from around the country running the series in full. An editor from The Seattle Times sent the Post-Dispatch a congratulatory note, calling her “journalistic glory undimmed by the shabby treatment accorded by the Army censors.” Even in its belated form, it impressed everyday readers and journalism professionals alike.
Irwin was born in 1908 in Quincy, Illinois, where her father worked as a salesman. The oldest of three children, she was close to her family but as a young adult would experience two tragedies in close succession. Her father, Clare Irwin, succumbed to lung issues resulting from fighting in World War I and her teenage younger brother Grant drowned in the Mississippi River in 1928. Irwin was a standout student, earning acceptance into nearby Lindenwood College before entering the workforce. A brief marriage ended in divorce. When she embarked on her overseas reporting career in her mid-30s, she was older than many women who worked in Europe.
Opportunities for women in journalism were largely limited to select formulae of lifestyle-oriented stories. After joining the Post-Dispatch as a file clerk in 1932 at the age of 24, Irwin was promoted to food editor, for no known reason other than her gender (she never liked to cook and found the promotion insulting). Days after Pearl Harbor thrust America into a global war, a feature on holiday shopping dubbed “Battle of the Bundles” ran under her byline.
But she was itching to get to the action– even though the Post-Dispatch had no interest in sending her. Overall, fewer than 130 American women held credentials, but most were removed from combat zones and none filed for the Post-Dispatch. “It was really frowned upon that they go to the front lines,” says Marilyn Greenwald, professor of journalism at Ohio University. “There were a lot of hurdles just getting there,” to say nothing of the challenges thereafter. Irwin’s wanderlust did not convince her employer – so she found another avenue to get to Europe.
“She had to join the Red Cross to get there,” says her niece Mosey Hoffmeister. “They wouldn’t send a woman over, [but] she was determined.” Irwin had taken a formal leave of absence from the Post-Dispatch for her new job, but soon began filing with her editors anyway. She called watching the wounded arriving from the beaches of Normandy “my first taste of the horrors of war.”
Irwin finally became a credentialed correspondent for the Post-Dispatch and soon linked up with units from the Third Army. She sent back vivid, first -person narratives of her experiences, emphasized the human element – from the mundane challenges of cold feet in winter and the no-frills food options to the danger constantly threatening to take the lives of relatable Joes from the St. Louis area.Virginia Irwin with American airmen in England. Soldiers called her "mom," and one of the conversation-starters she employed was to encourage the boys to "go home for five minutes" and talk about what their families and friends were doing back in the states. (St Louis Post-Dispatch / Polaris)
Irwin shared in that danger – during one tour of an observation post, she had to take cover behind a chimney while “under Jerry fire.” (Germans were often referred to derogatorily as “Jerrys” and “krauts” in newspaper coverage.) Despite the terror she felt at the time, Irwin was quick to point out that she could now claim, “with the best of the men correspondents, that I’ve been to the front lines.” The repeated exposure to such dangers seemed only to embolden her in the months before Berlin.
But her intrepid journey into the German capital did not endear her to her U.S. Army minders. At the time, the War Department oversaw correspondents in the theater. Like other correspondents, Irwin was required to wear a uniform. There was also a more practical matter – lacking the technology to send their writing back across the Atlantic, they relied on Army resources to send back their dispatches. For days, Irwin’s Army censors refused to transmit her writing back to the States. They also pulled her credentials, rendering her unable to continue reporting. After outspoken but fruitless protests she departed for home, furious and exasperated. In a sidebar story that ran May 10, next to her third installment, Irwin called the whole episode “the greatest exhibition of bungling I ever saw in my life.”
Irwin returned home an instant local celebrity, receiving a slew of honors and recounting her experience in Berlin in luncheons and interviews. Letters from readers expressed pride in her accomplishment (and in the case of one admiring local man, more than once). Her editor, Joseph Pulitzer II, was so happy with her work that he gave her a year’s salary – the bonus announcement tacked up on the newsroom’s bulletin board for all to see.
Despite the accolades, the Post-Dispatch newsroom was still staffed entirely by men. Members of the small club of female combat correspondents could not necessarily expect to parlay these proud moments into sustained gains in journalism. “It was a long time before women really were respected the way men were, and in their numbers the way men were covering news,” says Greenwald. Women like Irwin had advanced the ball, but the playing field would be slow to change.
Within a year Irwin made a decision that was perhaps pragmatic considering the prevailing post-war landscape: she moved to New York to write feature stories from the Post-Dispatch’s bureau, a position of relative autonomy which she enjoyed for the next 14 years. There she had the freedom to write features on the arts, politics, and personal profiles. “I think when she came back, if she had stayed in St. Louis, she would probably have not stayed in [journalism], because she would have felt too stifled,”says Hoffmeister. “She was lucky that she got the experience.”
When she moved back to St. Louis from New York in 1960, Irwin would be assigned to write “Martha Carr,” an advice column spanning topics from neighborhood spats to marital problems, which she loathed. She soon retired, but her sense of independence was undimmed in her later years. She settled on a rural Missouri farm near family, a quieter life punctuated by adventurous trips down the Amazon River and in far-flung locales. She didn’t write or publish about her travels after retirement. She considered writing a memoir, From D-Day to Bidet, but other than some notes left in her sister’s possession did not do so.
The excitement and camaraderie she experienced in Europe would leave a lasting mark. Writing from France in December of 1944, Irwin had predicted that in retirement her prevailing “memories will be of war…huddled over an old pot-bellied stove and fanning the breeze with the lads who are doing the fighting.”
The Mars rover Curiosity is power-hungry, narcoleptic and solitary—but that’s just what it takes to explore the solar system like a rock star. Today the rover is a media darling. Like any human celebrity, Curiosity takes frequent selfies, has a music video and a parody Twitter account and has been immortalized as a LEGO figurine. The famous robot even has a troubled past.
Back in 2008, Curiosity—technically called the Mars Science Laboratory, or MSL—was being heavily derided for getting behind schedule and going over budget. The mission was originally pitched to NASA as a $1.6-billion spacecraft, and it was supposed to launch in 2009. But a variety of technical hurdles caused the launch schedule to slip to 2011, and costs ballooned to $2.5 billion. According to Rob Manning, the mission’s chief engineer, young Curiosity’s troubles can be traced back to its most celebrated feature: the sky crane landing system.
The sky crane was like a jetpack that lowered the rover to the Martian surface on tethers. It was only one part in a phase of the mission called entry, descent and landing (EDL). To the engineers at NASA, the EDL phase was also called the seven minutes of terror, because once it started, everything was automated and there was nothing for the team to do but eat peanuts and cross their fingers.
The sky crane was a completely novel way to land spacecraft on Mars, developed to accommodate the one-ton Curiosity rover. Because it was so new, and because landing on Mars is always a challenge, designing and troubleshooting the EDL systems became a huge part of the overall mission design, one that overshadowed the rest of the rover’s needs, Manning says in his new book Mars Rover Curiosity, published by Smithsonian Books.
“I think that MSL's bright and shiny new EDL system … did in fact distract all of us a bit away from the fundamentals of building a brand new and radically different rover,” he says. Together with best-selling author William L. Simon, Manning recounts Curiosity’s highs and lows in the book, offering a peek inside the minds of the NASA and private sector workers who had to struggle to send this now world-famous mission to Mars.
For instance, the focus on the sky crane and other EDL gear meant that the team spent less time considering Curiosity’s power source. The previous two rovers, the twins Spirit and Opportunity, were solar powered. The trick was that the arrays could generate about 110 watts, but each rover needed 1500 watts to be fully operational. According to Manning, the solution was to make the rovers narcoleptic—they would be awake for only a few hours of each Martian day, drawing power from an on-board battery to drive or run experiments. Then they would take a nap and wake up again to do more work. “A day in the life of a rover is a bit more like an old dog than a racecar,” Manning writes.
While Curiosity was outfitted with a nuclear power source instead of solar panels, it was also a much larger machine carrying 11 complex science instruments and cameras. In addition to power for general operations, those instruments would need to be heated to work properly on frigid Mars. About a year before the 2009 launch date, as details about some of the science instruments came in, the team realized that even with power naps, Curiosity’s battery was too small for the task. Using a bigger battery without finding other places to trim would make the rover too heavy to land.
Troubles piled on from there, including worries about wind blowing away rock samples before they could be analyzed, and signs that detaching the rover from the sky crane’s tethers would short-circuit a vital communications link during landing. Delays in sending in finished hardware to assemble the spacecraft meant that NASA had to make the call and announce that they would miss the 2009 launch window.
“Once your rover misses that window … the cost automatically goes up, and that is just for the ‘taxi meter’ of the team having to wait longer to get off the job,” says Manning. The silver lining was that the extra time allowed the team to work out the kinks—fix the circuits, work in a bigger battery—and launch successfully on November 26, 2011.
Image by NASA/JPL-Caltech. A JPL engineer checks out the robotic arm movements on a test version of the Curiosity rover. (original image)
Image by NASA/JPL-Caltech. A graphic shows the multiple steps Curiosity had to make to land safely on Mars. (original image)
Image by NASA/JPL-Caltech. JPL engineers celebrate moments after confirming that Curiosity landed safely on Mars. (original image)
Image by NASA/JPL-Caltech/MSSS. A rocky outcrop reveals rounded stones being weathered out of sedimentary rock, a sign that this part of Mars once featured a flowing stream. (original image)
Image by NASA/JPL-Caltech/MSSS. The first sample of powdered rock was delivered to the rover's onboard chemistry lab in February 2013. (original image)
Image by NASA/JPL-Caltech/Ames. The Chemistry and Mineralogy (CheMin) experiment on the Curiosity rover took its first x-ray of a soil sample in October 2012. The results showed chemical signatures of minerals that suggest Martian dirt is very similar to volcanic soils in Hawaii. (original image)
Image by NASA/JPL-Caltech/MSSS. Curiosity took a picture of its left-front wheel in November 2013, revealing scrapes, dents and even punctures due to rolling across sharp rocks. (original image)
Image by NASA/JPL-Caltech/MSSS. Curiosity at last drilled into the base of Mount Sharp in September to collect samples for analysis. (original image)
Since its laudable landing in August 2012, Curiosity has been sending back vast amounts of data, from high-resolution images of Mars and its moons to the first clear signs that drinkable water capable of supporting life once flowed on the planet’s surface. A little over a year into the mission, the rover has now reached its main target, the base of a Martian mountain nicknamed Mount Sharp. Layers of exposed sediment could tell scientists more about Mars’s seemingly habitable past, and may even hold preserved traces of primitive life.
“We were all absolutely flabbergasted when the very first drill hole revealed a place on Mars that was habitable billions of years ago,” says Manning. “What we’ve got here is a place that not only could have supported life, but it could, if we keep looking, be a place that chemically stored those records. That’s what led us to put high priority on heading for the hill.”
The road trip hasn’t been without its snags, chief among them the unexpected wear on Curiosity’s wheels. When the wheels were being designed, the main worry was that an overly heavy rover would get stuck in the sand—a fate that spelled the end for the rover Spirit in 2010. So the team made Curiosity’s six large wheels to act like floatation devices, says Manning. Each wide, dunebuggy-esque wheel was hollowed from a block of lightweight aluminum.
What the team didn’t know is that the rover would have to drive over wind-sculpted rocks embedded in clay, which acts like a bed of nails. Those razor-sharp rocks started to tear up the wheels, and Manning anticipates a metal shard might one day rip into the rover’s internal cabling, crippling the mission. Until then, “we need to pick our path carefully,” says Manning. “We are also considering software changes that would minimize the damage by ensuring that the wheels speed up a tad as the wheel climbs over a rock. This reduces the wear.”
The glitch shows how each Mars mission can build on the capabilities of the next, a process Manning highlights in the book as he describes lessons learned from spacecraft going back to the Viking landers in the 1970s. He is already putting some of Curiosity’s experiences to good use in designs for the next Mars rover, slated to launch in 2020, and in a system for landing people on Mars with an inflatable disk and a next-generation parachute.
Manning adds that Curiosity and its Martian kin are allowing engineers to develop technologies, such as autonomous driving software, that will probably be crucial to future rovers headed for even more remote locales, such as the icy moons of Jupiter and Saturn. “Going to the outer planets, or to moons like Europa, Ganymede and Enceladus—in all cases you need a vehicle that has the smarts for autonomy,” says Rob Manning, currently the Mars Engineering Manager for NASA’s Jet Propulsion Laboratory. “We’re not joysticking it like a remote-controlled car. We’re telling it where we’d like it to go, and its job is to figure out how to get there.”
But more than the technical revelations, Manning believes the story of Curiosity is important for humanity on a much more basic, almost existential level. “I think the message is that even though MSL was a big budget NASA mission (at least big by today's standards), it is not built by abstract engineers and scientists working in faceless institutions,” Manning says. “Instead it is built by a bunch of people. People just as human, just as fallible and just as intelligent as most people you know. … This is ultimately a human endeavor and we are lucky to be part of it.”
On a Sunday morning on the Nicaraguan island of Rama Cay, Becky McCray visits with her family in her parents’ home over a breakfast of beans, coconut rice, coconut bread, and thick coffee, with the grounds still swimming in the bottom of the cup. The food was prepared over an open fire in a wall-less kitchen building; the aroma of coffee mingles with the wood smoke and the salty sea breeze.
Like other traditional homes built by the Rama, Nicaragua’s smallest indigenous group, McCray’s parents’ wooden home sits on stilts. The planks of the floor and walls are fitted together loosely, so you can see chickens scratching underneath from inside. The roof is made of thatched palm leaves and the windows are square holes, with solid wood shutters to close out violent evening winds.
Ten of McCray’s 11 adult siblings still live on Rama Cay, a 22-hectare island that rises from the water like a set of oversized goggles about a kilometer and a half off Nicaragua’s Caribbean coast. The island is home to roughly half of the Rama’s 2,000 or so community members; McCray and another sister traveled from Bluefields, the closest city, 20 minutes by motorboat up the coast. Some of their children, aged two through 11, race through the house. The family members joke with one another in Rama English (also known as Rama Cay Kriol), the native language for most members of the Rama community. This English creole is incomprehensible to speakers of standard English.
One brother talks about his upcoming fishing trip—he’ll fish from a traditional wooden dory on the open ocean and sell his catch on the mainland. Fishing is his primary source of income, as is common for Rama men. Elsewhere on the island, both men and women are preparing their canoes for a trip inland to plant corn, beans, and breadfruit in their farmland.
Unlike most Rama, Becky McCray has a college degree and speaks fluent Spanish. In between laughing with her siblings and nephews, she discusses her work as a legal defender for indigenous communities in Nicaragua’s Caribbean region. Recently, most of her personal and professional energy has been focused on protecting the Rama’s territory from being bisected by an interoceanic canal.
“Where they are going to put the canal is where our people go to fish. They survive by that,” she says.
The Rama’s territory, along Nicaragua’s Caribbean coast, stretches roughly from the Costa Rican border north to just south of Bluefields. Their territory is shared with the Kriols, descendants of Africans who adopted the Rama way of life centuries ago. The Rama-Kriols hold a communal title not only to the nine settlements where community members live, but also to the 4,843-square-kilometer territory where they fish, hunt, and farm. If current construction plans for the canal go ahead, that territory will be severed in two.
The massive Nicaragua Canal planned by a secretive Chinese billionaire, Wang Jing, and managed by his company, the Hong Kong Nicaragua Development Group (HKND), will stretch from the Pacific coast, across Lake Nicaragua, to the Caribbean coast and is destined to wipe at least one Rama village off the map. It will also make travel between the northern and southern parts of the territory impossible, at least as the Rama travel now, in small motorboats and wooden canoes. The Rama’s fishing grounds will no longer be safe in the path of 400-meter-long megaships approaching the canal. Rama farming techniques involve elaborate field rotation and substantial travel to reach the fields; the canal will both reduce the available farmland and render much of it inaccessible.
Although the Rama community is among the least powerful groups in Nicaragua, an international court case currently underway gives them and other canal opponents a glimmer of hope.The proposed route of the Nicaragua Canal cuts across the country and bisects Rama-Kriol territory. (Mark Garrison)
Nowhere is concern about the canal more acute than the village of Bangkukuk Taik, about two to three hours south of Rama Cay by motorboat over the open ocean. The isolated village is home to about 140 people, including 15 or so who still speak Rama, an indigenous language in the Chibchan family related to languages spoken as far south as Colombia. Bangkukuk Taik is among the most isolated of the nine villages in the Rama-Kriol territory and is the only place where there are regular classes in Rama for children. The Rama in Bangkukuk Taik have the deepest knowledge of traditional farming, hunting, and medicine, like how to hunt deer at night and how to collect iibu seeds and use the oil as a cough and headache medicine.
Under the current canal route, Bangkukuk Taik will become the canal’s Caribbean-side deep-water port and will be called Punta de Águila. (Bangkukuk Taik means “Eagle Point” in Rama; Punta de Águila has the same meaning in Spanish.) The wooden houses on stilts will—critics assume, based on the proposed port location—be destroyed and replaced by high-rises and port infrastructure. It’s hard to imagine people used to walking barefoot and hunting and fishing for their livelihood fitting into the slick, modern city represented in mock-ups of what the finished Punta de Águila will look like. The current residents of Bangkukuk Taik will be forced to move.
McCray has been trying to prevent that from happening for more than two years. The day before the canal concession law was adopted by the National Assembly, in June 2013, she and four other members of the Rama-Kriol Territorial Government traveled from Bluefields to the capital, Managua. They hoped to testify against the law they feared would destroy the traditional way of life in the Rama territory.
Just as their bus to Managua was preparing to depart, three police officers boarded and demanded McCray and her companions gather their belongings and disembark. McCray insisted on seeing the police officers’ identification. They refused. After a tense 10-minute standoff, the group was allowed to go. The following day, McCray and her companions watched in dismay as the law was adopted. “We didn’t get a chance to say anything,” McCray remembers. “They didn’t respect us, they didn’t give us a chance to defend what we were claiming.”
Nicaraguan human rights lawyer Maria Luisa Acosta is McCray’s primary source of legal support and has represented the Rama in all of their legal challenges related to territory since the late 1990s. Acosta filed a legal challenge to the canal concession law on July 1, 2013, just weeks after it was approved. Like the 31 other legal challenges to the law—based on environmental factors, human rights, and national sovereignty—the Rama’s legal case was dismissed. The Supreme Court said the lawsuits were invalid because the law passed the National Assembly with a wide majority and because the major development project took precedence. (Acosta and other canal opponents think the challenges failed because Nicaragua’s Supreme Court is controlled by the ruling Sandinistas.)
According to both international and Nicaraguan law, indigenous people must give their “free, informed, and prior consent” to any project that will affect the community’s territory or way of life. According to Manuel Coronel Kautz, the president of Nicaragua’s Canal Authority, the National Assembly had documents from the Rama-Kriol government giving permission for the canal to be constructed prior to the vote that granted the concession—though he has not been able to produce those documents. Telemaco Talavera, the spokesperson for the Canal Commission, has similarly stated to the Nicaraguan press that the Canal Commission has all the necessary permission from the Rama-Kriol to carry out studies and other actions on their territory.
The Rama-Kriol government disagrees. In a press release just after Talavera’s announcement, it clarified that it had provided permission solely for environmental and social-impact studies. The first permit was granted in November 2013—several months after the concession was signed into law. The Rama-Kriol government claims that it yielded to pressure from the national government and only granted the permit after environmental consultants contracted by HKND and escorted by the military entered Rama territory, causing alarm within the communities.Becky McCray is among the canal opponents currently fighting for indigenous rights in Nicaragua. (Emily Liedel)
Citing the government’s failure to obtain free, informed, and prior consent to use Rama-Kriol lands as part of the canal construction before passing the concession law, Acosta filed a complaint with the Inter-American Commission on Human Rights (IACHR) in June 2014. The following December, she asked the IACHR for precautionary measures, which would prevent work from proceeding on the canal until the Rama had been properly consulted. The IACHR is a part of the Organization of American States and hears complaints about human rights abuses from around the Americas.
In March, Acosta, McCray, and five other canal opponents traveled to Washington, DC, for the IACHR hearing. McCray represented the six indigenous groups whose territory is affected by the canal route; the others spoke about canal-related environmental impacts, police repression of protesters, and other human-rights violations. McCray was nervous as she read her remarks in Spanish. She cited three articles in the concession law that explicitly give the Canal Commission the right to expropriate indigenous land, and then she accused the government of violating international norms in the way it conducted community consultations, perhaps most blatantly by paying villagers—many of whom are illiterate—to come to the meetings. (Those villagers, Acosta claims, were then pressured into signing documents that they could not understand.)
Thomas Antkowiak, a law professor at Seattle University and a specialist in the Inter-American human-rights system, believes the Rama’s case against the canal is, under international and even Nicaraguan law, ironclad. But that doesn’t mean the IACHR will halt canal construction, which officially began in December 2014 on the Pacific coast, or order that the concession law be changed or overturned. Like other international organizations, the IACHR depends on its member states. In lower-profile cases, Antkowiak says, member states usually abide by the commission’s decisions. However, when international law conflicts with a high-profile project, it’s more complicated.
In the case of Belo Monte, a major hydroelectric dam in Brazil’s Amazon, indigenous leaders filed a complaint in front of the IACHR in 2010, and in 2011 the commission found in their favor, ordering the Brazilian government to stop all construction on the dam until the indigenous communities had been properly consulted. The Brazilian government announced that it would ignore the ruling and subsequently broke off its relationship with both the commission and the Organization of American States. The IACHR then backtracked, saying in a statement that the indigenous leaders’ complaints were not really about the lack of consultation but about whether or not the dam should be constructed at all. The commission removed its requirement that the government consult with the indigenous groups.
In the Nicaragua Canal case, the IACHR issued a summary of the March proceedings in late June, which included a confirmation that the commission had asked the Nicaraguan government for proof that they adequately consulted with the Rama and studied the environmental impacts. In Acosta’s view, this is a step in the right direction. “It’s the first time someone is demanding that the government provide information,” she says. “None of the [other] international organizations or regulators have done so yet.”
The deadline for Nicaragua to respond to the request is confidential and is released neither to the press nor to the petitioners. As of publication, neither the Nicaraguan representatives nor the IACHR will comment on where the case stands. When it’s issued, the actual reply from the Nicaraguan government—which the IACHR will base its recommendations on—will also be confidential. If the government fails to respond or ignores the recommendations, the commission can recommend that the case proceed to the Inter-American Court of Human Rights, based in San José, Costa Rica. The court’s rulings are legally binding for the 25 states that have accepted its jurisdiction—which includes Nicaragua.
Although the concession agreement with HKND doesn’t make any special references to indigenous territories, Kautz, the president of Nicaragua’s Canal Authority, insists that indigenous peoples will be treated differently than regular landowners. Aside from the Rama, whose territory will likely be the most impacted, at least four other indigenous groups will face disruption if the canal proceeds. Nicaraguan law explicitly bars indigenous land from being bought or sold; that means the land will be rented, not expropriated, says Kautz. Yet, critics say that because this is not expressly stated in the concession law, the land is vulnerable to seizure.
In fact, Acosta and other opponents say that, as written, the canal concession law gives HKND the right to expropriate land anywhere in the country, regardless of whether or not the canal is built. Acosta worries that the Rama will lose their territory—displaced by golf courses and beach resorts—even if the Nicaragua Canal is never built.
The last time the Rama territory was seriously threatened was in the late 1990s, when the Nicaraguan government planned a dry canal (an overland route for cargo) that would have bisected the community’s territory. Legal challenges against the dry canal were unsuccessful, but it was never built for political and economic reasons.
Maybe the Rama will dodge unwanted development a second time. But it will take a sustained fight from the community and international support. The case at the IACHR is probably the Rama’s best chance for meaningful international intervention, but it remains to be seen whether or not this glimmer of hope is enough to protect their territory and keep their culture alive.
This article originally appeared under the headline "The Rama Versus the Canal."
The Harvard College Observatory is home to over 500,000 glass photographic plates emblazoned with some of the most beautiful phenomena of our universe—star clusters, galaxies, novae, and nebulae. These plates are so scientifically and historically valuable that the Harvard Library is working to digitize them today. In her recent book The Glass Universe: How the Ladies of the Harvard Observatory Took the Measure of the Stars (out December 6), Dava Sobel tells the story behind of these plates and the group of women who dedicated their lives to studying and interpreting the mysteries hidden in them.
The process of making Harvard College Observatory the center of stellar photometry and discovery began in 1883, when Edward Pickering, the Observatory’s director, wrote to a woman named Mrs. Anna Palmer Draper. Pickering informed Mrs. Draper of his intent to carry out the work of her late husband Henry Draper—that of photographing the stars and determining their spectral classification. As director, Pickering already had the desire, the resources, and the staff needed to begin such a project. Driven by a deep love for her husband and astronomy, Mrs. Draper agreed to support and fund Pickering’s endeavor.
Central to the project was a group of women known as “computers.” These women spent their days poring over photographic plates of the night sky to determine a star’s brightness, or spectrum type, and to calculate the star’s position. Sobel found in her research that Harvard was the only observatory that predominantly employed women for such positions. Some of these women, like Antonia Murray niece to Henry and Anna Draper, came to the observatory through family ties, while others were intelligent women looking for paid, engaging work. Many of these women entered the Observatory as young women and dedicated the rest of their lives to astronomical work. Pickering thought women to be just as capable as men in astronomical observation, and he believed their employment would further justify the need for women’s higher education. When the project began in 1883, Pickering employed six women computers, and in only a few short years, as the project expanded and funding increased, the number grew to 14.
Sobel knew when she started research for The Glass Universe that it was going to be all about the women. But approaching her subject matter and the book’s structure still proved a challenge. “It seemed daunting because there were so many women,” Sobel said in an interview with Smithsonian.com. Even after deciding to write the book, she says, “I wasn’t sure at the beginning how to manage them—whether it would be possible to treat them as a group or pick one and focus on the one and treat the others in a subsidiary way.” Knowing that it would not be easy, Sobel says, “I finally convinced myself it had to be the group, and the plates themselves would tie everybody together.”
Of these women, Sobel singles out a select few who shone particularly brightly. Antonia Maury, for instance, developed an early version of the spectral classification system that distinguishes between giant and dwarf stars, and became the first woman to author part of the Annals of the Astronomical Observatory of Harvard College, the Observatory’s annual publication of the year’s stellar classifications. Another “computer,” Williamina Fleming, discovered more than 300 variable stars and several novae and, along with Pickering updated, the classification system to account for variations in a star’s temperature.Williamina Paton Stevens Fleming began working for the Pickerings as a maid. She later went on to establish a system for classifying stars by their spectra. (Public Domain)
Henrietta Swan Leavitt was the first to find a relationship between the variation in magnitude of a star’s brightness and the star’s period of variation, the fundamental relationship for measuring distance through space. Annie Jump Cannon—in addition to classifying thousands of star’s spectra—created a unified classification system from Maury’s and Fleming’s systems that more clearly defined the relationships among stellar categories, a system which is still in use today. Cecilia Payne was the first woman to receive a Ph.D. in astronomy at Harvard, and was the first to theorize about the abundance of hydrogen in the composition of stars.
All their discoveries, individually and together, came from hundreds of hours studying the hundreds of thousands of stars captured on the delicate glass plates.
Sobel expertly weaves together the scientific endeavor of mapping the universe with the personal lives of those closest to the century-long project. As in her earlier book Galileo’s Daughter, in which Sobel offers a nuanced look at Galileo’s battle with the church based on the letters of Galileo’s illegitimate daughter Maria Celeste, Sobel relies on correspondence and diaries to give readers a glimpse into the rich inner lives of her main characters. “I wanted to be able to say things that would distinguish the women one from another,” she says “If you just talk about their work, then they are cardboard figures.” By drawing on records of their lived experience, she makes them come alive.
Not only does Sobel show us what daily life was like for these women, but she also reveals how they felt about the work they did—and each other. In her diary, Fleming expressed both her love for Edward Pickering and her dissatisfaction with the low pay she received for her high-quality work. Cannon once wrote about the pride she felt in being the only woman and authority in a room of men, and her excitement at casting her vote for the first time after the passing of the 19th Amendment. We can delight in the way these women celebrated each other, and then be moved to tears by the loving manner in which they mourned each other upon their deaths.
For Sobel these personal details are integral to the story as a whole. “It’s not a story without them,” she says, “The characters have to make themselves present.”Stars appear as black dots in this negative plate of the Small Magellanic Cloud, a satellite galaxy of the Milky Way that can be seen from the Southern Hemisphere. (Courtesy Harvard College Observatory)
It wasn’t just the women computers who sustained the project. Pickering also relied heavily on the work of amateur astronomers. During the 19th century, there was a trend among American and British scientists to try to cultivate a specific image for themselves as professionals. Part of that involved establishing science as a masculine pursuit and also delineating themselves from amateurs. But Pickering had great insight into what amateurs and women could accomplish. Sobel explains Pickering’s inclusiveness: “I think because he had been an amateur astronomer himself, he understood the level of dedication that was possible and the level of expertise.”
Amateurs may rank lower on the professional hierarchy of science, but as Sobel says, “These were people who came to the subject out of pure love and never stinted on time devoted to what they were doing, whether it was building a telescope or making observations or interpreting the observations.” The word “amateur,” after all, derives from the French “lover of.”
Though Fleming, Cannon, and others shouldered the hands-on work of observation, classification and discovery, the dedicated funding and enduring interest of women donors sustained the expanding work of the Observatory. The money that Mrs. Draper gave the observatory was equal to their entire annual budget. “That changed the fortunes of the observatory so dramatically,” says Sobel. “It increased the reputation of the observatory in the eyes of the world.”
In 1889, six years after Mrs. Draper made her generous donation, Catherine Wolfe Bruce gave another $50,000 toward the construction of the of the 24-inch astrophotographic telescope called “The Bruce,” which was installed in Arequipa, Peru. For Sobel, “Mrs. Bruce represents the appeal that astronomy has for people. You will meet people all the time who just tell you how they love astronomy ...and she was one of those,” she says. Bruce was integral to expanding the project into the Southern Hemisphere, and as Sobel says, her donation of the telescope named in her honor “made the Henry Draper Memorial super powerful.”
The Glass Universe tells a story of science that is not of individual, isolated genius, but rather an endeavor of collaboration and cooperation, setbacks and celebration. This book also tells a different story about women in science, one which has a long history. “I think people are surprised to learn that women were doing this kind of work at that time,” says Sobel. “It wasn’t developed in a recent administration. It’s just always been there.” Many people might know of the Harvard computers, but few understand the complexity of the work they did or even recognize their work as intellectual and scientific.
“This is something that is so ingrained in women: ‘Well, if a woman was doing it, it probably wasn’t that important,’” Sobel says. In her book, she shows us something else entirely: a story of scientific discovery with women at its fiery center.
Have you ever gotten a postcard from a national park? Chances are the picture that comes to mind—maybe the powerful eruption of Old Faithful spouting up in Yellowstone or the rocky depths of the Grand Canyon—is the same shot that people across the world have seen.
There’s a reason for that. The idea of America’s national parks that's ingrained in the collective consciousness has been shaped through more than 150 years of photographing them, Jamie Allen contends in her new book, Picturing America’s Parks.
You might be surprised by just how important a role photography played in constructing what America thinks of as national parks today. Allen, an associate curator at the George Eastman Museum, weeds through the parks' origins, critically exploring the forces behind those now-iconic visages.
While national parks were created to preserve the country’s natural heritage and allow any person to experience their beauty, few were able to see them in person until the mid 20th century, when improved roads and more accessible travel allowed tourists to experience the images in person. Early stereographs and photography helped justify the original national parks, but they also shaped how they were viewed by the public.
By the 1930s, thanks the invention of the modern car and the construction of paved roads within the parks, people began to make road trips to the parks en masse. Drawn in by the circulating images of the early photography and art that had already captivated their imaginations, people arrived in droves. Advances in photographic technology made the parks seem even more accessible. The National Park Service used the advent of color postcards to highlight park amenities—not to mention the newly paved roads that wound their way through the established photo spots—as a way to encourage more tourism to help pay for conservation efforts.
In the decades that followed, these cemented images of the parks continued to be recycled and reconstructed through new lenses as people explored and examined the parks' legacy. Today, these same images show up transposed through a modern eye, which questions and personalizes these iconic views once again.
Allen discusses the motives of conservation and consumerism at work in her book and exhibition on National Park photography at the George Eastman Museum on view until October 2nd with Smithsonian.com.
How did you get the idea to create Picturing America’s Parks?
A couple of years ago we were kicking around ideas for exhibitions [at the George Eastman Museum]. I brought up an idea of doing an exhibit on photography in the American West because I'm from there. Lisa Hostetler, our curator in charge, said, “Hey, the national parks anniversary is coming up. Is there something we could do in tandem with that?” So I looked into it, and we went in that direction.
This is a story that spans more than a century. Where did you start with your research?
I realized it was really about this journey of exploring these spaces in the 19th century, which then leads [to] them becoming tourist spots—and tourism really drives the understanding of what these spaces are. [Then] preservation comes into being and photographers like Ansel Adams and Eliot Porter start to look at how we can promote these spaces through photography and make them known so that people want to preserve them. All of that, of course, is coupled with art photography all along the way.
Conservation has such a through line in this story of photographing the parks. Can you talk about the evolution of conservation photography within the parks?
Our national parks system is all based on this idea of preserving this land so it's not bought up by individuals and changed into spaces we can no longer enjoy out of natural spaces. By the time cars roll around, we're really changing these spaces. We're putting fences in them and adding roads in them and preserving them, but also changing them to make them easily accessible to people. [It's] kind of a double-edged sword—in a way we are affecting those spaces, good or bad.
I loved how you showed the way people are talking about the parks today, like the National Park Service’s #findyourpark campaign. How has the conversation today become more inclusive through photography?
I think there is a way of speaking about it that helps people take ownership of it in a different way than they did before. The parks have always been a national pride, but as you encourage people to take individual ownership of the spaces, it helps people connect to them in a different way.
As you traced the history of photographing the parks, were there any photo trends that surprised you?
Places like Yosemite, Yellowstone, the Grand Canyon really were established through photography and art. I add art in there because Thomas Moran made a very famous painting of Yellowstone National Park which helped solidify it becoming a national park. It was hung in Congress and people got to understand the color and space and what that area was. As we put images out into the public, we see them proliferate themselves. They get repeated over and over again. Those become the established views that we see. That really shapes the way that we understand these spaces.
There are far fewer images of [newer] spaces [like Pinnacles National Park]. Ansel Adams made images, but they are not as well-known because that park is much newer, so I think as we establish these spaces and set them aside, that's when we see these images come into our collective consciousness.
Did you notice one particular photographic technology that changed the perception of the parks the most?
Photography changed the parks in general, but I do think color really impacted the way that people understood these landscapes. You can see a black and white photograph and understand that the landscape is significant, but if you look at someplace like Yellowstone or the Grand Canyon in color, it really changes your vantage of what that space looks like if you've never been there. You don't understand the peaches and the blues and the greens and the yellows and the pinks that come out of that landscape.
After a long while, I had only looked at pictures of Yellowstone basically in black and white or albumen, and then I saw one that was one of the hot springs and it blew my mind. I hadn't really thought about what that space would look like in color and what it would be like to stand there in color. It really transforms how your brain can understand the space. It's not like I've never seen these photographs before, but it really made an impact on me after sifting through so many photographs to see this thing totally come alive in a totally different way than I'd expected.
How does what’s happening on Instagram and social media today feed into or change the way the parks are seen?
It's interesting to see people try to place themselves in those scenes, and what they're doing mimics what's always been done. There's a picture of a gentleman standing in the archway at Yosemite in the tunnel, and when you look through the book you see from the moment that tunnel was created that becomes the vantage that people want to take. There’s something ingrained in our consciousness that makes us approach these things in the same way over and over again.Photographer unknown, Yosemite Valley from tunnel view, ca. 1940 (Image courtesy of George Eastman Museum, museum accession)
Coming out of this project, how has your perception of the National Parks changed?
It’s something I still grapple with. In the beginning, I thought setting aside natural spaces was the way to preserve them, but now that I've learned more about how they were set aside and understand the changes that had to be made to those spaces, there's definitely that question—have we done well by populating these landscapes and then setting them aside? We affect everything in those spaces, [for example] the bears that live there – letting them understand what human food is, and making them want to come be part of our campsites. [Then we have to] drive them away from our campsites because it's not good for them to be near us. We put roads through the parks. We've changed water structures of certain areas by putting holes through mountains in order to create tunnels and roads.
After doing all this work, is there a particular park that you now want to visit the most?
Oh man, all of them. I was only able to represent 23 of the 59 parks in the exhibit, so it's really amazing to think about these spaces that we've set aside. Yellowstone and Yosemite both stick out in my mind. I know those are probably two of the most significant spaces. They're the first two that were really set aside. I really want to walk through the landscape and understand what it looks like and see that photographic vantage come into sight. Now that I've seen the photographic vantage so many times, I want to experience El Capitan from other angles.
Would you take that same iconic shot?
I don't know. I'd probably take that shot but I'd also see if there was anything else that wasn't that shot. In one way it's kind of like collecting baseball cards or something—you have to take the shot that you have to, the one that everyone takes, but then you can explore.
Alan Turing, one of the fathers of the computer age, was an extraordinarily clever man. So clever, in fact, that he understood that the term “machine intelligence” was just about meaningless. Better, he reasoned, to talk about what a machine can actually do: Can it talk? Can it hold down a conversation? At least that is something we can attempt to study. Turing eventually proposed what has come to be known as the “Turing test”: If a judge can’t tell which of two hidden entities is a human and which is a artificial, the machine has “passed” the test – which is exactly what is said to have happened this past Saturday in London.
“We are… proud to declare that Alan Turing’s test was passed for the first time,” one of the organizers, Kevin Warwick of the University of Reading, said as the results were announced. The winning chatbot goes by the name of “Eugene Goostman,” a computer program that emulates the personality of a 13-year-old Ukrainian boy. “Eugene” managed to convince 33 percent of the judges that it was human at Saturday’s event, held at the Royal Society’s offices in London on the 60th anniversary of Turing’s death. (Turing, a homosexual, was convicted of gross indecency in 1952 and was ordered to undergo hormonal “treatment” as part of a plea agreement. Two years later he died from cyanide poisoning in an apparent suicide.)
But a word of caution is in order. “Intelligence” has always been a slippery subject, and the Turing test in particular has long been fraught with controversy. Turing described how it would work in a 1950 paper titled “Computing machinery and intelligence.” He took the idea from a traditional Victorian parlor game, where you try to figure out if the person hidden behind a curtain is a man or a woman, just by asking questions. (The answers to the questions had to be written down, because the voice would be a giveaway.) Here’s how Turing’s version would work: You’d have a judge, sitting in front of two curtains, with no way of knowing what’s behind them. Behind one curtain is a human; behind the other is a computer. The judge can ask questions of either of the two hidden entities. Based on the responses, the judge tries to figure out if the hidden entity is a human or a machine. (Turing envisioned the conversation as being mediated by teletype machines; today, we can use any kind of electronic, text-based interface, like the kind used in Internet chat rooms, or instant messaging.)
Turing speculated that by the year 2000 “an average interrogator will not have more than 70 per cent chance of making the right identification” – that is, computer programs would stymie the judges 30 per cent of the time – after five minutes of questioning. The “five minutes” is important. Turing didn’t talk about a time limit as being an inherent part of the test, and one could argue that for a machine to really pass the test, it ought to be able to handle any amount of questioning. Presumably the five-minute criteria was an arbitrary but necessary limit. The year 2000 came and went, with chatbots making only halting progress. (In a more sober moment, responding to a question from a BBC interviewer in 1952, Turing said it would be 100 years before a machine passed the test.)
Back in 2012, I was a judge in a “Turing test marathon,” the largest-ever set of Turing tests conducted at one time; it was held at Bletchley Park, in England, the site of Turing’s vital code-breaking work during the final years of the Second World War. (It was organized by the same team that ran Saturday’s event, and an earlier version of Eugene was the winner that time, too.) The set-up for Saturday’s event was the same as in 2012: The judges typed their questions at a computer, then waited for the replies to appear on their screens; the chatbots, along with the “hidden humans,” were in another room, out of sight.
The first thing I became hyper-conscious of is that when you’re a judge in a Turing test, five minutes goes by pretty fast. And the shorter the conversation, the greater the computer’s advantage; the longer the interrogation, the higher the probability that the computer will give itself away. I like to call this the mannequin effect: Have you ever apologized to a department store mannequin, assuming that you had just bumped into a live human being? If the encounter lasts only a fraction of a second, with you facing the other way, you may imagine that you just brushed up against a human. The longer the encounter, the more obvious the mannequin-ness of the mannequin.
It’s the same with chatbots. An exchange of hellos reveals nothing – but the further you get into it, the more problems arise. Chatbots, I found, seem prone to changing the subject for no reason. Often, they can’t answer simple questions. At the risk of sounding vague, they just don’t sound human. In one of my conversations in 2012, I typed in a simple joke – and the entity I was conversing with instantly changed the subject to hamburgers. (Computer scientist Scott Aaronson recently had a similar experience when he chatted with Eugene via the bot’s website. Aaronson asked Eugene how many legs a camel has; it replied, “Something between 2 and 4. Maybe, three? :-)))” Later, when Aaronson asked how many legs an ant has, Eugene coughed up the exact same reply, triple-smiley and all.)
Note also that Eugene doesn’t emulate a native-English-speaking adult; it pretends to be a young and somewhat flippant Ukrainian teen, conversing in reasonably good (but far from perfect) English. As Vladimir Veselov, one of the program’s developers, told Mashable.com: “We spent a lot of time developing a character with a believable personality.” Although Eugene will engage anyone on any topic, his age “makes it perfectly reasonable that he doesn’t know everything.” Eugene doesn’t come right out and announce his age and nationality; but he’ll reveal it if asked – and the end result may be a certain amount of leniency from the judges, especially regarding English grammar and word use. (I’m assuming most of the judges on Saturday were native English speakers, though I don’t know this for certain.) The tables would likely have been turned if Eugene were ever to encounter a native Ukrainian speaker as a judge.
The struggle to build a talking machine highlights just how complex language is. It’s not just a question of talking – you have to talk about something, and what you say has to make sense – and it has to make sense in the context of what the other person has just said. For us, it’s easy; for computers, not so much. And so chatbots rely on an assortment of tricks: Memorizing megabytes of canned responses, or scouring the Internet for dialogue that might approximate the conversation they’re currently in the midst of. In other words, what a machine lacks in intelligence it may be able to make up for in raw computing power. This is why Google or Siri (the iPhone personal assistant) can seem so smart to us: Siri may not have a “mind,” but it has access to such a vast database of information, it can act as though it does. It was the same kind of brute-force approach that allowed IBM’s “Watson” to win at Jeopardy! in 2011.
All of this raises a crucial question: What is it, exactly, that the Turing test is measuring? Some critics have suggested that it is rewards trickery rather than intelligence. NYU Psychologist Gary Marcus, writing at NewYorker.com, says Eugene succeeds “by executing a series of ‘ploys’ designed to mask the program’s limitations.” Steven Harnad, a psychologist and computer scientist at the University of Quebec in Montreal, was even more skeptical, telling The Guardian that it was “complete nonsense” to claim that Eugene had passed the Turing test. (To his credit, Turing was well aware of this issue; he called his idea “the imitation game,” and spoke of intelligence only sparingly.) Even more awkwardly, the computer, unlike the human, is compelled to deceive. “The Turing Test is really a test of being a successful liar,” Pat Hayes, a computer scientist at the Institute for Human and Machine Cognition in Pensacola, Florida, told me following the 2012 Turing test marathon. “If you had something that really could pass Turing’s imitation game, it would be a very successful ‘human mimic.’”
And “human” is the other key point: Isn’t it possible that there are other kinds of intelligence in the world, beyond the kind displayed by our species? A truly intelligent machine would have countless practical applications, but why focus on creating more “people”? After all, we have plenty of people already. As the linguist Noam Chomsky has pointed out, when we strive to build a machine that moves underwater, we don’t require it to “swim” – and a submarine is no less of an achievement for its inability to do the backstroke.
Yes, Eugene is impressive, at least in small bursts. And yet, even the best chatbots stumble on questions that a child half Eugene’s pretend-age could handle breezily. Perhaps not surprisingly, most AI researchers spend little time obsessing over the Turing test. Machine intelligence is, in fact, moving forward, and rather swiftly. Voice-to-text translation software, which was fairly pathetic just a few years ago, is rapidly improving, as are language translation programs. Amazon often has a pretty good idea of what you want to buy even before you do. And Google’s self-driving car would have been mere fantasy a decade ago. But conversation, as we keep re-discovering, is really hard, and it is not likely to be the frontier in which AI shines most brightly. For now, if you’re looking for someone to chat with, I recommend a real human.
Dan Falk is a science journalist based in Toronto.
For the epicurean traveler, discovering new landscapes also means discovering new foods. And no doubt, new tasting experiences are one of the highlights of going places, yet I’m going to suggest something a bit radical, yet simple—that perhaps we all consider abstaining, at least sometimes, from dishes containing either meat or dairy, even while we’re abroad in new lands with exotic cuisines to explore. Don’t panic at the suggestion—just listen: An abundance of science analyzing the impacts on the earth of livestock farming has concluded that humanity’s appetite for meat and dairy products is having serious environmental consequences. Livestock species contribute directly and indirectly to deforestation, water pollution, air pollution, greenhouse gases, global warming, desertification, erosion and human obesity, and virtually anywhere you go in the world, the damage done by ruminants, pigs and poultry, and those who grow feed crops for them, is visible on the land. Dry and scrubby Greece, once a nation of woodlands, has gone to the goats. In Brazil, forests are falling before the advance of soybean fields, cultivated largely as beef fodder. In New Zealand, the banks of wild streams are frequently found trampled and muddied by grazers.
Other ecological problems associated with raising livestock are less obvious to the eye—like loss of biodiversity. On parts of the Great Plains, cows, and the fields of grain they eat, have replaced pronghorn antelope and bison. Livestock ranchers worldwide have participated heavily in the extermination of wild predators. In California, overuse of river water for agricultural use, including a million acres of water-intensive alfalfa (the state’s highest-acreage crop, used for feeding animals), has contributed to the long-term decline of wild salmon runs. Sixty percent of the state’s alfalfa fields lie in the San Joaquin Valley, ground zero in the water wars between farmers and salmon fishermen. And the mighty, man-size totuava, a Mexican fish species that once spawned in huge swarms in the Colorado River delta, has just about vanished partly because the Colorado barely reaches the Sea of Cortez anymore (remember in Into the Wild when vagabond Chris McCandless was unable to find the sea as he paddled a canoe downstream through the Colorado River delta?). Much of the Colorado’s flow is diverted to the Imperial Valley, a regional king of alfalfa hay production. Most California-grown alfalfa is fed to dairy cows—meaning, sadly, that the production of milk and of California’s acclaimed cheeses may be as problematic as raising meat.
The global scope of the livestock issue is huge. A 212-page online report published by the United Nations Food and Agriculture Organization says 26 percent of the earth’s terrestrial surface is used for livestock grazing. One-third of the planet’s arable land is occupied by livestock feed crop cultivation. Seventy percent of Brazil’s deforested land is used as pasture, with feed crop cultivation occupying much of the remainder. And in Botswana, the livestock industry consumes 23 percent of all water used. Globally, 18 percent of greenhouse gas emissions can be attributed to the livestock industry—more than is produced by transportation-related sources. And in the United States, livestock production is responsible for 55 percent of erosion, 37 percent of all applied pesticides and 50 percent of antibiotics consumed, while the animals themselves directly consume 95 percent of our oat production and 80 percent of our corn, according to the Sierra Club.
The United Nations report warns that “(l)ivestock’s contribution to environmental problems is on a massive scale” and that the matter “needs to be addressed with urgency,” and a report from the Worldwatch Institute says that “…the human appetite for animal flesh is a driving force behind virtually every major category of environmental damage now threatening the human future…”
So, what can do we do? Easy: Opt out of the livestock industry. Far from depriving themselves of the greatest foods, vegetarians and vegans often discover that some of the very best edible things, prepared dishes and entire national cuisines are based on plants. And for the omnivores out there, the good news is that shifting toward a more sustainable diet is easy: It simply means the minor adjustment of tipping one’s existing diet to one side; that is, omnivores already enjoy fruits, grains and vegetables—so why not just enjoy them more frequently? (I’ve been leaning in this direction increasingly for a decade, and the only non-plant foods I still firmly cling to are certain types of wild seafood.) Even in meat-centric cultures like Portugal, France, Turkey, Argentina and New Zealand, veggies do grow, and fruits do dangle from the branches. Yes, meat is everywhere. Just ignore it. In spite of warnings from meat-eating friends that “you just can’t make it in (INSERT YOUR COUNTRY HERE) if you don’t eat meat,” the truth is that vegetarians can live well almost everywhere. No culture is void of farmers’ markets or fruit-and-veggie shops, and increasingly, restaurant staffs in many places far afield recognize and respect the word “vegetarian.” And whereas the meat-eating traveler might never look further than the meat kebabs and bland grilled chicken of fast-food street vendors for his or her sustenance, vegetarians, by virtue of requiring plant-derived calories, may be required to look a little further and enter the vast bazaars where local farmers gather with their heaps of vegetables and fruits and nuts and baked goods. Many of us could spend hours on such dazzling epicurean forays. (Try browsing through a meat locker or slaughterhouse without losing your appetite, or your breakfast.)
Still skeptical? Well, the problem is, the math just doesn’t add up. We can’t eat meat at the rate we do in a sustainable world. Listen: This source claims that to feed just one omnivorous human requires more than three acres of land while all it takes to produce food for a vegan is one-sixth of an acre. And with more than seven billion people sharing the earth’s 7.68 billion acres of arable land, that would be an even split of about an acre apiece—plenty of space for growing all the food we need and enjoying what’s left for camping, backpacking, kayaking and wildlife watching—except that habitual meat-eating omnivores are using three times their own share of space, requiring that precious wild lands be used for raising animals.
Next time, we’ll have a look at the global menu of vegetarian options, as well as meet a few famous vegetarians.
There have always been food trends, says Libby O’Connell, author of The American Plate: A Culinary History in 100 Bites. Before hamburgers and sushi, there were centuries of epicurean staples, including eel pie, pear cider and syllabub, foods that have since dipped in popularity and might seem a little, well, unconventional, in today’s diet.
O’Connell attributes the rise and fall of different delicacies to, among other reasons, overharvesting of certain foods, the shift from active to sedentary lifestyles and a greater focus on convenience over time.
Many of the earliest foods that became deeply ingrained in American cuisine were carried over by English settlers who had affinities for items like oysters and turtles. As immigrants from around the world came to the U.S., they adapted dishes and drink from their home countries, creating new offerings such as chow mein and salsa, which became integrated into the broader menu of options.
While today food fads are fleeting and capricious –think the cronut–in the past, trends emerged that fulfilled key dietary or financial needs. Squirrel supplemented the protein of frontier families who needed meat to bolster their stews, while canned SPAM offered an inexpensive alternative to fresh options during challenging economic times and World War II.
Unfortunately, many prevalent dishes lost steam mostly because they became too popular and the ingredients they needed, scarce. Others disappeared because a more accessible option took their place or they were simply no longer needed. Here are seven lost foods highlighted in O’Connell’s book that were once go-to options, but have since faded from mainstream diets.Jellied eel, eel pie and mash are popular dishes in England that colonists once also enjoyed. (Flickr user Uglix)
Old Eel Pie
Sushi may be the most common use of eel today, but a few hundred years ago, eel pie was in high demand. Early Americans in 17th and 18th centuries loved eel, says O’Connell, so much that they harvested them everywhere from Cape Cod to local streams. Back then, eels were such a hot commodity, lobsters served as bait. This particular seafood originated in England, where it has been well loved for centuries and still remains popular, a highlight at “pie and mash” shops.
The decline of interest in savory eel pies was spurred by a corresponding decline in eel supply, which was once plentiful. Over time, Americans have also moved away from eating animals which are consumed in their natural form, notes O’Connell. People are increasingly less interested in seeing what their food actually looks like.
Today, although eel has seen resurgence in popularity driven by the rise of sushi, the dearth of supply continues to pose an obstacle. The aquatic delicacy has been classified as endangered on the International Union for Conservation’s Red List of Threatened Species. Because supply of the seafood has run low in Asia, there has been significant poaching in the United States, further depleting regional resources.
Roast Beaver Tail
Perhaps not presently seen as the most appetizing creature, beaver tail was once a delicacy among American Indians and European trappers during the 17th century. The food’s ascent to popularity was primarily fueled by its utility. Those out traveling in the wild urgently needed food that was high in calories and fat. Beaver tail was readily accessible and happened to fit the bill. Beaver pelts were also a valuable commodity given their use as material for warm, luxurious garments.
O’Connell compares the taste of roast beaver tail, cooked over an open fire, to that of pork rinds. This dish was still appearing in cookbooks through the 1940s, but has since disappeared. Heavily hunted for their coveted fur, beavers have become significantly more rare, although their population has recently stabilized thanks to conservation efforts.
The role that beaver tail served isn’t quite as necessary any more. “We don’t even think of beaver tail now,” says O’Connell, “The idea that you need fat calories seems counter to a culture that spends most of its time sitting.” Ironically, while beaver might no longer be on the menu, people continue to consume enough fat calories from other, more processed sources. Beavers are much less convenient to prepare and cooking them requires cleaning of their scale-like exterior and dealing with smelly glands. As a result, people have opted to indulge in something more accessible, like Oreos.Apple-based ciders have seen a resurgence in popularity while ones using pear are less common in the US. (Flickr user Karl Wright)
Before beer took off, the preferred alcoholic beverages of choice were apple and pear cider, the latter of which was also known as perry. This interest partly stemmed from the fact that settlers didn’t have as much expertise when it came to beer brewing and cider proved easier to make. Cider ingredients, namely the fruits required, were also conveniently on hand, given the orchards the settlers had planted upon arrival, although pears did prove more challenging to grow than apples.
Made in late fall because the cooler climate was conducive for storage and fermentation, perry was a sweet, crisp beverage. It met its demise in the early 19th century when German immigrants introduced lagers, which became a more popular alternative. Interestingly, although hard apple-based ciders have made more of a comeback in recent years, pear cider has fallen somewhat into oblivion.
A fresh, leafy spice, sassafras, is mostly used today in Creole cooking as seasoning for dishes like gumbo and roast chicken. However, during the 17th century it was the second most valuable export in Virginia, only behind tobacco, not solely due to its culinary uses, but also because of suspected medicinal properties.
Native Americans had been drying and powdering the spice for different healing remedies, so English settlers treated it as a cure-all—most notably for syphilis. Unfortunately, it wasn’t quite as effective as initially believed, so the bottom fell out of the sassafras market, says O’Connell.
The pungent flavoring lived on during Temperance, as a key ingredient for drinks like root beer and sarsaparilla. However, safrole, a chemical in sassafras oil was found to be carcinogenic and banned by the FDA after the 1960s. The leaves where modern-day seasonings are derived have a much lower concentration of the offending substance. Additionally, the sweet flavoring, sans safrole, can still make a tasty tea or syrup.By combining dairy and wine, Syllabub was a sweet treat that helped stretch limited alcohol available during frugal times. (Flickr user Lonnon Foster)
Wine has long held the connotation of being an upper-class and more expensive alcoholic beverage ever since the days of the founding fathers, when it had to be imported from overseas, a costly venture. For many years, no one stateside had quite figured out how to produce it using American grapes. In order to make precious wine stretch further, an almost milkshake-like drink called the colonial syllabub was invented, first emerging in the 1500s and maintaining a household presence into the 19th century. A syllabub was a frothy beverage made of whipped cream, sugar, and wine or brandy.
Interestingly, it shares many elements with eggnog, given the combination of dairy and alcohol, but O’Connell believes the use of wine may have been why the drink has not lasted until today. She notes that wine was popular among elites like George Washington and Thomas Jefferson, but many of the American traditions that have endured are based instead, on what are perceived as more egalitarian spirits, much like eggnog and its use of bourbon.
These shelled reptiles were a tremendously popular European delicacy that had ample supply in the New World. Turtle roasts held along the East River in New York served as trendy society events during the 1800s, O’Connell notes, often featured as the main protein of a hearty soup.
However, as with many popular creatures, the reptiles became victims of overharvesting and various species of turtle are now classified as threatened and endangered. Today, turtle soup is still served in New Orleans and a few other places in the southern United States, but it is not quite nearly as common as it once was.Oysters Rockefeller is made to have a green coloring reminiscent of money. (Flickr user Larry Hoffman)
Developed during the Gilded Age, this oyster dish is set apart by a signature and secret green butter sauce that garnishes the raw oysters, intended to be reminiscent of the color of money. The recipe was invented in Antoine’s Restaurant in New Orleans in 1899, a time when many chefs aimed to create foods that were “rich” and “luxurious” in flavor as symbols of the outrageous success and wealth that scions including John D. Rockefeller and Andrew Carnegie had achieved.
Like Baked Alaska, an elaborate ice cream filled cake with meringue coating, these oysters and their sauce especially, were over-the-top food embodiments of wealth incarnate. Oysters Rockefeller is still served in some restaurants, although the dish is not quite as in vogue as it was during its initial debut and it’s rumored the original recipe has never left Antoine’s. Oysters, however, continue to be popular, eaten raw, grilled and fried on their own or as part of a larger dish.