Skip to Content

Found 12,564 Resources

Heroes of the Republic

National Portrait Gallery

Members of the Press After the Ceremonies

National Portrait Gallery
This photograph was taken following the dedication of the Bull Run battle monument on June 11, 1865. Alexander Gardner is at the far left, and his brother James is lying on the grass to the right (with the hat). Other people in the picture are New York Herald reporters S. M. Carpenter (lifedates unknown), seated second from the left in profile, and L. A. Whiteley (1825–1869), standing ninth from the left. The photograph is attributed to William Morris Smith, an employee of Gardner’s.

Pair of fittings/hooks with duck's heads

Freer Gallery of Art and Arthur M. Sackler Gallery

The Presidents of the United States

National Portrait Gallery

Letting The Cat Out of The Bag

National Portrait Gallery

Supplementary catalogue of the Branford Lock Works

Smithsonian Libraries
Trade literature.

Pagination begins with page number 840 and ends with page number 1024.

From cover: "Manufactory: Branford, Conn., U.S.A.".

From cover: "Salesroom--103 Chambers Street, New York.".

From cover: "Agencies: Lloyd, Supplee & Walton, Philadelphia, Pa.; S.G.B. Cook & Co., Baltimore, Md.; B. Callender & Co., Boston, Mass.; Gordon Hardware Co., San Francisco, Cal.".

Also available online.

Elecresource

The Rise of Ocean Optimism

Smithsonian Magazine

Things are far more resilient than I ever imagined. Me, green sea turtles, coral reefs blown to bits by atomic bombs. In a twist of fate that even surprised scientists, Bikini Atoll, site of one of the world’s biggest nuclear explosions, is now a scuba diver’s paradise. Bikini Atoll located in the Pacific’s Marshall Islands didn’t just inspire the famous bathing suit; the US Army detonated the first hydrogen bomb there. Between 1946 and 1958, 23 nuclear explosions were carried out, at an incalculable cost to the people and the marine environment. Fifty years later, scientists record a thriving coral reef habitat that includes large tree-like branching coral formations with trunks the diameter of dinner plates. “It’s made a brilliant recovery,” says Zoe Richards, a scientist at the Australian Museum.

I’ve been awash in uplifting news about the ocean lately. Each day, tweets from #OceanOptimism alert me to marine conservation successes happening all over the world: a new marine sanctuary in the Galapagos Islands to protect the world’s highest concentrations of sharks; green sea turtles in Florida and Mexico no longer listed as endangered thanks to successful conservation efforts; a major fishing deal offers protection to Arctic waters.  

#OceanOptimism has reached more than 59 million people in the two years since I co-hosted a workshop with Nancy Knowlton of the Smithsonian Institution and Heather Koldewey of the Zoological Society of London that launched the Twitter hashtag on World Oceans Day 2014. 

We had no idea we were about to ignite a Twitter storm of hope. A few years before that workshop, the three of us had met and discovered a mutual passion for increasing access to ocean conservation solutions, and a shared concern about way the marine environment was so often portrayed as being synonymous with “doom and gloom.”

Heather’s desire to source and share hopeful marine solutions arose from her concern about the tendency for scientists to publish problem analyses rather than conservation successes, a view that was shared by the late Navjot Sodhi and a team of prestigious biologists. “Widespread pessimism prevails in the conservation community,” they wrote in a 2011 issue of Trends in Ecology & Evolution. “What successes have been won are rarely highlighted or fail to attract wide attention.” Heather travels extensively in her role as the Zoological Society of London’s head of marine and freshwater conservation programs. She frequently encounters marine conservation practitioners working in isolation without access to proven approaches.

Nancy’s interest in focusing on hopeful solutions stemmed from witnessing the impact of doom and gloom on the marine science graduate students she taught, and on the field of marine science more broadly. “An entire generation of scientists has now been trained to describe, in ever greater and more dismal detail, the death of the ocean,” she wrote in an article with her husband, the noted marine scientist Jeremy Jackson. In an attempt to balance that view, Nancy hosted what she called “Beyond the Obituaries” sessions at major international science conferences. Scientists were invited to only share conservation success stories. She thought a few people might show up. To her surprise, the sessions were packed.

For me, the impact of doom and gloom on kids, in particular, came as a shock. For years, I had worked with aquariums, museums, and international environmental organizations, creating strategies to engage people with marine issues. As an academic, I understood the national statistics about what people in many different countries knew and what their attitudes were toward climate change, overfishing, and other problems. But how all that “knowing” felt was nowhere to be found in that vast pool of information.  

I realized that omission when I was invited to speak with young people attending a United Nations children’s conference on the environment in 2008 in Stavanger, Norway. The participants, who ranged in age from 10 to14 years old, came from more than 90 countries and a wide range of socioeconomic backgrounds. “How do you feel when you think about the environment?” I asked. I don’t remember what I expected them to say, but so many of them expressed such a chilling sense of dread that I felt powerless to comfort them. I knew exactly what they meant. I, too, often felt despair about the state of the world. I just never imagined such feelings were shared among children living in vastly varied circumstances.

Global dread, eco-anxiety, environmental grief—despair about the future of the planet has garnered many labels in recent years. In our noble zeal to emphasize the urgency and enormity of environmental issues, we may inadvertently be raising a generation that feels hopeless about the future of the planet. Studies within the past decade from the United Kingdom, Australia, and the United States find a quarter to a half of children surveyed are so troubled about the state of the world, they honestly believe it will come to an end before they get older.

Those of us who work with marine issues are often reluctant to talk about the environment in hopeful terms, for fear it might be taken as saying it’s okay to continue the appalling degradation of the seas. “Don’t worry about PCBs, my friend. The ocean will heal itself!” That sort of thing. We worry that highlighting species recoveries will play into the hands of climate skeptics, or reduce political pressure for much-needed environmental reforms.

But what we fail to take into account is the collateral damage of apocalyptic storytelling.

Hopelessness undermines the very engagement with marine issues we seek to create. According to researchers at Columbia University’s Center for Research on Environmental Decisions, there are limits to the amount of concerns we can deal with at one time. They call it the “finite pool of worry.” Overburdening people’s capacity for worry with too much doom and gloom leads to emotional numbing. When we believe our actions are too small to make a difference, we tend to behave in ways that create the conditions in which those expectations are realized. By bombarding people with bad news about the oceans at scales that feel too large to surmount, we cause them to downplay, tune out, or shut down. Hopelessness is a self-fulfilling prophecy.

Whenever I speak about hope and the environment, someone invariably argues that marine issues are so dire, we need to scare people straight. It’s true that fear-based messages can be effective for simple, short-term, very specific behavior-changing interventions, such as convincing people to use seat belts, according to a comprehensive review of research published by the American Psychological Association in 2015. But fearmongering isn’t the answer for broad, complex, emotion-laden, societal-level issues. As research from the Yale Project on Climate Change Communication suggests, our beliefs, emotions, interests, as well as our cultural perceptions of risk and trust all influence how we respond to environmental issues.

Emotions, it turns out, are contagious. We “catch” hope from the hopeful actions of others. We don’t even have to be face-to-face. A 2014 study involving nearly 700,000 participants conducted by social scientists at Cornell University; the University of California, San Francisco; and Facebook found that emotions spread among users of online social networks.

And unlike in mainstream media, where bad news dominates environmental headlines, hope travels faster than doom on social media. Given that one out of every five people on Earth has an active Facebook account and hope is contagious, the capacity for replicable marine solutions to spread between the millions of people using social media is formidable.

Four years ago, I gave a keynote address to the International Aquarium Congress at their biennial meeting in Cape Town, South Africa, where I asked aquarium directors from around the world to help me create a global torrent of hope for the oceans. Heather walked up to me after that talk and offered her support. That simple action led to gatherings with Nancy, Cynthia Vernon (chief operating officer at the Monterey Bay Aquarium), marine campaigner Elisabeth Whitebread, and others, which resulted in the workshop where #OceanOptimism emerged. Heather went on to inspire hundreds of marine scientists to live tweet solutions based on their research findings from the International Marine Conservation Congress in Glasgow, Scotland. Nancy brought #OceanOptimism to the Vatican when she spoke at Pope Francis’s sustainability workshop. The Huffington Post, the World Bank, and thousands of other users now use the hashtag.

The momentum for hope for the oceans just keeps growing. In 2015, BBC and PBS created Big Blue Live, a multiplatform live series celebrating the remarkable conservation success story of Monterey Bay, California. Nearly five million viewers watched the program the first night it was aired on BBC One, making it the top-rated show in the United Kingdom that night. “I laughed, I cried, and, at the end, I wanted to know more and was filled with hope,” blogged movie critic Jana Monji on rogerebert.com

What I have learned from #OceanOptimism is how resilient ocean ecosystems can be. The recovery of Bikini Atoll reminds me that life is complicated. Things get horribly wrecked. That is true. But the remarkable capacity for renewal is true, too. Bikini is broken and beautiful, green sea turtles now swim where they haven’t for decades, and the ocean can still take my breath away. Far from making us complacent, stories of resilience and recovery fuel hope. Feeling hopeful enhances our capacity to take meaningful action. And that action flourishes in the supportive community of others. 

The Reception of the Diplomation & his Suite, at the Court of Pekin

National Air and Space Museum
The print depicts the scene as Lord Macartney presents his credentials to the Emperor of China in 1792. His lordship had been instructed to Kowtow three times as he approached the Emperor, that is, touch his forehead to the floor. He refused, explaining that no Englishman would submit to such an indignity. Instead, he would kneel on one knee before the Emperor, as he would to the King of England was to knell on one knee. Much was made of the lavish gifts that the crown had sent for presentation to the Emperor, as shown in the print, complete with a model balloon.

The Birth of Flight: NASM Collections

The invention of the balloon struck the men and women of the late 18th century like a thunderbolt. Enormous crowds gathered in Paris to watch one balloon after another rise above the city rooftops, carrying the first human beings into the air in the closing months of 1783.The excitement quickly spread to other European cities where the first generation of aeronauts demonstrated the wonder of flight. Everywhere the reaction was the same. In an age when men and women could fly, what other wonders might they achieve.

"Among all our circle of friends," one observer noted, "at all our meals, in the antechambers of our lovely women, as in the academic schools, all one hears is talk of experiments, atmospheric air, inflammable gas, flying cars, journeys in the sky." Single sheet prints illustrating the great events and personalities in the early history of ballooning were produced and sold across Europe. The balloon sparked new fashion trends and inspired new fads and products. Hair and clothing styles, jewelry, snuffboxes, wallpaper, chandeliers, bird cages, fans, clocks, chairs, armoires, hats, and other items, were designed with balloon motifs.

Thanks to the generosity of several generations of donors, the National Air and Space Museum maintains one of the world's great collections of objects and images documenting and celebrating the invention and early history of the balloon. Visitors to the NASM's Steven F. Udvar-Hazy Center at Dulles International Airport can see several display cases filled with the riches of this collection. We are pleased to provide visitors to our web site with access to an even broader range of images and objects from this period. We invite you to share at least a small taste of the excitement experienced by those who witness the birth of the air age.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Present at Creation:

The NASM Collection of Objects Related to Early Ballooning

The invention of the balloon struck the men and women of the late 18th century like a thunderbolt. The Montgolfier brothers, Joseph-Michel (August 26, 1740-June 26, 1810) and Jacques Etienne (January 6, 1745 - August 2, 1799), launched the air age when they flew a hot air balloon from the town square of Annonay, France, on June 4, 1783. Members of a family that had been manufacturing paper in the Ardèche region of France for generations, the Montgolfiers were inspired by recent discoveries relating to the composition of the atmosphere. Joseph led the way, building and flying his first small hot air balloons late in 1782, before enlisting his brother in the enterprise.

Impatient for the Montgolfiers to demonstrate their balloon in Paris, Barthélemy Faujas de Saint-Fond, a pioneering geologist and member of the Académie Royale, sold tickets to a promised ascension and turned the money over to Jacques Alexandre-César Charles (1746-1823), a chemical experimenter whom he had selected to handle the design, construction and launch of a balloon. Charles flew the first small hydrogen balloon from the Champs de Mars, near the present site of the Eiffel Tower, on August 27, 1783. Not to be outdone, the Montgolfiers sent the first living creatures (a sheep, a duck and a rooster) aloft from Versailles on September 19.

Pilatre de Rozier, a scientific experimenter, and François Laurent, the marquis D'Arlandes, became the first human beings to make a free flight on November 21. Less than two weeks later, on December 1, 1783, J.A. C. Charles and M.N. Robert made the first free flight aboard a hydrogen balloon from the Jardin des Tuileries.

A wave of excitement swept across Paris as the gaily decorated balloons rose, one after another, over the skyline of the city. Throughout the summer and fall of 1783 the crowds gathering to witness the ascents grew ever larger. As many as 400,000 people - literally half of the population of Paris -- gathered in the narrow streets around the Château des Tuileries to watch Charles and Robert disappear into the heavens.

The wealthy and fashionable set purchased tickets of admission to the circular enclosure surrounding the launch site. Guards had a difficult time restraining the crush of citizens swarming the nearby streets, and crowding the Place de Louis XV (now the Place de la Concorde) and the garden walkways leading toward the balloon. People climbed walls and clambered out of windows onto roofs in search of good vantage points.

"It is impossible to describe that moment:" wrote one observer of a balloon launch, "the women in tears, the common people raising their hands to the sky in deep silence; the passengers leaning out of the gallery, waving and crying out in joy… the feeling of fright gives way to wonder." One group of spectators greeted a party of returning aeronauts with the question: "Are you men or Gods?" In an age when human beings could fly, what other wonders might the future hold?

The balloons had an enormous social impact. The huge, seething crowds were something new under the sun. The spectators who gathered in such huge numbers were just becoming accustomed to the idea of change. The old certainties of their grandparent's world were giving way to an expectation that the twin enterprises of science and technology would provide the foundation for "progress."

The balloons sparked new fashion trends and inspired new fads and products. Hair and clothing styles, jewelry, snuffboxes, wallpaper, chandeliers, bird cages, fans, clocks, chairs, armoires, hats, and other items, were designed with balloon motifs. Party guests sipped Créme de l' Aérostatique liqueur and danced the Contredanse de Gonesse in honor of the Charles globe.

The Americans who were living in Paris to negotiate a successful conclusion to the American revolution were especially fascinated by the balloons. It seemed only fitting that, at a time when their countrymen were launching a new nation, human beings were throwing off the tyranny of gravity. The oldest and youngest members of the diplomatic community were the most seriously infected with "balloonamania."

"All conversation here at present turns upon the Balloons…and the means of managing them so as to give Men the Advantage of Flying," Benjamin Franklin informed an English friend, Richard Price. Baron Grimm, another Franklin acquaintance, concurred. "Among all our circle of friends," he wrote, "at all our meals, in the antechambers of our lovely women, as in the academic schools, all one hears is talk of experiments, atmospheric air, inflammable gas, flying cars, journeys in the sky."

Franklin noted that small balloons, made of scraped animal membranes, were sold "everyday in every quarter." He was invited to visit a friend's home for "tea and balloons," and attended a fête at which the duc de Chartres distributed "little phaloid balloonlets" to his guests. At another memorable entertainment staged by the duc de Crillon, Franklin witnessed the launch of a hydrogen balloon some five feet in diameter that kept a lantern aloft for over eleven hours.

The senior American diplomat in Paris purchased one of the small balloons as a present for his grandson and secretary, William Temple Franklin. Released in a bed chamber, "it went up to the ceiling and remained rolling around there for some time." Franklin emptied the membrane of hydrogen and forwarded it to Richard Price so that he and Sir Joseph Banks might repeat the experiment. The delightful little toy was thus not only the first balloon to be owned by an American but also the first to reach England. Both Franklins were soon supplying little balloons to friends across Europe.

Sixteen year old John Quincy Adams also took note of the small balloons offered for sale by street vendors. "The flying globes are still very much in vogue," he wrote on September 22. "They have advertised a small one of eight inches in diameter at 6 livres apiece without air [hydrogen] and 8 livres with it. .. Several accidents have happened to persons who have attempted to make inflammable air, which is a dangerous operation, so that the government has prohibited them."

There was a general sense that the colorful globes marked the beginning of a new age in which science and technology would effect startling change. The results and the implications of the revolution in physics and chemistry underway for over a century were largely unknown outside an elite circle of privileged cognoscenti. The balloon was unmistakable proof that a deeper understanding of nature could produce what looked very much like a miracle. What else was one to think of a contrivance that would carry people into the sky?

If human beings could break the age-old chains of gravity, what other restraints might they cast off? The invention of the balloon seemed perfectly calculated to celebrate the birth of a new nation dedicated, on paper at any rate, to the very idea of freedom for the individual. In the decade to come the balloons and the men and women who flew them came to symbolize the new political winds that were blowing through France. While some might question the utility of the "air globes," flight was already reshaping the way in which men and women regarded themselves and their world.

Of course most citizens of Europe and America were unable to travel to see a balloon. They had their first glimpse of the aerial craft through the medium of single sheet prints. In the late 18th century it was difficult and expensive to publish anything more than the roughest of woodcuts in newspapers or magazines. In an effort to share the excitement with those who could not attend an ascent, to let people know what a balloon looked like, and to introduce the brave men and women who were taking to the sky, artists, engravers and publishers flooded the market with scores of single sheet printed images. Ranging from the meticulously accurate to the wildly fanciful, these printed pictures were sold by the thousands in print shops across Europe.

The business of producing and marketing such images was nothing new. In Europe, block prints from woodcuts had been used to produce book illustrations and single sheet devotional or instructional religious images since the mid-15th century. In the 15th, 16th and 17th centuries, the technique was used to produce multi-sheet maps, bird's eye images of cities, and other products. In the early modern era, etching and engraving techniques enabled artists from Albrecht Dürer to Rembrandt van Rijn the opportunity to market copies of their paintings. .

In the 1730's. William Hogarth inaugurated a new era in the history of English printed pictures when he published his, "Harlot's Progress," a series of single sheet images charting the downfall of a young woman newly arrived in London. Other sets, including "Marriage à la Mode," appeared in the decade that followed. Other artists used the medium of the etching or engraving to reproduce portraits and offer examples of their work for sale.

By the late 18th century, Thomas Rowlandson, James Gillray and other English artists made considerable fortunes producing sporting prints and satirical images offering biting commentary on the shortcomings of the political and social leaders of the day. Rowlandson was said to have "etched as much copper as would sheathe the British navy." In order to publish his prints and caricatures while they were still newsworthy, Rowlandson worked rapidly. He would water color the first impression, then send it to refugee French artists employed by Rudolph Ackermann, one of his favored publishers, who would color each of the prints before they were hung up in the shop window. In the 1780's a typical print seems to have sold for a shilling, the price being sometimes included on the print itself.

The appearance of the balloon in 1783 provided artists, engravers and publishers in England, France, Germany and Italy a new subject for their efforts. As the wave of balloon enthusiasm swept across the continent, the production and sale of images depicting the great flights and daring aeronauts flourished. In addition to illustrating the birth of the air age, print makers made use of balloon motifs in comic images satirizing political events or social trends.

In the 19th century new lithographic techniques and the advent of improved presses and smooth paper, led to a revolution in the ability to mass produce images. Balloons remained a common subject of interest to readers, and ready material for satire in the talented hands of artists like Honorè-Victorine Daumier.

Today, the balloon prints produced by 18th and 19th century artists remain as a priceless window into the past. They enable us to share some sense of the excitement that gripped those watching their fellow beings rise into the sky for the first time. Engraved portraits tell us something of the appearance, and even the personality, of the first men and women to fly. Satirical prints utilizing balloon motifs help us to understand the impact that flight on the first generations to experience it.

The National Air and Space Museum owes its collection of balloon prints to the generosity of several leading 20th century collectors. The bulk of the prints in our collection come from Harry Frank Guggenheim (August 23, 1890 - January 22, 1971).. The son of industrialist and philanthropist Daniel Guggenheim and his wife Florence, Harry Guggenheim enjoyed multiple careers as a business leader, diplomat, publisher, philanthropist, and sportsman.

Aviation was the thread that tied his diverse activities together. A graduate of Yale and Pembroke College, Cambridge University, he learned to fly before the U.S. entered WW I and served as a Naval aviator during that conflict and as a Naval officer during WW II. In the mid- 1920's, he convinced his father to establish the Guggenheim Fund for the Promotion of Aeronautics, which had an enormous impact on aeronautical engineering and aviation in the U.S.

A collector of everything from fine art to thoroughbred horses, Guggenheim began to acquire aeronautica during the 1920's, gradually focusing his attention of aeronautical prints. His collection had grown to be one of the most complete in the world by the 1940's, when he loaned his prints to the New York museum maintained by the Institute of the Aeronautical Sciences. When the IAS dissolved its museum in the 1950's, Guggenheim donated his own collection to the National Air and Space Museum.

The NASM collection of aeronautical prints also includes items donated by the American Institute of Aeronautics and Astronautics, and by a number of other private collectors, notably Constance Fiske in memory of her husband Gardiner Fiske, who served with the U.S. Army Air Service during WW I and with the USAAF in WWII; Thomas Knowles, a long-time executive with Goodyear Aircraft and Goodyear Aerospace; and Bella Clara Landauer, one of the great American collectors of aeronautica.

There can be little doubt that William Armistead Moale Burden was one of the most significant contributors to the NASM collection of furnishings, ceramics and other objects related to ballooning and the early history of flight. . Burden began collecting aeronautical literature and memorabilia during the 1920's, while still a Harvard undergraduate. Following graduation he rode the post-Lindbergh boom to prosperity as a financial analyst specializing in aviation securities. His business success was inextricably bound to his enthusiasm for the past, present and future of flight.

By 1939, Burden was reputed to have built a personal aeronautical library second only to that of the Library of Congress. He loaned that collection to the Institute of the Aeronautical Sciences, an organization that he served as president in 1949. In addition to his library of aeronautica, Burden built a world-class collection of historic objects dating to the late 18th century - desks, chairs, bureaus, sofas, mirrors, clocks, ceramics and other examples of material culture -- inspired by the first balloons and featuring balloon motifs. After a period on display in the IAS museum, William A.M. Burden's balloon-decorated furnishings and aeronautica went into insured off-site storage in 1959. A member of the Smithsonian Board of Regents, Mr. Burden ultimately donated his treasures to the NASM, as well.

Thanks to the efforts of these and other donors, the NASM can share one of the world's finest collections of works of art and examples of material culture inspired b y the birth of flight with our visitors. We are pleased to extend the reach of our collections to those who visit our web site. Welcome, and enjoy.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Smithsonian Institution

To the Worshipful, the Mayor & Corporation of the City of Bristol

National Air and Space Museum
Pink and white striped balloon comes to rest in water; two men in gondola--one waving flag; four men rowing smaller boat.

The Birth of Flight: NASM Collections

The invention of the balloon struck the men and women of the late 18th century like a thunderbolt. Enormous crowds gathered in Paris to watch one balloon after another rise above the city rooftops, carrying the first human beings into the air in the closing months of 1783.The excitement quickly spread to other European cities where the first generation of aeronauts demonstrated the wonder of flight. Everywhere the reaction was the same. In an age when men and women could fly, what other wonders might they achieve.

"Among all our circle of friends," one observer noted, "at all our meals, in the antechambers of our lovely women, as in the academic schools, all one hears is talk of experiments, atmospheric air, inflammable gas, flying cars, journeys in the sky." Single sheet prints illustrating the great events and personalities in the early history of ballooning were produced and sold across Europe. The balloon sparked new fashion trends and inspired new fads and products. Hair and clothing styles, jewelry, snuffboxes, wallpaper, chandeliers, bird cages, fans, clocks, chairs, armoires, hats, and other items, were designed with balloon motifs.

Thanks to the generosity of several generations of donors, the National Air and Space Museum maintains one of the world's great collections of objects and images documenting and celebrating the invention and early history of the balloon. Visitors to the NASM's Steven F. Udvar-Hazy Center at Dulles International Airport can see several display cases filled with the riches of this collection. We are pleased to provide visitors to our web site with access to an even broader range of images and objects from this period. We invite you to share at least a small taste of the excitement experienced by those who witness the birth of the air age.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Present at Creation:

The NASM Collection of Objects Related to Early Ballooning

The invention of the balloon struck the men and women of the late 18th century like a thunderbolt. The Montgolfier brothers, Joseph-Michel (August 26, 1740-June 26, 1810) and Jacques Etienne (January 6, 1745 - August 2, 1799), launched the air age when they flew a hot air balloon from the town square of Annonay, France, on June 4, 1783. Members of a family that had been manufacturing paper in the Ardèche region of France for generations, the Montgolfiers were inspired by recent discoveries relating to the composition of the atmosphere. Joseph led the way, building and flying his first small hot air balloons late in 1782, before enlisting his brother in the enterprise.

Impatient for the Montgolfiers to demonstrate their balloon in Paris, Barthélemy Faujas de Saint-Fond, a pioneering geologist and member of the Académie Royale, sold tickets to a promised ascension and turned the money over to Jacques Alexandre-César Charles (1746-1823), a chemical experimenter whom he had selected to handle the design, construction and launch of a balloon. Charles flew the first small hydrogen balloon from the Champs de Mars, near the present site of the Eiffel Tower, on August 27, 1783. Not to be outdone, the Montgolfiers sent the first living creatures (a sheep, a duck and a rooster) aloft from Versailles on September 19.

Pilatre de Rozier, a scientific experimenter, and François Laurent, the marquis D'Arlandes, became the first human beings to make a free flight on November 21. Less than two weeks later, on December 1, 1783, J.A. C. Charles and M.N. Robert made the first free flight aboard a hydrogen balloon from the Jardin des Tuileries.

A wave of excitement swept across Paris as the gaily decorated balloons rose, one after another, over the skyline of the city. Throughout the summer and fall of 1783 the crowds gathering to witness the ascents grew ever larger. As many as 400,000 people - literally half of the population of Paris -- gathered in the narrow streets around the Château des Tuileries to watch Charles and Robert disappear into the heavens.

The wealthy and fashionable set purchased tickets of admission to the circular enclosure surrounding the launch site. Guards had a difficult time restraining the crush of citizens swarming the nearby streets, and crowding the Place de Louis XV (now the Place de la Concorde) and the garden walkways leading toward the balloon. People climbed walls and clambered out of windows onto roofs in search of good vantage points.

"It is impossible to describe that moment:" wrote one observer of a balloon launch, "the women in tears, the common people raising their hands to the sky in deep silence; the passengers leaning out of the gallery, waving and crying out in joy… the feeling of fright gives way to wonder." One group of spectators greeted a party of returning aeronauts with the question: "Are you men or Gods?" In an age when human beings could fly, what other wonders might the future hold?

The balloons had an enormous social impact. The huge, seething crowds were something new under the sun. The spectators who gathered in such huge numbers were just becoming accustomed to the idea of change. The old certainties of their grandparent's world were giving way to an expectation that the twin enterprises of science and technology would provide the foundation for "progress."

The balloons sparked new fashion trends and inspired new fads and products. Hair and clothing styles, jewelry, snuffboxes, wallpaper, chandeliers, bird cages, fans, clocks, chairs, armoires, hats, and other items, were designed with balloon motifs. Party guests sipped Créme de l' Aérostatique liqueur and danced the Contredanse de Gonesse in honor of the Charles globe.

The Americans who were living in Paris to negotiate a successful conclusion to the American revolution were especially fascinated by the balloons. It seemed only fitting that, at a time when their countrymen were launching a new nation, human beings were throwing off the tyranny of gravity. The oldest and youngest members of the diplomatic community were the most seriously infected with "balloonamania."

"All conversation here at present turns upon the Balloons…and the means of managing them so as to give Men the Advantage of Flying," Benjamin Franklin informed an English friend, Richard Price. Baron Grimm, another Franklin acquaintance, concurred. "Among all our circle of friends," he wrote, "at all our meals, in the antechambers of our lovely women, as in the academic schools, all one hears is talk of experiments, atmospheric air, inflammable gas, flying cars, journeys in the sky."

Franklin noted that small balloons, made of scraped animal membranes, were sold "everyday in every quarter." He was invited to visit a friend's home for "tea and balloons," and attended a fête at which the duc de Chartres distributed "little phaloid balloonlets" to his guests. At another memorable entertainment staged by the duc de Crillon, Franklin witnessed the launch of a hydrogen balloon some five feet in diameter that kept a lantern aloft for over eleven hours.

The senior American diplomat in Paris purchased one of the small balloons as a present for his grandson and secretary, William Temple Franklin. Released in a bed chamber, "it went up to the ceiling and remained rolling around there for some time." Franklin emptied the membrane of hydrogen and forwarded it to Richard Price so that he and Sir Joseph Banks might repeat the experiment. The delightful little toy was thus not only the first balloon to be owned by an American but also the first to reach England. Both Franklins were soon supplying little balloons to friends across Europe.

Sixteen year old John Quincy Adams also took note of the small balloons offered for sale by street vendors. "The flying globes are still very much in vogue," he wrote on September 22. "They have advertised a small one of eight inches in diameter at 6 livres apiece without air [hydrogen] and 8 livres with it. .. Several accidents have happened to persons who have attempted to make inflammable air, which is a dangerous operation, so that the government has prohibited them."

There was a general sense that the colorful globes marked the beginning of a new age in which science and technology would effect startling change. The results and the implications of the revolution in physics and chemistry underway for over a century were largely unknown outside an elite circle of privileged cognoscenti. The balloon was unmistakable proof that a deeper understanding of nature could produce what looked very much like a miracle. What else was one to think of a contrivance that would carry people into the sky?

If human beings could break the age-old chains of gravity, what other restraints might they cast off? The invention of the balloon seemed perfectly calculated to celebrate the birth of a new nation dedicated, on paper at any rate, to the very idea of freedom for the individual. In the decade to come the balloons and the men and women who flew them came to symbolize the new political winds that were blowing through France. While some might question the utility of the "air globes," flight was already reshaping the way in which men and women regarded themselves and their world.

Of course most citizens of Europe and America were unable to travel to see a balloon. They had their first glimpse of the aerial craft through the medium of single sheet prints. In the late 18th century it was difficult and expensive to publish anything more than the roughest of woodcuts in newspapers or magazines. In an effort to share the excitement with those who could not attend an ascent, to let people know what a balloon looked like, and to introduce the brave men and women who were taking to the sky, artists, engravers and publishers flooded the market with scores of single sheet printed images. Ranging from the meticulously accurate to the wildly fanciful, these printed pictures were sold by the thousands in print shops across Europe.

The business of producing and marketing such images was nothing new. In Europe, block prints from woodcuts had been used to produce book illustrations and single sheet devotional or instructional religious images since the mid-15th century. In the 15th, 16th and 17th centuries, the technique was used to produce multi-sheet maps, bird's eye images of cities, and other products. In the early modern era, etching and engraving techniques enabled artists from Albrecht Dürer to Rembrandt van Rijn the opportunity to market copies of their paintings. .

In the 1730's. William Hogarth inaugurated a new era in the history of English printed pictures when he published his, "Harlot's Progress," a series of single sheet images charting the downfall of a young woman newly arrived in London. Other sets, including "Marriage à la Mode," appeared in the decade that followed. Other artists used the medium of the etching or engraving to reproduce portraits and offer examples of their work for sale.

By the late 18th century, Thomas Rowlandson, James Gillray and other English artists made considerable fortunes producing sporting prints and satirical images offering biting commentary on the shortcomings of the political and social leaders of the day. Rowlandson was said to have "etched as much copper as would sheathe the British navy." In order to publish his prints and caricatures while they were still newsworthy, Rowlandson worked rapidly. He would water color the first impression, then send it to refugee French artists employed by Rudolph Ackermann, one of his favored publishers, who would color each of the prints before they were hung up in the shop window. In the 1780's a typical print seems to have sold for a shilling, the price being sometimes included on the print itself.

The appearance of the balloon in 1783 provided artists, engravers and publishers in England, France, Germany and Italy a new subject for their efforts. As the wave of balloon enthusiasm swept across the continent, the production and sale of images depicting the great flights and daring aeronauts flourished. In addition to illustrating the birth of the air age, print makers made use of balloon motifs in comic images satirizing political events or social trends.

In the 19th century new lithographic techniques and the advent of improved presses and smooth paper, led to a revolution in the ability to mass produce images. Balloons remained a common subject of interest to readers, and ready material for satire in the talented hands of artists like Honorè-Victorine Daumier.

Today, the balloon prints produced by 18th and 19th century artists remain as a priceless window into the past. They enable us to share some sense of the excitement that gripped those watching their fellow beings rise into the sky for the first time. Engraved portraits tell us something of the appearance, and even the personality, of the first men and women to fly. Satirical prints utilizing balloon motifs help us to understand the impact that flight on the first generations to experience it.

The National Air and Space Museum owes its collection of balloon prints to the generosity of several leading 20th century collectors. The bulk of the prints in our collection come from Harry Frank Guggenheim (August 23, 1890 - January 22, 1971).. The son of industrialist and philanthropist Daniel Guggenheim and his wife Florence, Harry Guggenheim enjoyed multiple careers as a business leader, diplomat, publisher, philanthropist, and sportsman.

Aviation was the thread that tied his diverse activities together. A graduate of Yale and Pembroke College, Cambridge University, he learned to fly before the U.S. entered WW I and served as a Naval aviator during that conflict and as a Naval officer during WW II. In the mid- 1920's, he convinced his father to establish the Guggenheim Fund for the Promotion of Aeronautics, which had an enormous impact on aeronautical engineering and aviation in the U.S.

A collector of everything from fine art to thoroughbred horses, Guggenheim began to acquire aeronautica during the 1920's, gradually focusing his attention of aeronautical prints. His collection had grown to be one of the most complete in the world by the 1940's, when he loaned his prints to the New York museum maintained by the Institute of the Aeronautical Sciences. When the IAS dissolved its museum in the 1950's, Guggenheim donated his own collection to the National Air and Space Museum.

The NASM collection of aeronautical prints also includes items donated by the American Institute of Aeronautics and Astronautics, and by a number of other private collectors, notably Constance Fiske in memory of her husband Gardiner Fiske, who served with the U.S. Army Air Service during WW I and with the USAAF in WWII; Thomas Knowles, a long-time executive with Goodyear Aircraft and Goodyear Aerospace; and Bella Clara Landauer, one of the great American collectors of aeronautica.

There can be little doubt that William Armistead Moale Burden was one of the most significant contributors to the NASM collection of furnishings, ceramics and other objects related to ballooning and the early history of flight. . Burden began collecting aeronautical literature and memorabilia during the 1920's, while still a Harvard undergraduate. Following graduation he rode the post-Lindbergh boom to prosperity as a financial analyst specializing in aviation securities. His business success was inextricably bound to his enthusiasm for the past, present and future of flight.

By 1939, Burden was reputed to have built a personal aeronautical library second only to that of the Library of Congress. He loaned that collection to the Institute of the Aeronautical Sciences, an organization that he served as president in 1949. In addition to his library of aeronautica, Burden built a world-class collection of historic objects dating to the late 18th century - desks, chairs, bureaus, sofas, mirrors, clocks, ceramics and other examples of material culture -- inspired by the first balloons and featuring balloon motifs. After a period on display in the IAS museum, William A.M. Burden's balloon-decorated furnishings and aeronautica went into insured off-site storage in 1959. A member of the Smithsonian Board of Regents, Mr. Burden ultimately donated his treasures to the NASM, as well.

Thanks to the efforts of these and other donors, the NASM can share one of the world's finest collections of works of art and examples of material culture inspired b y the birth of flight with our visitors. We are pleased to extend the reach of our collections to those who visit our web site. Welcome, and enjoy.

Tom D. Crouch

Senior Curator, Aeronautics

National Air and Space Museum

Smithsonian Institution

Alligators, Crocodiles, and Caiman in their Natural Environment

Smithsonian Institution Archives
Alligators, crocodiles, and caiman Alligators in their enclosure at the Reptile House at the National Zoological Park.

The Classy Rise of the Trench Coat

Smithsonian Magazine

The trench coat wasn’t exactly invented for use during the war that gave it its name, a war spent mired in muddy, bloody trenches across Europe. But it was during the First World War that this now iconic garment took the shape that we recognize today, a form that remains startlingly current despite being more than 100 years old.

The trench coat is, in some ways, emblematic of the unique moment in history that World War I occupies, when everything – from rigidly held social structures to military organization to fashion – was in upheaval; it is both a product of this time as well as a symbol of it. “It’s the result of the scientific innovation, technology, mass production… The story of the trench coat is a very modern story,” says Dr. Jane Tynan, lecturer in design history at Central Saint Martins, University of the Arts London and author of British Army Uniform and the First World War: Men in Khaki. 

Even so, the story of the trench coat starts roughly 100 years before the outbreak of World War I in 1914. As early as 1823, rubberized cotton was being used in weatherproof outerwear for both civilian and military use. These “macks”, named for their inventor Charles Macintosh, were great at keeping rain out, but equally – and unfortunately – great at keeping sweat in. They also had a distinctive and unpleasant smell of their own, and a propensity to melt in the sun. Nevertheless, Mackintosh’s outerwear, including rubberized riding jackets, were used by British military officers and soldiers throughout the 19th century. 

Inspired by the market the macks created – and the fabric’s initial shortcomings – clothiers continued to develop better, more breathable waterproofed textiles. In 1853, Mayfair gentlemen’s clothier John Emary developed and patented a more appealing (read: less stinky) water-repellant fabric, later renaming his company “Aquascutum” – from the Latin, “aqua” meaning “water” and “scutum” meaning “shield” – to reflect its focus on designing wet weather gear for the gentry. His “Wrappers” were soon necessities for the well-dressed man who wanted to remain well-dressed in inclement weather. 

Image by Burberry. Burberry had invented a breathable waterproof twill called gabardine that made its clothing useful for military uniforms. (original image)

Image by Burberry. Burberry swiftly transformed its sports coat into military wear. (original image)

Image by Burberry. Ads depicted the different functionalities of the Burberry trench coat. (original image)

Image by Aquascutum. Trench coats were known for their versatility and adaptability. (original image)

Image by Art of Manliness. Higher ranked military officers wore trench coats and were responsible for outfitting themselves. (original image)

Image by Wikimedia Commons Australian War Memorial. Fighting in the trenches was wet and slippery - waterproof coats helped to combat some of these elements. (original image)

Image by Wikimedia Commons The War Pictorial. "The trench coat was a very, very useful garment." (original image)

Thomas Burberry, a 21-year-old draper from Basingstoke, Hampshire, founded his eponymous menswear business in 1856; in 1879, inspired by the lanolin-coated waterproof smocks worn by Hampshire shepherds, he invented “gabardine”, a breathable yet weatherproofed twill made by coating individual strands of cotton or wool fiber rather than the whole fabric. Burberry’s gabardine outerwear, like Aquascutum’s, proved popular with upper class, sporty types, and with aviators, explorers and adventurers: When Sir Ernest Shackleton went to Antarctica in 1907, he and his crew wore Burberry’s gabardine coats and sheltered in tents made from the same material. 

“Lightweight waterproof fabric is] a technological development, like the Gore-Tex of that period, making a material that would be fit for purpose,” explains Peter Doyle, military historian and author of The First World War in 100 Objects (the trench coat is number 26). With the fabric, the factories, and the primary players – Burberry, Aquascutum, and, to some degree, Mackintosh – in place, it was only a matter of time before the trench coat took shape. And what drove the design was changes in how the British military outfit itself, and to a large degree, how war was now being waged.

**********

Warfare through the 1860s was Napoleonic, typically conducted in large fields where two armies faced off and fired or hacked at one another until one fell. In these scenarios, brightly colored uniforms helped commanders identify their infantry troops even through the smoke of battle. But with the technological advancements in long-range arms in place even by the Crimean War in the 1850s, this kind of warfare had become deeply impractical, not to mention deadly; bright, garish uniforms simply made soldiers easier targets. 

Military tactics needed to adapt to this new reality and so too did uniforms. The color khaki, which came to dominate British military uniforms, was the result of lessons learned in India; the word “khaki” means “dust” in Hindi. The first experiments at dyeing uniforms to blend in with the landscape began in 1840; during the Indian Rebellion of 1857, several British regiments dyed their uniforms drab colors. 

By the 1890s, khaki and camouflage had spread to the rest of the British military; in the Boer War in 1899, the utility of khaki uniforms had proven itself by allowing soldiers dealing with guerilla warfare to blend more easily with their surroundings. The British military was in some ways slow to change – bizarrely, mustaches for officers were compulsory until 1916 – but by World War I, there was an increasing recognition that uniforms needed to disappear into the landscape, allow for fluid, unencumbered movement, be adaptable to the fighting terrain, and be easily produced in mass quantities.

Trench coats offered utility during war and later, style for civilians. (Wikimedia Commons Imperial Warm Museums)

The terrain that British military outfitters were designing for even early in the war was, essentially, a disgusting hole in the ground. Trenches were networks of narrow, deep ditches, open to the elements; they smelled, of both the unwashed living bodies crammed in there and the dead ones buried close by. They were muddy and filthy, and often flooded with either rain or, when the latrines overflowed, something worse. They were infested with rats, many grown to enormous size, and lice that fed off the close-quartered soldiers. Life in the trench, where soldiers would typically spend several days at a stretch, was periods of intense boredom without even sleep to assuage it, punctuated by moments of extreme and frantic action that required the ability to move quickly. 

It was to deal with these conditions that the trench coat was designed. “This was really the modernizing of military dress. It was becoming utilitarian, functional, camouflaged … it’s a very modern approach to warfare,” says Tynan. 

In past wars, British officers and soldiers alike wore greatcoats, long overcoats of serge, a thick woolen material, that were heavy even when dry; they were warm, but unwieldy. But in the trenches, these were a liability: Too long, they were often caked with mud, making them even heavier, and, even without the soldiers’ standard equipment, were difficult to maneuver in. Soldiers in the trenches needed something that was shorter, lighter, more flexible, warm but ventilated, and still weatherproof. The trench coat, as it soon came to be known as, fit the bill perfectly.

But let’s be clear: Regular rank and file soldiers, who were issued their (now khaki) uniforms, did not wear trench coats. They had to make do with the old greatcoats, sometimes cutting the bottoms off to allow greater ease of movement. Soldiers’ clothing was a source of discomfort for them – coarse material, ill-fitting cuts, poorly made, and teeming with lice.

Uniforms for those with higher ranks, however, were a very different story. While their dress was dictated by War Office mandates, officers were tasked with the actual outfitting themselves. Up until 1914, officers in the regular army were even asked to buy the clothes themselves, often at considerable cost, rather than simply being given the money to spend as they saw fit: In 1894, one tailor estimated that a British officer’s dress could cost anywhere from £40 to £200. From the start of the war in 1914, British officers were provided a £50 allowance to outfit themselves, a nod to the fact that dressing like a proper British military officer didn’t come cheaply. 

Having officers outfit themselves also helped reinforce the social hierarchy of the military. Soldiers tended to be drawn from the British working classes, while the officers were almost exclusively plucked from upper, gentlemanly class, the “Downton Abbey” swanks. Dress was (and still is, of course) an important marker of social distinction, so allowing officers to buy their own active service kit from their preferred tailors and outfitters set them apart, fortifying their social supremacy. It also meant that though there were parameters for what an officer had to wear, they could, as Doyle says, “cut a dash”: “The latitude for creating their own style was enormous.

Burberry and Aquascutum both take credit for inventing the first trench coats. (Aquascutum)

The officers called on firms like Burberry, Aquascutum and a handful of others who marketed themselves as military outfitters; notably, these also tended to be the firms that made active, sporting wear for the very same aristocratic gentleman (Aquascutum, for example, enjoyed no less a patron than the Prince of Wales, later King Edward VII; he wore their overcoats and issued them their first royal warrant in 1897). This marriage of sporting wear and military gear was longstanding. Burberry, for example, designed the field uniform for the standing British army in 1902 and noted in promotional materials that it was based on one of their sportswear suits; Aquascutum was selling overcoats and hunting gear to aristocratic gentlemen and outfitting British officers with weatherproofed wool coats as far back as the Crimean War in 1853. Burberry and Aquascutum both created designs informed by their own lines of well-made, nicely tailored clothing for wealthy people who liked to fish, shoot, ride, and golf. This also tailored nicely with the image the British military wanted to convey: War was hell, but it was also a sporty, masculine, outdoorsy pursuit, a pleasure and a duty. 

**********

Both Burberry and Aquascutum take credit for the trench coat, and it’s unclear who really was the first; both companies had strong ties to the British military establishment and both already had weatherproof outerwear similar to the trench coat. Burberry may have a stronger claim: Khaki-colored Burberry “weatherproofs”, Mackintosh-style raincoats in Burberry gabardine, were part of officers’ kit during the Boer War and in 1912, Burberry patented a knee-length, weatherproofed coat very like the trench coat called a “Tielocken”, which featured a belt at the waist and broad lapels. But in truth, no one really knows. 

“Burberry and Aquascutum were very clever in adapting to military requirements,” says Tynan, especially as “what you’re talking about is a sport coat being adapted for military use.” The adaptation appears to have largely taken place within the first two years of war: Regardless of who really was the first, British officers had certainly adopted them by 1916, as this drawing of soldiers loading a cannon while being supervised by a trench coat-wearing officer attests. The first instance of the term “trench coat” in print also came in 1916, in a tailoring trade journal accompanied by three patterns for making the increasingly popular weatherproof coats. By this time, the coats’ form had coalesced into essentially the same thing sold by luxury “heritage” brands and cheap and cheerful retailers today. So what made a coat a “trench coat”? 

Before, during and after World War I, Burberry was one of the signature manufacturers of trench coats. (Burberry)

Firstly, it was a coat worn by officers in trenches. A blindingly obvious statement, sure, but it deserves some unpacking – because each part of the trench coat had a function specific to where and how it was used and who used it. Trench coats were double-breasted and tailored to the waist, in keeping with the style of officers’ uniform. At the belted waist, it flared into a kind of knee-length skirt; this was short enough that it wouldn’t trail in the mud and wide enough to allow ease of movement, but still covered a significant portion of the body. The belt, reminiscent of the Sam Browne belt, would have come with D-rings to hook on accessories, such as binoculars, map cases, a sword, or a pistol. 

At the back, a small cape crosses the shoulders – an innovation taken from existing military-issue waterproof capes – encouraging water to slough off; at the front, there is a gun or storm flap at the shoulder, allowing for ventilation. The pockets are large and deep, useful for maps and other necessities. The straps at the cuffs of the raglan sleeves tighten, offering greater protection from the weather. The collar buttons at the neck, and this was for both protection from bad weather and poison gas, which was first used on a large scale in April 1915; gas masks could be tucked into the collar to make them more airtight. Many of the coats also came with a warm, removable liner, some of which could be used as emergency bedding if the need arose. At the shoulders, straps bore epaulettes that indicated the rank of the wearer. 

In short, as Tynan notes, “The trench coat was a very, very useful garment.” 

But there was a tragic unintended consequence of officers’ distinctive dress, including the trench coat: It made them easier targets for snipers, especially as they lead the charge over the top of the trench. By Christmas 1914, officers were dying at a higher rate than soldiers (by the end of the war, 17 percent of the officer class were killed, as compared to 12 percent of the ranks) and this precipitated a major shift in the make-up of the British Army. The mass pre-war recruitment drives had already relaxed requirements for officers; the new citizen army was headed by civilian gentleman. But now, necessity demanded that the army relax traditions further and take officers from the soldiering ranks and the middle class. For the rest of the war, more than half of the officers would come from non-traditional sources. These newly created officers were often referred to by the uncomfortable epithet “temporary gentleman”, a term that reinforced both the fact that officers were supposed to be gentlemen and that these new officers were not. 

To bridge that gap, the newly made officers hoped that clothes would indeed make the man. “Quite a lot of men who had no money, no standing, no basis for working and living in that social arena were suddenly walking down the street with insignia on their shoulder,” says Doyle. “If they could cut a dash with all these affectations with their uniforms, the very thing that would have gotten them picked off the front line by snipers, that was very aspirational.” Doyle explains that one of the other elements that pushed the trench coat to the fore was the commercial competition built up to outfit this new and growing civilian army. “Up and down London, Oxford Street, Bond Street, there would be military outfitters who would be offering the solution to all the problems of the British military soldier – ‘Right, we can outfit you in a week.’ … Officers would say, ‘I’ve got some money, I don’t know what to do, I’ll buy all that’. There came this incredible competition to supply the best possible kit.” 

Interestingly, adverts from the time show that even as the actual make-up of the officer class was changing, its ideal member was still an active, vaguely aristocratic gentleman. This gentleman officer, comfortable on the battlefield in his tailored outfit, remained the dominant image for much of the war – newspaper illustrations even imagined scenes of officers at leisure at the front, relaxing with pipes and gramophones and tea – although this leisure class lifestyle was as far removed from the bloody reality of the trenches as the grand English country house was from the Western Front.

For the temporary gentleman, this ideal image would have been entrancing. And very much a part of this image was, by the middle of the war at least, the trench coat. It embodied the panache and style of the ideal officer, while at the same being actually useful, rendering it a perfectly aspirational garment for the middle class. New officers happily and frequently shelled out the £3 or £4 for a good quality trench coat (for example, this Burberry model); a sizeable sum when you consider that the average rank-and-file soldier made just one shilling a day, and there were 20 shillings to a pound. (Doyle pointed out that given the very real possibility of dying, maybe even while wearing the trench coat, newly made officers didn’t often balk at spending a lot of money on things.) And, of course, if one couldn’t afford a good quality trench coat there were dozens of retailers who were willing to outfit a new officer more or less on the cheap, lending to the increasing ubiquity of the trench coat. (This isn’t to say, however, that the cheaper coats carried the same social currency and in that way, it’s no different than now: As Valerie Steele, director of the Museum at the Fashion Institute of Technology in New York, puts it, “I wouldn’t underestimate people’s ability to read the differences between a Burberry trench and an H&M trench.”)

Image by Hulton-Deutsch Collection/CORBIS. Models wearing fashionable Burberry trench coats, which remain a staple today, 1973. (original image)

Image by Mirrorpix/Corbis. Flying nurses of the USAAF Ninth Troop Carrier Command, wearing special hooded trench coats in England during World War II, 1944. (original image)

Image by Corbis. Humphrey Bogart in a trench coat and fedora, 1940s. (original image)

Image by Sunset Boulevard/Corbis. American actor Humphrey Bogart and Swedish actress Ingrid Bergman on the set of Casablanca, 1942. (original image)

Image by Kirn Vintage Stock/Corbis. Four businessmen wearing trench coats as part of their work uniform, 1940. (original image)

Image by Alain Dejean/Sygma/Corbis. A model wears a trench coat as part of an outfit designed by Ted Lapidus, 1972. (original image)

Image by Paramount Pictures/Sunset Boulevard/Corbis. German actress and singer Marlene Dietrich sporting a trench coat on the set of A Foreign Affair, 1948. (original image)

Image by Imaginechina/Corbis. Burberry trench coats are still popular today, now available in many different patterns and styles. (original image)

Ubiquity is one measure of success and by that measure alone, the trench coat was a winner. By August 1917, the New York Times was reporting that even in America, the British import was “in demand” among “recently-commissioned officers”, and that a version of the coat was expected to be a part of soldiers’ regular kit at the front.

But it wasn’t only Allied officers who were adopting the coat in droves – even in the midst of the war, civilians of both sexes also bought the coats. On one level, civilians wearing a military coat was an act of patriotism, or perhaps more accurately, a way of showing solidarity with the war effort. As World War I ground on, savvy marketers began plastering the word “trench” on virtually anything, from cook stoves to jewelry. Doyle said that people at the time were desperate to connect with their loved ones at the front, sometimes by sending them well-meaning but often impractical gifts, but also by adopting and using these “trench” items themselves. “If it’s labeled ‘trench’ you get the sense that they’re being bought patriotically. There’s a slight hint of exploitation by the [manufacturers], but then they’re supplying what the market wanted and I think the trench coat fit into all that,” he says. “Certainly people were realizing that to make it worthwhile, you needed to have this magical word on it, ‘trench’.” For women in particular, there was a sense that too-flashy dress was somehow unpatriotic. “How are you going to create a new look? By falling into line with your soldier boys,” says Doyle. 

On another level, however, the war also had a kind of glamour that often eclipsed its stark, stinking reality. As the advertisements for trench coats at the time reinforced, the officer was the face of this glamour: “If you look at adverts, it’s very dashing … it’s very much giving a sense that if you’re wearing one of these, you’re at the height of fashion,” explains Doyle, adding that during the war, the most fashionable person in the U.K. was the trench coat-clad “gad about town” officer. And on a pragmatic level, Tynan pointed out, what made the coats so popular with officers – its practical functionality married to a flattering cut – was also what resonated with civilians.

**********

After the war, battle wounds scabbed over and hardened into scars – but the popularity of the trench coat remained. In part, it was buoyed by former officers’ tendency to keep the coats: “The officers realized they were no longer men of status and had to go back to being clerks or whatever, their temporary gentleman status was revoked… probably the echo into the 1920s was a remembrance of this kind of status by wearing this coat,” theorized Doyle.  

At the same time, the glamour attached to the coat during the war was transmuted into a different kind of romantic image, in which the dashing officer is replaced by the equally alluring world-weary returning officer. “The war-worn look was most attractive, not the fresh faced recruit with his spanking new uniform, but the guy who comes back. He’s got his hat at a jaunty angle... the idea was that he had been transformed, he looked like the picture of experience,” Tynan says. “I think that would certainly have given [the trench coat] a caché, an officer returning with that sort of war-worn look and the trench coat is certainly part of that image.”

The trench coat remained part of the public consciousness in the period between the wars, until the Second World War again put trench coats into military action (Aquascutum was the big outfitter of Allied military personnel this time). At the same time, the trench coat got another boost – this time from the golden age of Hollywood. “A key element to its continued success has to do with its appearance as costume in various films,” says Valerie Steele. And specifically, who was wearing them in those films: Hard-bitten detectives, gangsters, men of the world, and femme fatales. For example, in 1941’s The Maltese Falcon, Humphrey Bogart wore an Aquascutum Kingsway trench as Sam Spade tangling with the duplicitous Brigid O’Shaugnessy; when he said goodbye to Ingrid Bergman on that foggy tarmac in Casablanca in 1942, he wore the trench; and again in 1946 as private eye Philip Marlowe in The Big Sleep

“It’s not a question of power coming from an authority like the state. They’re private detectives or spies, they rely on themselves and their wits,” said Steele, noting that the trench coat reinforced that image. “[The trench coat] does have a sense of kind of world-weariness, like it’s seen all kinds of things. If you were asked ‘trench coat: naïve or knowing?’ You’d go ‘knowing’ of course.” (Which makes Peter Sellers wearing the trench coat as the bumbling Inspector Clouseau in The Pink Panther series all the funnier.)

Even as it became the preferred outerwear of lone wolves, it continued to be an essential part of the wardrobe of the social elite – a fascinating dynamic that meant that the trench coat was equally appropriate on the shoulders of Charles, Prince of Wales and heir to the British throne, as on Rick Deckard, hard-bitten bounty hunter of Ridley Scott’s 1982 future noir Blade Runner. “It’s nostalgic… it’s a fashion classic. It’s like blue jeans, it’s just one of the items that has become part of our vocabulary of clothing because it’s a very functional item that is also stylish,” says Tynan. “It just works.”

It’s also endlessly updatable. “Because it’s so iconic, it means that avant garde designers can play with elements of it,” says Steele. Even Burberry, which consciously recentered its brand around its trench coat history in the middle of the last decade, understands this – the company now offers dozens of variations on the trench, in bright colors and prints, with python skin sleeves, in lace, suede, and satin.

But as the trench coat has become a fashion staple, on every fashion blogger’s must-have list, its World War I origins are almost forgotten. Case in point: Doyle said that in the 1990s, he passed the Burberry flagship windows on London’s major fashion thoroughfare, Regent Street. There, in huge lettering, were the words “Trench Fever”. In the modern context, “trench fever” was about selling luxury trench coats. But in the original context, the context out of which the coats were born, “trench fever” was a disease transmitted by lice in the close, fetid quarters of the trenches. 

“I thought it astounding,” said Doyle. “The millions of people who walked down the street, would they have made that connection with the trenches? I doubt that.”

The Secret Orchids of Palau

Smithsonian Environmental Research Center

by Kristen Minogue Most visitors to Palau don’t come for its forests. The chain of 300-plus Pacific islands is more famous for its coral reefs, giant rays and hundreds of flamboyantly-colored fish species. “It’s known as one of the top dive sites on the planet,” said Benjamin Crain, a postdoc at the Smithsonian Environmental Research […]

The post The Secret Orchids of Palau appeared first on Shorelines.

Wetlands of the Warmer World

Smithsonian Environmental Research Center

SERC researchers race to find out how higher temps will affect coastal wetlands by Mollie McNeel Wetlands are typically filled with the sounds of crickets chirping, bees buzzing and frogs croaking. But at the Smithsonian Environmental Research Center (SERC), those are all accompanied by the whirring of motor-powered pumps. These pumps are driving air from hexagonal […]

The post Wetlands of the Warmer World appeared first on Shorelines.

Fitting of figure atop a bird (peacock?)

Freer Gallery of Art and Arthur M. Sackler Gallery

Tools of the EVA Trade

National Air and Space Museum
Some tools used by astronauts during spacewalks look like something you could pick up at your local hardware store. Many more are specially engineered to accomplish tasks in the unique environment of space. From checklists and tethers to cameras and power drills, astronauts need an army of mechanical help to work outside the spacecraft. Tool designers, tool users, and tool curators were on hand in this "What's New in Aerospace" presentation that looked at the past, present, and future of EVA (extra-vehicular activity) tools. This program is made possible through the generous support of Boeing. See more from the "What's New in Aerospace" program: http://airandspace.si.edu/explore-and-learn/whats-new-aerospace/

Is Fungus the Material of the Future?

Smithsonian Magazine

Fungus and slippers are two words that most people don’t want to read in the same sentence. However, scientists in the Netherlands are one step closer to changing people’s perceptions by creating everyday objects like chairs, lampshades and slippers using fungi—specifically oyster mushrooms (pleurotus ostreatus).

Not only are fungi readily available in nature, but they’re also sustainable and have the potential to replace less environmentally friendly materials, such as plastic. Which begs the question: Is fungus the material of the future?

This is exactly what designer Maurizio Montalti asked himself during his studies at the Design Academy Eindhoven in the Netherlands. For his 2010 thesis, Montalti wanted to find a new approach to human burials, so he began studying the degradation of human remains and what happened when he introduced fungi as a facilitating agent for decomposition. Soon he began employing his approach to manmade materials.

“It became apparent that fungi are the great recyclers of the natural world,” says Montalti. “As a student, I started cultivating an interest in a new way of producing materials that no longer relied on the exploitation of certain resources.”

Realizing fungi’s hidden potential, but not having a background in biology, he contacted Han Wösten, a professor of microbiology at Utrecht University in the Netherlands. In the years since, they have developed a method of growing fungi in a controlled environment that makes it a sustainable alternative to materials like plastic, rubber, wood and leather.

Image by Micropia. Designer Maurizio Montalti started thinking about producing materials from fungi while studying at the Design Academy Eindhoven in the Netherlands. (original image)

Image by Micropia. Montalti enlisted the help of Han Wösten, a professor of microbiology at Utrecht University in the Netherlands. (original image)

In February, they showcased their findings to the public as part of an ongoing permanent exhibition at Micropia in Amsterdam, the world’s only museum dedicated to microbes. Called “A Fungal Future,” the exhibit includes an array of everyday objects they’ve created, including vases, chairs, lampshades and slippers. By allowing visitors to interact with each piece by picking it up and discovering that it’s both firm and light, their hope is that people will walk away with a better understanding of fungi’s potential as a sustainable material.  

“Many people still have negative ideas about fungus, and that’s the whole educational part of this project we want to tackle,” Montalti says. “I think as a society we really detached ourselves from the acceptance [of fungus] because of the whole cleaning mania that developed in the 20th century, which brought good gains, but also caused us to live aseptic lives and regard fungus as something dangerous.”

Fungi’s mycelium is the vegetative network of long branching filaments (hyphae) that is invisible to the naked eye. (Micropia)

In actuality, Montalti and Wösten have found fungi to be the exact opposite, and have found a way to take fungi’s mycelium, the vegetative network of long branching filaments (hyphae) that is invisible to the naked eye, and nurture it in a controlled environment where it can be formed into specific objects using molds.

According to the museum, mycelium (plural: mycelia) is an important part of the ecosystem, since it breaks down organic material along with toxic substances, such as pesticides, and also filters water. (Interestingly, the largest known single living organism in the world is a “humongous fungus” living in eastern Oregon’s Blue Mountains, stretching approximately four square miles.)

“We can make pure mycelium [in the laboratory] by taking fungus and letting it degrade straw, sawdust, [or other agrarian waste], resulting in mycelium with a measured strength similar to [the synthetic plastic polymer] PVC, while another strain has the strength of polyethylene, which is used to make plastic bags,” Wösten says. “At the same time, it glues the sawdust or straw particles of the substrate together.”

Once enough mycelia have formed, Montalti and Wösten take the mass and put it into a plastic mold, which retains humidity and forces it to take on a specific shape.

“At this point, I’m no longer the designer,” Montalti says. “Rather I’m a choreographer orchestrating and guiding the fungus.”

The pair often uses oyster mushrooms in their work, something one would expect to find in the produce aisle of a supermarket more so than a laboratory. Not only do oyster mushrooms thrive on dead plant materials, but they’re also nontoxic, unlike other mushrooms. 

Once the fungi have filled out the mold—a process that typically takes several weeks, depending on the size of the mold—the formed object is fired in an oven, which kills the fungi and prevents further growth. “Most people don’t want a living fungus in their homes,” Wösten jokes.

Montalti first learned about mycelium after attending a workshop led by Eben Bayer, co-founder and CEO of Ecovative, a company in Albany, New York, that develops and produces sustainable packaging and building materials using mycelium. (Ecovative’s clients include Dell computers and Gunlocke, an office furniture manufacturer.) Bayer began working with mycelium as part of a school project in college to find a replacement for the toxic adhesive used in building manufacturing. In 2006, he and his business partner, Gavin McIntyre, applied for a patent and eventually began commercializing their product. In the time since, they’ve introduced dozens of designers and artists around the world to mycelium and its potential as a sustainable material, even going so far as marketing GIY (grow-it-yourself) kits that consumers can use at home.

“There are about 30 or 40 different designers and artists around the world who are doing projects with mycelium,” Bayer says. “It’s really exciting and now we’re trying to figure out how to best support them because we think mycelium can really help the world.”

Image by Micropia. Montalti and Wösten have made chairs, lampshades, slippers, even book covers using oyster mushrooms. (original image)

Image by Micropia. Due to fungi’s natural qualities, the objects aren’t meant to last forever. (original image)

Image by Micropia. "What we’re really working out is improving the mechanical properties of the materials, because that will be the turning point. I personally can’t imagine owning a shoe that only lasts a few months; a shoe should last a few years if not more," says Montalti. (original image)

Image by Micropia. Some of Montalti's earliest creations—bowls and vases made in 2012—are still in great shape. (original image)

One aspect that Montalti and Wösten are currently grappling with is the longevity of their products. Due to fungi’s natural qualities, the objects aren’t meant to last forever, a reality that the pair is in the process of trying to resolve. As an example, Montalti points to some of his earliest creations from 2012 that include bowls and vases. These initial objects remain in his studio and are “still fully solid and unchanged.”

“At this stage, [mycelium] is still an experimental material and by definition it is susceptible to degradation,” Montalti says. “It’s good to consider that all objects and applications realized so far are fully natural and therefore degradable by definition. This doesn’t mean that such items or artworks are subjected to quick decay, unless the conditions for such degradation to occur are created [such as a change in humidity or temperature]. What we’re really working out is improving the mechanical properties of the materials, because that will be the turning point. I personally can’t imagine owning a shoe that only lasts a few months; a shoe should last a few years if not more.”  

Currently the pair is experimenting with different finishes by using various coating systems that are applied to objects near the end of production, while also examining what humidity and temperature levels spur degradation. In addition to household products, they’re focusing on producing architectural materials using their developed method, such as panels, ceilings and flooring. 

“For the future, our aim is that, 20 years from now, you can buy anything you need for a construction project using fungus,” Wösten says. “So [materials made of fungus] would replace things like plastic, stones and bricks. This way if you’re going to remodel again, you can easily reuse these materials by breaking them into smaller pieces, reintroducing fungus, molding it, and then selling it again as a new product.”

Perhaps fungus really is the material of the future.

The Tiny Life of Oysters

Smithsonian Environmental Research Center
Not sure how to start your discussions on oysters? Check out this video, The Tiny Life of Oysters: Oysters may lead a tiny life, but their story has a long history and a big impact. This video mentions important concepts … Continue reading

The Rockstar Geologist Who Mapped the Minerals of the Cosmos

Smithsonian Magazine

At the age of 57, geologist Ursula Marvin traveled to Antarctica to hunt for meteorites, the first woman to ever do so.

Marvin, who died on February 12 this year at the age of 96, described her time there with an air of wonder. “Working in Antarctica is a marvelous experience. We tented and searched in the gorgeous mountainous regions,” she said in a 2001 interview. Conditions that most people would find grueling, the longtime Smithsonian scientist delighted in: “By dressing for the cold we kept comfortable, and I loved having 24 hours of daylight.”

In a way, Marvin had been preparing for such an adventure all her life. As a woman in a male dominated field—geology—she had weathered gender barriers throughout college and embraced years of fieldwork in Brazil and Africa. And after extensive study of lunar samples from NASA’s historic Apollo missions, she had acquired the knowledge and tenacity needed for an Antarctic expedition. Poised on the icy tip of the terrestrial, Marvin was ready to uncover the mysteries of the cosmic.

Born Ursula Bailey in August of 1921, she was the youngest in a family of three in the Vermont countryside. Her entire family shared a love of nature, perhaps stemming from the fact that they grew up next to the Connecticut River with a view of the White Mountains of New Hampshire just to the east. “Best of all was just after sunset when a breathtaking alpenglow lit up the mountains in shades of peach and purple,” she recalled in the 2001 interview.

Her father, an entomologist with the Department of Agriculture, and her mother, a schoolnteacher, valued education. They always expected their children to go to college. When Marvin’s turn to choose a college came around, she “felt adventurous,” and unlike her siblings, applied to colleges hundreds of miles from home. Eventually, however, she chose her father’s alma mater, Tufts College, built on a hill overlooking Boston. Even at a school that close to home, she found adventure for herself, skiing down the steep hill on snowy evenings.

“One thing I felt certain of was that I never would want to be a scientist,” Marvin said, recalling her early college days. She decided to pursue history but was also required to take two full years of science. Biology didn’t make much of an impression on Marvin, but from the very first lecture of professor Robert Nichols’ geology class, she said that she was “spellbound.” She recalls how Nichols, “a speaker of immense force, began talking about continents and oceans and how they have changed and evolved over long periods of time.”

Soon after that first geology class, Marvin decided to change her major from history to geology. Yet although Nichols’s words had so inspired her, she encountered a shock when she told him her decision. “No, you cannot major in geology,” she recalled him saying. “You should be learning how to cook.” Undeterred, Marvin continued to fulfill the requirements for a history degree while taking myriad geology courses with a quiet resolve.

In an interview with Smithsonian.com, Karen Motylewski, who later worked with Marvin at the Harvard-Smithsonian Center for Astrophysics, described Marvin as “strong-willed and determined.” As a woman in a male-dominated field, Marvin “had to fight pretty darn hard for her position in the field—and did” said Motylewski, “but she did so in a very quiet and polite way.”

Marvin peers into the glacial ice in search of meteorites, which look a lot like Earth rocks but stand out on the ice-covered landscape of Antarctica. (Smithsonian / Ursula Marvin)

Marvin had already resolved to navigate for herself a field that was not welcoming to women. But luck brought her a fortuitous female mentor in her journey. When Nichols left Tufts in the midst of World War II, geologist Katharine Fowler-Billings took his place, and became a female role model who helped Marvin imagine herself as a professional geologist. After meeting Fowler-Billings, Marvin recalled thinking “now I knew that women geologists existed.”

Marvin’s encounter with Billings illustrates the importance of having representation of women in science. But after more experience in such a masculinized field, Marvin also understood mere presence of women was not enough to retain them in the sciences.

Decades later, after gaining some renown of her own, Marvin would help organize a 1975 “Space for Women Conference,” which helped young women prepare for careers in science; she also became the first Women’s Program Coordinator at Smithsonian Astrophysics Laboratory in 1974. In 1976, she co-authored an article titled “Professionalism Among Women and Men in the Geosciences,” in which she helped identify five obstacles, on top of lack of role models, to women succeeding in science.

After graduating, Marvin applied to Radcliffe for graduate study in geology, and attended with a full scholarship in 1943. Within her first year, she had a research assistantship with Esper S. Larsen studying uranium ores for a Manhattan Project grant. This made her the first woman research assistant in Harvard’s geology department—followed by another first, when geologist Kirtley Mather hired her as a teaching assistant to teach introductory geology classes.

In 1946, Marvin graduated from Radcliffe and went to the University of Chicago with her first husband, who was attending Northwestern Dental School. While there, she found work as a research assistant helping to create artificial feldspars (a group of minerals that contain calcium, sodium, or potassium and making up over half of the earth’s crust). Both her marriage and time in Chicago was short, however, and she moved back to Cambridge in 1950 to begin her PhD in geology, focusing on mineralogy. There, she met fellow geologist Tom Marvin. On April 1, 1952—the same day that her divorce from her first husband was final—she married Tom.

The first years of her new marriage, too, Marvin called an “adventure.” The pair worked together as prospectors for manganese oxide deposits in Brazil and Angola for the Union Carbide Corporation. They left for South America before she could finish her doctoral oral exams at Harvard, but the opportunity for world travel and hands-on fieldwork was indispensable in preparing her for the more demanding exhibitions to come.

By the time Marvin returned to Harvard in 1956, the Space Race was in full swing. In this changed political atmosphere, she found a new and thrilling use for her mineralogical skills—not in mines, but in star stuff.

Marvin displays her Antarctic gear before the 1978 meteorite hunt in Antarctic. Since then more than 1000 meteorite specimens have been added to world collections. (Charles Hanson / Smithsonian)

In 1956, Marvin joined a team studying the mineral makeup of the meteorites in the Harvard collection. At the same time, she was also offered a position teaching mineralogy at Tufts from an unlikely person: Robert Nichols, the same professor who had told her she should be learning to cook instead of learning geology. She worked in both roles until her position with the meteorite team turned into permanent civil service job at the Smithsonian Astrophysical Observatory (SAO), which would occupy her until her retirement in 1998.

In 1969, the same year that three Apollo missions successfully landed on the moon, Marvin and her colleague John Wood at SAO began to study lunar samples collected from Apollo 11. Their petrological and mineralogical research group investigated tiny rock fragments from the lunar soil, and “Ursula was the mineralogy arm of it,” Wood tells Smithsonian.com.

In their study, the group found something they did not expect: white anorthosite, which is likely to form during the early stages of magma cooling. “The savants who had worried about what the moon was made of, how it was formed, what it all meant, prior to the Apollo missions people were wrong,” Wood says. “They had said the moon formed relatively cold and didn’t really have a violent igneous history. And the evidence from these particles we found showed that that was wrong.”

The presence of white anorthosite proved that a young moon was either mostly or completely melted. Marvin, Wood, and two others from the research group published this finding about the mineralogical makeup of the lunar surface in a 1970 article in Science. Of their work on lunar samples, Wood says, “I like to think that the work that our group did, that Ursula was a part of, was the most important contribution that any of us made.”

Six years after these findings, American-led team began exploring Antarctica for meteorites, which had been found by Japanese scientists in 1973 embedded in the Arctic sheet in large concentrations. After learning of the expeditions, she immediately wanted to go and personally sought out the expedition leader, William Cassidy, and asked him to include her on the team. And she did go—twice—for the austral summer in 1978-79 and again in 1981-82, collecting dozens of meteorites to discover more about the mineralogical makeup of these celestial objects.

“I think she found her great joy when the exploration of Antarctica for meteorites began,” Motylewski says. Further bolstering her mineralogical expertise, Motylewski says that “Ursula had an eye for and looked for the unusual, what didn’t fit. So she was, I think, instrumental in helping identify those meteoritic pieces, which came from other planetary sources.”

(It should be noted that, despite her vast accomplishments, Marvin's scientific contributions were relatively inaccessible to the general public until recently. In 2015, that changed when one of the Smithsonian Institution’s annual Women in Science Wikipedia Edit-a-Thons initiative created a page for her on the editable online encyclopedia. The initiative has resulted in the creation of more than 50 new articles on groundbreaking geologists, anthropologists, botanists and more.)

The Antarctic expeditions from various countries including the U.S. have returned thousands of meteorites with origins in the Moon and even Mars. Marvin’s work in these efforts was rewarded and is now memorialized with Marvin Nunatak named for her in Antarctica, as well as Asteroid Marvin. With a mountain peak in the Arctic and an asteroid zooming through space bearing her name, Marvin leaves a legacy as a geologist of the boundless and as a ceaseless adventurer.

Marvin had few regrets about her career. When a friend once suggested to her that she would have been happier if she had stuck with history, she, with such certainty, replied, “I cannot agree to that. I really would not exchange for anything our work in Brazil and Angola, or the thrill of seeing those first samples from the Moon, or of spotting black rocks on the Antarctic.”

An undated photo of Marvin at Harvard University. (Harvard-Smithsonian Center for Astrophysics)

Meal Kit Delivery May Not Actually Be That Bad for the Environment

Smithsonian Magazine

Meal kits, the pre-portioned food delivery services that help even the most inept cooks whip up gourmet grub, are now a $1.5 billion industry. The convenience of this popular foodie phenomenon comes with a caveat: As many critics have pointed out, meal subscription boxes are stuffed with packaging, including cardboard, little plastic bags and refrigeration packs. But according to NPR’s Jonathan Lambert, a study has found that if you look at the big picture, meal kits actually have a smaller carbon footprint than the same meals made from store-bought ingredients.

A team of researchers at the University of Michigan ordered five meals—salmon, a cheeseburger, chicken, pasta and salad—from the company Blue Apron, then made the same recipes using food purchased at a grocery store. The team “measured every bit of food, plastic, bits of cardboard, everything for each type of meal,” Shelie Miller, an environmental scientist at the University of Michigan and lead author of the new study in Resources, Conservation and Recycling, tells Lambert.

The team also used data from previously published studies to conduct a “comparative life-cycle assessment,” which is an estimation of the greenhouse gas emissions produced for every phase of the meals’ “lifetime,” including agricultural production, packaging production, distribution, supply chain losses, and waste generation. Their results showed that yes, the subscription kits had more packaging per meal. But overall, grocery store meals yielded more greenhouse gas emissions than the kits—8.1 kilograms of carbon dioxide per meal versus 6.1 kilograms of carbon dioxide, respectively. Only the cheeseburger kit produced more greenhouse gas emissions than the grocery store equivalent, primarily because a number of ingredients included in the kit weighed more than those purchased in store.

A key factor reducing the meal kits’ carbon footprint was pre-portioned ingredients, which cut down on the amount of food used and the amount of waste produced. Americans chuck some 133 billion pounds of food each year, and as Jamie Ducharme notes in Time, wasted food means unnecessary land, water and fertilized is used and unnecessary greenhouse gases are pumped into the atmosphere. As it rots in landfills, food waste also produces the greenhouse gas methane.

“Even though it may seem like that pile of cardboard generated from a Blue Apron or Hello Fresh subscription is incredibly bad for the environment, that extra chicken breast bought from the grocery store that gets freezer-burned and finally gets thrown out is much worse, because of all the energy and materials that had to go into producing that chicken breast in the first place,” Miller says.

Meal kits don’t just cut down on waste by giving home cooks the exact amount of food they need; the services also circumvent grocery stores, which generate large food losses by overstocking items and throwing away blemished products. Another way the kits displayed emissions savings is through the “last-mile transportation,” or the final leg of food’s trip to the consumer. Meal kits are one of many products delivered on mail trucks, and are therefore associated with less carbon emissions than driving to and from the grocery store.

The new study is somewhat broad; it does not, for instance, factor in consumer behaviors like stopping at the grocery store on the way home from work, as Lambert points out. But the results show the importance of looking beyond the immediate problem when it comes to assessing the sustainability of what we eat and how we eat it.

Excessive packaging that comes with meal kits isn’t great for the environment, but it’s also only one piece of the much larger carbon footprint puzzle.

“When we think about objectives likes minimizing environmental impacts or climate change mitigation, it’s important to understand the impacts that are occurring in the food system,” Brent Heard, study co-author and PhD candidate at the University of Michigan’s School for Environment and Sustainability, tells Time’s Ducharme. “A lot of times, they’re largely invisible to the consumer.”

Large fragment of a balustrade architectural fitting

Freer Gallery of Art and Arthur M. Sackler Gallery

The ethics of lethal methods

Smithsonian Libraries

The Giant Squid: Dragon of the Deep

Smithsonian Magazine

There are few monsters left in the world. As our species has explored and settled the planet, the far-flung areas marked “Here Be Dragons” have been charted, and toothy terrors once thought to populate the globe have turned out to be imaginary or merely unfamiliar animals. Yet some elusive creatures have retained their monstrous reputation. Foremost among them is Architeuthis dux—the giant squid.

The creature—likely the inspiration for the legendary kraken—has been said to have terrorized sailors since antiquity, but its existence has been widely accepted for only about 150 years. Before that, giant squid were identified as sea monsters or viewed as a fanciful part of maritime lore, as in the case of a strange encounter shortly before scientists realized just what was swimming through the ocean deep.

At about 5:00 in the afternoon on August 6, 1848, Capt. Peter M’Quhae was guiding the HMS Daedalus through the waters between the Cape of Good Hope and the island of St. Helena off the African coast when the crew spotted what they described as a gigantic sea serpent. The beast was unlike anything the sailors had seen before. News of the encounter hit the British newspaper The Times two months later, telling of the ship’s brush with a nearly 100-foot monster that possessed a maw “full of large jagged teeth … sufficiently capacious to admit of a tall man standing upright between them.”

M’Quhae, who was asked by the Admiralty to confirm or deny this sensational rumor, replied that the stories were true, and his account was printed a few days later in the same newspaper. Dark on top with a light underbelly, the sinuous, 60-foot creature had slipped by within 100 yards of the boat, and M’Quhae proffered a sketch of the animal made shortly after the sighting.

Precisely what the sailors had actually seen, though, was up for debate. It seemed that almost everyone had an opinion. A letter to The Times signed “F.G.S.” proposed that the animal was a dead ringer for an extinct, long-necked marine reptile called a plesiosaur, fossils of which had been discovered in England just a few decades before by fossil hunter Mary Anning. Other writers to the newspapers suggested the animal might be a full-grown gulper eel or even an adult boa constrictor snake that had taken to the sea.

The notoriously cantankerous anatomist Richard Owen said he knew his answer would “be anything but acceptable to those who prefer the excitement of the imagination to the satisfaction of judgment.” He believed that the sailors had seen nothing more than a very large seal and conferred his doubts that anything worthy of the title “great sea serpent” actually existed. It was more likely “that men should have been deceived by a cursory view of a partly submerged and rapidly moving animal, which might only be strange to themselves.”

M’Quhae objected to Owen’s condescending reply. “I deny the existence of excitement, or the possibility of optical illusion,” he shot back, affirming that the creature was not a seal or any other readily recognizable animal.

As was the case for other sea monster sightings and descriptions going back to Homer’s characterization of the many-tentacled monster Scylla in The Odyssey, attaching M’Quhae’s description to a real animal was an impossible task. Yet a series of subsequent events would raise the possibility that M’Quhae and others had truly been visited by overly large calamari.

The naturalist credited with giving the giant squid its scientific start was Japetus Steenstrup, a Danish zoologist at the University of Copenhagen. By the mid-19th century, people were familiar with various sorts of small squid, such as species of the small and widespread genus Loligo that are often eaten as seafood, and the basics of squid anatomy were well known. Like octopus, squid have eight arms, but they are also equipped with two long feeding tentacles that can be shot out to grasp prey. The head portion of the squid pokes out of a conical, rubbery structure called the mantle, which encloses the internal organs. Inside this squishy anatomy, the squid has two hard parts: a tough internal “pen” that acts as a site for muscle attachment, and a stiff beak that is set in the middle of the squid’s ring of sucker-tipped arms and used to slice prey. Since naturalists were only just beginning to study life in the deep sea, relatively few of the approximately 300 squid species now known had been discovered.

In 1857, Steenstrup combined 17th century reports of sea monsters, tales of many-tentacled giant creatures washed up on European beaches, and one very large squid beak to establish the reality of the giant squid. He called the animal Architeuthis dux. His only physical evidence was the beak, collected from the remains of a stranded specimen that had recently washed ashore. Steenstrup concluded: “From all evidences the stranded animal must thus belong not only to the large, but to the really gigantic cephalopods, whose existence has on the whole been doubted.”

Image by Associated Press. Scientists from the National Science Museum of Japan recorded a live giant squid that had been hauled up to the surface next to a boat. (original image)

Image by The Granger Collection, New York. Architeuthis dux, better known as the giant squid, is likely the inspiration for the legendary kraken. (original image)

Image by Mary Evans Picture Library / Alamy. A dead giant squid washed ashore in Fortune Bay, Newfoundland in 1871. (original image)

Subsequent run-ins would leave no doubt as to the giant squid’s reality. In November 1861, the French warship Alecton was sailing in the vicinity of the Canary Islands in the eastern Atlantic when the crew came upon a dying giant squid floating at the surface. Eager to acquire the strange animal, but nervous about what it might do if they came too close, the sailors repeatedly fired at the squid until they were sure it was dead. They then tried to haul it aboard, unintentionally separating the tentacled head from the rubbery tail sheath. They wound up with only the back half of the squid, but it was still large enough to know that this animal was far larger than the familiar little Loligo. The ensuing report to the French Academy of Sciences showed that the poulpe could grow to enormous size.

Encounters in North American waters added to the body of evidence. A dead giant squid was discovered off the Grand Banks by sailors aboard the B.D. Haskins in 1871, and another squid washed up in Fortune Bay, Newfoundland.

The naturalist Henry Lee suggested in his 1883 book Sea Monsters Unmasked that many sea monsters —including the one seen by the crew of the Daedalus—were actually giant squid. (Accounts of M’Quhae’s monster are consistent with a giant squid floating at the surface with its eyes and tentacles obscured underneath the water.) The numerous misidentifications were simply attributable to the fact that no one actually knew such creatures existed!

Instead of being tamed through scientific description, though, the giant squid seemed more formidable than ever. It was cast as the villain in Jules Verne’s 1869 novel 20,000 Leagues Under the Sea, and in 1873 news spread of a giant squid that had allegedly attacked fishermen in Conception Bay, Newfoundland. The details are a little murky due to some creative retelling over the years, but the basic story is that two or three fishermen came upon an unidentified mass in the water. When they tried to gaff it, they discovered that the thing was a giant squid—which then tried to sink their boat. Some quick hatchet work sent the monster jetting away in a cloud of dark ink, and the proof of their encounter was a 19-foot-long tentacle. The fishermen gave it to the Rev. Moses Harvey, who was given the body of another giant squid by a different group of Newfoundland fishermen soon afterward. He photographed the latter specimen before sending it on to naturalists in New Haven, Connecticut, for study. The fame and reputation of the “devil fish” was at its acme—so much so that the showman P.T. Barnum wrote to Harvey requesting a pair of giant squid of his own. His order was never filled.

The giant squid was transformed into a real monster, and one whose unknown nature continues to frighten us. Not long after giving sharks a bad rap with Jaws, Peter Benchley made a particularly voracious giant squid the villain of his 1991 novel Beast. The second Pirates of the Caribbean film in 2006 transformed the squid into the gargantuan, ship-crunching kraken.

The enormous cephalopod still seems mysterious. Architeuthis inhabit the dark recesses of the ocean, and scientists are not even sure how many species are in the giant squid genus. Most of what we know comes from the unfortunate squid that have been stranded at the surface or hauled up in fishing nets, or from collections of beaks found in the stomachs of their primary predator, the sperm whale.

Slowly, though, squid experts are piecing together the natural history of Architeuthis. The long-lived apex predators prey mainly on deep-sea fish. Like other ocean hunters, they accumulate high concentrations of toxins in their tissues, especially those squid that live in more polluted areas. Marine biologists say that giant squid therefore can act as an indicator of deep-sea pollution. Giant squid strandings off Newfoundland are tied to sharp rises in temperature in the deep sea, so giant squid may similarly act as indicators of how human-driven climate change is altering ocean environments. There are two giant squid, measuring 36- and 20-feet long, on display in the National Museum of Natural History’s Sant Ocean Hall. As NMNH squid expert Clyde Roper points out, they are “the largest invertebrate ever to have lived on the face of the earth.”

In 2005, marine biologists Tsunemi Kubodera and Kyoichi Mori presented the first underwater photographs of a live giant squid in its natural habitat. For a time it was thought that squid might catch their prey through trickery—by hovering in the water column with tentacles extended until some unwary fish or smaller squid stumbled into their trap. But the images show the large squid aggressively attacking a baited line. The idea that Architeuthis is a laid-back, deep-sea drifter began to give way to an image of a quick and agile predator. The first video footage came in December of the following year, when scientists from the National Science Museum of Japan recorded a live giant squid that had been hauled up to the surface next to the boat. Video footage of giant squid in their natural, deep-sea environment is still being sought, but the photos and video already obtained give tantalizing glimpses of an enigmatic animal that has inspired myths and legends for centuries. The squid are not man-eating ship sinkers, but capable predators in an utterly alien world devoid of sunlight. No new images have surfaced since 2006, which seems typical of this mysterious cephalopod. Just when we catch a brief glimpse, the giant squid retreats back into the dark recesses of its home, keeping its mysteries well guarded.

Further reading:

Ellis, R. 1994. Monsters of the Sea. Connecticut: The Lyons Press.

Ellis, R. 1998. The Search for the Giant Squid. New York: Penguin.

Guerraa, Á; Gonzáleza, Á.; Pascuala, S.; Daweb, E. (2011). The giant squid Architeuthis: An emblematic invertebrate that can represent concern for the conservation of marine biodiversity Biological Conservation, 144 (7), 1989-1998

Kubodera, T., and Mori, K. 2005. First-ever observations of a live giant squid in the wild. Proceedings of the Royal Society B, 22 (272). pp. 2583-2586

Lee, H. 1883. Sea Monsters Unmasked. London: William Clowes and Sons, Limited

The Way of Water in the World

SI Center for Learning and Digital Access
Lesson plan looking at global issues of water shortage and contamination. Students conduct Internet research and work in small groups to find solutions to these problems.

Cranking Up the Heat in the “Wetland of the Future”

Smithsonian Environmental Research Center
by Joe Dawson, SERC research aide Last fall, while volunteering in a plant lab at George Washington University in D.C., I heard about an experiment that was starting up at the Smithsonian Environmental Research Center (SERC). The project, a global warming simulation in the wetlands surrounding the Chesapeake Bay, was helmed by SERC research ecologist Roy […]
241-264 of 12,564 Resources