Skip to Content

Found 6 Resources

Your Guide to All Things Anthropocene

Smithsonian Magazine

For over a year, we at have been telling crucial stories on the front lines of global change. We've showcased the good, the bad and the uglysolutions, casualties, and key scientific and technological advancements in an effort to illustrate the scope and consequences of this critical time in our planet's history. Today we know that much of these changes owe to humans, whose activities have transformed—and continue to transform—the fundamental nature of Earth's climate, natural resources, and biological diversity on an unprecedented scale.

This profound influence has led many scientists to assert that we have entered a new chapter in Earth’s geologic history: the Anthropocene, which translates roughly into “the age of humans.” Popularized by Nobel Laureate and noted atmospheric chemist Paul Crutzen in the early 2000s, the concept has since become a mainstay of the scientific and popular lexicon. But just how profound has humanity's contribution been? For how long has this been happening, and what steps can we take to address it?

In the past century and a half, some of the brightest philosophical and scientific minds have applied themselves to answering these questions. We reached out to key researchers and experts across the Smithsonian Institution to give their take on a few of the seminal research papers that have shaped our understanding of this new chapter in Earth's history. Here, we present them as a brief annotated guide. Taken together, they show the arc of how we came to understand the extent and nature of the Anthropocene—and how much we have still left to learn. 

Topic areas:

Air | Water | Earth | Biodiversity





Have Humans Really Created a New Geologic Age?

Smithsonian Magazine

If you know how to read it, the face of a cliff can be as compelling as the latest bestselling novel. Each layer of rock is a chapter in Earth’s history, telling stories of birth and death, winners and losers, that help scientists understand the evolution of the planet over the past 4.6 billion years.

While humans arrived only recently on geologic time scales, our species already seems to be driving some major plot developments. Agriculture occupies about one-third of Earth's land. The atmosphere and oceans are filling up with chemical signatures of our industrial activity. Whole ecosystems have been reshaped as species are domesticated, transplanted or wiped out.

These changes have become so noticeable on a global scale that many scientists believe we have started a new chapter in Earth’s story: the Anthropocene. Atmospheric chemist Paul Crutzen popularized the term in the early 2000s, and it has become engrained in the scientific vernacular. But don’t ask what the Anthropocene technically means unless you’re in the mood for some drama.

“It’s not research, it is diplomacy. It’s not necessary for geologists,” says Lucy Edwards, a researcher with the U.S. Geological Survey. Others think there is a case to be made for at least trying to codify the Anthropocene, because it is forcing the global community to think about the true extent of human influence. "It focuses us on trying to work out how we measure the relative control of humans as opposed to nature," says Tony Brown, a physical geographer at the University of Southampton in the United Kingdom.

"For example, is human activity altering the rate of uplift of mountains? If you had asked that question 20 years ago, geologists would have looked at you as if you were mad," says Brown. "But we know some faults are lubricated by precipitation, so if we are altering global precipitation patterns, there is a slight chance of a link. If that is the case, that is quite a profound potential interaction between humans and their environment."

The International Commission on Stratigraphy—the ruling body that sets formal boundaries on geologic ages—has set up a working group to study the case for making the Anthropocene official. The crux of the debate is where to place the starting boundary line, or base. Geologists continue to tinker with the bases for well-established epochs, eras and ages, and there is usually a relatively wide margin of error. "Even the most precisely defined, the end of the Cretaceous 66 million years ago, is plus or minus 3,000 years. This is minute in geological terms but very big in humans terms," says Brown.

In the reference text "The Geologic Time Scale 2012", Crutzen and colleagues lay out three main options for the start of the Anthropocene. It's possible to set the boundary in the early part of the current epoch, called the Holocene, which began about 11,700 years ago. The idea is that the dawn of agriculture in the early Holocene kicked off a steady rise in carbon dioxide that has altered Earth's natural climate cycles. But that potential base is controversial, in part because agriculture spread to various locations at different times, and a formal interval of geologic time should be recognizable globally.

Nobel Prize laureate and Dutch meteorologist Paul Crutzen, who gave prominence to the term "Anthropocene". (GIL COHEN MAGEN/X01316/Reuters/Corbis)

The next option, and the one preferred by Crutzen, is to put the base near the Industrial Revolution, which the book authors argue became a global phenomenon in the early 19th century. "This is … where the combination of industrialization and the acceleration of population growth created a clear step change in the human signal," the scientists write. But like agriculture, industrial activity didn't start everywhere at once—China was smelting iron in the 11th century, for instance—so not everyone may be happy with the choice.

Still others have proposed linking the base to a global spike in a signal that is unquestionably caused by humans: radioactive isotopes from atomic bomb detonation in the 1950s. Distinctive levels of radioactive substances from bomb use and testing were distributed widely and will linger in the rock record for millennia. But they are not a perfect solution either, as radioactive decay means that the signal will eventually be lost.

Another way to approach the problem is to consider when human influence became the dominant force of change on a combination of Earth systems. Natural cycles and cataclysmic events have affected the environment over deep time, and some of those forces are still at work. But in addition to the signal from atomic bombs, the mid-20th century saw an acceleration in a variety of human impacts, with a doubling of population size, a massive increase in vehicle use and a rapid shift from mostly rural to urban living, which triggered an increase in construction and large infrastructure projects such as dams.

"Probably in the late part of the last century, humans became responsible for moving more soil or rock than natural agencies," says Brown. "We’ve increased erosion rates in most parts of the world, but we've also trapped a lot of sediments, because we've dammed most of world's really big rivers." 

“For geologists, there are lots of features on the present-day planet that are human-made or distorted,” says James Ogg, a stratigrapher with Purdue University and the China University of Geosciences. But he believes the best strategy may be to keep the term unofficial. "The Anthropocene is a very useful term, because it helps show the dramatic impacts we’ve had on all aspects of the planet," he says. "But on the geologic time scale, you need a place and time that can be correlated around the world, so that people are speaking the same language. For the Anthropocene, is there actually a time level that we can correlate?"

Brown agrees: "The majority of scientists who engage with the question will say, 'yes we are in Anthropocene'. And it's OK if you just say that. My view is, at moment, we're better off not formalizing it, partly because we will get into very long and not very productive argument about where the boundary should be."

Edwards adds that another problem with making the Anthropocene official is deciding when it might end, and thus how large of a time interval to assign it. The use of the "cene" suffix signals to geologists that it is an epoch (tens of millions of years). But it's also sometimes referred to as an age (millions of years) within the Holocene, and some people say it should be an even smaller unit, a stage.

Given the term’s complexity, if you really just have to have a formal definition, you better be prepared to wait, Edwards says. "Geologists have learned from the Pluto experience," she says, referring to the 2006 vote by the International Astronomical Union to take away Pluto's official status as a planet. "We're not just going to show up at a union meeting and have a decision with all these glaring errors that makes us a laughingstock. Unfortunately, the decision to take it slowly and work it out bothers some people. But to geologists, what's a million years?" 

The Atomic Age Ushered In the Anthropocene, Scientists Say

Smithsonian Magazine

Humans are living in a new geologic epoch, one that is largely of their own making, scientists say.

In a new study, published in this week’s issue of the journal Science, an international team of geoscientists concluded that the impact of human activity on the Earth is so widespread and persistent that it warrants formal recognition with the creation of a new geologic time unit, which they propose to call the Anthropocene epoch.

“We’re saying that humans are a geological process,” says study coauthor Colin Waters, a geologist with the British Geological Survey in the U.K. “We are the dominant geologic force shaping the planet. It’s not so much river or ice or wind anymore. It’s humans.”

The term “Anthropocene”–from anthropo, for “man”, and cene, for “new”–has been slowly gaining popularity as an environmental buzzword to describe humanity’s planet-scale influence since 2000, when it was popularized by the atmospheric chemist and Nobel laureate Paul Crutzen.

In recent years, however, there has been a growing movement amongst scientists to formally adopt the term as part of the official nomenclature of geology. Those who advocate this action argue that the current epoch dominated by humanity is markedly different from the Holocene epoch of the past 12,000 years, the time during which human societies developed and flourished.

The new study is not the first to propose a formal establishment of an Anthropocene epoch–Simon Lewis and Mark Maslin of the University of College London made a similar recommendation last year– but it is one of the most comprehensive to date. In it, Waters and his colleagues sought to answer whether human actions have left measurable signals in the geological strata, and whether those signals are markedly different from those of Holocene. The answer to both questions, the scientists say, is overwhelmingly yes.

The researchers conducted a review of the published scientific literature and found evidence for numerous ways that humans have changed the Earth to produce signals in ice and rock layers that will still be detectable millions of years from now. Among them: a preponderance of unique human products such as concrete, aluminum and plastics; elevated atmospheric levels of the greenhouse gases carbon dioxide and methane; higher levels of nitrogen and phosphorus in the soil from fertilizers and pesticides; and radionuclide fallout from above-ground nuclear weapons testing in the 20th century.

Humans have also indelibly shaped the biological realm by raising a few domesticated animals and cultivated crops to prominence while pushing other species toward extinction.

“I think these changes will be really obvious in the fossil record,” says Scott Wing, the curator of fossil plants at the Smithsonian National Museum of Natural History.

“Imagine the abundance of beef and chicken bones and corn cobs in sediments from now versus sediments deposited 300 years ago,” says Wing, who was not involved in the study.

Humans have also facilitated the mixing of species to a degree unprecedented in the history of the Earth, says Waters, who is also the secretary of the Anthropocene Working Group, an organization within the International Union of Geological Sciences.

“If we find a plant that’s nice to look at, within years we’ve transported it across the globe,” Waters says. “That is creating pollen signatures in sediments that are very confusing. Normally, you have to wait for two continents to collide until you get that kind of transfer of species, but we’re doing it in a very short period of time.”

As far as epochs go, the Anthropocene is a young one: Waters and his team argue that it only began around 1950 C.E., at the start of the nuclear age and the mid–20th century acceleration of population growth, industrialization, and mineral and energy use. In this, the group differs from Lewis and Maslin, who suggested the Anthropocene’s “golden spike”– the line between it and the Holocene–be set at either 1610 or 1964. The year 1610 is when the collision of the New and Old Worlds a century earlier was first felt globally, and the year 1964 is discernable in rock layers by its high proportion of radioactive isotopes–a legacy of nuclear weapons tests.   

“The Holocene was an abrupt event as far as geologists are concerned. And yet, we’re seeing changes that are even more rapid than that,” Waters says.

The Smithsonian’s Wing says he agrees that humans have changed the Earth sufficiently to create a distinct stratigraphic and geochemical signal. “I don’t think there is any doubt about it,” he says. “Not only is the signal distinct and large, it will persist for a geologically long amount of time, so it will be recognizable hundreds of thousands or millions of years into the future, should there be anyone then to look at the record.”

Interestingly, unlike the notion of climate change, for which scientific consensus was established long before public acceptance became widespread, Waters says members of the general public appear to be more willing to accept the idea of an Anthropocene epoch than some scientists.

 “Geologists and stratigraphers”–scientists who study the layers of the Earth–“are used to looking at rocks that are millions of years old, so many of them have a hard time appreciating that such a small interval of time can be a geologic epoch,” Waters says.

Both Waters and Wing say that in addition to being scientifically important, formally recognizing the Anthropocene epoch could have a powerful impact on the public perception of how humanity is changing the planet.

“There’s no doubt that when 7 billion people put their minds to doing something, they can have a big impact. We’re seeing that now,” Waters says. “But it also means that we can reverse some of those impacts if we wish, if we are aware of what we’re doing. We can modify our progress.”

Wing agrees. “I think the Anthropocene is a really important mechanism for getting people of all sorts to think about their legacy,” he says. “We humans are playing a game that affects the whole globe for an unimaginably long time into the future. We should be thinking about our long-term legacy, and the Anthropocene puts a name on it.”

The Smithsonian Institution Announces an Official Climate Change Statement

Smithsonian Magazine

As humans continue to transform the planet at an increasingly rapid rate, the need to inform and encourage change has become ever more urgent. The situation is becoming critical for wild species and for the preservation of human civilization. Recognizing this urgency, the Smithsonian Institution has formulated its first official statement about the causes and impacts of climate change.

With special emphasis on the Smithsonian’s 160-year history and tradition of collection, research and global monitoring, the statement delivers a bold assessment: "Scientific evidence has demonstrated that the global climate is warming as a result of increasing levels of atmospheric greenhouse gases generated by human activities."

"The 500 Smithsonian scientists working around the world see the impact of a warming planet each day in the course of their diverse studies," reads the statement. "A sample of our investigations includes anthropologists learning from the Yupik people of Alaska, who see warming as a threat to their 4,000-year-old culture; marine biologists tracking the impacts of climate change on delicate corals in tropical waters; and coastal ecologists investigating the many ways climate change is affecting the Chesapeake Bay."

“What we realized at the Smithsonian is that many people think that climate change is just an environmental topic,” says John Kress, acting undersecretary of science at the Smithsonian. “It’s much more than that. Climate change will affect everything.”

Many scientists, including Smithsonian researchers, believe we have entered a new interval called the Anthropocene. Coined in the 1980s by Eugene F. Stoermer, a researcher in diatoms, but popularized by atmospheric chemist and Nobel laureate Paul Crutzen in 2000, the term is derived from the Greek words anthrop for man and cene for current or new. Unlike the Holocene, which began at the end of the last glaciation about 12,000 years ago, the Anthropocene has no formal start date. But in adopting the term, the Smithsonian recently organized its initiative “Living in the Anthropocene” to “expand climate change outside of just science and take Smithsonian resources to look at what other scholars and professionals are doing in various areas with regard to climate change,” Kress says.

As part of this initiative, the Smithsonian is bringing together some of the nation’s top critical thinkers to offer their perspectives in a symposium on October 9 called “Living in the Anthropocene: Prospects for Climate, Economics, Health, and Security.” The symposium features Rachel Kyte, group vice president and special envoy for climate change at the World Bank; James J. Hack, the director of the National Center for Computational Science at the Oak Ridge National Laboratory; George Luber, the associate director for climate change in the Division of Environmental Hazards and Health Effects at the National Center for Environmental Health, Centers for Disease Control and Prevention; Admiral Thad Allen, the executive vice president of Booz Allen Hamilton and former commander of the U.S. Coast Guard; and Thomas L. Friedman, a Pulitzer-winning columnist for the New York Times.

For economies to grow and prosper, especially in underdeveloped countries, the need to address climate change is crucial. Last year, the World Bank changed their business model and added a special envoy for climate change to reach their goal of eradicating poverty by 2030. “Climate change is already having an impact on our goals because of extreme weather events. If you’re a country that is vulnerable to weather events, than those events can wipe out decades worth of development in just a few minutes or hours. We’ve seen countries and regions lose anywhere from 2 to 200 percent of their GDP,” Kyte says. “In almost every aspect of our economy, climate change is beginning to bite down, and that means we have to help our climate adapt and build a resilience plan for an increasingly uncertain future.”

Admiral Allen, who was designated principal federal official for the U.S. government’s response and recovery operations in the aftermath of Hurricane Katrina and later served as the national incident coordinator for the federal response to the Deepwater Horizon oil spill in the Gulf of Mexico, agrees that there needs to be resilience, although he emphasizes a bottom-up concept. “I always tell people that the first responder in any natural disaster is you and the second first responder is your neighbor. The more you become resilient, the less demand you put on the services in the community and the more you can help each other to create a resilient community.”

The Smithsonian initiative will also examine the health effects that emerge from changing environments and climate, including deaths, disease and trauma.  “We have the direct effects of events like hurricanes, which have both immediate and long-lasting health consequences, but then we also have health effects that come with changing ecology. There are pathogens such as Lyme disease or dengue fever that are sensitive to weather, and their environment can expand or shift,” says Luber, who is also an epidemiologist.  

Understanding such complex systems requires computational models, which can make predictions and reveal current activities on both grand and small scales. “The better the computational foundations and facilities to help the scientists, the more we’re going to start making progress toward more formally evaluating where uncertainties lie in the process of developing models,” Hack says. Even small uncertainties in the data could have trillion-dollar impacts and undermine faith in the modeling community, he adds.

As the struggle to understand and cope with global change continues, a “unity of effort” is needed across all platforms to better understand our challenges and determine solutions. “I think the challenge is to understand the complexity of the world we live in and the interaction of technology, human beings and the natural environment and try and think of new ways to build in resiliency into not only the human side of the planet but also the natural side,” Admiral Allen says.

James J. Hack, Rachel Kyte, George Luber, Admiral Thad Allen and Thomas L. Friedman will speak at the Smithsonian Institution on October 9, 2014 at a one-day symposium entitled, “Living in the Anthropocene: Prospects for Climate, Economics, Health, and Security,” 9:15 a.m. to 6:30 p.m., with a reception to follow in the Baird Auditorium at the National Museum of Natural History. The event is free and open to the public, but space is limited. To get your ticket, RSVP to by October 7.

The Ozone Hole Was Super Scary, So What Happened To It?

Smithsonian Magazine

It was the void that changed public perception of the environment forever—a growing spot so scary, it mobilized a generation of scientists and brought the world together to battle a threat to our atmosphere. But 30 years after its discovery, the ozone hole just doesn’t have the horror-story connotations it once did. How did the conversation change—and how bad is the ozone hole today?

To understand, you have to go back about 250 years. Scientists have been trying to study the invisible since the beginning of science, but the first real understanding of Earth's atmosphere came during the 1700s. In 1776, Antoine Lavoisier proved that oxygen was a chemical element, and it took its place as number eight on the periodic table. The scientific revolution that spurred on discoveries like Lavoisier’s also led to experiments with electricity, which produced to a stinky revelation: Passing electricity through oxygen produced a strange, slightly pungent smell.

In the 1830s, Christian Friedrich Schönbein coined the term “ozone” for the odor, riffing off the Greek word ozein, which means “to smell.” Eventually, ozone was discovered to be a gas made from three oxygen atoms. Scientists began to speculate that it was a critical component of the atmosphere and even that it was able to absorb the sun’s rays.

A pair of French scientists named Charles Fabry and Henri Buisson used an interferometer to make the most accurate measurements ever of ozone in the atmosphere in 1913. They discovered that ozone collects in a layer in the stratosphere, roughly 12 to 18 miles above the surface, and absorbs ultraviolet light.

Because it blocks some radiation from reaching Earth's surface, ozone provides critical protection from the sun's scorching rays. If there were no ozone in the atmosphere, writes NASA, “the Sun's intense UV rays would sterilize the Earth's surface.” Over the years, scientists learned that the layer is extremely thin, that it varies over the course of days and seasons and that it has different concentrations over different areas.

Even as researchers began to study ozone levels over time, they started to think about whether it was capable of being depleted. By the 1970s, they were asking how emissions from things like supersonic aircraft and the space shuttle, which emitted exhaust directly into the stratosphere, might affect the gases at that altitude.

But it turned out that contrails weren’t the ozone layer’s worst enemy—the real danger was contained in things like bottles of hairspray and cans of shaving cream. In 1974, a landmark paper showed that chlorofluorocarbons (CFCs) used in spray bottles destroy atmospheric ozone. The discovery earned Paul Crutzen, Mario Molina and F. Sherwood Rowland a Nobel Prize, and all eyes turned to the invisible layer surrounding Earth.

But what they found shocked even scientists who were convinced that CFCs deplete ozone. Richard Farman, an atmospheric scientist who had been collecting data in Antarctica annually for decades, thought his instruments were broken when they began to show drastic drops in ozone over the continent. They weren’t: The ozone layer had been damaged more than scientists could have imagined before Farman discovered the hole.

As word of the ozone hole leaked through the media, it became nothing short of a worldwide sensation. Scientists scrambled to understand the chemical processes behind the hole as the public expressed fear for scientists’ wellbeing at the South Pole, assuming that while studying the hole they would be exposed to UV rays that could render them blind and horrifically sunburned.

Rumors of blind sheep—the increased radiation was thought to cause cataracts—and increased skin cancer stoked public fears. “It’s like AIDS from the sky,” a terrified environmentalist told Newsweek’s staff. Fueled in part by fears of the ozone hole worsening, 24 nations signed the Montreal Protocol limiting the use of CFCs in 1987.

These days, scientists understand a lot more about the ozone hole. They know that it’s a seasonal phenomenon that forms during Antarctica’s spring, when weather heats up and reactions between CFCs and ozone increase. As weather cools during Antarctic winter, the hole gradually recovers until next year. And the Antarctic ozone hole isn’t alone. A “mini-hole” was spotted over Tibet in 2003, and in 2005 scientists confirmed thinning over the Arctic so drastic it could be considered a hole.

Each year during ozone hole season, scientists from around the world track the depletion of the ozone above Antarctica using balloons, satellites and computer models. They have found that the ozone hole is actually getting smaller: Scientists estimate that if the Montreal Protocol had never been implemented, the hole would have grown by 40 percent by 2013. Instead, the hole is expected to completely heal by 2050.

Since the hole opens and closes and is subject to annual variances, air flow patterns and other atmospheric dynamics, it can be hard to keep in the public consciousness.

Bryan Johnson is a research chemist at the National Oceanic and Atmospheric Administration who helps monitor the ozone hole from year to year. He says public concern about the environment has shifted away from the hole to the ways in which carbon dioxide affects the environment. “There are three phases to atmospheric concerns,” he says. “First there was acid rain. Then it was the ozone hole. Now it’s greenhouse gases like CO2.”

It makes sense that as CFCs phase out of the atmosphere—a process that can take 50 to 100 years—concerns about their environmental impacts do, too. But there’s a downside to the hole’s lower profile: The success story could make the public more complacent about other atmospheric emergencies, like climate change.

It was the fear about ozone depletion that mobilized one of the biggest environmental protection victories in recent memory. But while it’s easy to see why blind sheep are bad, gradual changes like those associated with CO2 emissions are harder to quantify (and fear). Also, the public may assume that since the issue of the ozone hole was “fixed” so quickly, it will be just as easy to address the much more complex, slow-moving problem of climate change.

Still, researchers like Johnson see the world’s mobilization around the ozone hole as a beacon of hope in a sometimes bleak climate for science. “The ozone hole is getting better, and it will get better,” says Johnson. It’s not every day a scientific horror story has a happy ending.

Where in the World Is the Anthropocene?

Smithsonian Magazine

Sixteen years ago, a pair of scientists introduced a new word that would shake up the geologic timeline: the Anthropocene. Also known as the "Age of Humans," the idea was first mentioned in a scientific newsletter by Nobel Prize-winning, atmospheric chemist Paul Crutzen and renowned biologist Eugene Stoermer. The duo enumerated the many impacts of human activities on the planet, outlining human induced carbon and sulfur emissions, the global run off of nitrogen fertilizers, species extinctions and destruction of coastal habitats.

Considering these vast changes, they declared the Holocene (our current 11,000-year-old geologic epoch) over. The Earth had entered a new geologic era, they said. This week, scientists are meeting to present their evidence of this new chapter of geological time to the International Geological Congress in Cape Town, South Africa.

Since it was introduced, the Anthropocene concept has resonated throughout the sciences and humanities. It's forced people to confront how, in so little time, our species has irreversibly transformed Earth’s climate, landscapes, wildlife and geology.

“Many people are using [the term] because it sums up in a word and an idea the total scale and extent of how the Earth’s system is changing because of humans,” says Jan Zalasiewicz, a University of Leicester geologist who pieces together Earth’s history using fossils.

As he watched the Anthropocene idea proliferate, he wondered whether there was some geological truth to it. Could today’s soils and sediments be distinct from those laid down in the Holocene? Are they distinct enough to name a new geologic epoch?

"The important thing is that the Earth system is changing," says Zalasiewicz. "From the point of geology, it doesn’t matter whether it’s humans causing it, or if it’s a meteorite, aliens from outer space or even my cat masterminding change to the planet." 

In 2008, he gathered a group of geologists, and together they published a list of possible geological signs of human impact in GSAToday, the magazine for the Geological Society of America. The group concluded that the Anthropocene is "geologically reasonable" and warranted further investigation.

But declaring a new geologic epoch is no small task. The official inclusion of the Anthropocene would be a major revision to the Geologic Timescale—the hulking calendar of time that divides Earth’s 4.6-billion-year history into chapters. The boundaries between each of these chapters are marked by shifts in the composition of glacial ice, tree rings, coral growth bands, seafloor and lake sediments among other layered geologic formations, found consistently throughout the world. “All of these layers contain signals within themselves, which reflect the life and the times around them, the chemical, biological and physical signals,” says Zalasiewicz. If the rocks have changed, the world must have changed, too.

Perhaps the most well known boundary is that between the Mesozoic and Cenozoic—also known as the Cretaceous-Paleogene or K/Pg boundary and formerly as the K-T boundary. Some 66 million years ago, an asteroid struck the Earth and killed off the non-avian dinosaurs. Since comets and asteroids are rich in the element iridium, and it's rare on Earth, a fine layer of iridium marks this event in the geologic record around the world. On every continent, paleontologists find fossils of large dinosaurs and certain plankton species below that stripe of iridium; above it, they find a distinct suite of plankton and no traces of non-avian dinosaur fossils. The iridium layer separates the Mesozoic, the dinosaur-filled era of life, from the Cenozoic, when mammals began taking over.

Though the iridium stripe can be found worldwide, the boundary’s official location is outside El Kef, Tunisia. There, in 2006, geologists hammered a golden spike into a hillside that displayed the telltale signs of the K/Pg boundary to serve as a reference point. Ideally, each boundary between chapters on the Geologic Timescale will have its own “golden spike” placed into an existing rock face or core (from glacial or marine sediment). Strict rules govern the boundaries and golden spikes, overseen by the International Commission on Stratigraphy within the larger International Union of Geological Sciences, lest the Geologic Timescale be swept away by fads in geology or in politics.

In 2008, the IUGS contacted Zalasiewicz with the request that he form a new committee to look into the idea of the Anthropocene. He gathered a diverse set of researchers, including geologists, climatologists, chemists, paleontologists and historians, dubbing the crew the Anthropocene Working Group (AWG). Over the past eight years, they furiously compared notes and gathered data to make their formal recommendation for the start of the Anthropocene. The group tallied up the various proposals to choose the one that best fit, publishing a summary of their work earlier this year in the journal Science

The signal that received the most attention was the radioactive fallout from nuclear tests, which left a prominent layer of plutonium in sediments and glacial ice. Even though thermonuclear weapons were not tested everywhere in the world, their evidence is global. “Once the fallout could get into the stratosphere, it was then distributed right around the planet very quickly over weeks or months,” says geologist Colin Waters of the British Geological Survey and secretary of the AWG. “Plutonium is barely present naturally; it’s very, very rare. So as soon as you start to see this increase, then you know that you’ve got 1952.” The radioactive signal disappears in 1964 after countries agreed to test nuclear devices underground.

A number of other signals also cluster around the year 1950 in what the AWG calls “The Great Acceleration,” when human population, resource use, industry and global trade took off. It’s then that many anthropogenic signals that once were local became truly global, and perhaps global enough to signify the Anthropocene. Here are some of those signals:

  • Concrete has been around since the Roman Empire, but “volumetrically most of the concrete ever produced has been since 1945 or 1950,” says Waters. That makes it a recognizable modern material. The downside? Concrete is uncommon in the oceans and absent from glacial ice so the signal isn't universal, he says.
  • Plastics were first introduced in the 1800s, but today there are more plastics around than ever before. Production expanded from 2 million tons in 1950 to 300 million tons in 2015, and it’s estimated that 40 billion tons of the stuff will exist by 2050. People like plastics because they’re lightweight and degrade slowly. But those same qualities also make plastic a good geologic indicator. Sediment samples containing plastics nearly all caome from the last half century, according to Zalasiewicz. This abundance of plastic "was almost unknown before the mid-twentieth century,” he says. On Hawaii beaches, geologists are now finding rocks they call “plastiglomerate,” which is formed when campfires melt plastics into a massive glob containing pebbles and sand. In addition, microplastics, such as tiny microbeads from cosmetics and artificial fibers from clothing, are currently forming a sedimentary layer on the seafloor. The downside of using plastics as a marker is that they are not commonly found in glacial ice, so they are not a universal signal.
  • Nearly all of the reactive nitrogen on Earth has been produced since 1913, when German chemists Fritz Haber and Carl Bosch figured out how to capture nitrogen gas from the air and turn it into fertilizer. Since then, the amount of reactive nitrogen on Earth has more than doubled, with a substantial increase around 1950 as the Green Revolution industrialized farming practices. And though it sounds like it would be a good Anthropocene marker, nitrogen doesn’t leave a strong signal in the sediments. “The processes are not quite as well understood,” says Zalasiewicz. In some remote lakes in northern Canada, far from local human influences, the dominant structures of nitrogen atoms (known as isotopes) shift around 1950, reflecting the addition of nitrogen fertilizers. But whether this shift is consistent enough across lakes throughout the world to make a good signal isn’t yet certain.
  • Burning fossil fuels releases black “fly ash” particles into the atmosphere; with no natural source, they are clear signs of human activity. Those particles are now found in lake sediments throughout the world, starting as early as 1830 in the UK, and showing a dramatic, global increase beginning around 1950. “But they peaked already around the 1970s [through the] 1990s and are starting to decline,” says Waters. So similar to radioactive nucleotides, fly ash signals a geologic shift but doesn’t make a good permanent indicator.
  • The increase in carbon emissions from burning fossil fuels is recorded in a shift in carbon isotopes, which is present in any materials that trap carbon including glacial ice, limestone, shells of marine animals (found in seafloor sediment) and corals. The signal shows up around the Industrial Revolution, with a sharp increase around 1965. It's a good signal, says Zalasiewicz, though not quite as sharp as either the fly ash or radioactivity.

Some human impacts aren’t yet visible in sediments, but could plausibly leave signals in the far future. For instance, people have extensively transformed Earth itself. We dig mines, landfills and foundations for buildings; we build dams, docks and seawalls, which alter water flow and erosion; we quarry and transport rock around the world to construct towns and cities; we churn and move topsoil for farming. Future paleontologists could find these man-made materials compressed into an unusual rock layer that would be conspicuously Anthropocene.

Then there are the future fossils left behind by today’s plants and animals—and those that will vanish as species go extinct. Any hard-bodied animal that sports a shell or is held up by bones has a chance to leave a fossil upon its death.

If we are in the midst of a mass extinction, which some scientists believe we are, the disappearance of common fossils could be another indicator. But this would be a messy signal with different changes taking place at different times around the world. “It’s a more complicated signal simply because life is more complicated than the average radionucleide or carbon isotope,” says Zalasiewicz.

Another option are the fossils from the species that dominate after extinctions, such as invasives, which might leave a cleaner signal. Zalasiewicz is currently leading a team that is studying the Pacific oyster, which was introduced from the Sea of Japan to coastlines around the world during the past century. It’s both abundant and likely to fossilize, giving it strong potential as an Anthropocene indicator.

“Where [the Pacific oysters] appear they will be a new element of the biology and therefore future paleontology in those strata,” he says. “But again because humans have transplanted different species at different times around the world, it’s a complicated or messy signal.”

These findings are all play into the AWG's presentation this week at the IGC. They originally hoped this presentation would coincide with their official submission on the Anthropocene to the International Commission on Stratigraphy. But after speaking with geologists on the commission, they decided to wait. “It’s clear that the community would be more comfortable and feel rather more grounded with a traditional golden spike type definition,” says Zalasiewicz. Collecting evidence of signals isn’t enough; they need to identify a location to hammer in the Anthropocene golden spike.

The group isn’t yet sure where they’ll place it; they’re eyeing sediment cores from the deep ocean or remote lakes where the layered signals are clear. But finding a good core comes with its own set of challenges because the layer of Anthropocene sediment is very thin. “If you went to the deep oceans, you might be talking about a millimeter or two of sediment,” says Waters. “All you need is a bivalve to crawl across the seabed and it’ll churn up the whole of the Anthropocene in one go.” In many places, trash or fishing trawls have already obliterated any potential Anthropocene layers.

The work of identifying a golden spike location will likely take years. The researchers may need to go out into the field, drill for sediment cores, and do complicated analyses to prove that the signals are consistent and global. Up to this point, AWG members have been doing this work on their own time; now they’ll need to find funding in order to devote themselves to the effort.

Zalasiewicz groans at the thought of it. “Writing grant applications is one of the world’s great soul-destroying jobs,” he says. But to stake a geologic claim to the Anthropocene and bring the world’s overseers of the geologic time scale to a vote, a bit of soul destruction may be worth it.

“The current signals that are forming are quite striking to us already, even if humans died out tomorrow,” he says, a mark will likely remain in the geologic record in the far future. “A case can be made that it can be separable as a geological time unit. We cannot go back to the Holocene.”