Skip to Content

Found 1,978 Resources

Why 'Paradise Lost' Is Translated So Much

Smithsonian Magazine

"Paradise Lost," John Milton's 17th-century epic poem about sin and humanity, has been translated more than 300 times into at least 57 languages, academics have found.

“We expected lots of translations of 'Paradise Lost,'" literature scholar Islam Issa tells Alison Flood of the Guardian, "but we didn’t expect so many different languages, and so many which aren’t spoken by millions of people."

Isaa is one of the editors of a new book called Milton in TranslationThe research effort led by Issa, Angelica Duran and Jonathan R. Olson looks at the global influence of the English poet's massive composition in honor of its 350th anniversary. Published in 1667 after a blind Milton dictated it, "Paradise Lost" follows Satan's corruption of Adam and Eve, painting a parable of revolution and its consequences.

Milton himself knew these concepts intimately—he was an active participant in the English Civil War that toppled and executed King Charles I in favor of Oliver Cromwell's Commonwealth.

These explorations of revolt, Issa tells Flood, are part of what makes "Paradise Lost" maintain its relevance to so many people around the world today. The translators who adapt the epic poem to new languages are also taking part in its revolutionary teachings, Issa notes. One of the best examples is when Yugoslav dissident Milovan Djilas spent years translating "Paradise Lost" painstakingly into Serbo-Croatian on thousands of sheets of toilet paper while he was imprisoned. The government banned the translation, along with the rest of Djilas' writing.

That wasn't the first time a translation was banned—when "Paradise Lost" was first translated into Germany, it was instantly censored for writing about Biblical events in "too romantic" a manner. Just four years ago, a bookstore in Kuwait was apparently shut down for selling a translation of Milton's work, though according to the owner, copies of “Paradise Lost” remained available at Kuwait University's library.

As the world becomes increasingly globalized expect to Milton's seminal work to continue to spread far and wide. In the last 30 years, the researchers found that more translations of "Paradise Lost" have been published than in the 300 years before that.

Why Abraham Lincoln Was Revered in Mexico

Smithsonian Magazine

American historian Michael Hogan makes a bold claim. He says that Abraham Lincoln is in no small part responsible for the United States being blessed for many generations with an essentially friendly nation to the south—this despite a history that includes the United States annexation and conquest of Mexican territory from Texas to California in the 1840s, and the nations’ chronic border and immigration tensions. “Lincoln is revered in Mexico,” Hogan says. As evidence, he points to the commemorative statues of Lincoln in four major Mexican cities. The one in Tijuana towers over the city's grand boulevard, Paseo de los Héroes, while Mexico City's Parque Lincoln features a replica of sculptor Augustus Saint-Gardens' much admired Standing Lincoln, identical to the one in London's Parliament Square. (The original stands in Lincoln Park in Chicago.) These are commanding monuments, especially for a foreign leader.

In his 2016 study, Abraham Lincoln and Mexico: A History of Courage, Intrigue and Unlikely FriendshipsHogan points to several factors that elevated the United States’ 16th president in the eyes of Mexicans, in particular Lincoln’s courageous stand in Congress against the Mexican War, and his later support in the 1860s for democratic reformist Benito Juárez, who has at times been called the “Abraham Lincoln of Mexico.” Lincoln’s stature as a force for political equality and economic opportunity—and his opposition to slavery, which Mexico had abolished in 1829—made the American leader a sympathetic figure to the progressive followers of Juárez, who was inaugurated as president of Mexico in the same month and year, March 1861, as Lincoln.

“Both were born very poor, pulled themselves up by their bootstraps, became lawyers, and ultimately reached the highest office of their countries,” says Hogan in a telephone interview from Guadalajara, where he has lived for more than a quarter-century. “Both worked for the freedom of oppressed peoples—Lincoln demolishing slavery while Juárez helped raise Mexican workers out of agrarian peonage.” (In a lighter vein, Hogan points out that physically, they were opposites: While the gangly Lincoln stood six-foot-four, Juárez reversed those numbers, at a stocky four-foot-six.)

Early on in Lincoln’s political career, as a freshman Whig congressman from Illinois, he condemned the 1846 U.S. invasion of Mexico, bucking the prevailing patriotic tide and accusing President James K. Polk of promoting a falsehood to justify war. After a skirmish of troops in an area of what is now south Texas, but was then disputed territory, Polk declared that "American blood has been shed on American soil” and that therefore “a state of war” existed with Mexico. “Show me the spot where American blood was shed,” Lincoln famously challenged, introducing the first of eight “Spot resolutions” questioning the constitutionality of the war. Lincoln’s stand proved unpopular with his constitutents—he became known as “Spotty Lincoln”—and he did not seek re-election.

He was not alone in his protest, however. Among others, New Englanders such as John Quincy Adams, who lost a son in the war, and Henry David Thoreau, who wrote his famed essay, “On Civil Disobedience,” in reaction to the war, also dissented. Ulysses S. Grant, who distinguished himself as an officer serving in Mexico, later wrote in his memoirs that it had been “the most unjust war ever waged against a weaker nation by a stronger.”

In seizing more than half of Mexico’s territory as the spoils of war, the U.S. increased its territory by more than 750,000 square miles, which accelerated tensions over the expansion of slavery that culminated in the carnage of the American Civil War. Hogan believes strongly that the long-term economic impact on Mexico should inform thinking about border politics and immigration today, “We conveniently forget that the causes of northward migration have their origins,” he writes, “in the seizure of Mexico’s main ports to the west (San Diego, San Francisco, Los Angeles), the loss of the rich silver mines of Nevada, the gold and fertile lands of California, and the mighty rivers and lakes which provide clean water to the entire southwest.”

In the course of researching his Lincoln book, Hogan made an important discovery in the archives of the Banco Nacional de México: the journals of Matías Romero, a future Mexican Treasury Secretary, who, as a young diplomat before and during the American Civil War, represented the Juárez government in Washington.

Romero had written a congratulatory letter to Lincoln after the 1860 election, to which the president-elect cordially thanked Romero, replying: “While, as yet I can do no official act on behalf of the United States, as one of its citizens I tender the expression of my sincere wishes for the happiness, prosperity and liberty of yourself, your government, and its people.”

Those fine hopes were about to be tested as never before, in both countries.

During its own civil war of the late 1850s, Mexico had accrued significant foreign debt, which the French Emperor Napoleon III ultimately used as pretext to expand his colonial empire, installing an Austrian archduke, Ferdinand Maximilian, as Emperor Maximilian I of Mexico in 1863. The United States did not recognize the French regime in Mexico, but with the Civil War raging, remained officially neutral in the hope that France would not recognize or aid the Confederacy.

Nevertheless, the resourceful Romero, then in his mid-20s, found ways to secure American aid in spite of official policy, mainly by establishing a personal relationship with President Lincoln and the First Lady, Mary Todd Lincoln. From there, Romero was able to befriend Union generals Grant and Philip Sheridan, connections that would later prove crucial to the Mexican struggle. “What particularly endeared Romero to the American president,” Hogan notes, “was that he escorted Mrs. Lincoln on her frequent shopping trips…with good-natured grace. It was a duty which Lincoln was happy to relinquish.”

With Lincoln’s earlier letter in hand,Romero made the rounds with American bankers in San Francisco, New York and Boston, Hogan says, selling bonds that raised $18 million to fund the Mexican army. “They bought cannon, uniforms, shoes, food, salaries for the men, all kinds of things,” he says. “And Grant later helped them secure even better weapons—Springfield rifles. He would go to the Springfield people and say, “Get them some decent rifles. I don’t want them fighting the French with the old-fashioned ones.”

After the Civil War, the U.S. became even more helpful in the fight for Mexican liberation. In a show of support, Grant dispatched 50,000 men to the Texas border under General Sheridan, instructing him to covertly “lose” 30,000 rifles where they could be miraculously “found” by the Mexicans. Sheridan’s forces included several regiments of seasoned African-American troops, many of whom went on to fight in the Indian Wars, where they were nicknamed the Buffalo Soldiers.

By 1867, the French had withdrawn their occupying army; the Juárez forces captured and executed Maximilian, and the Mexican Republic was restored. Though Lincoln didn’t live to see it, his Mexican counterpart had also triumphed in a war for the survival of his nation. “Lincoln really loved the Mexican people and he saw the future as us being allied in cultural ways, and also in business ways,” Hogan reflects. “He supported the growth of the railroads in Mexico, as did Grant, who was a big investor in the railroads, and he saw us as being much more united than we are.”

Though most of this history has receded in the national memories of both countries, Hogan believes that Lincoln’s principled leadership and friendship—outspoken in the 1840s, tacit in the 1860s—created a pathway for mutually respectful relations well into the future.

Why America Has a “President” Instead of an “Exalted Highness”

Smithsonian Magazine

In 1789, the year of George Washington’s election, America had spent six years recovering from the Revolutionary War and twice that amount of time trying to nail down what form the new nation’s government would take. The Articles of Confederation, an admirable failure of decentralization, would be replaced by the U.S. Constitution. But even with that binding document and a democratically elected leader, what would the United States of America call its new chief executive?

A king by any other name would be just as tyrannical—or so thought the earliest American politicians (and the Romans, who abhorred the title “rex” and its dangerous association with unchecked power). With only 10 weeks until Washington was to take office, Congress asked what now seems like a straightforward question: what should Washington’s title be? After all, he was the first of his kind, the leader of a newborn nation. And America couldn’t go on to another king after having just revolted against one.

So the debate began. Some delegates to the Constitutional Convention suggested “His Exalted Highness,” with others chiming in with the more democratic “His Elective Highness.” Other suggestions included the formal “Chief Magistrate” and the lengthy “His Highness the President of the United States of America, and Protector of Their Liberties.” The debate went on for multiple weeks, according to historian Kathleen Bartoloni-Tuazon, because the House of Representatives worried that too grand a title might puff Washington up with power, while the Senate feared Washington would be derided by foreign powers if saddled with something as feeble as “president” (the title originally meant, simply, one who presides over a body of people‑‑similar to “foreman”).  

“…[T]he debate over whether or not to give the president a regal title represents an early consideration of constitutional intent, just as it also comprises the ‘first dispute between the Senate and the House,’” Bartoloni-Tuazon writes in For Fear of an Elective King. “The fight over titles was hardly frivolous. The controversy explored an important constitutional question: How much like a monarch should the head of a republic resemble, particularly in the United States, whose revolution aimed at weakening the executive?”

The question of titles was a concern to the Founding Fathers even outside political office. Article 1, Section 9 of the Constitution states that “No Title of Nobility shall be granted by the United States.” Alexander Hamilton called the clause a “cornerstone of republican government,” saying that without titles of nobility, “there can never be serious danger that the government will be anything other than that of the people.”

Eventually the Senate agreed to the simplified version of their grandiose title, and Washington became President of the United States. “Happily the matter is now done with, I hope never to be revived,” Washington wrote at the conclusion of the ordeal.

While the debate over titles has mostly ended, the question of how to address former officials is ongoing. Some former presidents and politicians choose to be addressed by their titles even after their careers are over (not Washington, who reverted to his military title of “general” after leaving office, or John Quincy Adams or Harry Truman). Boston University professor of Law Jay Wexler says that while the practice of holding onto one’s title after leaving office isn’t unconstitutional, it does create a permanent class of citizens who keep their titles of distinction forever and is therefore “inconsistent with the spirit of the [constitutional] clause.”

But as etymologist Mark Forsyth reminds us in his TED Talk on the subject, titles and their meaning and uses are always changing. “Politicians try to pick and use words to shape and control reality, but in fact, reality changes words far more than words can ever change reality,” Forsyth says.

Since the creation of the office of president, the title has undergone its own permutations. In 1903 the pronounceable acronym “POTUS” first came into use, and was quickly followed by FLOTUS (First Lady of the U.S.) and SCOTUS (for the Supreme Court). Then came the moniker “Leader of the Free World,” with origins dating to the United States’ entrance to World War II. But even after two centuries and dozens of men taking the office, the original title still remains the most potent one: Mr. President. 

Why Did a Venomous Fish Evolve a Glowing Eye Spike?

Smithsonian Magazine

In 2003, Leo Smith was dissecting a velvetfish. Smith, an evolutionary biologist at the University of Kansas, was trying to figure out the relationships between mail-cheeked fishes, an order that includes velvetfishes, as well as waspfishes, stonefishes and the infamous lionfish. As he worked his way to the velvetfish's upper jaw, though, he realized something strange—he was having trouble removing the lachrymal bone.

“On a normal fish, there's a little bit of connective tissue and you can work a scalpel blade between the upper jaw and this bone,” recalls Smith, whose work centers on the evolution of fish venom and bioluminescence. “I was having just a horrible time trying to separate it. When I finally got it separated, I noticed there was this thing that's all lumpy and bumpy … it was then that it hit me that it had to be some sort of locking mechanism.”

To be fair, most velvet fish already resemble thorny, blobby mutants, so an extra skewer isn’t really that unusual. But given that Smith has spent years studying mail-cheeked fishes (Scorpaeniformes)—an order that gets its common name from the bone plates found on each cheek—you’d think he would have noticed a massive, locking eye spike before. He hadn’t. He and his colleagues would dub this strange new discovery the “lachrymal saber.”

(FYI: Lachrymal comes from the Latin word for “tear.” While fish can’t cry, it’s still the technical name for the bone forming the eye socket.)

Smith and his co-authors at the American Society of Ichthyologists and Herpetologists describe this unlikely eye spike for the first time in the journal Copeia—and even report on one that glows fluorescent green, a little eye lightsaber. The authors can’t yet say exactly what the appendage is for. But they do claim that it has the potential to profoundly rearrange the Scorpaeniformes evolutionary tree, changing what we know about these highly venomous fish.

The finding also raises the question: How the heck did a glowing, locking sword-like appendage go ignored for so long?

A species of stonefish, the Spotted Ghoul (Inimicus sinensis), buried in the gravel. (Leo Smith)

It’s easy to miss a stonefish. True to their name, they closely resemble rocks, with cobbled-covered exteriors that mirror underwater rubble or chunks of coral. But step on one, and you’ll never forget it.

There are more venomous fish in the seas than snakes on land—or indeed, than all venomous vertebrates combined—but the stonefish is one of the most venomous on the planet. Getting pricked by one of these marine monsters can feel, as an unlucky victim once put it, like “hitting your toe with a hammer and then rubbing over it again and again with a nail file." While it’s uncommon, divers have even died after such an encounter.

Stonefish and their cousins are also marvelous at camouflage. Some grow algae and hydroid gardens on their backs, others can change color at will, and one, the decoy scorpionfish, has a lure on its dorsal fin that resembles a tiny, swimming fish. Found mainly in the tropical waters throughout the Indo-Pacific, these remarkable creatures use their disguises to both ambush prey and avoid becoming lunch themselves.

But the lachrymal saber, a unique aspect of these fish, had somehow gone overlooked. And while it’s not a Star Wars lightsaber or a blade from Lord of the Rings, this saber might be something even more impressive. Picture a complex spine under the fish’s eye that operates like a ratchet and pawl, laterally locking into place like two sharp arms. “They don't actually even move the saber itself,” says Smith. “They move the underlying bone that's connected to it through the locking mechanism and then that rotation is what locks it out.”

In at least one species—Centropogon australis, a breed of waspfish—the saber glows a biofluorescent lime green, while the rest of the fish glows orange-red under certain light.

Adam Summers, a biomechanist and fish specialist at the University of Washington, is currently trying to CT scan all 40,000 species of fish. Summers, who was not involved in the recent study, has already scanned 3052 species and 6,077 specimens, while studying many mail-cheeked fishes for years. And he’s never noticed the saber.

“Erectile defenses in fishes are really common,” says Summers, who was also a scientific consultant on Pixar’s Finding Nemo and Finding Dory. He isn’t referring to fish penises, but anatomical defenses that pop up when certain species are stressed or threatened. “If you’ve ever caught a fish and tried to pull it off the hook, you know the dorsal spines erect and they can poke the living crap out of you," hes says, "but that we missed one that was under the eye—sort of an eye saber—is pretty insane.”

To determine that these fish really are related beyond the saber, the researchers in the new study used DNA sequencing to confirm their findings. Looking at 5,280 aligned nucleotides and using 12 outgroups as controls, they built a phylogenetic or evolutionary tree. Once you have the tree, Smith explains, there are methods called ancestral character state reconstruction that allow us to trace when characters evolve. And that may help biologists unify a group of fishes that was previously thought to be separate families.

“The taxonomy of Scorpaeniformes is historically muddled,” Smith explains. “The scorpionfish and stonefish relationships have been really problematic, and there have been a lot of family-level names attached to this group that are dramatically cleaned up when these groups are treated as the two main lineages rather than the 10 traditional families. It is much cleaner now and the presence of a lachrymal saber can separate the two revised families completely.”

An Ocellated Waspfish (Apistus carinatus) being skeletonized by flesh-eating beetles at the Field Museum. (Leo Smith)

When he was first dissecting the velvetfish, Smith didn’t understand what he was looking at. “I just thought they were kind of spinier or lumpier,” he says. “These fish have a lot of spines and bumps on their head. So I was like, ‘Oh, these [lachrymal] ones are kind of more interesting.’”

Smith spent years examining fish skeletons and live fish to determine how widespread this saber was. Fortunately, as a curator at the Biodiversity Institute at the University of Kansas, he has access to one of the largest libraries of fish specimens in the world.

Many of these exemplar fish were made using a method called “clearing and staining,” in which scientists use a mix of liquid formaldehyde and a stomach enzyme called trypsin to dissolve muscle and other soft tissue. The result is a clear skeleton with red-tinted bones and blue-colored cartilage, like stained glass. This technique makes it easy to study skeletal structures of vertebrates.

“People who study fishes closely often work with dead preserved fish and these kinds of really cool things don't work in an animal that isn’t mobile,” says Summers. Still, “to find this and then to realize that it’s an uniting character for a whole group of fishes is very, very cool.”

Smith isn’t sure why the fish evolved this trait. The obvious assumption is it’s defensive, given the projected spines expand the width of the head, making the fish harder to swallow and more likely to puncture a would-be predator. Similar defensive measures exist: the deep-sea lanternshark, for example, has glowing “lightsabers” on its dorsal spine that are believed to defend against predators. But Smith hasn’t seen the lachrymal saber used defensively, except in photographs of mail-cheeked fishes getting eaten.

“I went into this assuming it was an anti-predator, complex anatomical thing that grew that way and now as every day goes on, I start questioning that more and more,” Smith says. “Part of it is I can never get the stupid things to do it … I mean you would think if it was just anti-predator, if I bumped the tank they would immediately get them out.” The other option, he says, is that it might be for attracting mates, though he points out that both genders appear to have the sabers.

In other words, for now, the eye spike is still a mystery.

In 2006, with Ward Wheeler, Smith found that more than 1200 species of fish are venomous, compared to previous estimates of 200. He updated that number a decade later to between 2386 and 2962. He also worked on a PLOS One paper with noted ichthyologists Matt Davis and John Sparks to show that bioluminescence evolved 27 separate times in marine fish lineages. He even revised the taxonomy of butterflyfishes.

With this new finding, Smith may have disrupted the way we think about fish relationships yet again, says Sarah Gibson, an adjunct professor of biology at St. Cloud State University in Minnesota who studies Triassic fish. “I think it's a pretty important, big study,” she says. “Knowing the evolutionary relationships of a group can really impact our understanding of the evolutionary history of fishes in general.” (Gibson worked with Smith when she was doing her dissertation, but was not part of the recent study.)

Understanding the evolution of stonefish is key to their conservation, adds Summers. “You can't conserve something unless you know who it is,” he says. The mystery of the lachrymal saber “is an interesting question that's worth addressing and I’m still blown away that we missed it.”

In the end, this discovery also underscores something Smith once told The New York Times: Despite centuries of research and exploration, “we really don’t know anything about fish.”

Why Do Secretaries of State Make Such Terrible Presidential Candidates?

Smithsonian Magazine

During her four years as the 67th secretary of state, Hillary Rodham Clinton visited 112 countries and logged 956,733 miles, setting a record as the most well-traveled U.S. envoy in history. But as Clinton mulls a second run for the presidency in 2016, there is one other number she may want to consider.

160.

By 2016, that is how many years it will have been since the last candidate with secretary of state credentials was voted into the White House. Prior to that, six secretaries of state went on to be elected president after their diplomatic service.

It might be convenient to trace the jinx to James Buchanan, the U.S. envoy to Britain and former secretary of state under James Polk who was elected to the presidency in 1856. Most presidential scholars, after all, rank him the worst chief executive in U.S. history. But while Buchanan did fail to prevent the Civil War, political historians offer analysis that suggests he shouldn’t take the rap for sullying the prospects of his successors at State. If diplomats have fallen out of favor at the polls, they say, blame America’s transformation into a global power, universal suffrage, the rise of the primary system and the changing nature of the cabinet position itself.

Besides Buchanan, the other top diplomats who became president all served in the country’s infancy. The nation’s first secretary of state, Thomas Jefferson, was followed to the White House by James Madison, James Monroe, John Quincy Adams and Martin Van Buren.

At a time when there were few prominent national figures and only white men who owned property could vote, the pool of presidential contenders came mostly from the vice presidency and the most senior cabinet position.

“In the early days of the republic, the secretary of state was the heir apparent to the president,” says H.W. Brands, a University of Texas at Austin professor of American history. “Presidents could easily hand-pick their party's next candidate. The party caucuses formally selected the candidates but presidents guided the process. There were no primaries, and vote-getting ability had little to do with the nominee-selection process.”

Backroom dealing and the prospect that time spent in diplomacy would pay off later with the presidency played a key role in the contentious and inconclusive election of 1824.

Secretary of State John Quincy Adams came out the winner of what came to be known as the “corrupt bargain” that saw the House of Representatives bypass the top electoral college vote-getter, Tennessee’s Andrew Jackson, in favor of the son of the second president. Adams won the day with the help of Kentuckian Henry Clay, who detested the populist Jackson and threw his support to the New Englander. In repayment, Adams made Clay his secretary of state and, as was widely understood, his designated successor.

The voters, however, had other ideas. In 1828, Jackson turned Adams out of the White House after just one term and four years later trounced Clay to be re-elected. Clay tried again in 1844 but lost a third time. He would “only” go down in history as The Great Compromiser and one of the country’s greatest statesmen.

Clay’s equally prominent colleague in the Senate, Daniel Webster of Massachusetts, also waged three losing campaigns for president. Two of them came after two stints, a decade apart, as secretary of state under John Tyler and Millard Fillmore.

Like Clay and Webster, many early secretaries of state were domestic political powerhouses who weren’t necessarily experts in foreign affairs.

“After the Civil War, the position's requirements changed,” says Walter LaFeber, a professor emeritus at Cornell University and a historian of U.S. foreign relations. “Secretaries of state were much less political party leaders than able, in some cases highly able, corporate-trained administrators. Their job was no longer to serve as part of a political balance in the Cabinet, but to administer an increasingly complex foreign policy.”

Some of the most effective secretaries, LeFeber says, were corporate lawyers like Elihu Root, Philander Knox and Robert Lansing -- establishment figures not interested in or known for their glad-handing skills with the hoi polloi. Others were career diplomats for whom politics held no appeal.

When the presidential primary system began to take hold in the second half of the 20th century, the distance between Foggy Bottom and 1600 Pennsylvania Avenue grew even longer.

“Suddenly, vote-getting ability was a big deal,” Brands says. “Secretaries of state, who often climbed the appointive ladder rather than the elective ladder, were untested and therefore risky. Their dearth as nominees and then presidents had little to do with their diplomatic skills; it had much to do with their absence of political chops.”

Voters wanted candidates who had won campaigns and came equipped with executive experience. In other words, governors like Jimmy Carter, Ronald Reagan and Bill Clinton. After Buchanan, the only president to be elected with substantial diplomatic credentials was George H. W. Bush, a former U.S. ambassador to the United Nations who later served as Gerald Ford’s envoy to China and director of the CIA. Secretaries of State, for that matter, were often selected from outside the legislature; prior to Clinton, the last senator to take on the cabinet role was Edmund Muskie in 1980.

“There is an elitism to running foreign policy,” says historian Douglas Brinkley. “You’re thinking about the world at large, but Americans like populists. You’ve got to play big in Des Moines, not in Paris. It used to be in the early republic that having your time in Paris was a big credential for president. It’s no longer that.”

Indeed, the White House cabinet room may be one of the worst springboards to the presidency overall. Besides the six diplomats, only former secretary of war William Howard Taft and former commerce secretary Herbert Hoover have made the jump to the Oval Office. Taft would also be confirmed as Chief Justice of the Supreme Court after his presidency.

However, losing a presidential campaign-- or two or three -- is a time-tested route to the secretariat. In the late 19th century, Maine Republican James Blaine would intersperse two separate terms as secretary of state with three failed runs for president. Democratic firebrand William Jennings Bryan lost three presidential elections before Woodrow Wilson appointed him to the post in 1913.

Current Secretary of State John Kerry, whose perceived French connection contributed to his loss to incumbent George W. Bush in 2004, and Hillary Clinton, who lost a historic election to Barack Obama four years later, came to the job like many of their predecessors: as a consolation prize.

Now, as Clinton ponders whether to become the first former secretary of state since Alexander Haig in 1988 to run for president -- something another highly touted top diplomat, Colin Powell, gave a pass -- is precedent weighted against her?

Not necessarily, says University of Virginia political scientist Larry Sabato. Despite Republican promises to make her handling of the 2012 attack in Benghazi an issue if she runs, being at State “has helped Hillary Clinton enormously,” he says, “because if there is anyone who needed to be put above politics, what with Bill, it was Hillary Clinton.”

Presidential scholar Stephen Hess of the Brookings Institution doesn’t see parallels to other secretaries of state who ran for the White House and lost. As a former first lady who was twice elected to the U.S. Senate and could make history as America’s first woman chief executive, Clinton “by now is in a category by herself.”

Why No One Can Agree on What George Washington Thought About the Relationship Between Church and State

Smithsonian Magazine

To commemorate the end of a bloody Revolutionary War, George Washington issued what might be considered the first executive order, setting aside the last Thursday of November as a day of thanksgiving and prayer. His 1789 Thanksgiving Proclamation was short, a mere 456 words, punctuated by references—“Almighty God,” “the Lord and Ruler of Nations,” “the great and glorious Being,” “the beneficent Author of all the good that was, that is, or that will be”—to a Supreme Being.

Pointing to sources like the proclamation, today’s religious leaders often count Washington as one of their own. The late evangelical writer Tim LaHaye, whose Left Behind series sold over 11 million copies, cast Washington as a “devout believer in Jesus Christ” who had “accepted Him as His Lord and Savior.” David Barton, founder of WallBuilders, an evangelical Christian advocacy organization, and the former vice chairman of Texas’s Republican Party, pictured a reverent Washington kneeling in prayer at Valley Forge on the cover of his book, Americas Godly Heritage. And many politicians look to texts like Washington’s proclamation as proof that America was founded as a Christian nation.

But what did Washington’s talk of this “glorious Being” really mean at the time? Are these references proof that Washington would, in LaHaye’s words, “freely identify with the Bible-believing branch of evangelical Christianity?” Or do they mean something else—something that would have been clear to Washington’s audience in 1789—but which eludes us today?

To find out, research psychologist Eli Gottlieb and I conducted a study in which we asked people with varied levels of historical knowledge and religious commitment to read Washington’s proclamation and tell us what they thought. At one end of the spectrum were members of the clergy; at the other were agnostic and atheist scientists. We also questioned professional historians, religious and nonreligious alike.

Clergy and scientists agreed that Washington was deeply pious, but where they parted ways was about whether his piety should be applauded—or denounced. A Methodist minister found support in Washington for the claim that the United States was founded on a “general Christian faith” and that “religion and spirituality played a significant role” in American life, more so than people are willing to admit today.

For their part, scientists chaffed at Washington’s “violation of church and state.” A biologist compared the president to a “country preacher” who arrogantly assumed “that everybody believed the same thing.”

And the historians? They reacted so differently that it seemed as if they had read a different document entirely.

Regardless of their religious leanings, historians focused less on what was in Washington’s address than on what wasn’t. One historian remarked that the proclamation would “depress Pat Robertson,” the evangelical media mogul and chairman of TV’s Christian Broadcasting Network, who would fume at the fact that the proclamation made “no mention of Jesus Christ.” In lieu of recognizable markers of Christian piety—Jesus, Son of God, the cross, the blood of salvation, the Trinity, eternal life, the Resurrection—one finds airy and nondescript abstractions like “great and glorious Being” or “the Lord and Ruler of Nations.”

Historians were not deaf to Washington’s religious references. While the clergy and the scientists saw them as evidence of Washington’s devotion, the historians stressed the president’s precision in crafting a vocabulary that would unite the dizzying array of Protestant denominations in post-revolutionary America without alienating the small but important groups of Catholics, Jews, and freethinkers dotting the American landscape. It was precisely because he understood that Americans did not believe the same thing that Washington was scrupulous in choosing words that would be acceptable to a wide spectrum of religious groups.

In his own time, Washington’s reluctance to show his doctrinal cards dismayed his Christian co-religionists. Members of the first Presbytery of the Eastward (comprised of Presbyterian churches in Massachusetts and New Hampshire) complained to the president that the Constitution failed to mention the cardinal tenets of Christian faith: “We should not have been alone in rejoicing to have seen some explicit acknowledgement of the only true God and Jesus Christ,” they wrote. Washington dodged the criticism by assuring the Presbyterians that the “path of true piety is so plain as to require but little political direction.”

Similarly, a week before his 1789 proclamation, Washington responded to a letter from Reverend Samuel Langdon, the president of Harvard College from 1774-1780. Langdon had implored Washington to “let all men know that you are not ashamed to be a disciple of the Lord Jesus Christ.” Once again, instead of affirming Christian tenets, Washington wrote back offering thanks to the generic “Author of the Universe.”

Even historians who have spent a lifetime studying Washington find his religious beliefs difficult to pin down. (John Adams once remarked that Washington possessed the “gift of silence.”) According to historian John Fea, himself an evangelical Christian, Washington’s Christianity took a back seat to his republicanism, believing that personal interests and commitments of faith should be, as Fea put it, secondary to the “greater good of the nation.”

The last state to ratify the Constitution was Rhode Island, and only after they had done so did Washington agree to visit the state. Arriving in Newport on August 17, 1790, Washington listened to the town’s notables offer greetings, among them a representative from Yeshuat Israel, Newport’s Hebrew congregation. Moses Seixas thanked Washington for “generously affording” the “immunities of Citizenship” to a people “deprived as we heretofore have been of the invaluable rights of free Citizens.”

Moved by these words, Washington responded four days later by making clear to the members of Yeshuat Israel that citizenship in this new country was not a matter of “generosity” or the “indulgence of one class of people” by another. America was not Europe, where tolerance of religious minorities, where it occurred, was an act of noblesse oblige. In the United States, Washington explained, “all possess alike liberty of conscience and the immunities of citizenship.”

Today, George Washington has been conscripted into the culture wars over the religious underpinnings of this country. The stakes are high. As one prominent theologian put it, if Washington can be shown to be an “orthodox Trinity-affirming believer in Jesus Christ” then “Christianity today is not an interloper in the public square” but can be mobilized to counter “the secular assault against the historic values and beliefs of America.” But those who summon the first president to the contemporary battlefield must pay a price: They must scrub Washington of the ambiguity, prudence, nuance, tact, and caution that so defined his character.

In the rare moments when Washington was forthcoming about religion, he expressed fear about using faith as a wedge to separate one American from another. He understood how religious disputes tear at civic union. “Of all the animosities which have existed among mankind,” Washington wrote Sir Edward Newenham in the midst of the bloodletting between Ireland’s Protestants and Catholics, “those which are caused by a difference of sentiments in religion appear to be the most inveterate and distressing.”

Washington dreamed of a nation, as he wrote to Newport’s Hebrew Congregation, that gives “bigotry no sanction … persecution no assistance.” What makes Americans American, he believed, is not the direction they turn to in prayer. Rather, it is the respect they owe fellow citizens who choose to turn in a different direction—or in no direction at all.

Sam Wineburg is a professor of education at Stanford University. His latest book is Why Learn History (When It’s Already on Your Phone).

Why the Colonies’ Most Galvanizing Patriot Never Became a Founding Father

Smithsonian Magazine

As John Adams told it, the American Revolution didn’t start in Philadelphia, or at Lexington and Concord. Instead, the second president traced the nation’s birth to February 24, 1761, when James Otis, Jr., rose in Boston’s Massachusetts Town House to defend American liberty.

That day, as five red-robed judges—and a rapt, 25-year-old Adams—listened, Otis delivered a five-hour oration against the Writs of Assistance, sweeping warrants that allowed British customs officials to search any place, anytime, for evidence of smuggling.

“It appears to me the worst instrument of arbitrary power,” argued Otis, “the most destructive of English liberty…that was ever found in an English law-book.” Until this case, the 36-year-old lawyer had been Massachusetts’ advocate general. But he resigned rather than defend the writs, then agreed to provide pro bono representation to the merchants fighting against them. Inside the courtroom, Otis denounced the British king, parliament, and nation as oppressors of the American colonies—electrifying spectators.

“Otis was a flame of fire,” Adams recalled years later. “American Independence was then and there born.…Then and there was the first…opposition to the arbitrary claims of Great Britain.”

At the time, Otis was the most brilliant orator in Massachusetts, and one of the most influential protesters against Britain’s colonial laws. But you may never have heard his name. He’s the Founding Father who could’ve been.

Born in 1725 in West Barnstable, Massachusetts, Otis enrolled in Harvard at age 14. He developed a reputation as an eloquent defense lawyer early in his career, successfully defending accused pirates in Halifax, Nova Scotia, and young men in Plymouth accused of rioting on Guy Fawkes’ Day. “He had the orator’s fire and passion,” wrote John Clark Ridpath in his 1898 biography of Otis; “also the orator’s eccentricities—his sudden high flights and transitions, his quick appeals and succession of images.”

In the patriotic version of Otis’ life story, conscience called him to defy British authorities after Massachusetts Governor Francis Bernard used the Writs of Assistance to enforce a long-dormant tax on molasses. But to hear his rivals tell it, a family feud inspired his rebellion. Thomas Hutchinson, the Massachusetts Bay Colony’s lieutenant governor, beat out Otis’ father for the job of chief justice in 1760. The younger Otis went to Hutchinson, “swore revenge,” and vowed to “set the province in flames,” the lieutenant governor claimed in his history of Massachusetts. Ridpath, however, dismissed the story. “The art of political lying was known even among our fathers,” he wrote.

Otis’ arguments at the 1761 trial didn’t win over the court, which upheld the Writs of Assistance. But Bostonians, impressed with his oratory, elected him to the Massachusetts House of Representatives soon after. There, he led patriots’ efforts to challenge a succession of British laws and taxes, gaining more fame with every outspoken defense of the colonists’ freedoms.

He developed a reputation as fiery, brilliant and erratic. Friends called him Furio; his archrival, Hutchinson, dubbed him the Grand Incendiary. “Otis is fiery and feverous,” John Adams wrote in his diary in 1765; “his imagination flames, his passions blaze; he is liable to great inequalities of temper; sometimes in despondency, sometimes in a rage.”

His defiance did more than flame colonists’ passions—it stirred them to actively resist.

He likely didn’t coin the phrase, “Taxation without representation is tyranny,” an overstatement based on John Adams’ paraphrase of his 1661 speech. Nonetheless, Otis deserves credit for advancing the idea behind the phrase, and as time went on his opposition to taxation only increased.

“The very act of taxing, exercised over those who are not represented, appears to me to be depriving them of one of their most essential rights,” Otis wrote in his 1764 pamphlet, “The Rights of the British Colonies Asserted and Proved.” The pamphlet, which argued that Parliament had no authority to tax the colonies unless they were granted seats in it, was debated in Parliament itself. “It is said the man is mad,” declared Lord Mansfield during one debate. “The book is full of wildness.”

In March 1765, Parliament imposed the Stamp Act, a tax on nearly every document printed in the colonies. Otis played a leading role in the Massachusetts legislature’s opposition to the law. And when the Townshend Acts levied new taxes on the colonies and revived the hated Writs of Assistance two years later, Otis and Samuel Adams co-wrote the Massachusetts House’s protest letter, again arguing that Parliament had no right to tax the colonies. An enraged King George III declared the letter seditious and demanded that the House rescind it. “Let Britain rescind her measures, or the colonies are lost forever,” Otis replied. The House rejected the demand, standing by its letter. The governor, furious, dissolved the legislature.

All that defiance damaged Otis’ marriage. Ruth, a loyalist, disagreed with her husband’s politics. “He mentioned his wife – said she was a good wife, too good for him – but she was a Tory,” John Adams wrote in his diary. “She gave him certain lectures.” Meanwhile, as tensions rose in Boston, Otis worried that the colonies would soon reach a boiling point. “The times are dark and trying,” he told legislators in 1769. “We may soon be called on in turn to act or to suffer.”

His words proved all too true. That summer, he learned that the four British customs commissioners in Boston had complained about him in letters to London. Enraged, he accused them of slander in a local newspaper. They were “superlative blockheads,” he wrote, threatening to “break [the] head” of commissioner John Robinson. The next night, Otis found Robinson at the British Coffee House near Boston’s Long Wharf and demanded “a gentleman’s satisfaction.” Robinson grabbed Otis by the nose, and the two men fought with canes and fists. The many loyalists in the coffee house pushed and pulled Otis and shouted for his death. British officers stood by and watched.

Otis was left bleeding. Months later, he still had a deep scar; “You could lay a finger in it,” John Adams recalled. The trauma unhinged his already fragile psyche. He started drinking heavily, expressing regret for opposing the British, and wandering Boston’s streets.

“He rambles,” Adams wrote in his diary in January 1770, “like a ship without a helm….I fear, I tremble, I mourn for the man, and for his country.” By February, Adams wrote, his friend was “raving mad, raving against father, wife, brother, sister, friend.”

Though Otis was re-elected to the House in 1771, he was too mentally troubled to play much of a role. John and Samuel Adams and other friends continued to support and socialize with him, but they weren’t surprised when his mind turned fiery and wild again. That December, his rival Hutchinson wrote, Otis was carried away, bound hand and foot. He spent much of the rest of his life living with various friends in the countryside, alternating between lucid moments and relapses.

The Revolution took a toll on Otis’ divided family. His son, James Otis III, enlisted in the American navy and died in a British prison at age 18. His daughter, Elizabeth, a loyalist, married a British captain and moved to England; Otis disowned her.

Friends and family took up Otis’ banner after he left politics. His peers took on leadership roles in the Revolution that he might have assumed. His sister, Mercy, went from answering his correspondence to organizing political meetings and publishing anti-British political satires—one of the first women in America to write for the public. His younger brother, Samuel Allyne Otis, was the first secretary of the U.S. Senate, serving from 1789 to 1814.

In early 1783, John Hancock, then Massachusetts’ governor, threw a public dinner to mark his friend’s return to Boston. But the speeches and toasts threw off Otis’ mental balance, and his family took him back home to the countryside. There, Otis burned most of his papers. On May 23, 1783, he stepped out of his friend’s house to watch a thunderstorm—and was killed by a lightning bolt.

Otis was “as extraordinary in death as in life,” John Adams wrote upon hearing the news. “He has left a character that will never die while the memory of the American Revolution remains.”

Why the New U.K. Political Coalition Could Undermine Peace in Ireland

Smithsonian Magazine

When British Prime Minister Theresa May called a Parliamentary election in hopes of securing an absolute majority for the Conservative Party, she didn’t realize the move was a major gamble. And instead of winning big, her party lost 13 seats—and majority control of Parliament.

So May turned to the Democratic Unionist Party (the DUP)—a little-known conservative party from Northern Ireland—to form a coalition that would give her a working majority in Parliament. But the seemingly simple deal may come with a heap of trouble: It’s angered other political groups, may undermine Brexit negotiations, and could upend almost two decades of peace in the turbulent region of Northern Ireland.

Confused yet? Here’s a guide to the most puzzling questions about the DUP, Northern Ireland and Brexit.

What’s the deal with May’s deal?

On June 26, Theresa May and Arlene Foster, leader of the DUP, agreed to a supply and confidence agreement that will help May’s conservative party get the votes it needs to control decision-making in Parliament. The price of this deal? Forking over £1.5 billion (almost $2 billion) to Northern Ireland over the next two years, only £500 million of which had previously been earmarked for the region. The money will go towards infrastructure, health and education. In return, the DUP will support the Tories (the Conservative Party) on platforms like homeland security legislation and Brexit negotiations by providing the necessary votes.    

What is Northern Ireland?

Politically, Northern Ireland is a part of the United Kingdom (if you need to brush up on what that means, read this). Geographically, it’s part of the island of Ireland, but not part of Great Britain—and that’s exactly as complicated as it seems.

It all started nearly a millennium ago, when an English king invaded Ireland. Power shifted back and forth a number of times over the centuries, and relations got more fraught after Henry VIII introduced Protestantism to the Catholic country in 1534. All the while, English colonists were coming to the island of Ireland and establishing themselves there, especially in the northeast around the industrial hub of Ulster. This region would eventually become the political entity known as Northern Ireland.

Centuries of fighting culminated in the 1921 Government of Ireland Act, which split the country into six majority-Protestant counties in the north and 26 majority-Catholic counties to the south. Thus Northern Ireland was born, and the rest of Ireland was left to rule itself as the Republic of Ireland.

Who’s in charge of Northern Ireland?

Northern Ireland is technically part of the United Kingdom, but it’s not ruled by English Parliament. Rather, two opposing political groups share power in the Northern Ireland Executive, also known as a devolved government. Those power-sharing groups are the leftist Sinn Fein (also known as nationalists, those who want to join the nation of Ireland) and the conservative DUP (or unionists, who want Northern Ireland to remain part of the United Kingdom).

The power-sharing scheme was created during the Good Friday Agreement of 1998, which ended a 30-year period of violence between the two groups that resulted in nearly 4,000 dead and 50,000 casualties. But this January, the power-sharing coalition collapsed, and even after Sinn Fein won a historically large number of seats in the Northern Ireland Assembly in March—just one fewer than the DUP—no agreement was reached between the parties that would allow them to move forward.

Talks to reform the semi-autonomous government are still ongoing. But with the new coalition between the DUP and the Tories, those talks might be even more strained than before. According to The Telegraph, Sinn Fein president Gerry Adams said, “The DUP are showing no urgency or no real inclination to deal with the rights-based issues which are at the crux and heart of these difficulties”—including marriage equality, an Irish language act and the country’s legacy of violence.

What is the DUP?

The Democratic Unionist Party was created by radical Protestant leader Ian Paisley in 1971. The group was on the unionist side of the Troubles—they wanted Northern Ireland to stay part of the “union” with the U.K., in part because many members track their ancestry back to mainland Britain. The culturally conservative party has vetoed same-sex marriage legislation, opposes making abortion legal, and its members deny climate change and have supported the teaching of creationism. It’s also connected to the far-right Orange Order, “whose members are forbidden from marrying a Catholic, from participating in Roman Catholic Churches,” says Jonathon Tonge, a professor of political science at the University of Liverpool and author of Northern Ireland.

Though the DUP is ideologically conservative, the party is left of center when it comes to economic issues. “On economics it’s more populist, it wants the government in Westminster to spend more money in Northern Ireland,” Tonge says. That’s evident in the deal they negotiated with May, which resulted in much more funding for Northern Ireland social services.

Isn’t that opposition party, Sinn Fein, in support of terrorists?

Early in its history, Sinn Fein supported the Irish Republican Army, which has alternately been termed a group of terrorists or freedom fighters. Either way, the IRA was behind multiple deadly attacks in Northern Ireland and on mainland Britain. But since the Good Friday Agreement, Sinn Fein has denounced violence in the name of Irish nationalism, and has operated as the left-wing opposition to the DUP.

Shouldn’t everyone be happy that DUP is negotiating for more money for Northern Ireland?

Yes and no. “So long as the DUP stays just with the cash rather than the sash—that being the sash of the Orange Order—it needn’t alienate nationalists,” Tonge says. In other words, if the DUP just accepts the money for Northern Ireland, it shouldn’t cause any controversy with Sinn Fein. But the DUP may use their position to later demand Northern Ireland end investigations into the British state for crimes committed during the Troubles, or that they end the Parades Commission that dictates where the Orange Order can march. (In the past, Orangemen marches through predominantly Catholic neighborhoods have resulted in riots and violence, which is why the commission was created.) Both these agenda items run counter to Sinn Fein’s platform.

The other problem is that the Good Friday Agreement of 1998 hinges on the British government being a neutral, third-party peace broker. “The Tory-DUP pact undermines the neutrality as it is an agreement between the governing party and a staunchly unionist party. This could have far reaching ramifications,”—including difficulty reforming the devolved government, said Henry Jarrett, University of Exeter professor of international relations, by email.

The sentiment has been echoed elsewhere. “The peace process, which was very hard earned over very many years … people shouldn’t regard it as a given,” former conservative Prime Minister John Major told the BBC. “It isn’t certain, it is under stress. It is fragile.”

What does all this mean for Brexit?

The DUP-Tory coalition definitely makes Brexit negotiations more complicated. First and foremost, the Republic of Ireland is part of the European Union, and that won’t change regardless of what its neighbor does.

Since the Good Friday Agreement was reached, the border between Northern Ireland and Ireland has been more of a political fact than a physical one. There are no fences, no towers, no tariffs on goods passing between the two regions. But all that could change under Brexit.  

“If Northern Ireland is outside the EU, which it would be, then there’s going to have to be tariffs on goods,” Tonge says. “The DUP doesn’t want special status in the U.K., it thinks that will be a slippery slope toward a unified Ireland. They want to leave the E.U., but they don’t want any of the consequences that come from leaving the E.U..” In other words, the DUP wants to support the rest of the United Kingdom in Brexit, but it doesn’t want to face any of the consequences of doing so, because that would mean barriers between Northern Ireland and the Republic of Ireland, which might reignite the violent fight over unification. 

The president of Sinn Fein, Gerry Adams, has gone so far as to say that taking Northern Ireland out of the E.U. will destroy the Good Friday Agreement. But Tonge is slightly more optimistic, in that everyone is taking the issue into serious consideration.

“All sides recognize the sensitivity of the border and don’t want to go back to the days when it was like a fortress,” Tonge says. No one wants a war that lasted 30 years to pick up again—but how Brexit will be negotiated without triggering one is still up for debate.

Wild Turkey

National Museum of American History

Wild at Heart

Smithsonian Magazine

At 9 a.m. the morning fog is beginning to lift from eastern Yosemite Valley. Thirteen sixth graders are milling around, preparing to set off on a daylong excursion. Bundled in fleece jackets against the chilly air, the kids are chattering about their ultimate destination: Yosemite's "Spider Caves." One rumor—that it's pitch-dark in there—is true. But others just may be exaggerated. "My sister has been there before; she said you can fall a really long way," says 11-year-old Charles Healow.

The students have converged here under the auspices of the Yosemite National Institutes, a nonprofit organization dedicated to connecting young people with this magnificent wilderness, using its 761,000 acres as a classroom. On weeklong outdoor-education outings in Yosemite, iPods and laptops are banned. For many of these sixth graders, all of whom attend the Notre Dame des Victoires school in San Francisco, going unplugged is a rude awakening. Ordinarily, admits 11-year-old Kenny Tankeh, "I'd be home right now, watching TV." The institute estimates that school-age children across the nation spend, on average, fewer than eight minutes outside each day. "We're aiming to be the antidote to this nature deficit disorder," says the organization's Adam Burns.

Today the instructors are leading 20 student groups into the park. During an academic year, more than 14,000 kids will trek here. Most attend schools in California; this week's groups include a contingent from New York City. Last month, students came from as far away as Beijing.

The wilderness immersion programs were started in 1971, when Santa Barbara high-school teacher Don Rees brought a class to Yosemite. That same year, in cooperation with Rees, the National Park Service expanded his idea to help create the Yosemite National Institutes. The fledgling venture benefited from the support of several high-profile board members, including astronaut Bill Anders and actor Robert Redford, who had worked in Yosemite after high school.

Early in the afternoon, the Notre Dame des Victoires group, led by institute staffer Laura Manczewski, clambers up a rocky slope to the Spider Caves. The youngsters lower themselves into impenetrable darkness. "I can't find my foot!" yells 12-year-old Charles Kieser from the inky gloom. "I lost my right foot!" Ten minutes and 100 feet later, the students emerge one at a time through a crevice, smiling and squinting into the light.

Manczewski tells everyone to sit on the grass and write in their journals. A half-hour later, Kieser volunteers to share his musings. "The Spider Caves are like life, because you can't always see ahead of you," he reads, "but if you keep going, you'll find your way." It's the kind of road-less-traveled insight that John Muir himself might well have understood—and appreciated.

William Dean Howells

National Portrait Gallery

William Earl Dodge Memorial [sculpture] / (photographed by Joseph Hawkes)

Archives and Special Collections, Smithsonian American Art Museum
On photo mount label: J.Q.A. Ward. Portrait statue of William Earl Dodge. New York, Herald Square. Photographer: Hawkes. 1886. Classification number: 282/W259/652. Accession: 46493.

Historical photograph of the sculpture located at Herald Square (before it was moved to Bryant Park in 1941), and with its original base designed by Richard Morris Hunt, which was not saved in the move. Also included in the photograph is a view of the New York Herald Building which was demolished in 1921 and the Minerva and the Bell Ringers sculpture by Antonin Jean Carles located on the building's roof. When the building was demolished, the Minerva sculpture was saved and placed in a niche in Herald Square in 1940 as a memorial to the New York Herald founder, James Gordon Bennett.

Sharp, Lewis I., "John Quincy Adams Ward, Dean of American Sculpture," Newark: University of Delaware Press, 1985, no. 73.

1 photographic print : b&w, 9 3/8 x 7 1/4 in. (trimmed), mounted on 9 3/4 x 13 7/8 in. board.

William H. Vanderbilt [sculpture] / (photographed by Peter A. Juley & Son)

Archives and Special Collections, Smithsonian American Art Museum
Sharp, Lewis I., "John Quincy Adams Ward: Dean of American Sculpture," Newark, DE: University of Delaware Press, 1985.

Black-and-white study print (8x10).

Orig. negative: 8x10, Glass, BW.

William Shakespeare [sculpture] / (photographed by Joseph Hawkes)

Archives and Special Collections, Smithsonian American Art Museum
On photo mount label: J.Q.A. Ward. Portrait statue of William Shakespeare (1564-1616). New York, Central Park. Mall. 1872, bronze. Costume, English. 16th cen. end-17th cen. early. Photographer: J. Hawkes. Classification number: 282/W259/656. Accession: 46495.

Sharp, Lewis I., "John Quincy Adams Ward, Dean of American Sculpture," Newark: University of Delaware Press, 1985, no. 50.

1 photographic print : b&w, 9 1/4 x 7 in. (trimmed), mounted on 9 3/4 x 13 7/8 in. board.

William Wilson Corcoran [sculpture] / (photographed by Peter A. Juley & Son)

Archives and Special Collections, Smithsonian American Art Museum
Sharp, Lewis I., "John Quincy Adams Ward: Dean of American Sculpture," Newark, DE: University of Delaware Press, 1985.

Black-and-white study print (8x10).

Orig. negative: 8x10, Glass, BW.

Woman

National Museum of the American Indian

Woman

National Museum of the American Indian

Woman At Home

National Museum of the American Indian

Woman Kneeling

National Museum of the American Indian

Woman Tablita Dancer

National Museum of the American Indian

Woman's Dress, 1855–65

National Museum of American History
This elaborate summer dress was worn by Mary Louisa Adams Johnson, a granddaughter of President John Quincy Adams and a great-granddaughter of President John Adams. Mary Louisa was born at the White House in Washington, D. C. on December 2, 1828. On June 30, 1853 she married her second cousin, William Clarkson Johnson of New York. She was his second wife. They had two children, Louisa Catherine Adams Johnson (who later married Erskine Clement), born in 1856, and John Quincy Adams Johnson, born February 12, 1859. She died at Far Rockaway, Long Island, New York on July 16, 1859, just a few months after her son's birth. It is possible that Mrs. Johnson may have had the opportunity to wear this dress only a few times before her death.

This dress with its cutting of the printed fabric to create the overall effect was clearly made by a skilled dressmaker. Since labels were not incorporated into dresses at this date, we will never know the name of the woman who made it. Silk gauze dresses of this sort were extremely popular for summer wear in the 1850s as evidenced by the number of them that are depicted in fashion plates, the hand colored fashion illustrations that were inserted into women's magazines. Because of the fragility of the open weave of the fabric, most existing examples are in poor condition. Buckram underskirts, which created the fashionable bell skirt silhouette, also abraded the gauze, causing additional damage. This example is in relatively good condition, with only a few tears in the skirt.

This two-piece dress is constructed of white silk gauze in a woven pattern with a printed small paisley design, predominantly red and blue at the top portion of the dress with medallions and flowers in browns and reds at the lower portion. The fabric has been cut and stitched to use the pattern to advantage in the dress. The bodice is made of a white cotton body covered with the paisley printed fabric. It has a round neck edged with corded piping. The center front opening is closed from waist to neck with twelve hidden metal hooks-and-eyes, with the front of the opening decorated with five silk tassels. The bodice has two boned darts on either side of the center front and a center back piece boned in the center with two smaller pieces on either side. A peplum is formed of separate pieces of gauze in a medallion design sewn to the body of the dress at the waist. The peplum is pointed at the center front, the center back, and the sides and is trimmed at the bottom edge with pink ribbon with chenille button fringe. The bell sleeves are lined in the upper portion with cotton, and the paisley printed fabric is pleated to fit the sleeve linings at the upper portion. The lower portions of the sleeves are constructed of the medallion print. Some pleats on upper sleeves and sleeve openings are trimmed with the same silk and chenille ribbon as the peplum. Tassels are attached at the points of the sleeve openings, and the same piping that edges the neckline is inserted in the seams of the armholes. The skirt consists of separate sections of buckram and gauze pleated and attached together to a narrow buckram waistband with a large metal hook-and-eye closure at the opening. The buckram skirt section has a narrow fold over hem, and the gauze skirt section has a partial overskirt that forms a flounce. The lower edge of the flounce and the gauze skirt section are trimmed with silk and chenille ribbon. The waist measures 20 inches.

Woman's Hairpin (Binyeo)

NMNH - Anthropology Dept.
"Late 19th century. Pewter. An unornamented everyday hairpin for the chignon was worn by married won1en until the mid-twentieth century. Binyeo are gold, silver, jade, nickel, wood, bamboo, horn or ivory. Pearls and coral ornament long silver or other metal pins. Enameled phoenix or dragon motif heads were strictly for royal court ladies' ceremonies. In later years, lower-class women popularized such pins by wearing them at their weddings (for illustrations and non1enclature, see Kim, 1988b: 258; DHM ('Doseol Hanguk oegyo-sa yeonpyo' or The Folkcrafts of Korea), 1980: 174-175; HUM ('Hanguk ui mi' or Beauty of Korea), 1988: 72; Jang, 1999: 1833). A long pin with plum blossom and bamboo symbolizes chastity. An unornamented willow or bamboo hairpin represents widowhood and mourning (Adams, 1987: 35). Bernadou Field Notes 138 "... pin for fastening the hair. Of pewter, but often made of silver and sometimes gilt. The hair is done up in a plait, but the end is taken up and back and plaited in." Collected in Seoul. Ref: Hough Korean Catalog p. 450; Bernadou Field Notes 138" [from: "An Ethnography of the Hermit Kingdom: The J.B. Bernadou Korean Collection 1884-1885", Chang-su Cho Houchins, 2004, number 38]

Woman's leggings

National Museum of the American Indian

Woman's leggings

National Museum of the American Indian
1921-1944 of 1,978 Resources