Found 902 Resources containing: Stereotypes
Here’s how stereotype threat works: there’s an idea out there about some group you identify—maybe it's being a woman or a person of color or gay or disabled. You know that stereotype, and you’re afraid that you might reinforce it. Which makes you anxious. Which means you perform worse than you might if you were just focusing on whatever you were supposed to be doing.
“Girls are bad at math” is a classic example of this. Girls know that they’re supposed to be worse at math than boys, and they fear that their performance at math will be used to reinforce that stereotype. Their fear often leads to worse performance at math because they’re so focused on not proving the stereotype.
This phenomenon has been proven in research labs over, and over, and over, but it’s often hard to pin down in the “real world” because there are so many variables. But now, researchers say they’ve got another piece of pretty good proof that what they’re seeing in the lab really does hold up in real life. It comes on the chess board.
A recent study examined female chess players to see if it were possible to identify proof of stereotype threat. First, the researchers surveyed 77 women to see whether they knew that many think men are better at chess than women. (You can’t be threatened by a stereotype you don’t know exists.) Since they’re chess players, they certainly did. There is only one woman in the top 100 chess players in the world.
Next, the researchers watched men and women play chess at twelve different tournaments. They saw 219 girls between the ages of 5 and 15 play. When they compared how the girls should have done, based on their rankings and previous play, and how they actually did, voilà: “Females performed worse than expected when playing against a male opponent, achieving 83% of the expected success based on their own and their opponent’s prerating," they write.
Not only did the study find that stereotype had an impact on games in the real world, they also found it had an impact on players in the long run: “Those most vulnerable to stereotype threat were less likely to continue playing in future chess tournaments.” This is a phenomenon that fields like science see constantl—women and minorities drop out of science and engineering fields all the time. As far back as 1999, the Atlantic ran a story on how stereotype threat hurts black college students. Throughout the 1990s the dropout rate for African-American college students was 20 to 25 percent higher than for whites. That number has not changed.
While it might not seem like a big deal whether a seven-year-old girl wins or loses in a chess tournament, it’s a pretty clear example of why that same seven-year-old might decide not to pursue her dream career in science or math.
Earlier this year, the organizations LeanIn.org and Getty Images announced a joint effort to change how women are portrayed in media content and advertising (New York Times, February 9, 2014). The project will create special collections of stock photographs that represent women "in more empowering ways."
The practices that have prompted this project are neither easily changed nor new. While I was researching my recent contribution for Women's History Month (a post about Science Service medical editor Jane Stafford), I came across a striking example that involved editorial decisions by two accomplished, smart women sensitive to the trends of their times.
In 1956, Faye Johannes Marley (1900-1992), editor of Independent Woman, the magazine of the National Federation of Business and Professional Women's Clubs, asked Stafford to contribute an article that would focus on "scientific work for the peaceful and constructive use of nuclear energy" by the "small band of pioneers who showed that women could make contributions" to science. After a telephone conversation to discuss the story, Marley wrote Stafford and urged her not to "emphasize the scholarship angle," but instead to play up "the various types of scientific work" that women might pursue after marriage.
Among the many "treats" that await historians in archival records are handwritten and marginal notes. Along with letters and drafts, these scribbles often expose the messy construction process that can precede a finished work. They can also reveal how biases and stereotypes influence content and editorial choices.
Stafford's contemporaneous notes mention several non-scientific aspects, such as the "hazel eyes" and "brown hair" of astronomer Elizabeth Roemer. One note suggests that the article "play up the refugee angle" (a goal fulfilled by choosing Science Talent Search winner Taimi Toffer. Mentioning the husbands and fathers of the subjects (who included astronomer Cecilia Payne-Gaposchkin, chemist Marjorie Ann Gilbert Moldenhauer, ecologist Vera Rada Demerec Dyson-Hudson, and psychologist Gloria Lauer Grace) was another nod to cultural values of the time and a practice not usually followed when discussing male scientists.
Stafford's finished article emphasized, in language emblematic of the 1950s, that these representatives of the nation's "scientific womanpower" were "by no means the blue-stocking type." Young woman contemplating careers in science could have it all. The scientists profiled were said to "have feminine charm and athletic ability as well as intellectual prowess." "Playing this feminine role need not keep them from continuing their careers as scientists," she concluded.
For keen-eyed consumers of popular culture, such examples will seem eerily familiar. The mass media and social media continually transmit and reinforce statements about the role and status of women in science. Each March, we make a concerted effort to highlight the remarkable achievements of remarkable women but the challenge remains unchanged: how to describe and discuss women in real terms while demythologizing the notion that only "superwomen" can become "superscientists." Real female scientists have hazel eyes, families, and charm as well as Nobel prizes, hundreds of publications, and ground-breaking discoveries. The challenge in the future will be to break down constraining stereotypes, while not closing the door on diverse choices and life paths.
Lean in, readers. Let the discussion begin.
- Record Unit 7091 - Science Service, Records, circa 1910-1973, Smithsonian Institution Archives - Includes correspondence, drafts, and notes related to Jane Stafford’s article
Conventional wisdom states that, newly hatched, a duckling will immediately clamp onto the first suitable mother figure it lays eyes on. We often see this ability, known as imprinting, go awry—in the form of a line of ducklings waddling behind a dog, person, or cat. This kind of adorable slip-up might be taken as evidence that ducks aren’t exactly the smartest creatures in the animal kingdom—you might even say they’re a bit, well, bird-brained. But is that true?
Actually, a duckling’s ability to imprint confers a remarkable ability for abstract thought, often associated only with primates and other animals considered highly intelligent. Ducks even outperform supposedly “smarter” animal species in certain aspects of abstract reasoning. Just hours after birth, those yellow fuzzballs understand concepts like “same” and “different,” remember them, and apply them to never-before-seen objects with no social cues or training whatsoever, researchers report in a study published this week in the journal Science.
Take that, duck-doubters!
To explore how ducks think, researchers exposed newborn ducklings to a variety of objects, showing them pairs that were either the same or different, in characteristics like shape or color. Later, when shown completely different objects, three-fourths of the ducks got up and followed the pair that had the same relation they'd originally seen—whether it was one of color or shape, sameness or difference—parading after them the same way they'd line up and follow Mrs. Mallard.
For example, newborn mallards who were first exposed to two spheres (same), later chose to follow a pair or triangles (same) rather than a cube and a cuboid (different). “We hatch them, we give them about 12 hours to dry off, and once they able to walk they are able to do this and learn it with great accuracy,” says Antone Martinho a cognitive scientist at the University of Oxford and co-author of the new study.
This kind of relational matching behavior has been observed in certain primates, like monkeys and apes (and of course humans), and a few other birds, like parrots and crows. But again, these animals are all generally considered to be far more intelligent than ducks.
Plus, those species exhibited relational matching behavior only after going through training that rewarded correct associations and punished incorrect ones. In ducks, by contrast, this ability appears to be virtually innate. “To our knowledge, this is the first demonstration of a non-human organism learning to discriminate between abstract relational concepts without any reinforcement training,” said co-author Alex Kacelnik, of Oxford University's zoology department, in a statement.
How are ducks able to cognitively able to perform such seemingly-advanced cognitive tasks so soon after entering the world? It seems there’s more to imprinting than meets the eye.
“Imprinting allows ducks to identify who their mother is on the first day of their life,” Martinho says. “In this experiment we are essentially hijacking that normal, but remarkable, behavior. We already knew that ducks would be very good at learning quickly because that's what they are built to do. But the fact that, within that behavior, they can learn something abstract was certainly startling. And they do it quite a bit faster than we see in other species.”
“That's more of a testament I think to their innate ability for imprinting, coupled with their ability to recognize abstract concepts, rather than simply being faster at abstract concepts than other species,” he adds. “This is two abilities combining to produce a stunning result.”
Edward Wasserman, an experimental psychologist at the University of Iowa who wrote a commentary on the study in Science, said the study added on to our understanding of abstract thought in animals. First, it demonstrated abstract thinking in a bird considered not particularly intelligent. But it also showed that abstract thinking can occur in animals just hours old, suggesting that prior learning isn't needed for this kind of ability. Finally, it showed that learning could take place with no instruction or system of reward and punishment.
“Those three things produce a powerful mix that makes this an unprecedented and important project,” says Wasserman, who has showed that pigeons can recognize and categorize objects much as human toddlers do and helped a Russian team explore how crows can match objects without training.
He adds that being able to distinguish likeness and difference is a more advanced process than just knowing what the mother looks like standing still. When ducks dive, fly or move behind a bush, their shape and appearance changes to the viewer, which would cause youngsters relying on a fixed image to lose them.
“If animals are just taking a sensory snapshot, something akin to a photo where it's a case of, 'I see my mother, I remember exactly what my mother looks like at this moment in time and I'm going to use this image ingrained in my brain to follow her'—that's not going to work," he says.
Given how crucial it is for ducks—as well as crows and parrots, which are only distantly related on the avian family tree—it's likely that of abstract thinking is actually more common across animals than previously thought. “The suggestion from this evidence is that relational learning is something far more widespread in the animal kingdom than we might have suspected,” Wasserman says. Examples are mounting: One study has even suggested that honeybees can discriminate between the paintings of Monet and Picasso.
If that's true, another fascinating question to explore is the origin of abstract thought. As Wasserman puts it: “Did the wheel get reinvented many times, or might relational learning be exceptionally old and we're just now with our very young science discovering it?”
If 19th-century papers are to be believed, the problem had grown to plague-like proportions. Women were warned about this pestilence in ladies’ journals. Intrepid writers like Jack London exposed themselves to danger to get a closer look. Local and state governments warned against actions that might exacerbate the epidemic. No, the new social woe wasn’t bedbugs or tuberculosis or any other infectious disease: it was a supposed army of professional beggars spilling into cities across England and America.
“They have little care or anxiety, except the fun of dodging the policemen,” wrote K.K. Bentwick in The North American Review in 1894. “They shamelessly impose upon those who really pity and befriend them.” Bentwick described the weekly meetings these supplicants held in London and identified a biweekly paper published in Paris called Journal des Mendicants (beggars). In London’s travels around the United States as a tramp, the author best known for Call of the Wild came to know his share of professional beggars, whom he called the profesh. “[They] are the aristocracy of their underworld,” London wrote in The Road, but they were also the most fearsome because of the lengths they were willing to go to hold onto their status. “The professional mendicants may be estimated at no less than 60,000, who are for the most part thieves, or their accomplices,” claimed the British Lady’s Newspaper in 1847, likely an exaggeration of the actual number.
Where did these professional beggars come from, who made up their ranks, and how did they organize themselves? Each writer had their own answer, or no answers at all. But perhaps the real question should’ve been: were professional beggars real?
“As the homeless population emerges in the late 1870s, and in some cities in fairly large numbers, you see the emergence of literature trying to explain who these men are and what they’re doing there. They were also trying to create this hierarchy of deserving-ness,” says Stephen Pimpare, author of A People’s History of Poverty in America. “With most of this kind of writing, it’s almost all anecdotal.” In other words, the professional beggars of the 18th and 19th centuries were the welfare queens of their era. While Bentwick and London might not have been completely fabricating their accounts, they also didn’t consider societal factors like economic upheaval, war, epidemics and natural disasters, all of which correlate with increases in the number of beggars and homeless, says Pimpare.
Categorizing the deserving and undeserving poor goes back nearly a millennium in the Western world. Government officials in England began regulating begging and poverty relief as early as the 13th century, when population growth and depressed wages meant an increasing number of able-bodied people couldn’t make ends meet. After the first wave of the Black Death in 1349 reduced the labor force, the situation only got worse. While poverty had once been seen as a societal problem that required regular almsgiving, it was now transformed into a moral failing.
“What employers wanted was a return to earlier standards, to a labor market in which masters held the upper hand, workers were disciplined by the threat of insecurity, and wages were seen as ‘reasonable,’” writes historian Elaine Clark. “By launching a war of words that portrayed laborers as transgressors and employers as victims, the government defined the problem of the ‘begging poor’ as a problem of justice; able-bodied beggars were in the wrong and should be punished.”
Regulations on almsgiving and begging continued into the Elizabethan era of the late 1500s and beyond. A 1597 act laid down strict guidelines for beggars and vagabonds and required towns to provide a prison for the undeserving poor to be housed in. Turning poverty and begging into criminal offenses also meant employers could maintain low wages and control the labor market. “Everyone but an idiot knows that the lower classes must be kept poor or they will never be industrious,” wrote English traveler Arthur Young in 1771.
Despite criminalizing begging in England, some village magistrates adopted the practicing of establishing living wages, a system named “Speenhamland,” writes Boyd Hilton in A Mad Bad, and Dangerous People? England 1783-1846. And while opponents of the system argued it rewarded sloth and served to increase poverty, “most available evidence suggests that, rather than causing poverty, it was adopted in parishes where poverty was greatest.”
Begging and vagrancy could be punished by whipping, imprisonment and hard labor, though women and children—who made up 90 percent of beggars in London in 1796—were often exempted from punishment. All the same, the public fear of and fascination with male beggars continued to grow. In 1817, engraver John Thomas Smith wrote Vagabondiana, which detailed the lives of 30 Londoners living on the streets and how they survived.
“The vast majority of beggars are women with children, but the people who get in the literature are men who find a safe space on the street and own it,” says Tim Hitchcock, author of the 2005 Down and Out in Eighteenth-Century London. “Are they professional? Possibly. Are they poor? Yes. Are they in need? Yes,” says Hitchcock. “But you don’t continue begging if you can’t make a living on it.” He points to the existence of popular memoirs including to show that some people did consider themselves to be successful professional beggars, including Autobiography of a Super-Tramp and Mary Saxby’s Memoirs of a Female Vagrant.
To Hitchcock, the title “professional beggar” wasn’t so much a myth as it was part of long continuum of changing traditions for how poor members of society interacted with wealthier ones. He cites the tradition of British servants using Christmas boxes in the 18th and 19th centuries, wherein they carried the boxes around and begged for money, often earning more than their wages for the rest of the year combined. Or the holiday of Guy Fawkes, when children would beg for change outside pubs to pay for the ceremonial bonfires. Even Halloween is its own sort of begging, Hitchcock says.
Fearing beggars and discouraging welfare wasn’t unique to England in the 18th and 19th centuries. “[American chambers of commerce] were concerned that if governments started to intervene and provide more public assistance, it would strengthen workers’ bargaining rights in the labor market,” Pimpare says. “If you had nothing other than the awful, dangerous job in the factory, you’re gonna take it. But suddenly if soup kitchens are available, maybe if your job is really terrible or dangerous you’ll be able to turn it down.”
One of the main differences between begging in the U.S. and England, Pimpare notes, is the legacy of slavery. Following the Civil War, a number of southern states passed very specific laws that targeted newly freed slaves. These men could then be arrested for “crimes” like appearing in public without a visible means of support, violations that resulted in conscription into chain gangs or being leased out to private companies. The visible through line from those early laws to today’s mass incarceration debate are modern municipal laws that disproportionately target African-Americans, like those in Ferguson, Missouri as reported by the Washington Post.
The Civil War also resulted in many veterans suddenly finding themselves without employment, leaving them to wander the streets. Shortly after the war ended there was the first post-industrial economic depression in 1873. “There was something like a million vagrancy arrests in 1877, which was double, give or take, the number the year before,” Pimpare says. There were also immigrants from countries like Italy pouring into the United States, prompting more xenophobic fears about the motivations of these outsiders and whether they were contributing to the begging epidemic.
“The professional beggar became a conversation about how society should work more generally,” says Hitchcock. “When there’s no substantial safety net, begging becomes a more reasonable thing to do.”
But Pimpare thinks classifying beggars as professionals can be dangerous because it suggests society should turn to harsher punishments for poverty. “By blaming people for that failure it doesn’t obligate us collectively through government to step up and ensure there are opportunities available. People will often say poverty is such a hard problem, it’s so intractable, so difficult to deal with. It’s actually not all that difficult to deal with. Pretty much every rich democracy on the planet has a lower poverty rate than we do.”
The solution, he says, is to stop using myths that dole out blame to the impoverished, and look to other countries with greater welfare systems whose poverty and incarceration rates are lower than our own.
Hattie McDaniel is remembered as the first black actor to ever win an Oscar.
But McDaniel, born June 10, 1895 in Wichita, Kansas, was far more than that. In total, McDaniels played a maid at least 74 times over her career, perhaps most notably in her Oscar-winning performance as Mammy, Scarlett O’Hara’s slave and best counselor in Gone With the Wind. Her character's name was the one used for many black female slaves who took on domestic roles.
McDaniels was lauded for her performance as Mammy—a performance that continued off-screen as well. She was credited as “Hattie ‘Mammy’ McDaniel” in the film, did a tour of Gone With the Wind showings in costume. She even auditioned for the part in costume.
But she was also criticized by the NAACP for portraying stereotypes on screen. In 1947, McDaniels published an article in which she personally addressed her critics in Hollywood Reporter.
“I have never apologized for the roles I play,” she wrote:
Several times I have persuaded the directors to omit dialect from modern pictures. They readily agreed to the suggestion. I have been told that I have kept alive the stereotype of the Negro servant in the minds of theatre-goers. I believe my critics think the public more naïve than it actually is. As I pointed out to Fredi Washington, “Arthur Treacher is indelibly stamped as a Hollywood butler, but I am sure no one would go to his home and expect him to meet them at the door with a napkin across his arm.”
Although the n-word is frequently used in the Margaret Mitchell novel of the same name, it is never spoken in Gone With the Wind, reported Leonard J. Leff for The Atlantic in 1999. Part of the reason for this is that McDaniel refused to say it, Leff writes, and joined other actors in pushing back.
McDaniel wrote that the film industry had become a better place for black workers in the course of her career, and that black actors had gained recognition for their work. “I’d rather play a maid than be one,” she frequently said, according to Seth Abramovitch for Hollywood Reporter.
Of winning the Oscar, she wrote:
My own people were especially happy. They felt that in honoring me, Hollywood had honored the entire race. That was the way I wanted it. This was too big a moment for my personal back-slapping. I wanted this occasion to prove an inspiration to Negro youth for many years to come.
Still, her win was racially fraught. The Oscars dinner was held at the Coconut Grove, a segregated venue, and McDaniel was not able to sit with her fellow cast members who were at the awards. She had to sit at “a small table set against a far wall, where she took a seat with her escort, F.P. Yober and her white agent, William Meiklejohn,” Abramovitch writes. “With the hotel’s strict no-blacks policy, Selznick had to call in a special favor just to have McDaniel allowed in the building.”
It was consistent with the treatment that McDaniel and her black costars endured throughout the promotion of Gone With the Wind. But from one perspective–and certainly to McDaniel herself–just being in the room meant something. She “saw herself in the old-fashioned sense as a ‘race woman–someone advancing the race,” biographer Jill Watts told Abramovitch. McDaniel certainly put the hours in.
In 1851, a concert soprano named Elizabeth Taylor Greenfield embarked on a national tour that upended America’s music scene.
In antebellum America, operatic and concert songs were very popular forms of entertainment. European concert sopranos, such as Jenny Lind and Catherine Hayes, drew huge crowds and rave reviews during their U.S. tours. Lind was so popular that baby cribs still bear her name, and you can now visit an unincorporated community called Jenny Lind, California.
Greenfield, however, was different. She was a former slave. And she was performing songs that a burgeoning field of American music criticism, led by John Sullivan Dwight, considered reserved for white artists. African-American artists, most 19th-century critics argued, lacked the refined cultivation of white, Eurocentric genius, and could create only simple music that lacked artistic depth. It was a prejudice that stretched as far back as Thomas Jefferson in his “Notes on the State of Virginia” and was later reinforced by minstrel shows.
But when Greenfield appeared on the scene, she shattered preexisting beliefs about artistry and race.
‘The Black Swan’
Elizabeth Taylor Greenfield was born into slavery in Natchez, Mississippi, around 1820. As a girl, she was taken to Philadelphia and raised by an abolitionist.
Largely self-taught as a singer, she began her concert career in New York with the support of the Buffalo Musical Association. In Buffalo, she was saddled with the nickname “the Black Swan,” a crude attempt to play off the popularity of Jenny Lind – known as “the Swedish Nightingale” – who was wrapping up one of the most popular concert tours in American history.
In 1851, Colonel Joseph H. Wood became Greenfield’s promoter. Wood, however, was an overt racist and inhumane promoter known for creating wonderment museums in Cincinnati and Chicago that featured exhibits like the “Lilliputian King,” a boy who stood 16 inches tall. With Greenfield, he sought to replicate the success that another promoter, P.T. Barnum, had with Jenny Lind.Joseph H. Wood’s museum in Chicago (Encyclopedia of Chicago)
In a letter to Frederick Douglass, Martin R. Delany, a physician, newspaper editor and Civil War hero, wrote that Wood was a fervent supporter of the Fugitive Slave Act of 1850 and would not admit black patrons into his museums or at Greenfield’s concerts.
For Greenfield’s African-American supporters, it was a point of huge contention throughout her career.
Critics reconcile their ears with their racism
In antebellum America, the minstrel show was one of the most popular forms of musical entertainment. White actors in blackface exploited common stereotypes of African-Americans, grossly exaggerating their dialect, fashion, dancing and singing.
For example, the popular song “Zip Coon” portrayed African-Americans as clumsily striving for the refinement of white culture. The cover of the sheet music for “Zip Coon” shows an African-American attempting to mimic refined fashions of the day and failing. The song goes on to mock its subject, Zip Coon, as a “learned scholar,” while putting him in situations where his apparent lack of intelligence shows.
Greenfield’s performances, however, forced her critics to rethink this stereotype. The Cleveland Plain Dealer described the confusion that Greenfield caused for her audiences:
“It was amusing to behold the utter surprise and intense pleasure which were depicted on the faces of her listeners; they seemed to express – ‘Why, we see the face of a black woman, but hear the voice of an angel, what does it mean?’”
Critics agreed that Greenfield was a major talent. But they found it difficult to reconcile their ears with their racism. One solution was to describe her as a talented, but unpolished, singer.
For example, the New-York Daily Tribune reported that “it is hardly necessary to say that we did not expect to find an artist on the occasion. She has a fine voice but does not know how to use it.” (We see a similar phenomenon today in sports coverage, in which black athletes are often praised for their raw physical athleticism, while white athletes are praised for their game intelligence.)
By performing repertoire thought too complex for black artists – and by doing it well – Greenfield forced her white critics and audiences to reexamine their assumptions about the abilities of African-American singers.
A star is born
On Thursday, March 31, 1853, Greenfield made her New York City premiere at Metropolitan Hall.
Originally built for Jenny Lind, it was one of the largest performance halls in the world. The day before the concert, the New-York Daily Tribune carried an ad that read, “Particular Notice – No colored persons can be admitted, as there has been no part of the house appropriated for them.” The ban resulted in a citywide uproar that prompted New York City’s first police commissioner, George W. Matsell, to send a large police unit to Metropolitan Hall.
Greenfield was met with laughter when she took to the stage. Several critics blamed the uncouth crowd in attendance; others wrote it off as lighthearted amusement. One report described the awkwardness of the show’s opening moments:
“She was timidly led forward to the front of the stage by a little white representative of the genus homo, who seemed afraid to touch her even with the tips of his white kids [gloves], and kept the ‘Swan’ at a respectful distance, as if she were a sort of biped hippopotamus.”
Despite the inauspicious beginning, critics agreed that her range and power were astonishing. After her American tour, a successful European tour ensued, where she was accompanied by her friend Harriet Beecher Stowe.
A singer’s legacy
Greenfield paved the way for a host of black female concert singers, from Sissieretta Jones to Audra McDonald. In 1921, the musician and music publisher Harry Pace named the first successful black-owned record company, Black Swan Records, in her honor.
But these achievements are byproducts of a much larger legacy.
In Stowe’s novel “Uncle Tom’s Cabin,” one of the slave children, Topsy, is taken in by a northern abolitionist, Miss Ophelia. Despite her best attempts, Ophelia can’t reform Topsy, who continues to act out and steal. When asked why she continues to behave as she does – despite the intervention of implied white goodness – Topsy replies that she’s can’t be good so long as her skin is black because her white caregivers are incapable of seeing goodness in a black body. Her only solution is to have her skin turned inside out so she can be white.
Stowe’s argument was not that we should begin skinning children. Rather, Topsy is a critique of the act of “othering” African-Americans by a dominant culture that refuses to acknowledge their full humanity.
After Greenfield’s New York concert, the New-York Daily Tribune recognized the monumental nature of Greenfield’s heroics. The paper urged her to leave America for Europe – and to stay there – the implication being that Greenfield’s home country wasn’t ready to accept the legitimacy of black artistry.
But Greenfield’s tour did more than prove to white audiences that black performers could sing as well as their European peers. Her tour challenged Americans to begin to recognize the full artistry – and, ultimately, the full humanity – of their fellow citizens.The cover of Zip Coon (Library of Congress)
“I’m a talker. I have a hard time shutting up,” admits artist Kay WalkingStick as she leads a reporter through a retrospective of her works at the National Museum of the American Indian. But standing in front of a wall of charcoal and graphite sketches on paper, the 80-year-old Easton, Pennsylvania-based painter and Cherokee Nation member talks about doing the exact opposite—preserving the mystery in her art.
“What the heck is going on? Why on earth would she put a cross in the middle of all that mess?” she says people must ask about her art.
“I like the idea of people coming to it and not fully understanding it—maybe taking that home and thinking about what on earth was happening there,” she says.
Her five-decade career is honored in this first major retrospective, “Kay WalkingStick: An American Artist,” on view through Sept. 18, 2016, and includes more than 65 rarely exhibited works. Upon first seeing the installation, WalkingStick was overwhelmed. “I feel disconnected from the work somewhat, because I’ve always seen it in the studio or in a small gallery,” she says. “Much of it I haven’t seen for years.”
As retrospectives are wont to do, the exhibition demonstrates significant changes in WalkingStick’s repertoire. The show opens with the 2011 New Mexico Desert, a large painting from the Museum’s permanent collections that includes traditional patterns superimposed upon a desert landscape, and the exhibition traces her career from her minimalist works of the 1970s, many which depict sensual bodies—mostly nude self-portraits—to her more recent monumental landscape work.
The blue skies and clouds in her 1971 Who Stole My Sky, a series of stacked canvases inside a wood frame that resembles a box-within-a-box construction, is evocative of René Magritte’s 1928 The False Mirror. Writing in the show’s catalog, Kate Morris, associate art history professor at Santa Clara University, notes that WalkingStick’s sky paintings were a response to the burgeoning environmental movement of the early 1970s. “The closest she ever came to making overt political proclamations in her early work,” Morris writes.
Heavily layered canvases from the 1980s with thickly applied acrylic paint and saponified wax, that embed slashes and crosses—what WalkingStick describes as “all that mess”—are followed in subsequent galleries with her diptych works that juxtapose abstraction and representational forms. Next, is a series of mappings of the body across landscapes; and finally works that combine traditional Native patterns and landscapes.
Growing up, art was the “family business” for WalkingStick. Two of WalkingStick’s uncles were professional artists; and her brother, Charles WalkingStick, 93, who lives in Oklahoma, was a commercial artist, and a sister is a ceramicist.
“Indians all think they’re artists. All Indians are artists. It’s part of the DNA,” WalkingStick says. “I grew up thinking this was a viable thing to do. I’ve always drawn.”
WalkingStick likes to tell people that she learned to draw going to the Presbyterian church. Her mother would hand her pencil and paper during the long sermons. WalkingStick remembers sitting near a rose window.Kay WalkingStick's five-decade career is honored in a major retrospective, “Kay WalkingStick: An American Artist,” at the Smithsonian's National Museum of the American Indian. (Julia Maloof Verderosa)
Her 1983-1985 Cardinal Points from the collection of Phoenix’s Heard Museum is in the exhibition and blends the four-directional cross, the compass directions, and the coloration of the male cardinal (the bird) and of Catholic cardinals. “There’s this double meaning to the title,” WalkingStick says.
She used her hands to spread the acrylic paint and saponified wax on the canvas, and glued a second layer of canvas upon the first. (She gouged the cross out with a woodcutter’s tool after the paint dried, “so that you get a nice sharp line. If you did it while it was wet, you’d get a smooshy line.”) The work, she estimates, has about 30 coats of paint. The wax—composed the way soap is made—“takes away the plastic look of the paint itself,” he says. “It gives it a more natural look. It also happens to make the studio smell divine. It’s made with beeswax; it smells like honey.”
All of those layers make the canvases—whose size she selected based on her arm span so that she could lift them—quite heavy. WalkingStick typically lays the canvas flat on a table while she works, but she still had to move them when they were done.
“I’m a big strong girl,” the octogenarian says. “I think back, how the heck did I do that? I can still carry them, but I can’t sling them around like I used to.”
The exhibition of WalkingStick’s works is part of a broader goal of the museum’s to expand the public’s understanding of what contemporary Native art looks like, according to co-curators Kathleen Ash-Milby and David Penney.
“Many of our visitors have a difficult time reconciling the fact that people of Native ancestry have very complicated, full, rich, often cosmopolitan lives in the later 20th, early 21st century. They’re really expecting American Indian people to be one way. It’s less than an identity and more of a cultural stereotype,” Penney says.
There are Native artists who create traditional works, and that’s a great thing, but other Native artists work in new media, performance and a variety of other areas. “And they’re still Native,” says Ash-Milby. “Some of our best artists do have Native content in their work, but it’s more sophisticated.”
Penney notes that WalkingStick’s recent landscapes draw upon American landscape traditions, such as those of 19th-century Hudson River School artist Albert Bierstadt.
“The message of those big Bierstadts was really: here is a wilderness continent ready for conquest. In a sense these pictures are an attempt to reclaim that landscape,” Penney says of WalkingStick’s work. “Geology is witness to cultural memory. And then these designs are a way of reasserting the fact that these are Native places that can’t be separated from Native experience, history, and the history of this country.”
Asked what she hopes viewers will take away from the show, WalkingStick echoes similar goals. “I would like people to understand on a very profound level that Native people are part and parcel of our functioning world, our whole world, our nation. That we are here. That we are productive. And that we are speaking to others,” she says. “We are part of the mainstream culture.”
"Kay WalkingStick: An American Artist" is on view through Sept. 18, 2016 at the National Museum of the American Indian in Washington, D.C. The American Federation of the Arts will tour the exhibition to the Dayton Art Institute in Dayton, Ohio (Feb. 9, 2017–May 7, 2017), Montclair Art Museum in Montclair, N.J. (Feb. 3, 2018–June 17, 2018) and two additional venues in 2017.