Posted on May 7, 2008

Buried Prejudice: The Bigot in Your Brain

Siri Carpenter, Scientific American, May 2008

Subliminal Stereotyping

* All of us hold unconscious clichéd beliefs about social groups: black and white, female and male, elderly and young, gay and straight, fat and thin.

* Such implicit bias is far more prevalent than the more overt, or explicit, prejudice that we associate with, for instance, the Ku Klux Klan or the Nazis.

* Certain social scenarios can automatically activate implicit stereotypes and attitudes, which then can affect our perceptions, judgments and behavior, including the choice of whom to befriend, whom to hire and, in the case of doctors, what treatment to deliver.

* Recent research suggests we can reshape our implicit attitudes and beliefs—or at least curb their effects on our behavior.

“There is nothing more painful to me at this stage in my life,” Jesse Jackson once told an audience, “than to walk down the street and hear footsteps and start thinking about robbery—then look around and see somebody white and feel relieved.”

{snip}

Using a variety of sophisticated methods, psychologists have established that people unwittingly hold an astounding assortment of stereotypical beliefs and attitudes about social groups: black and white, female and male, elderly and young, gay and straight, fat and thin. Although these implicit biases inhabit us all, we vary in the particulars, depending on our own group membership, our conscious desire to avoid bias and the contours of our everyday environments. For instance, about two thirds of whites have an implicit preference for whites over blacks, whereas blacks show no average preference for one race over the other.

{snip}

The persistence of explicit bias in contemporary culture has led some critics to maintain that implicit bias is of secondary concern. But hundreds of studies of implicit bias show that its effects can be equally insidious. Most social psychologists believe that certain scenarios can automatically activate implicit stereotypes and attitudes, which then can affect our perceptions, judgments and behavior. “The data on that are incontrovertible,” concludes psychologist Russell H. Fazio of Ohio State University.

Now researchers are probing deeper. They want to know: Where exactly do such biases come from? How much do they influence our outward behavior? And if stereotypes and prejudiced attitudes are burned into our psyches, can learning more about them help to tell each of us how to override them?

Sticking Together

Implicit biases grow out of normal and necessary features of human cognition, such as our tendency to categorize, to form cliques and to absorb social messages and cues. To make sense of the world around us, we put things into groups and remember relations between objects and actions or adjectives: for instance, people automatically note that cars move fast, cookies taste sweet and mosquitoes bite. Without such deductions, we would have a lot more trouble navigating our environment and surviving in it.

Such associations often reside outside conscious understanding; thus, to measure them, psychologists rely on indirect tests that do not depend on people’s ability or willingness to reflect on their feelings and thoughts. Several commonly used methods gauge the speed at which people associate words or pictures representing social groups—young and old, female and male, black and white, fat and thin, Democrat and Republican, and so on—with positive or negative words or with particular stereotypic traits.

{snip}

Self-interest often shores up implicit biases. To bolster our own status, we are predisposed to ascribe superior characteristics to the groups to which we belong, or in-groups, and to exaggerate differences between our own group and outsiders [see “The New Psychology of Leadership,” by Stephen D. Reicher, S. Alexander Haslam and Michael J. Platow; Scientific American Mind, August/September 2007].

Even our basic visual perceptions are skewed toward our in-groups. Many studies have shown that people more readily remember faces of their own race than of other races. In recent years, scientists have begun to probe the neural basis for this phenomenon, known as the same-race memory advantage. In a 2001 study neurosurgeon Alexandra J. Golby, now at Harvard Medical School, and her colleagues used functional magnetic resonance imaging to track people’s brain activity while they viewed a series of white and black faces. The researchers found that individuals exhibited greater activity in a brain area involved in face recognition known as the fusiform face area [see “A Face in the Crowd,” by Nina Bublitz] when they viewed faces of their own racial group than when they gazed at faces of a different race. The more strongly a person showed the same-race memory advantage, the greater this brain difference was.

This identification with a group occurs astoundingly quickly. In a 2002 study University of Washington psychologist Anthony G. Greenwald and his colleagues asked 156 people to read the names of four members of two hypothetical teams, Purple and Gold, then spend 45 seconds memorizing the names of the players on just one team. Next, the participants performed two tasks in which they quickly sorted the names of team members. In one task, they grouped members of one team under the concept “win” and those of the other team under “lose,” and in the other they linked each team with either “self” or “other.” The researchers found that the mere 45 seconds that a person spent thinking about a fictional team made them identify with that team (linking it with “self”) and implicitly view its members as “winners.”

Some implicit biases appear to be rooted in strong emotions. In a 2004 study Ohio State psychologist Wil A. Cunningham and his colleagues measured white people’s brain activity as they viewed a series of white and black faces. The team found that black faces—as compared with white faces—that they flashed for only 30 milliseconds (too quickly for participants to notice them) triggered greater activity in the amygdala, a brain area associated with vigilance and sometimes fear. The effect was most pronounced among people who demonstrated strong implicit racial bias. Provocatively, the same study revealed that when faces were shown for half a second—enough time for participants to consciously process them—black faces instead elicited heightened activity in prefrontal brain areas associated with detecting internal conflicts and controlling responses, hinting that individuals were consciously trying to suppress their implicit associations.

Why might black faces, in particular, provoke vigilance? Northwestern University psychologist Jennifer A. Richeson speculates that American cultural stereotypes linking young black men with crime, violence and danger are so robust that our brains may automatically give preferential attention to blacks as a category, just as they do for threatening animals such as snakes. In a recent unpublished study Richeson and her colleagues found that white college students’ visual attention was drawn more quickly to photographs of black versus white men, even though the images were flashed so quickly that participants did not consciously notice them. This heightened vigilance did not appear, however, when the men in the pictures were looking away from the camera. (Averted eye gaze, a signal of submission in humans and other animals, extinguishes explicit perceptions of threat.)

Whatever the neural underpinnings of implicit bias, cultural factors—such as shopworn ethnic jokes, careless catchphrases and playground taunts dispensed by peers, parents or the media—often reinforce such prejudice. Subtle sociocultural signals may carry particularly insidious power. In a recent unpublished study psychologist Luigi Castelli of the University of Padova in Italy and his colleagues examined racial attitudes and behavior in 72 white Italian families. They found that young children’s racial preferences were unaffected by their parents’ explicit racial attitudes (perhaps because those attitudes were muted). Children whose mothers had more negative implicit attitudes toward blacks, however, tended to choose a white over a black playmate and ascribed more negative traits to a fictional black child than to a white child. Children whose mothers showed less implicit racial bias on an implicit bias test were less likely to exhibit such racial preferences.

Many of our implicit associations about social groups form before we are old enough to consider them rationally. In an unpublished experiment Mahzarin R. Banaji, a psychologist at Harvard University, and Yarrow Dunham, now a psychologist at the University of California, Merced, found that white preschoolers tended to categorize racially ambiguous angry faces as black rather than white; they did not do so for happy faces. And a 2006 study by Banaji and Harvard graduate student Andrew S. Baron shows that full-fledged implicit racial bias emerges by age six—and never retreats. “These filters through which people see the world are present very early,” Baron concludes.

Dangerous Games

{snip}

A growing body of work indicates that implicit attitudes do, in fact, contaminate our behavior. Reflexive actions and snap judgments may be especially vulnerable to implicit associations. A number of studies have shown, for instance, that both blacks and whites tend to mistake a harmless object such as a cell phone or hand tool for a gun if a black face accompanies the object. This “weapon bias” is especially strong when people have to judge the situation very quickly.

In a 2002 study of racial attitudes and nonverbal behavior, psychologist John F. Dovidio, now at Yale University, and his colleagues measured explicit and implicit racial attitudes among 40 white college students. The researchers then asked the white participants to chat with one black and one white person while the researchers videotaped the interaction. Dovidio and his colleagues found that in these interracial interactions, the white participants’ explicit attitudes best predicted the kinds of behavior they could easily control, such as the friendliness of their spoken words. Participants’ nonverbal signals, however, such as the amount of eye contact they made, depended on their implicit attitudes.

As a result, Dovidio says, whites and blacks came away from the conversation with very different impressions of how it had gone. Whites typically thought the interactions had gone well, but blacks, attuned to whites’ nonverbal behavior, thought otherwise. Blacks also assumed that the whites were conscious of their nonverbal behavior and blamed white prejudice. {snip}

Implicit biases can infect more deliberate decisions, too. In a 2007 study Rutgers University psychologists Laurie A. Rudman and Richard D. Ashmore found that white people who exhibited greater implicit bias toward black people also reported a stronger tendency to engage in a variety of discriminatory acts in their everyday lives. These included avoiding or excluding blacks socially, uttering racial slurs and jokes, and insulting, threatening or physically harming black people.

{snip}

Implicit bias may sway hiring decisions. In a recent unpublished field experiment economist Dan-Olof Rooth of the University of Kalmar in Sweden sent corporate employers identical job applications on behalf of fictional male candidates—under either Arab-Muslim or Swedish names. Next he tracked down the 193 human resources professionals who had evaluated the applications and measured their implicit biases concerning Arab-Muslim men. Rooth discovered that the greater the employer’s bias, the less likely he or she was to call an applicant with a name such as Mohammed or Reza for an interview. Employers’ explicit attitudes toward Muslims did not correspond to their decision to interview (or fail to consider) someone with a Muslim name, possibly because many recruiters were reluctant to reveal those attitudes.

Unconscious racial bias may also infect critical medical decisions. In a 2007 study Banaji and her Harvard colleagues presented 287 internal medicine and emergency care physicians with a photograph and brief clinical vignette describing a middle-aged patient—in some cases black and in others white—who came to the hospital complaining of chest pain. Most physicians did not acknowledge racial bias, but on average they showed (on an implicit bias test) a moderate to large implicit antiblack bias. And the greater a physician’s racial bias, the less likely he or she was to give a black patient clot-busting thrombolytic drugs.

Beating Back Prejudice

{snip}

Seeing targeted groups in more favorable social contexts can help thwart biased attitudes. In laboratory studies, seeing a black face with a church as a background, instead of a dilapidated street corner, considering familiar examples of admired blacks such as actor Denzel Washington and athlete Michael Jordan, and reading about Arab-Muslims’ positive contributions to society all weaken people’s implicit racial and ethnic biases. In real college classrooms, students taking a course on prejudice reduction who had a black professor showed greater reductions in both implicit and explicit prejudice at the end of the semester than did those who had a white professor. {snip}

{snip}

In addition, people who report a strong personal motivation to be nonprejudiced tend to harbor less implicit bias. And some studies indicate that people who are good at using logic and willpower to control their more primitive urges, such as trained meditators, exhibit less implicit bias. Brain research suggests that the people who are best at inhibiting implicit stereotypes are those who are especially skilled at detecting mismatches between their intentions and their actions.

But wresting control over automatic processes is tiring and can backfire. If people leave interracial interactions feeling mentally and emotionally drained, they may simply avoid contact with people of a different race or foreign culture. {snip}

{snip}

Taking Control

Despite such data, some psychologists still question the concept of implicit bias. In a 2004 article in the journal Psychological Inquiry, psychologists Hal R. Arkes of Ohio State and Philip E. Tetlock of the University of California, Berkeley, suggest that implicit associations between, for example, black people and negative words may not necessarily reflect implicit hostility toward blacks. They could as easily reflect other negative feelings, such as shame about black people’s historical treatment at the hands of whites. They also argue that any unfavorable associations about black people we do hold may simply echo shared knowledge of stereotypes in the culture. In that sense, Arkes and Tetlock maintain, implicit measures do not signify anything meaningful about people’s internal state, nor do they deserve to be labeled “prejudiced”—a term they feel should be reserved for attitudes a person deliberately endorses.

{snip}