At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

We are not born with prejudice toward specific groups of people. Prejudice is learned and reinforced over time so deeply and in so many unintended ways that it has proved very difficult to reduce (Paluck et al., 2021). The resulting discrimination has negatively impacted various groups and classes of people for centuries in untold ways.

A new study in Personality and Social Psychology Review (Axt and To, April 22, 2024) has proposed that by reviewing and analyzing ways to reduce people’s classic cognitive biases in judgment and decision-making fields, we might develop better or longer-lasting ways to reduce discrimination. The authors contended that studies on decision-making biases and studies on prejudice and discrimination need to work together more.

The authors began by summarizing the debiasing literature, which represents a useful contribution in itself for those who want to better understand how to reduce cognitive biases. The authors divided debiasing strategies into three groups: 1) Trying to change the person (who may be prejudiced); 2) Providing situation-specific tactics; and 3) Changing the context in which interpersonal judgments are made.

There are ways to increase the general motivation or ability of people to be unbiased, including different types of education or training. Social science education in general can improve logical and unbiased thinking (Lehman and Nisbett, 1990; Schaller et al., 1996; Stalder, 2012).

The authors also cited virtual games that simulate real-life decision-making with an avatar. It is also helpful to increase feelings of accountability where we expect to have to justify our judgments to someone else (Lerner and Tetlock, 1999).

Some debiasing approaches are specific to a particular situation in that the instructions to participants apply directly and only to a single judgment. A primary tactic here is called “consider the opposite.” The general idea is to think about or imagine information that is somehow opposite our initial inclinations.

For example, in answering general knowledge questions, participants typically show an overconfidence bias, in that they overestimate their answers’ accuracy. But if they’re asked to consider potential reasons why their answers may be wrong, then their estimates become more realistic (Koriat et al., 1980). According to the authors, the “consider the opposite” approach has also reduced the anchoring bias, confirmation bias, and hindsight bias.

When those in power have to decide others’ fates, they sometimes use numerous predictive variables, such as when judges decide which defendants to release on bail or faculty decide whom to accept into a graduate program. Research has shown that research-based formulas, or “linear models,” which direct which variables to use or how to weigh them, lead to better outcomes than letting judges and faculty use their own intuition. Using preset formulas resulted in fewer defendants who missed court appearances and higher first-year performances for graduate students, respectively.

Decisions can also be influenced by changing how they are presented or structured. In particular, framing an issue in terms of frequencies (or raw numbers) can have more impact than percentages, such as “10 of every 1,000 women” being more persuasive in a preventive healthcare message compared to 1 percent (though 10 of 1,000 equals 1 percent). This example is similar to the classic framing effect in which consumers view a 70-percent-lean meat product as a better purchase than one that has 30 percent fat.

Among other examples, pre-commitment can lead to better later decisions, as when students who had committed to evenly spaced deadlines were less likely to procrastinate. In deciding between products A and B, consumers were more likely to choose A if there were a third option that was comparable but inferior to A (called a “judgment decoy”). Drivers are more likely to agree to organ donation if it is the default with an opt-out option (called a “judgment default”).

The authors acknowledged that their list of strategies was not exhaustive, but some of the unmentioned approaches may be especially helpful in reducing prejudice and discrimination. They include situational attribution training, empathy training, exposure control (limiting exposure to information that may bias judgment), mindfulness training, reducing cognitive load, self-affirmations, and slowing down (Neal and Brodsky, 2016; Pronin and Hazel, 2023; Stalder, 2018; Stewart et al., 2010). There are also personality traits associated with less bias and racism, including low need for structure and high attributional complexity (Schaller et al., 1995; Stalder, 2009, 2014a; Tam et al., 2008).

There are also unmentioned caveats or complexities. Some of the change-the-context approaches seem manipulative by relying on one bias (such as the framing effect) in pursuit of reducing another bias. The authors acknowledged that judgment defaults work by relying on a “bias for inaction.” Perhaps the ends justify the means, but it seems ironic to refer to such approaches as “debiasing.”

Using frequencies over percentages can debias but can also mislead. For example, there are about twice as many poor White people in the United States as compared to poor Black people, which could (mistakenly) look like a White poverty problem. But the poverty rate is under 10 percent for White people compared to almost 20 percent for Black people (Creamer, 2020).

The authors identified several debiasing approaches that have had some success in reducing prejudice or discrimination: accountability, pre-commitment, judgment decoys, and judgment defaults. For example, feeling more accountable can reduce prejudiced judgments in mock trials and hiring tasks and can reduce ingroup favoritism in general. The authors also identified approaches that have potential, including certain forms of training and “consider the opposite.”

The authors ruled out increasing awareness or education about biases due to low empirical support. But there is also a common personal resistance to such education. Even if we’re willing to listen, we tend to think the biases pertain to other people and not really ourselves (Pronin and Hazel, 2023; Stalder, 2018). A potential first step for readers interested in becoming less biased, then, is to try to be open to the real possibility of our own biases (Stalder, 2014b).

Jordan Axt and Jeffrey To, “How Can Debiasing Research Aid Efforts to Reduce Discrimination?” Personality and Social Psychology Review (April 22, 2024), advance online publication, https://journals.sagepub.com/doi/10.1177/10888683241244829.

John Creamer, “Inequalities Persist Despite Decline in Poverty for All Major Race and Hispanic Origin Groups,” United States Census Bureau, September 15, 2020, https://www.census.gov/library/stories/2020/09/poverty-rates-for-blacks-and-hispanics-reached-historic-lows-in-2019.html.

Asher Koriat et al., “Reasons for Confidence,” Journal of Experimental Psychology: Human Learning and Memory 6 (1980): 107–18.

Darrin R. Lehman and Richard E. Nisbett, “A Longitudinal Study of the Effects of Undergraduate Training on Reasoning,” Developmental Psychology 26 (1990): 952–60.

Jennifer S. Lerner and Philip E. Tetlock, “Accounting for the Effects of Accountability,” Psychological Bulletin 125 (1999): 255–75.

Tess M. S. Neal and Stanley L. Brodsky, “Forensic Psychologists’ Perceptions of Bias and Potential Correction Strategies in Forensic Mental Health Evaluations,” Psychology, Public Policy, and Law 22 (2016): 58–76.

Elizabeth Levy Paluck et al., “Prejudice Reduction: Progress and Challenges,” Annual Review of Psychology 72 (2021): 533–60.

Emily Pronin and Lori Hazel, “Humans’ Bias Blind Spot and Its Societal Significance,” Current Directions in Psychological Science 32 (2023): 402–409.

Mark Schaller et al., “The Prejudiced Personality Revisited: Personal Need for Structure and Formation of Erroneous Group Stereotypes,” Journal of Personality and Social Psychology 68 (1995): 544–55.

Mark Schaller et al., “Training in Statistical Reasoning Inhibits the Formation of Erroneous Group Stereotypes,” Personality and Social Psychology Bulletin 22 (1996): 829–44.

Daniel R. Stalder, “Are Attributionally Complex Individuals More Prone to Attributional Bias?” (presentation, Annual Convention of the Midwestern Psychological Association, Chicago, IL, May 1–3, 2014a).

Daniel R. Stalder, “Competing Roles for the Subfactors of Need for Closure in Committing the Fundamental Attribution Error,” Personality and Individual Differences 47 (2009): 701–705.

Daniel R. Stalder, “I’m a PARB and So Are You,” PARBs Anonymous (blog), February 12, 2014b, https://parbsanonymous.wordpress.com/2014/02/12/im-a-parb-and-so-are-you-2/.

Daniel R. Stalder, The Power of Context: How to Manage Our Bias and Improve Our Understanding of Others (Amherst, NY: Prometheus Books, 2018).

Daniel R. Stalder, “A Role for Social Psychology Instruction in Reducing Bias and Conflict,” Psychology Learning and Teaching 11 (2012): 245–55.

Tracie L. Stewart et al., “Consider the Situation: Reducing Automatic Stereotyping Through Situational Attribution Training,” Journal of Experimental Social Psychology 46 (2010): 221–25.

Kim-Pong Tam et al., “Attributionally More Complex People Show Less Punitiveness and Racism,” Journal of Research in Personality 42 (2008): 1074–81.

Daniel R. Stalder, Ph.D., is a social psychologist at the University of Wisconsin-Whitewater and author of The Power of Context.

Get the help you need from a therapist near you–a FREE service from Psychology Today.

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

By admin