Tribal bias from the wild to the laboratory

It is not just politics that is beset by tribalism. The social sciences are also vulnerable to in-group bias.
My Post (48) (1)
Share on facebook
Share on twitter
Share on email

War is older than the human species. It is found in every region of the world, among all the branches of humankind. It is found throughout human history, deeply and densely woven into its casual tapestry. It is found in all eras, and in earlier people no less than later… War is reflected in the most fundamental features of human social life.

J. Tooby and L. Cosmides, 2010

Human evolution has been powerfully shaped by our history of war and intergroup conflict. For groups to survive, reproduce, and pass their genes on to later generations of humans, they had to coordinate and cooperate within their own groups in order to defeat other groups. Because group loyalty and commitment were so important for group survival, humans would have evolved to reward loyal and cooperative members of their ingroup – those who fervidly support the group’s cause and contribute to the group’s success. Humans also would have evolved to punish and ostracize disloyal members of the group – those who oppose or harm the goals of the group. In these environments, where loyalty was rewarded with status and resources (and perhaps more crucially, where disloyalty was highly costly), individuals would have evolved traits that enhance coalitional commitment and tendencies to signal those traits to other group members. Modern humans evolved from these highly group-loyal and cooperative individuals. In other words, human evolution selected for traits that enhance and signal ingroup loyalty.

Group loyalty and cooperation sound like positive human tendencies – and they certainly can be positive. But these group commitments can also lead to more problematic psychological propensities, particularly when the goal is intergroup cooperation and coordination. One of these propensities is ingroup favouritism. Indeed, even children arbitrarily assigned to relatively meaningless groups (e.g., red t-shirts or blue t-shirts) display tendencies to be more favourable toward ingroup than outgroup members. Ingroup favouritism comes in many varieties. People not only like the members of one’s ingroup (whether one’s religious ingroup, political ingroup, ethnic ingroup, etc.) better than the members of other groups, but they also tend to treat the ideas and behaviours of ingroup members more favourably than those exact same ideas and behaviours from outgroup members. For example, the immoral actions of a political ingroup member are evaluated as more morally permissible than those exact same immoral actions when performed by a political outgroup member.

But another, and perhaps more problematic type of tribal psychological tendency is ingroup bias or tribal bias. Tribal biases regard not merely how we feel about other people or how we evaluate individual actions, but how we evaluate empirical information. Our apparent pursuit of truth and understanding of human nature and the world can be warped by our desires to conform to the ingroup and to gain status within the ingroup.

There are two primary levels of these tribal biases. The first is called selective exposure. Selective exposure regards how we approach and avoid information in the world. Specifically, people have tendencies to seek out information that supports their group’s goals and to avoid information that opposes their group’s goals. An immigration restrictionist might seek out information that plays down the downfalls and costs of relatively open borders and avoid information regarding the benefits of more liberal immigration policies, whereas a proponent of immigration might avoid information regarding the costs and challenges of immigration and seek out information on the benefits.

Perhaps the most obvious way in which people engage in selective exposure is in their media consumption. They are inclined to read the newspapers, watch the news programmes, and visit the news websites that support the beliefs of their ingroup and are similarly inclined to avoid any media that opposes their group’s beliefs. But they also engage in selective exposure in their social worlds. We choose to be friends with users on Facebook, on Twitter, and in the real world, who are part of our various ingroups and who tend to exchange information with us that conforms to our group’s beliefs. We selectively put ourselves in media environments and social environments that are likely to bolster our pre-existing beliefs about the world – the beliefs that support our ingroup – and selectively avoid media and social environments that put us at risk of confronting discordant information. Thus, we are exposing ourselves and others to a biased set of potentially relevant information.

The second level of tribal bias occurs after people are exposed to new information (whether they purposefully sought out that information or failed to avoid it). The two critical biases here are called motivated scepticism and motivated credulity. Motivated scepticism refers to the tendency for people to be highly critical and unaccepting of information that opposes their own group’s interests, whereas motivated credulity refers to the tendency for people to be highly credulous toward and uncritical of very similar information that supports their own group’s interests.

A classic example of these tendencies comes from a 1979 paper by social psychologists, Lord, Ross, and Lepper. These researchers had death penalty proponents and opponents evaluate the scientific methods of an ostensibly real study that tested whether the death penalty does or does not appear to deter crime. For example, participants read a paragraph describing a scientific study that compared murder rates in 14 states in the United States before and after the adoption of capital punishment. In one experimental condition (which half of the participants saw), the study results demonstrated that murder rates were lower after adoption of the death penalty and thus supported the deterrent efficacy of the death penalty; in the other experimental condition (which the other half of the participants saw), the study results demonstrated that murder rates were higher after adoption of the death penalty and thus opposed the deterrent efficacy of the death penalty.

Participants were then asked to evaluate how well-conducted the study was. Note that the study was conducted in exactly the same way in the two experimental conditions, and only the results of the study differed. Lord, Ross, and Lepper found that participants who were proponents of the death penalty evaluated the study as better conducted when the results indicated the death penalty does deter homicide than when the results opposed the deterrent efficacy of the death penalty, and the exact opposite pattern was observed for participants who opposed the death penalty, such that they evaluated the study as better conducted when the results indicated that the death penalty does not deter homicide than when the results supported the deterrent efficacy of the death penalty. Thus, when people are exposed to new information, they tend to discount that information as low quality when it challenges the beliefs of their political ingroup.

These types of biases appear to be particularly problematic in the political sphere. Politics is one of the most salient modern tribal conflicts. We generally no longer kill our tribal opponents, but we argue and debate in an effort to advance the success of our own political ingroup and to squash our political opponents. There are at least three main reasons why the political sphere elicits substantial tribal biases.

Political arguments are highly consequential. Many political disagreements centre on who should receive status and resources within society (for example, who should receive welfare benefits and who should pay for them), and so group success is very important for individual success. There is a strong motivation to win political disagreements.

Political disagreements are often morally significant. Moral disagreements signal an unwillingness to conform to the same rules – rules that are often set in place to advance the interests of the ingroup (or to oppose disadvantages). For example, gun control is a major source of political conflict in the United States. When Democrats, who are less likely to own guns than Republicans, morally oppose certain gun rights, and Republicans reject this moral opposition, Republicans signal an unwillingness to follow the same rules as Democrats (specifically, to give up certain rights to gun ownership). Moreover, this results in at least one sort of power imbalance between the two groups: in the US, Republicans are more armed than Democrats. This, understandably, creates conflict between the two groups.

Lastly, ambiguity exacerbates bias, and political issues are often if not always ambiguous. Even experts disagree on many political issues. For example, experts disagree on how large pay discrepancies are between men and women and they disagree on which factors contribute to such discrepancies. On top of factual ambiguities, political issues often relate to opinions about what ought to be the case based on that fuzzy understanding of the facts.

One might think that ambiguity would compel humility and open-mindedness, but when ambiguity occurs in the context of political conflict, it appears to make people more biased and more dogmatic. Why? Because there is more room for argument. People do not argue about obvious truths. It would be pretty challenging to argue over whether animals must be killed to obtain meat, but it is easier to argue about the costs and benefits of meat consumption. They are difficult to quantify; even animal experts do not know how animals experience physical and emotional pain; even health experts do not know whether there are long-term health consequences of avoiding animal products. Given these unknowns, it is even easier to argue about whether the difficult-to-quantify costs outweigh the difficult-to-quantify benefits. Nobody knows the answer for certain, so compelling arguments can make the difference between whether one’s preferred policy is supported or opposed.

Thus far, I have suggested that tribal conflict and biases are a fundamental feature of politics because humans share an evolutionary history of intergroup conflict – and politics is the most salient modern form of intergroup conflict. However, the social sciences have long emphasized the shortcomings of more right-leaning or politically conservative ideologies, arguing that the cognitive tendencies of political conservatives (e.g. threat avoidance, cognitive rigidity) probably predict more bias in conservatives relative to liberals. In recent years, however, the social sciences have been criticized for their left-leaning political homogeneity. Nearly all social scientists identify as political liberals. It is possible then, that the overemphasis on the flaws and biases of conservatives in the social sciences is merely a reflection of the ingroup biases of a left-leaning field. In other words, the very scientists who have been exploring political biases may have mischaracterized the cognitive tendencies of political conservatives due to their own political biases.

This possibility inspired me and my colleagues to test whether indeed conservatives are more biased than liberals as many scholars have contended, or whether liberals and conservatives are more similar in their ingroup bias tendencies. We conducted two studies in the United States. First, we simply asked Republicans and Democrats whether Republicans or Democrats are more biased. The results displayed in the table below demonstrated that Republicans reported that Democrats are more biased and Democrats reported that Republicans are more biased. This is a nice foreshadowing of the results to come.

We then conducted a meta-analysis of political bias research. A meta-analysis essentially combines the results of all studies that measure a particular effect. In this case, we combined the results of studies that were very similar to the classic Lord, Ross, and Lepper study described above. We combined the results of 51 separate studies that presented political partisans with virtually identical information with conclusions that either supported or opposed their political ingroup and then asked participants to evaluate the validity of that information. Topics spanned dozens of political issues, including capital punishment, gun control, abortion, welfare, healthcare, climate change, same-sex marriage, affirmative action, immigration, education, taxes, and marijuana. When all results were combined together and averaged, we found near perfect symmetry between liberals and conservatives. That is, across 51 studies, both liberals and conservatives evaluated information that supports their political ingroup as more valid than that exact same information when it opposes their political ingroup, and to virtually equal degrees.

In recent years, inspired by the realisation that political homogeneity in the social sciences may have biased conclusions about conservatives, many scholars have now been probing other domains of ingroup favouritism among liberals and conservatives. The emerging trend has been that liberals and conservatives are far more similar in their ingroup tendencies than previously thought. Whereas social scientists long thought support for authoritarianism was mainly a feature of right-wing ideologies, more recent work has found that left-wing authoritarianism both exists and predicts left-wing prejudices against right-wing targets. Similarly, scholars long thought that conservatives were particularly intolerant toward outgroups, but more recently, researchers have expanded the considered outgroups to those that liberals oppose, and found that liberals are similarly as intolerant and prejudices as conservatives. Many scholars had also suggested that conservatives are particularly avoidant and sceptical of scientific conclusions that oppose their political positions, but more recent work suggests that liberals and conservatives are similarly prone to avoidance and denial of dissonant scientific findings.

Given the shared human evolutionary histories of individuals within different political groups and parties, it seems plausible that all political groups and parties would be similarly tribal and similarly prone to concomitant tribal biases. If the members of a particular political group were not tribal, it seems unlikely that they would be able to compete with and survive alongside other political groups with more loyal and tribal group members.

It may seem puzzling, then, that scholars have long placed far greater emphasis on the tribal cognitive tendencies among political conservatives.

However, the politically liberal homogeneity of the social sciences suggests one possible explanation: perhaps the very people attempting to measure political biases have biases of their own. Specifically, because social scientists who study bias are politically liberal, they may have been unable to perceive the biases of their own ingroup. Indeed, research has shown that people struggle to perceive their own biases. For these reasons, we should be somewhat sceptical of social scientific claims that the opposing political group is relatively flawed or has particularly unflattering cognitive tendencies. And perhaps, we should be somewhat sceptical of social scientific claims in general.

Tribalism is a natural and quite possibly an ineradicable element of human social groups. It is likely that all groups, and all people within those groups, are susceptible to tribalism and tribal biases. It is usually quite easy to convince people of this point. But people seem easily convinced because they consider this claim as it relates to other groups and other people, and not to their own group or to their own self. People seem to think to themselves, ‘Ah, yes, this explains why that other group has such ludicrous beliefs!’ But tribalism and tribal biases do not only affect our outgroups. They likely affect all groups: mine, yours, and everyone else’s. We are all humans, and thus we are all susceptible to these kinds of biases.

Tribalism and tribal biases are not all bad. Group loyalty and commitment can be beautiful things, and biases often serve as useful heuristics. However, they can be toxic when the goal is to make compromises between groups, which is so often the goal of modern groups and governments. Moreover, tribalism and tribal biases can steer us away from a true and accurate understanding of human nature. For those of us who care deeply about truth and understanding, we should be vigilant in acknowledging and combating these tendencies in our own groups and in ourselves.

Tribal bias from the wild to the laboratory by Cory J. Clark was first published in Past and Present: Perspectives from the Engelsberg seminar, 2020, Axess Publishing

Cory J Clark

Cory J. Clark is Assistant Professor of Social Psychology, Durham University and Director of Academic Engagement at Heterodox Academy. Her research interests are in moral judgment, punishment, free will, belief, political bias and motivated cognition.

Subscribe to Engelsberg Ideas

Receive the Engelsberg Ideas weekly email from our editorial team.

By subscribing, you consent to us contacting you by email. You may unsubscribe at any time, and we’ll keep your personal data safe in accordance with our privacy policy.