Believing and belonging — why people believe in scientifically discredited theories

Helen De Cruz
9 min readAug 4, 2018
flat Earth

The puzzle of denialism

Surrounded by devastating Californian wildfires, residents of the town of Redding remained unconvinced that human agency was at least in part to blame for their plight. The town votes predominantly Republican, and the local media are skeptical about scientific explanations. As one resident said, “I think it’s bull. It’s just fire season. It’s hot”.

Climate change is a topic about which scientists have a very high degree of consent. A recent survey estimates that 97% of published climate scientists believe climate change is to a large extent caused by humans.

The Rim Fire in the Stanislaus National Forest near in California began on Aug. 17, 2013 and is under investigation. The fire has consumed approximately 149, 780 acres and is 15% contained. U.S. Forest Service photo.

Yet views on climate change and global warming are increasingly politically polarized. Since Gallup has been tracking American views on climate change in the 2000s, the issue has never been so politically polarized as in 2018. For example, 89% of Democrats believe global warming is caused by human activities, compared to 35% of Republicans (not all crackpot theories are polarized in this way, for the purposes of this article I’ll focus on those that are, in some way, associated to groups, political or otherwise).

How can large swathes of the population accept ideas that are not endorsed by the scientific community? This is the puzzle of denialism. Denialism is the systematic denial of facts that enjoy a high degree of consent among the scientific community. It is usually associated by conspiracy theories to explain away scientific consent (e.g., medical scientists must be bought off by "big pharma"). Denialists tend to use fake experts, such as the discredited doctor Andrew Wakefield who forged the results of his retracted study in 1998 allegedly showing that vaccines cause autism.

Neil Levy's solution to the denialism puzzle

The philosopher Neil Levy has recently proposed a theory (open access article here) to explain why people believe in crackpot scientific theories. He dismisses ideas that climate change deniers, anti-vaxxers and creationists would be less critical, less well-informed, or less rational than those who believe the scientific consensus. Instead, he thinks that denialists are epistemically unluckier — their chains of transmitted information through testimony are not as good as those of laypeople who hold correct scientific beliefs.

It is useful to take a step back and to think: why do we hold specific scientific beliefs at all? Why do I, for example, believe that climate change is due to human activity? I am not a climate scientist, and have not done any experiments or systematic climatological observations myself. I did see fewer cold winters in the course of my life, but local experience can only tell us so much. The reason I believe climate change is human-induced is that I trust the testimony of scientists. As a novice I trust what scientists (the experts) say about this, and this is how I come to know facts about climate change. The problem is, how can a novice evaluate the testimony of an expert? This is termed the novice-expert problem.

Learning from parents. Source: Pixabay

Since we cannot directly test whether what experts say is true, how can we ever make up our minds when they (seem to) disagree? We use two cognitive mechanisms to appraise testimony: competence and benevolence. Competence means we prefer to learn from people who make fewer mistakes, and benevolence means we prefer to learn from people who are well-disposed towards us. Already very young children use these two kinds of cues. It makes sense to do so : if we trust incompetent people, we will not gain reliable information; if we trust people who don't have our interests at heart, they may deceive us.

Levy thinks that these two mechanisms go awry in denialism, as testifiers to politicized scientific theories can be seen as less benevolent, as he says

“But because the topic has come to be politicized, this disposition to defer ensures that they do not defer to (or their chains of deference will not bottom out in) groups of scientists who espouse a view contrary to theirs.”

If a scientist argues that evolutionary theory is correct, it will make her appear less trustworthy in the eyes of Evangelical Christians, for example. In Levy’s view, politically liberal Americans are epistemically luckier — they use exactly the same cognitive mechanisms to monitor testimony as conservatives, but it so happens that conservative politicians have taken on board maverick theories and have increasingly moved away from the scientific consensus, a process that already started in the 1970s.

While it is plausible that people of different political leanings use the same cues for appraising testimony, the key move in Levy’s proposal remains somewhat mysterious: the mere fact of holding a particular belief is seen as signaling a lack of benevolence.

How could this happen? Levy argues that we are disposed to see people who are politically or religiously similar to us as more benevolent towards us. Maybe the merchants of doubt politicized “climate science such that it would come to serve as a marker of political affiliation and thereby a cue for benevolence or its lack.”

I think that to explain how political affiliation can be increasingly tied to belief (or lack thereof) of scientific theories, we need to invoke another reason people defer to testimony, social belonging.

An additional factor: believing to belong

When we accept the testimony of others, we are driven by a variety of goals. Only some of these goals are epistemic, for example, we want to believe true things and avoid believing false things (as William James already held). But we also hold beliefs so that we may belong. We tend to modulate our beliefs to be in line with group membership. This is not only for local communities or political affiliation, but even for very transient groups and beliefs.

This is most clearly demonstrated in social conformity experiments dating back to Solomon Asch (1951). In these experiments, a participant has to perform a fairly straightforward perceptual task.

A line of a given length is placed on a left card, together with three lines of differing length on a right card, and the participant is asked which of the lines on the right card has the same length as the one on the left. On some occasions, the fellow participants, who in reality are colluding with the experimenter, pick the wrong line.

For example, in the picture above the other "participants" all pick line B. Which line would you pick in this situation? Asch found that people sometimes defer to the majority opinion, even if this conflicted with their own senses. This has been replicated in a variety of cultures, and even with young children. Four-year-old children make their judgments conform those of peers of the same age.

Traditionally, Asch's experiment has been interpreted to show that people are thoughtless conformists who will just bend their perceptual judgment to majority opinion. One problem with the conformist interpretation is that even in Asch’s original experiments, some participants deferred more to the majority than others, and many participants deferred on some trials but trusted their own perceptual judgments over the majority opinion in others. Moreover, Haun and Tomasello (2011) found that four-year-old children who defer to majority opinion still maintained their original perceptual opinion, but only modulated their public expression of it.

This suggests an additional factor is at work when we evaluate testimony. Hodges and Geyer (2006) argue that when we appraise testimony, we are not only guided by epistemic considerations, such as believing true things and avoiding false things. We are also guided by moral and social considerations.

As evidence for this alternative interpretation, they show that when participants in Asch-style conformity experiments are debriefed, they will often voice discomfort. For instance, some participants claim that they were sure that the other participants were wrong, but yet agreed to the majority. In Hodge’s and Geyer’s view (2006, 2–3) Asch-style conformity experiments intentionally put two kinds of values in tension: belonging to a group and holding beliefs that acknowledge one’s sense of interdependence within it, and on the other hand epistemic values such as valuing and being committed to truth. Hodge and Geyer claim “participants work pragmatically to negotiate these conflicts in ways that acknowledge their interdependence with others and their joint obligations to values such as truth.”

Something similar might be going on in the case of denialism. Levy is probably right that merchants of doubt deliberately politicize or aim to politicize minority views in science, so as to harness support for these views in part of the electorate. Still, such partisan tactics only work if beliefs can become a proxy for group membership. Why would this happen?

Why do beliefs become a proxy for group membership

From an evolutionary perspective, there are many reasons to trust the beliefs that are predominant in your group. For example, not going near the swamps might be a good idea if malaria-infested mosquitos lurk there. But even beliefs far removed from everyday survival concerns might be advantageous to hold. Believing what one’s group holds helps to people to coordinate action.

Social identity becomes more salient as people accentuate differences between their ingroup and the outgroup, and emphasize similarities within the group. This is why members of the same group tend to dress similarly, eat similarly, and also hold similar beliefs. One way to enhance within-group similarities is to hold beliefs that are extreme or counterintuitive, and thus not likely to be held by outgroup members, such as the doctrine of the Trinity. Holding such beliefs is an easy marker for ingroup affiliation and distinguishes the ingroup from the outgroup. Nicholson (2016) has recently argued that these dynamics can explain the popularity of some highly counterintuitive religious beliefs such as the Christian doctrine of the Trinity and the Buddhist concept of no-self.

Thus perhaps a similar dynamic is at work in denialism — holding anti-vaxxer, creationist or climate change denial beliefs might be a marker of group membership. To the extent that these beliefs become politicized, a “ good Republican” might be expected to believe that climate change is bunk, or a “ good Evangelical” might be expected to believe that evolutionary theory is false.

Ken Ham creationist billboard.

How to improve scientific knowledge

It should be stressed that belonging is just one factor at work people’s decision to defer to specific forms of scientific testimony. If it were the only factor, then making people aware that partisan beliefs they hold are wrong would merely lead them to dig in their heels, the so-called backfire effect.

Recent experiments have shown that the backfire effect is unreliable, and likely only manifest among college students (who are weird anyway). When people are confronted with a correction to a factually incorrect partisan belief, they align their beliefs more with the correct opinion. In a study with over 10,000 people, Thomas Wood and colleagues confronted people with corrections to incorrect partisan beliefs, e.g., on the right, that immigrants are more criminal, on the left, that hedge fund managers pay less taxes than workers. They then gave people corrections to these false beliefs (e.g., undocumented immigrants have lower records of criminal offences compared to US citizens, hedge fund managers pay more taxes). They did not find that people dug in their heels. Instead, people adjusted their beliefs to be more in line with the facts.

Similarly, Sander Van Der Linden and colleagues found that showing the public that scientists agree that climate change is human-induced increases beliefs that climate change is happening, is worrisome, and a threat.

However, while experiments can induce such changes, there is no large-scale change in public opinion and beliefs about climate change are more polarized than they ever were. If I am correct, this can be explained by the fact that these beliefs are motivated not just by epistemic concerns, but by concerns about belonging to a social group and fitting in it in holding key beliefs.

Science communicators should keep this in mind when providing their message, either by trying to minimize reference to partisan issues, or perhaps by emphasizing the consent among scientists that their findings enjoy.

--

--