Gordon Pennycook: "It might be one of the biggest false consensus effects that's been observed."
Credit: Aurich Lawson / Thinkstock
Belief in conspiracy theories is often attributed to some form of motivated reasoning: People want to believe a conspiracy because it reinforces their worldview, for example, or doing so meets some deep psychological need, like wanting to feel unique. However, it might also be driven by overconfidence in their own cognitive abilities, according to a paper published in the Personality and Social Psychology Bulletin. The authors were surprised to discover that not only are conspiracy theorists overconfident, they also don't realize their beliefs are on the fringe, massively overestimating by as much as a factor of four how much other people agree with them.
"I was expecting the overconfidence finding," co-author Gordon Pennycook, a psychologist at Cornell University, told Ars. "If you've talked to someone who believes conspiracies, it's self-evident. I did not expect them to be so ready to state that people agree with them. I thought that they would overestimate, but I didn't think that there'd be such a strong sense that they are in the majority. It might be one of the biggest false consensus effects that's been observed."
In 2015, Pennycook made headlines when he co-authored a paper demonstrating how certain people interpret "pseudo-profound bullshit" as deep observations. Pennycook et al. were interested in identifying individual differences between those who are susceptible to pseudo-profound BS and those who are not and thus looked at conspiracy beliefs, their degree of analytical thinking, religious beliefs, and so forth.
They presented several randomly generated statements, containing "profound" buzzwords, that were grammatically correct but made no sense logically, along with a 2014 tweet by Deepak Chopra that met the same criteria. They found that the less skeptical participants were less logical and analytical in their thinking and hence much more likely to consider these nonsensical statements as being deeply profound. That study was a bit controversial, in part for what was perceived to be its condescending tone, along with questions about its methodology. But it did snag Pennycook et al. a 2016 Ig Nobel Prize.
Last year we reported on another Pennycook study, presenting results from experiments in which an AI chatbot engaged in conversations with people who believed at least one conspiracy theory. That study showed that the AI interaction significantly reduced the strength of those beliefs, even two months later. The secret to its success: the chatbot, with its access to vast amounts of information across an enormous range of topics, could precisely tailor its counterarguments to each individual. "The work overturns a lot of how we thought about conspiracies, that they're the result of various psychological motives and needs," Pennycook said at the time.
Miscalibrated from reality
Pennycook has been working on this new overconfidence study since 2018, perplexed by observations indicating that people who believe in conspiracies also seem to have a lot of faith in their cognitive abilities—contradicting prior research finding that conspiracists are generally more intuitive. To investigate, he and his co-authors conducted eight separate studies that involved over 4,000 US adults.
The assigned tasks were designed in such a way that participants' actual performance and how they perceived their performance were unrelated. For example, in one experiment, they were asked to guess the subject of an image that was largely obscured. The subjects were then asked direct questions about their belief (or lack thereof) concerning several key conspiracy claims: the Apollo Moon landings were faked, for example, or that Princess Diana's death wasn't an accident. Four of the studies focused on testing how subjects perceived others' beliefs.
The results showed a marked association between subjects' tendency to be overconfident and belief in conspiracy theories. And while a majority of participants believed a conspiracy's claims just 12 percent of the time, believers thought they were in the majority 93 percent of the time. This suggests that overconfidence is a primary driver of belief in conspiracies.
It's not that believers in conspiracy theories are massively overconfident; there is no data on that, because the studies didn't set out to quantify the degree of overconfidence, per Pennycook. Rather, "They're overconfident, and they massively overestimate how much people agree with them," he said.
Ars spoke with Pennycook to learn more.
Ars Technica: Why did you decide to investigate overconfidence as a contributing factor to believing conspiracies?
Gordon Pennycook: There's a popular sense that people believe conspiracies because they're dumb and don't understand anything, they don't care about the truth, and they're motivated by believing things that make them feel good. Then there's the academic side, where that idea molds into a set of theories about how needs and motivations drive belief in conspiracies. It's not someone falling down the rabbit hole and getting exposed to misinformation or conspiratorial narratives. They're strolling down: "I like it over here. This appeals to me and makes me feel good."
Believing things that no one else agrees with makes you feel unique. Then there's various things I think that are a little more legitimate: People join communities and there's this sense of belongingness. How that drives core beliefs is different. Someone may stop believing but hang around in the community because they don't want to lose their friends. Even with religion, people will go to church when they don't really believe. So we distinguish beliefs from practice.
What we observed is that they do tend to strongly believe these conspiracies despite the fact that there's counter evidence or a lot of people disagree. What would lead that to happen? It could be their needs and motivations, but it could also be that there's something about the way that they think where it just doesn't occur to them that they could be wrong about it. And that's where overconfidence comes in.
Ars Technica: What makes this particular trait such a powerful driving force?
Gordon Pennycook: Overconfidence is one of the most important core underlying components, because if you're overconfident, it stops you from really questioning whether the thing that you're seeing is right or wrong, and whether you might be wrong about it. You have an almost moral purity of complete confidence that the thing you believe is true. You cannot even imagine what it's like from somebody else's perspective. You couldn't imagine a world in which the things that you think are true could be false. Having overconfidence is that buffer that stops you from learning from other people. You end up not just going down the rabbit hole, you're doing laps down there.
Overconfidence doesn't have to be learned, parts of it could be genetic. It also doesn't have to be maladaptive. It's maladaptive when it comes to beliefs. But you want people to think that they will be successful when starting new businesses. A lot of them will fail, but you need some people in the population to take risks that they wouldn't take if they were thinking about it in a more rational way. So it can be optimal at a population level, but maybe not at an individual level.
Ars Technica: Is this overconfidence related to the well-known Dunning-Kruger effect?
Gordon Pennycook: It's because of Dunning-Kruger that we had to develop a new methodology to measure overconfidence, because the people who are the worst at a task are the worst at knowing that they're the worst at the task. But that's because the same things that you use to do the task are the things you use to assess how good you are at the task. So if you were to give someone a math test and they're bad at math, they'll appear overconfident. But if you give them a test of assessing humor and they're good at that, they won't appear overconfident. That's about the task, not the person.
So we have tasks where people essentially have to guess, and it's transparent. There's no reason to think that you're good at the task. In fact, people who think they're better at the task are not better at it, they just think they are. They just have this underlying kind of sense that they can do things, they know things, and that's the kind of thing that we're trying to capture. It's not specific to a domain. There are lots of reasons why you could be overconfident in a particular domain. But this is something that's an actual trait that you carry into situations. So when you're scrolling online and come up with these ideas about how the world works that don't make any sense, it must be everybody else that's wrong, not you.
Ars Technica: Overestimating how many people agree with them seems to be at odds with conspiracy theorists' desire to be unique.
Gordon Pennycook: In general, people who believe conspiracies often have contrary beliefs. We're working with a population where coherence is not to be expected. They say that they're in the majority, but it's never a strong majority. They just don't think that they're in a minority when it comes to the belief. Take the case of the Sandy Hook conspiracy, where adherents believe it was a false flag operation. In one sample, 8 percent of people thought that this was true. That 8 percent thought 61 percent of people agreed with them.
So they're way off. They really, really miscalibrated. But they don't say 90 percent. It's 60 percent, enough to be special, but not enough to be on the fringe where they actually are. I could have asked them to rank how smart they are relative to others, or how unique they thought their beliefs were, and they would've answered high on that. But those are kind of mushy self-concepts. When you ask a specific question that has an objectively correct answer in terms of the percent of people in the sample that agree with you, it's not close.
Ars Technica: How does one even begin to combat this? Could last year's AI study point the way?
Gordon Pennycook: The AI debunking effect works better for people who are less overconfident. In those experiments, very detailed, specific debunks had a much bigger effect than people expected. After eight minutes of conversation, a quarter of the people who believed the thing didn't believe it anymore, but 75 percent still did. That's a lot. And some of them, not only did they still believe it, they still believed it to the same degree. So no one's cracked that. Getting any movement at all in the aggregate was a big win.
Here's the problem. You can't have a conversation with somebody who doesn't want to have the conversation. In those studies, we're paying people, but they still get out what they put into the conversation. If you don't really respond or engage, then our AI is not going to give you good responses because it doesn't know what you're thinking. And if the person is not willing to think. ... This is why overconfidence is such an overarching issue. The only alternative is some sort of propagandistic sit-them-downs with their eyes open and try to de-convert them. But you can't really convert someone who doesn't want to be converted. So I'm not sure that there is an answer. I think that's just the way that humans are.
Personality and Social Psychology Bulletin, 2025. DOI: 10.1177/01461672251338358 (About DOIs).
Jennifer is a senior writer at Ars Technica with a particular focus on where science meets culture, covering everything from physics and related interdisciplinary topics to her favorite films and TV series. Jennifer lives in Baltimore with her spouse, physicist Sean M. Carroll, and their two cats, Ariel and Caliban.