Ideology and Social Networks: Why Smart People Believe Stupid Ideas

Praveen Benedict
6 min readNov 4, 2023
Photo by Antenna on Unsplash

We all like to consider ourselves smart. We all like to believe that we are right. Myself included. But when I look back at certain ideas that I believed in the past, I can’t help but laugh at how stupid they were. And sometimes, I look around me, I see so many smart and brilliant people hold very stupid, factually incorrect ideas.

I know a friend, who believes that climate change isn’t true and that it is a hoax by Chinese companies that sell Solar Cells. Another friend of mine believes that the mRNA vaccines can affect your DNA, which makes no sense. But that isn’t to say that those two people are dumb; they are smart intellectual people, studying or working at prestigious places.

Now, when I first started asking this question, it came from the lens of politics and democracy. I believe that nothing is more important to a democracy than a well-informed electorate. But, a lot of well-informed people I know, believe in some policy ideas despite overwhelming data suggesting that they are bad policies.

Looking back, I can point to economic policy positions that I believed, which make no sense. So, why isn’t being well-informed enough? Can good data or more information actually persuade people to accept that they are wrong and switch policy positions?

But before we try answering the question, let’s take a step back. Most of you reading this have very little clue about rocket science or cosmology. But we believe that the Earth is round, multiple galaxies exist, and that Neil Armstrong landed on the moon. We trust that antibiotics work, even if we have no idea about microbiology or immunology.

So, it isn’t really the presence or lack of information that makes us believe or not believe in certain ideas or facts.

Well, this is where I read about an interesting research by Dan Kahan.

Unfortunately, I couldn’t find any similar study in India, so I’d like to give some background. Unlike in India, where a majority of us irrespective of political ideology, believe that global warming is real and humans have contributed to it. In the US, that isn’t the case. A good chunk of the people believe that Global Warming is natural and human activity has nothing to do with it.

Yeah, let’s get back to Kahan’s study. He constructed a test to understand scientific literacy alongside the ideology of people. He took the example of climate change. The general expectation from climate change alarmists is that, if one reads data about climate change, their concern about the impact of climate change should only increase. But that isn’t the case, as per Kahan’s study. Their study found that a climate change or global warming skeptic turned more skeptical of climate change after reading data that supports global warming.

In fact, their arguments against global warming turn even more intellectual, after they read more data supporting climate change.

Another experiment by Kahan portrays an even better picture of this problem.

In this experiment, he and his team provided information to people about scientists, their biographies, and the result of their results. For eg, they came up with a fictional scientist named Oliver Roberts, who claimed that he was a professor of nuclear engineering at the University of California, Berkeley. Then they came up with two sets of articles. One claims that nuclear waste can be disposed of safely, while the other claims that nuclear waste cannot be disposed of safely.

When the researcher’s name is attached to the article that claims that nuclear waste can be safely disposed of, people who already believed in the same view are MORE LIKELY to consider the researcher as an expert, while people who already did not believe in the view, were LESS LIKELY to believe that the researcher is an expert and dismiss the person as not an expert. The same is the case with climate change. When the researcher’s results underscored the dangers of climate change, people who already worry about climate change were MORE LIKELY to agree that the researcher is an expert than people who have opposing views on climate change.

The takeaway is that, in matters of high political or ideological stakes, people define an expert as “a scholarly individual who aligns with my viewpoint”.

To understand this, I went on to read books that claim that climate change isn’t true. They are well-written and contain beautiful tables and graphs. But, most of it is irrelevant, and only someone who has spent studying about climate change will be able to identify it. The average Tom, Dick, and Harry, will find such arguments to be fair.

But, why is this happening?

Just think about this. Most of our close social circles, be it friends or family, are made up of people who strongly believe in the same opinions. Now, Kahan says that the personal cost of making a mistake in science is zero, but the cost of having an opinion that goes against the opinions of a majority of our close friends or family members, IS HUGE.

The effect of this is so huge that, when there is an opposing opinion, your brain would find ways to justify your opinion, rather than succumb to it.

Kahan introduces a concept he terms “identity-protective cognition”. This is a psychological process in which individuals, often without realizing it, dismiss any evidence that contradicts the commonly accepted beliefs within their group. It’s a protective mechanism that helps them maintain their standing and identity within their community or social circle.

The beliefs we hold define “who we are” and that significantly shapes our relationships within our social circles.

This explains why a lot of religious people, when presented with evidence against their beliefs, tend to provide justifications that sound scientific, but make no sense, just to justify themselves. Their beliefs and bond with the religious community are so strong that, they are unable to accept any scientific evidence because it will put them at odds with their social circle. And the effect is so strong that, people genuinely would find the evidence as “not enough”.

Let’s get back to the climate change problem in the US. Most conservative politicians believe that climate change isn’t real, while most progressive/liberal politicians believe that climate change is real. Where did it mostly start?

I went behind learning about conservatives who first believed that climate change isn’t real, and then later changed their opinion. I came across Bob Inglis. Now Bob Inglis belongs to the conservative party in the US, the Republican Party. When he first said that Climate change is nonsense, he had no idea about it. All he knew that was, his opposing party, the Progressive Party (Democratic Party) was campaigning about it, one of their leaders Al Gore was passionate about solving Climate Change, and Al Gore wanted to spend government funds to tackle climate change. Just for the sake of that, just to oppose the opposing party, Bob Inglis and his peers opposed any effort to tackle climate change.

Now the circle of conservatives so strongly opposes efforts for climate change. If you are a conservative, you look at the data and then decide that climate change is real and that you were wrong, changing your opinion means disagreeing with your circle, the people you work with, and the ones you have dinner with. This is the same problem Bob Inglis faced when his son disagreed with him on climate change. Bob Inglis had to choose to between facts provided by his son, and the opinions of his party peers.

For many people, being at odds with their social circle for the sake of a political opinion is just not worth it. That is enough to make people genuinely believe that the evidence being presented to them, is not true.

This is the case with most of our beliefs. Not everything we believe starts with us forming opinions based on mounds of research literature and fieldwork. It starts with what people around us believe. It is the reason why we vote for the party that our family or our close friends vote for. It is why we are more likely to strongly believe in the religion that our family believes in.

And as we read more opposing opinions, rather than think about it, we would rather find ways to more intelligently justify our positions, just so that we can stay within the bubble of our social circle. We are fine with fooling ourselves.

Hence, we don’t search for facts, we search for “arguments” to justify our existing positions. It is almost like working as a press secretary for a politician. No matter what, your job is to defend the politician’s views to the press.

--

--