Psychologist logo
Jon Ronson and Sander van der Linden
Cyberpsychology, Social and behavioural

‘For me, it's just about telling complicated, grey area human stories’

Jon Ronson and Sander van der Linden in conversation.

19 March 2024

Journalist and author Jon Ronson, whose most recent BBC podcast series 'Things Fell Apart' considers 'strange tales from the culture wars', meets social psychologist Sander van der Linden, author of Foolproof: Why We Fall for Misinformation and How to Build Immunity

What follows is an edited transcript. Jon's contributions are in bold. You can also watch the full, unedited conversation via YouTube.

I think it was a nice idea that we should have a conversation, Sander: thanks to Jon Sutton, The Psychologist Editor, for suggesting it.

So your field of interest is pulling people out of the 'rabbit hole', de-radicalising people?

We do some of that, but I've shifted a lot of my focus to prevention, given how difficult it is to pull people out of rabbit holes. I do interview a lot of people who are former extremist and conspiracy theorists. We research why people believe these things, what are the predictors? How does the mind work when people believe in what are seemingly outrageous theories? And can we design real interventions to try to prevent that from happening? What effect might we have on people who are already playing around with these ideas?

Sometimes, we need more aggressive de-radicalisation interventions for people who are deep down the rabbit hole, although it's very difficult to do that. I've got some questions for you about your experiences with that. Maybe we could start with your motivation behind Things Fell Apart? I saw a clip of you with Louis Theroux, with a bit about the pop star Robbie Williams. Was that the catalyst for the show?

It wasn't the catalyst, but during lockdown Rob did contact me. Rob always has a foot in both camps – he's willing to go down rabbit holes, but he's always got one foot in the world of scepticism, which I think is a very healthy way of living your life. So he said he'd got interested in QAnon, and the thing that really made me laugh was he said, 'I don't know whether to believe or not. I've read some stuff that says that I am part of the paedophile cabal, and I know that's not true about me, but how do I know it's not true about Bill Gates?' We had a number of calls, but nothing came of it… 

But then at the same time, the BBC approached me and asked if I was interested in doing something about the culture wars. And really what motivated me was seeing friends plummeting down rabbit holes in a way that felt really deleterious to their well-being. It seemed to be happening more than ever. More and more people were talking about losing people to whatever culture war it was: people just changing. I wanted to figure out the mechanics of all of this – what does happen to somebody, between that tweet and that tweet two weeks later, when they seemed to be unrecognisable, or an extreme caricature of themselves?

Yes, I've had family members deeply embroiled in conspiracy theories, who got further radicalised during the pandemic. Before we get into the psychology of that, how do you get people to agree to come on the show? In the scientific community, that's a real challenge – conspiracy theorists don't want to participate in scientific research. Even when we say 'we just want to hear your thoughts, we want to get inside your mind, we're very empathetic to your situation, it's in the public interest to have this knowledge out there' – non-confrontational, you know – they just don't trust scientists, they don't trust scientific institutions. With your podcast, you do try to stand up for the facts, and give people a sense of what the science actually says. So they know they're not going to come out of this as the hero of the story. So what was your experience? Did you get any harassment after the show? I get this all the time… I get a lot of hate mail around doing this type of research.

I don't! Touch wood it never starts, because I think it would be very upsetting if I got negative feedback. I think there are a few answers to that question. Firstly, on the very rare occasions that we were turned down, it was Brits. I think Americans still feel a sense of the BBC being less partisan, than a lot of the American media. The BBC is far from perfect, none of the old legacy institutions are, but the BBC works far harder at being nonpartisan… the American media prides itself on being partisan!

Another reason is I'm just very, very curious, enthusiastically curious, and I think that rubs off on people. I'm not an ideological person – I tend to take ideology out of the equation when I'm doing storytelling. I'm meeting people just where they are and I think people appreciate that. But yes, I'm not going to give somebody like Judy Mikovits an easy ride, because there is a danger to the things that she says.

What you're saying resonates well with when we built psychological interventions. When we make games for the public, for example, we keep them non-political, try to make the environment non-judgmental… so it's not this elite institution telling people what to think or do. People are more open and willing to engage with that type of environment. But in terms of interviewing extremists, it's more challenging. I was listening to your episodes thinking, 'Is Jon going to challenge them? And how hard is he going to challenge them?'

My answer is, yes, I will challenge them. But not that hard!

That's interesting, because from the psychological research, we know that the people who are conspiracy theorists tend to double down when you challenge them too directly and too hard.

They get more extreme and nothing comes of it, they become defensive, and more taciturn. That's one of the reasons why I don't do a sort of Jeremy Paxman style. I just don't think it's my purpose, because then it becomes confrontational, or oppositional. When you do that, you lose the curiosity. It becomes about you, about you being the representative of righteous society. I'm more interested in getting the interviewees to a place where they will talk in an open and honest way, so they might give you some new insights on how society works. Once in a while, you have to challenge them if what they're saying is particularly dangerous or controversial… but it's certainly not the part of the process that I enjoy… it's my least favourite part of interviewing people.

I assume that in your research, you've never found it to work, if you just tell someone they're wrong?

Yes, we don't advocate for that with people who are on the more extreme side of things. But it is sometimes strategically important for the bystanders and for the audience. I think you balanced it quite well, because at some points you do fact check them. And I think it's really important for the audience to be aware of what the facts are. People might get confused if it's an equal back and forth. In the episode with Mikki Willis, he's talking about 300,000 people dying from AIDS drugs, and you then fact check that with a health economist. You put that to Mikki and he's just like 'oh, okay, well', and then the segment skips to the next question. I feel like whenever you challenged them, there was not much of a response – just a recognition that you disagree.

I think we get the balance pretty right. There's a lovely moment in that particular episode where I say, 'I don't believe that, you know, vaccinations didn't kill millions'. There's a pretty long silence, the two of us are pretty frosty with each other for a moment, and I think that's enough. That moment in the show, that's enough. I don't need to do any more than that. I'm trying to get different things from them. Is it good for your story to just see them angry at you?

It reminds me of Louis Theroux, visiting the Neo Nazis. They start suspecting he's Jewish. I'm Jewish, and I was watching this, and I was like, 'this is getting really awkward'. I see the camera team making their way out the door. It takes gumption to just stand there with this deadpan act, 'I'm not going to tell you if I'm Jewish or not'. But they were shutting down, no longer wanting to talk to him, it's an interesting dynamic. When you interview people if there's this awkwardness, do you feel sometimes that you have to pivot? Do you try to avoid that level of confrontation?

Well, when I was younger, I used to go for awkwardnesses. My early documentaries, the things that comprise my book Them, and even before that, awkwardness was a big colour in my palette. I used to try and make things awkward. I really admired Louis for doing that. But to be honest, I just kind of changed. For years, I was a slightly odd presence in documentaries, making things awkward. People enjoyed it, and it was fun to watch. But that only takes you so far. It's not really journalism, it's more performance. Also, I just don't like conflict and I don't want to make people feel bad about themselves. And of course the BBC does have rules in place, you can't just allow certain things to go unchallenged. Luckily, it doesn't get to that very often… it's more about getting people to open up because of your curiosity as opposed to putting them in a corner to see how they react to that.

In one episode you tackle the 'Plandemic' viral video… I knew about that from research we did on what we call the seven traits of conspiratorial thinking. These are reccurring patterns that are part of pretty much every conspiracy. There's always some nefarious intention, a victim in the story, something must be wrong. There often incoherent things that don't really fit together. We use an acronym, CONSPIRE, to help people sort of spot these elements [Contradictory, Overriding suspicion, Nefarious intent, Something must be wrong, Persecuted victim, Immune to evidence, RE-interpreting randomness]. With Covid, the idea that by wearing a face mask you activate a virus that's already inside of you, but then also you're gonna get ill from taking the vaccine, it's just incoherent.

What I didn't know, though, was Mikki Willis' history before making that 'Plandemic' trilogy of films. You showed the evolution from his early days, to the pandemic, and I thought that was the peak, but no, it actually gets way more interesting. That story reminded me of what we find in the literature: a monological set of beliefs, where if you believe in one conspiracy, it starts to serve as evidence for the existence of another conspiracy. And so the probability that you endorse multiple conspiracies goes up, quite radically. It's a worldview where there's always a higher order conspiracy, and it's very hard to break through that. People start out with one conspiracy, but then they see patterns, and become enthralled. With Willis, there's that viral video clip where he's urging his kids to be open about their sexuality, and that then becomes part of a big plot by trans people. It fits with the literature on how that actually works. In one of my books I talk about Buckey Wolfe, a man in Seattle who thought his brother was a shapeshifting lizard, and killed him with a sword. He was not well, mentally. But interestingly, he was also a member of the Proud Boys, and QAnon. One conspiracy theory becomes the story for another, and psychologically that's how people get sort of trapped in this worldview.

What part do you think paranoia plays in that? A couple of months ago, I got this message, and it seemed passive aggressive. I thought 'Why is this person being so weird towards me?' I went back and looked at the message again, a couple of months later, and the message was completely delightful. Every negative feeling I had about this message was entirely in my own mind. That made me think about how easy it is to slip into paranoid thinking.

Yes, that's an interesting segue. As academics, we often joke about this… let's say when your paper gets rejected, even though the reviews are positive, you start thinking maybe there's some plot where the editors are conspiring against publishing your paper. There's actually a lot of this low-level conspiracy theorising that goes on for most people, in most settings. Maybe at your job, you didn't get a promotion, you start thinking people are against you.

People have argued about whether paranoia and conspiracy theorising are the same thing. They are conceptually different, but paranoia is a major predictor of belief in conspiracy theories. For the subclinical population, just regular paranoia is a major predictor.

We've also looked at the relation between ideology and belief in conspiracy theories – there's been a lot of debate about the right and the left and the differences. One of the things we found was that the link between ideology and conspiracies is actually explained by paranoia and mistrust. People who are really strong on distrust of official authorities, of the media, mainstream institutions, and people high on paranoia. Those are the two most important predictors of belief in conspiracy theories.

And narcissism? I've always thought narcissism was a big part of it for a number of reasons. Firstly, when people with narcissistic disorders get wounded, the wound doesn't heal at all well. They tend to lash out. So if somebody who is narcissistically inclined is shamed, or excluded from their community, I've always thought that's one reason why they might turn to conspiracies. And I think the other connection is just that narcissists don't care as much about whether something's true or not. They're wrapped up in themselves, and truth only matters in relation to their place in the world. Do you think those things are true?

Absolutely. A lot of this is about how motivated people are by accuracy. In a typical experiment, where we ask participants some questions about misinformation, they'll give us not so great answers. But then if we start paying people for the right answer, all of a sudden, they know the right answer. So with the right incentives, you can actually make people a lot more accurate. And I think narcissism is a major distractor in terms of being accurate – it's all about self-obsession. It's looking inward, and there's this marginal relationship to what's true. It's more about 'how am I advancing my own interest?'

You mentioned a kind of vulnerable narcissism, but there's also a more grandiose type. Maybe Trump is an example, where grandiosity is the main motivator. But yes, there's tons of research on narcissism and belief in conspiracy theories, and that's absolutely another major predictor.

Then there are contextual things: how much time people spend on social media, where they get their news from, how cognitively flexible people are. We did some research on this, and found this construct called 'actively open-minded thinking'. Can you hold multiple hypotheses in your mind? Are you open to uncertainty? Is it okay to not have all the answers? Are you willing to take somebody else's perspective? People who are high on that level of flexibility, are very much protected from believing in conspiracy theories and misinformation. People who are low on that – who tend to be very rigid cognitively, they want things to be certain, a single explanation, clear causal effects – those people are more ideologically extreme, and also more likely to endorse extremist ideas. That's another major component of that type of thinking. That's not so easily addressed. You can't just teach people to be open minded… it's much more difficult than we thought.

This is such an important question. I've got friends, or former friends, who fell down rabbit holes of their particular culture war, and they seem so deep that I think this is it for the rest of their lives now. However, I've met a a small number of people over the years who have completely changed. One was this guy called Josh Owens, who I met in 2016, at the Republican Convention in Cleveland, and he was Alex Jones' camera man. I noticed he was spending quite a lot of time hanging out with me and not with Alex and Alex's friends. Eventually I could tell there was something on his mind. I met him after the convention, and he told me he'd been working with Alex for four years, and gone from being such a huge Alex Jones fan to becoming completely disillusioned. He wanted out. I did a little thing with him for This American Life, and I hooked him up with an editor, and he ended up writing a big piece for The New York Times Magazine, 'I worked for Alex Jones. I regret it'.

And then I was introduced quite recently by this podcast called Some Dare Call It Conspiracy to these these guys, Brent Lee and Neil Sanders, who were also big Alex Jones fans and found their way out. What they have in common with Josh Owens is that nobody forced them out of the rabbit hole. They found their way out by themselves. It was self-motivated. So what is it about these guys?

There's not a clear common thread, but there are a few ways people get out. I've interviewed people like Caleb Cain, who fell into the YouTube rabbit holes. He watched tens of thousands of hours of right wing, extremely misogynistic videos. And then there's this YouTube influencer – Natalie Wynn, ContraPoints – and she does what some people refer to as 'algorithmic hacking'. She talks about the same issues as the radicalisers, the same key words, so that her content shows up in the feed of the extremists. I learned about this because Google actually have a programme called Redirect, where they redirect people after they watch extremist videos to something that is basically a 'contrapoint'. That's kind of how it was for Caleb, he started watching alternative viewpoints and slowly, slowly got out.

I think for Brent it was different. He mentioned that the whole QAnon thing just became so implausible, so bizarre… there seems to be a limit. There's a kind of informal concept in psychology, 'crank magnetism', which is that strange ideas attract each other. So people who believe in spirituality, are also into alternative medicine, are also into pseudoscience, magical thinking… it's this whole bubble that surrounds people. But some things don't fit. There's a contrasting effect, where if the ideas are too extreme, too strong, all of a sudden people think, 'What's going on here?' That idea actually informs an intervention called paradoxical thinking, where we take an issue to such an extreme level that people say, 'no, that's too much'.

Certainly what Josh said to me was that he had to go to upstate New York, where there was this Muslim community, and Alex wanted him to do a report about them – that they were all secret terrorists, and it was a training camp. Josh said that on his way back, he was on a plane, and he was sitting next to a Muslim family. And he was just feeling just incredible guilt. Like, 'This isn't why I wanted to work for Infowars, it was never about the Muslims'. I think that was a big moment for Josh.

That's interesting. And Brent does talk about how people feel bad once they realise they're endorsing something that's actually harmful.

But it does take a lot of bravery to come out. Brent lost all of his friends, he had no social network. You have to be strong for that. For most people, it's just not an option. If we look at how people get radicalised, the playbook tends to be to find vulnerable individuals, in a personal life crisis, gain their trust, isolate them from their friends and family, and then activate them to do something little for the cause. And then you go on to become extreme. So these people are often already vulnerable and isolated – it takes a lot of logistical and social support to get them out. De-radicalisation is not going to happen overnight – you have to keep talking to people, you have to be supportive.

Unfortunately, often that's not the case. When Josh wrote his piece in The New York Times Magazine, I looked at the comments. The great majority were, 'I don't forgive you, you should never have worked for Alex Jones to begin with.' I though those comments were completely inappropriate and elitist. This guy's just done something incredibly brave, and all you New York Times readers are just going to attack him for making a mistake in the past?

When I was writing So You've Been Publicly Shamed. I was very aware that if some kid made a mistake, tweeted something unwise, the left would pile in on them. I don't think this happens anymore, but it was happening huge style 10 years ago. Who was waiting with open arms? The right. 'We're not going to judge you, come to us.'

I think that does still happen, and it's not a good response. I did some work on climate disinformation and when you have prominent Republicans like Arnold Schwarzenegger speaking out about climate, this should be celebrated. If they are accused or ridiculed, then we're not going to have any bipartisan support. People just realise they're not getting rewarded for going against their own group in some way.

Maybe this is the difference between a sort of progressive versus more classical libertarian perspective… progressives want to make people behave in a certain way…

I think so. And often they want it all and they want it now. They want people to be fixed, they want them to endorse the facts now, and it doesn't happen that way. We've got to be patient… de-radicalisation happens in slow steps.

So we started thinking about a different type of solution – pre-bunking or inoculation, actually giving people a weakened dose of the 'virus'. We deconstruct the misinformation in advance, so that when it actually happens to people, they're more immune. When we started doing this 10 years ago, people were very confused about this idea of giving people a weakened version of a conspiracy, and then deconstructing it in advance, and then testing people with the full dose a bit later on. It seems kind of uncomfortable. But simulating the types of ideological or cultural war attacks that people might be facing in the future, in weakened form, and allowing people to build up resistance through counter argumentation and counter evidence, finding their own ways of resisting it, turns out to be pretty strong protection. We do stuff on social media, little videos with YouTube … going back to what we were talking about earlier, we try to keep it light and not tell people what to believe or what to do. We show people a clip from Star Wars… are you a fan?

I've never watched a Star Wars movie. OK, I watched the very first one in 1977. My parents took me to the Odeon in Cardiff, and everybody in the cinema was clearly thinking this was the greatest moment in their lives… it washed over me. Ironically, the two actors who have played versions of me in movies have both been Star Wars people – Ewan McGregor and Domnhall Gleeson.

Ha! So we give people a clip from Revenge of the Sith … it's Anakin Skywalker – spoiler alert, he goes on to become Darth Vader – and he's talking to Obi Wan Kenobi. He says, 'either you're with me or you're my enemy', and Obi Wan says, 'Only a Sith deals in absolutes'. Then our narrator says 'don't join the Dark Side, watch out for false dilemmas'. We explain the false dilemma technique that's often used in propagandistic rhetoric. And we test people with real examples from social media: 'if you're pro-Israel, you're anti Gaza', or 'if you're pro Gaza, you're anti Israel', or if during school shootings, you don't support automatic rifles, you're against the Second Amendment. False dilemmas take away all nuance and pretend there's only two options. Of course, there are many more. So we're using the exact same structure as the actual examples, but in a non-threatening context, and you can scale that across millions of people.

One of the criticisms we get is that we're not changing people's minds on specific issues, which is I think what progressives often like to see. Certainly, you can try to persuade people to believe in certain facts and evidence, but I think the first thing we need to do is empower people to spot these radicalisation techniques, and then they can make up their own mind. When you do talk to the conspiracy people, they don't engage in fact checks. They don't trust fact checkers. They don't trust scientists. But they don't like manipulation. And so when we say 'learning about manipulation techniques using non-political examples', they're intrigued. They're at least willing to come on board with step one and learn about these techniques. I often use the example of Alex Jones – this guy has told people for decades that the Sandy Hook shooting was a hoax with crisis actors, and then he admitted, in court, that was all made up. Sandy Hook was 100 per cent real. We don't want to be manipulated and duped by people who don't have our best interests in mind. And that seems to resonate with people and make them more willing to engage.

In some ways, your Things Fell Apart series are like a vaccination too… a weakened dose, with a little counter pressure, talking through the narrative in a polite and non-political way. That allows people to hear a more nuanced way of thinking about it, and maybe even immunise themselves from falling for these things in the future. And maybe you're like, well, that's that's not how I would see the show. But we use a lot of things from popular media as our interventions – perhaps we could use your podcast, see if people become more resistant to conspiracy theories after listening to the show.

You should do some A/B testing, have me and Louis Theroux and see who's better at de-radicalising people!

My main motivation is just telling human stories. I was at the Steven Pinker event the other day, and somebody asked him, 'everybody in this room probably feels the same way. How do you reach people outside of this room?' If I was on the stage, I'd have been saying human stories. Human stories, that twist and turn, just by their very nature, are more nuanced, easier to connect to. So I think of Things Fell Apart as being human stories. Hopefully, that will have the same positive impact as this 'weakened version'. But for me, it's just about telling complicated, grey area human stories. I think the reason I tend to be immune to those sort of culture war pile ons, is because I stick to human stories. It's hard for people to get angry if you're telling a nuanced story like that.

Absolutely. But it's hard for people to stick to the nuance when under pressure. That's a hard skill that's hard to develop, to remain nuanced.

Well, I think it's easier now. One of the episodes was about the concept creep of trauma. That's a pretty right-wing argument, and I would have been scared to say that a couple of years ago, scared of being piled in on. But I wasn't, and nothing bad happened. I got one or two slightly grumpy messages, but nothing more than that. So I actually think things are calming down in general, when it comes to talking about these issues.

- Listen to Things Fell Apart now.

Sander van der Linden is a winner of the British Psychological Society's Book Award for Foolproof: Why We Fall For Misinformation, and How to Build Immunity. He told us about that book here.

 

Nominations for the 2024 round of the British Psychological Society's annual Book Award are now open.

This award recognises high standards of published works in psychology, with nominees able to nominate books in any of the following categories: 

  • Academic monograph
  • Practitioner text
  • Text book
  • Popular science

Each category will receive one award. The award includes a commemorative certificate and the sum of £500.

Find more information about the award and how to submit a nomination.

The closing date for nominations is Sunday 23 June 2024.