Inoculating Your Mind | Stats + Short Stories Episode 209 / by Stats Stories

Sander van der Linden is Professor of Social Psychology in Society in the Department of Psychology at the University of Cambridge and Director of the Cambridge Social Decision-Making Lab. His research interests center around the psychology of human judgment, communication, and decision-making. In particular, he is interested in the influence and persuasion process and how people gain resistance to persuasion (by misinformation) through psychological inoculation. He is also interested in the psychology of fake news, media effects, and belief systems (e.g., conspiracy theories), as well as the emergence of social norms and networks, attitudes and polarization, reasoning about evidence, and the public understanding of risk and uncertainty. In all of this work, he looks at how these factors shape 


Episode Description

The information age has been rife with more misinformation than any other time in human history. With the dissemination and spread of fakes news at an all-time high, can people be trained to spot and pre-bunk misinformation? That’s what we’re here to learn about on today’s episode of Stats and Short Stories with guest Sander van der Linden.

Check Out Bad News Here

+Full Transcript

Bailer
The information age has been rife with more misinformation than any other time in human history. With the dissemination and spread of fakes news at an all-time high, can people be trained to spot and pre-bunk misinformation? That’s what we’re here to learn about on today’s episode of Stats and Short Stories with guest Sander van der Linden.

Sander Van der Linden is Professor of Social Psychology in Society in the Department of Psychology at the University of Cambridge and Director of the Cambridge Social Decision-Making Lab. His research interests center around the psychology of human judgment, communication, and decision-making. In particular, he is interested in the influence and persuasion process and how people gain resistance to persuasion (by misinformation) through psychological inoculation which is where I’d like to start. You’ve used the expression, the value of open mindedness, and, you know, Sander, I want to live there, I want to help people join me there, I want, I want to do some some development of that community and, and welcome others to this. It seems like a lot of your work has been targeting trying to do that. So you did an experiment. I know that some of your early interests were related to some aspects of belief in climate change. And you did an experiment to see if you could impact kind of shifts in perception about whether or not you believe in climate change. Could you just describe a little bit of what was done? And then what was observed as a result of that?

Linden
Yeah, absolutely. So we were interested in trying to understand how public perception of the science on climate change gets manipulated by not only the media landscape, but also some of the disinformation actors who've been purposely spreading misinformation about climate change. And, and one of the questions that we had was, well, if, if it has a negative impact on people's perception, and can we preemptively immunize people against it, by making them aware of the techniques of manipulation beforehand, and that, that was really the idea of following the vaccination analogy, just as you expose people to weaken dose trigger the immune system to help fight off infection? Maybe we can do the same with the mind by giving people a weekend dose, a simulation of the types of attack that you might come across. And then preemptively refuted, can we make people's, you know, attitudes, relatively more resistant to DoS attacks. And so we did an experiment with about 2000 Americans and we randomized people into different groups and one group just received the facts as they are, you know, most scientists say that humans are causing climate change, very simple. The other was a misinformation condition where they were exposed to a petition and petition is misleading in the sense that it's not verified. So it's a website, anyone can go to where it says, you know, 1000s of scientists sign petitions saying global warming isn't happening. And you know, they've removed it now. But Charles Darwin used to be on there, you know, Spice Girls, so anyone can sign this petition, you know, and so it was it was, but people don't, you know, people don't understand what we call the fake expert technique. And so you say, Oh, this, this petition is signed by all of these fancy scientists and global warming isn't isn't happening. And then another condition, we paired the two together, which was to simulate the early sort of false balanced media landscape where, where typically a climate scientist is debating or contrary and, and people get this false perception that, you know, there's a lot of debate on the issue. And in the last condition, there was a control group who received the versus doing the work puzzle. And then there was what we call the denaturation condition where we preemptively firstly said, look, there's people out there trying to manipulate you for political purposes, you should be aware of that. And also, you might, you know, you might hear some petition, saying that, you know, climate change isn't real, and all these 1000s of scientists, but you know, this is a manipulation strategy, because actually, here's how it works. Here's the, you know, here's how many signatories actually validated this petition. This is what we call the fake expert technique to try to influence your opinion. And then later on the experiment, we let people go to the website and find out for themselves. And what we found, if you compare it across conditions, was that, yeah, if you were just exposed to a petition, you started to have quite a negative perception of the signs of climate change. If you were just exposed to the facts, then you know, you were more aligned with the facts. But the unfortunate truth is that we don't live in a vacuum where people just just get the facts, right. And so in the condition where they were contrasted if people were confused, and so you know, the power of fax was completely wiped out by the presence of this contrarian sort of petition completely wiped out. And so then we found that in the sort of immunizing condition in the pre bunking condition, it wasn't perfect, but we were able to preserve a large amount of the effect that we would have seen in the facts only condition. So people, you know, we're not fully endorsing the scientific consensus, but much more than they would have if we hadn't pre bumped the misinformation. So it was about 1/3 to two thirds of the effect of the factual condition, which was quite a large effect in itself. And so yeah, I mean, it was maybe maybe a 30% boost in people's sort of resistance to the misinformation. And so that's not full immunity. But now people were able to sort of disentangle it and make up their own mind. And I think that's the whole point of it. Coming to the value of openness, sort of analogy. Yeah, the idea is not so much that we tell people what's true and what's false, but that we empower people to unveil the techniques of manipulation so that they can make up their own mind.

Pennington
That's really interesting, because I know, in the field of health communication, there's been a lot of work to try to educate people about misinformation around health, or educate people about you know, you know, HIV AIDS awareness, it's something I know a fair amount about. But there's always a struggle in that the messages, the educational message, or messages are not always super receptive, or you know, are not receptive. Influential, they don't often do the job right. So people create these interventions where they want to help people, you know, get vaccinated, or, or do something to improve their health. And for some reason, the intervention doesn't work. And it's sort of what you have sort of talked about here. And what also shows up in that COVID study that you helped co author was, I wonder what is the difference? If it's just saying there is some false information and sort of making that clear if that's what sort of helping with the intervention? Because I do think that's an interesting perspective, you're adding an interesting layer, you're adding to this attempt to educate people about, you know, change.

Linden
Absolutely. I think there's a few elements to it. One is that the forewarning in itself can be effective sometimes, but it typically has a smaller effect, but it is useful because people are not always alerted to the fact that they're going to be misled and so nobody wants to be duped. And so I think if you tell people in advance, and particularly the people who consider themselves as the truth seekers, they don't want to be manipulated by anyone, right? So when you tell people that they might be vulnerable to manipulation, and all of a sudden they're paying attention. So I think that that is a useful sort of way to start the message. And then I think the power of the pre bump really, is that yeah, you get some reaction sometimes when you when you try to tell people what the facts are and what the misinformation is, but we find that people are less resistant When you talk about the ways in the techniques that, you know, are used to mislead people, and so people seem to be more receptive to that. So if I tell you, not necessarily that you're wrong, because you know, you don't believe in climate change, but that, you know, there are people that use fake experts to try to convince you that there's a lot of disagreement, then people are more amenable to reconsidering their stance on the issue than so and so we find that that is a useful approach to engage with more hesitant audiences, by showing them really, you know, what the what the techniques are, that are used to deceive people, rather than trying to force an opinion, which is often approach unhealthy communication of saying this is the right answer. These are the facts. And so, you know, it's kind of a middle ground. And I would say it doesn't get, it doesn't elicit the type of behavior change that you might want from a health campaign, usually, but it is a way to, I think, to more effectively communicate with hesitant audiences. And to the Climate Experiment, you were mentioning bad news earlier. I mean, one of the things that we came up with was, I mean, it's kind of boring, maybe people don't want messages about climate change. And so we decided, we were going to create a game, and we're going to gamify this whole approach, and we're not going to take ourselves too seriously. And we're gonna, you know, help people in a simulated environment, simulate the social media environment, let people come to terms with all the problems and social media and some of these techniques. And we found that that was a nice way to engage people or for people to learn something and not feel like they're learning something.

Bailer
So I did the game, you know, I thought it was a how to guide to become a conspiracy Enforcer. I'm really, you know, sander, I, you know, I got my badges on, you know, impersonation and emotion and polarization and conspiracy, and discrediting and trolling. You know, I really thought like I had earned the next rank in terms of being, you know, kind of a really malicious actor in this i. So I'm really so glad that you mentioned this bad the bad news software, because I thought that was a, it's I was gonna ask if that was in part inspired by by some of that, that experimental work that you had done with, with the idea of the vaccination as a way of trying to address you know, the fact that fake news would would negate simple facts in terms of presentations. Was this part of the inoculation strategy?

Linden
Yeah, absolutely. Yeah. And that was part of the motivation behind it. And we, you know, the idea of the game was really, and some of my graduate students at the time, who was influential in bringing the site, the game, the game idea to the lab was the idea. But the whole point was that maybe also to the idea of the messenger, maybe people don't want to receive messages from authoritative figures, maybe we should create a game for people that's online, anyone can share and use it. And it does, it does, we, we explicitly created a sort of jolt that like, Oh, I'm stepping into the shoes of a missive, right to activate people's immune system, so to speak, right to get the antibodies flowing and experience. And they're all weekend doses, you know, in a sense that it's using humor and stuff, that's obviously too ridiculous to be true, but it gets the message sort of across. So it is very much inspired by that inoculation design. And, and the weekend sort of the weekend dose of metaphor. And so when people come out of the simulation, they should be relatively more immune, as you say, you go through the levels. And sometimes they use a magic show metaphor that the the thing is that you go to an illusionist, you might be duped the first time around, because you don't know how it works, and that the standard fact checking approach is here's a scientific technical blueprint of the trick. Or, you know, we could spend some time backstage and we could, you know, you could step into the shoes of the illusionist, and then the next time you see it, you're never going to be duped again. Because at the end of the day, people really experience and once you have the experience, you're not going to be fooled by it again. Now, some people have asked us, you know, aren't you worried that you might create a fake news troll? I mean, the thing is, how many people go on to be a professional illusionist after learning, you know, so we're not, we're not so concerned about that, because we're not revealing the incentives for spreading fake news. We don't show people how to make money or you know how to gain political favors, but then and so we've tried to keep talking but you know, pretty clean, but other than that, you know, it was meant to be a fun experience.

Bailer
Oh, it was great. I know, and we'll certainly want to encourage people to explore that as they go forward. I think that if you're if your lab had a theme song, you'd probably be the Who's Won't Get Fooled Again, just as well I'm afraid that's all the time we have for this episode of Stats and Short Stories. Thanks again Sander for joining us.

Linden
My pleasure.

Bailer Stats and Stories is a partnership between Miami University’s Departments of Statistics, and Media, Journalism and Film, and the American Statistical Association. You can follow us on Twitter, Apple podcasts, or other places you can find podcasts. If you’d like to share your thoughts on the program send your email to statsandstories@miamioh.edu or check us out at statsandstories.net, and be sure to listen for future editions of Stats and Stories, where we discuss the statistics behind the stories and the stories behind the statistics.