Sander van der Linden is Professor of Social Psychology in Society in the Department of Psychology at the University of Cambridge and Director of the Cambridge Social Decision-Making Lab. His research interests center around the psychology of human judgment, communication, and decision-making. In particular, he is interested in the influence and persuasion process and how people gain resistance to persuasion (by misinformation) through psychological inoculation. He is also interested in the psychology of fake news, media effects, and belief systems (e.g., conspiracy theories), as well as the emergence of social norms and networks, attitudes and polarization, reasoning about evidence, and the public understanding of risk and uncertainty. In all of this work, he looks at how these factors shape
Episode Description
The spread of misinformation is of increasing interest to researchers around the world. It’s been tied to the 2016 and 2020 US elections, Brexit, the COVID pandemic, and the Russian invasion of Ukraine. Some have called the glut of misinformation a pandemic in its own right. A researcher at Cambridge University suggests that, as with other pandemics, the solution just might be a vaccine. That’s the focus of this episode of Stats and Stories with guest Sander van der Linden.
+Full Transcript
Rosemary Pennington Just a reminder that Stats and Stories is running its data visualization contest to celebrate its 300th episode. You can grab data about the show to analyze and submit your entry at statsandstories.net/contest. Your entry has to be there by June 30.
The spread of misinformation is of increasing interest to researchers around the world. It's been tied to the 2016 and 2020 US elections, Brexit, the COVID pandemic and the Russian invasion of Ukraine. Some have called the glut of misinformation a pandemic in its own right. A researcher at Cambridge University suggests that as with other pandemics, the solution just might be a vaccine. That's the focus of this episode of Stats and Stories, where we explore the statistics behind the stories and the stories behind the statistics. I'm Rosemary Pennington. Stats and Stories is a production of Miami University's Department of Statistics and Media, Journalism and Film, as well as the American Statistical Association. Joining me as regular panelist, John Bailer emeritus professor of statistics from Miami University. Our guest today is Sander van der Linden. Van der Linden is a professor of social psychology and society in the Department of Psychology at the University of Cambridge, and director of the Cambridge Social Decision Making Lab. His research interests center around the psychology of human judgment, communication and decision making. In particular, he's interested in the influence and persuasion process, and how people gain resistance to persuasion by misinformation through psychological inoculation. That's the focus of his new book “Foolproof: Why Misinformation Affects Our Minds and How to Build Immunity.” Sander, thank you so much for joining us today.
Sander van der Linden
So excited to be on the show.
John Bailer Oh, that's not fake news is it? You know, as I was going through your book, I’ve really been enjoying it. I love how you end it because as someone who aspired to think about being a data self defense teacher, the fact that you were aspiring to be a defense against Dark Arts teacher really spoke to me, Sander, so I think your, your book does a great job at trying to paint that picture. So just before we kind of dive into some of the specific details, could you give us a little bit of a framing or a structure to the way that you've organized your thinking and writing “Foolproof”?
Sander van der Linden Yeah, absolutely. So you know, when I was trying to conceptualize this book, I was thinking about the way that misinformation spreads and how it affects people. And I ended up dividing the book into three parts. And the first part is really about why the brain is susceptible to misinformation. And why we are susceptible. And every part of the book also follows the viral analogy. So the viral analogy runs throughout the book in the first part, kind of, you know, just as viruses hijack cells with the aim of taking over some of their machinery to try to reproduce themselves. The same really is true for misinformation. And it can distort people's memories, it can control some of what we think, and also has the goal of replicating itself. And then I look at how it spreads online, in social networks, but also some of the history behind the spread of disinformation. And again, here, there's these interesting analogies where we can use models from epidemiology that are used to study how viruses spread to actually, you know, use them in a fairly unaltered way to study how misinformation spreads in social networks. And then the third part of the book, where I kind of come to the conclusion that if a lot of misinformation behaves and spreads like a virus, maybe we can inoculate or vaccinate people against it. And that's where I kind of outline all of the research that we've been doing on that front. And that's kind of how the book came about.
Rosemary Pennington Could you walk us through how you conceptualize misinformation and how that might compare to “disinformation,” which is another term that gets used a lot and “fake news?”
Sander van der Linden
Yeah, that's a great question. And I want to be nuanced about this, you know, given how politicized the term misinformation in itself has become so you know, I define misinformation as anything that's either false or misleading in some way. And that could just be due to simple errors, right? So it's not necessarily intentional. Whereas disinformation is misinformation coupled with some psychological intention to deceive or harm other people. And then I should, you know, I should caveat by saying, okay, but who decides it as misinformation, you know, I'm more focused on this idea of helping people calibrate their judgments in terms of how reliable or trustworthy a piece of content is based on the presence or absence of misleading techniques rather than telling people what's true or false necessarily, but certainly we have some ground truths that, you know, that we can all relate to, that can help us discern whether or not a piece of information is false or not, whether it's expert consensus, or science or fact checkers, or any other established means of reaching some conclusions about the nature of of truth.
Rosemary Pennington
One of the things that I found really interesting in the book was this, I can't remember which section it's in, where you discuss, I think, in relation to maybe climate change, where the existence of a skeptic made people feel more like the debate more was more fair or something, you know, debate, just regardless of the truth, right? The field, the fact that there was a skeptic skeptic voice made it seem like maybe whatever they were hearing was trustworthy. Could you talk about that? The importance of the skeptic in relation to that?
Sander van der Linden
Yeah. So, you know, in journalism, for some time, there had been this norm, which sometimes is a good norm of balance. So you always have to have another viewpoint in a story. And that can be a good heuristic for journalists, because, you know, they don't know, you know, it's not good to always just kind of promote the views of one particular expert. But sometimes that heuristic can really go wrong. When we're comparing what one scientist is saying about the scientific consensus on, let's say, the safety of vaccines or climate change to that of a contrarian because it gives people the perception that there's an even keeled debate happening, right, that there is balance, while in fact, there is weight of evidence that overwhelmingly points in one direction. And though, you know, we have some rules for trustworthy communication that we work on. And one of those rules is to give people balance, but not false balance. And so in situations where, you know, with medical procedures there, there are harms and benefits, and you have to outline those for people, right. But when we talk about things like scientific consensus, what happens in people's minds is that they hear two voices debating, what should be happening is that, you know, we hear one voice say something, and then ideally, the other voice should sound like 1000s of voices in our head saying the same thing, right, that there's 1000s of independent scientists who have concluded something. But that's not what is happening, you know, that's not what's happening. People only hear two voices, and they perceive that there is a balanced debate. And that misrepresentation of the weight of science and evidence, I think, is hugely problematic in the media. And it's not just fringe outlets. I mean, the BBC had a whole report on, and they apologized for this. And this is also what I think distinguishes legitimate outlets that have editorial norms and can own up to a mistake and say, look, we have good journalistic practices, and we made an error. Now we're going to move on from outlets that are fringe and have no norms and keep producing misinformation. So the BBC also participated in propping up contrarians for a long time and confusing people about climate. They've admitted to this, and they've stopped doing that. And I think that also differentiates trustworthy from non-trustworthy outlets.
John Bailer
Yeah, having the different perspectives but not being equal, waiting on those is really, it's the nuanced, the critical part of that story is, as you describe it, you know, what I really liked about the book was the frequency, the frequency that you've kind of infused the stories of history throughout. I mean, you know, while certainly the sense of kind of the maliciousness of propaganda comes up, as we think about World War Two and and the Nazi efforts in that regard, you push this back, even to stories of Roman times. So could you kind of take us back to sort of one of the early greatest hits of fake news?
Sander van der Linden You know, when I was writing the book, I always felt the sense of like, people are gonna wonder how is this different from the past? Haven't politicians always, or aren't they always lying to us? And I thought about maybe I should try to give people a historical comparison throughout the book to try to sort of benchmark and see what's what's happening. In the furthest I could trace some of these disinformation campaigns was, was to the to the Roman Empire, it goes back further, but the evidence, you know, I've attained a new appreciation for historians in trying to ascertain the factual validity of references from the Roman era and debate among historians about who wrote what and who said, what so, you know, for a book on misinformation, I tried to be as accurate as possible. So, you know, these are often called the Info Wars of Rome where Julius Caesar was assassinated. That was a real conspiracy, by the way, and so the senators conspired to take him down because he was gaining too much influence. And then Mark Anthony and Augustus kind of adopted grandsons were at war over who was going to, you know, pursue his legacy and take the reins. And they both instituted what would now be described as really, you know, sophisticated disinformation campaigns. And I refer to some of their tactics as an archaic tweet. And so Augustus had coins minted with his slogans on it, to try to convince the public that he was the true heir. And they use some of the same techniques that are used now, for example, you know, he would say that Antony was, you know, he ran off to Egypt with Cleopatra, and so he was this, this outsider. And, you know, he was, it was bewitched, by this woman, and under the influence, and Romans couldn't trust him. And, you know, there was this sort of growing suspicion of foreigners at the time. And so they played into that with their disinformation campaigns. And so a lot of, you know, a lot of the techniques that we observe today, were also used by the Romans in terms of fighting for public opinion. And so that, you know, that I think, was the interesting part to me about that, about that case study. And even I mean, this is contested, but they fabricated, you know, as we know, that, you know, Antony had a from the famous sort of Shakespeare adaptation, Antony and Cleopatra, died by suicide, but he left a will. And Augustus tried to spin this wheel as ultimate evidence that he wasn't a Roman because, you know, he wanted to leave some of his legacy to Cleopatra, and that ultimately convinced the senators that Augustus should become the true ruler of what was then the Roman Empire.
Rosemary Pennington
John and I were talking about this before, and we really both appreciate this historicization. But I wonder what is it about? Now, that is so different from, you know, whether it's during World War Two or ancient Rome, what makes it so potent now?
Sander van der Linden
So I think there's a few factors. It's about the speed with which information can travel, and also the medium through which it reaches us. So I do some calculations based on the best historic estimates of how long it would take in the Roman courier system to spread a message from one end to the other. And you know, at best that would have taken a week by horse, if they didn't take breaks the spread of message bearers now, and I calculate this in the book, if you send a message to your Whatsapp group, which can have a maximum of 20 people, and they forwarded to another group of a maximum to 20 people, and even accounting for the limits that WhatsApp is placed on forwarding, you can reach you know, millions of people within the split of a second. We looked at some research that investigated the diffusion of misinformation online. We know that, not always, but in many cases, misinformation can spread faster, further and deeper than true information. It reaches more people on average than factual information. And it can take, and I mentioned that in the book, you know, the old saying from which, again, is also debated as to who said this, that, you know, a lie can make its way halfway around the world before the truth even has a chance to put its pants on, or in the UK, shoes on, or whatever the variation of this saying is, and it's interesting that studies do show that a lie can make its way around, you know, social media and millions of people around the world before the truth even has a chance to put his shoes on. And so, that, I think is different, but then also the ways in which misinformation is delivered now online…so we have micro-targeting, which allows for the efficient distribution of fake news. And so we can scrape people's digital online footprints, and then target messages at them in a way that tries to maximize persuasion potential and efficiency. And those tools weren't, you know, weren't available back then. The medium itself is also different. Think about AI generated images, deep fakes, right? It's becoming, technology’s making it much more tricky for people to discern what's true, and what's false. And so we're trying to deal with all of these technological advancements, I think, that make both the spread and identification of fake news much more difficult. There's also much more of it, just the volume of it. And we can talk about exposure does not equal infection, but you know, there's just a tremendous amount of content available that is a barrier to entry. And with the rise of the Internet, the barrier to entry has been reduced. Anyone can now be a producer of news and, and information. And so people are bombarded with it. And I think that's really what's the difference. And what concerns people about post truth, I think, is that at the same time, we have unprecedented access to the facts with you know, it's just one Google search away, but somehow we're in the situation where people are not necessarily seeking out the facts and that's kind of the paradox, I think of, of having to deal with with these new technologies that we're not always using it in the public interest, but rather, the target people and potentially undermining elections and public health and, and so that's, you know, that's of interest to me.
Rosemary Pennington You're listening to Stats and Stories. And today we're talking to University of Cambridge’s Sander van der Linden, about his new book, “Foolproof.”
John Bailer
You know, as you were talking about that, I found myself thinking about other conversations we've had in the past about things like news deserts, where some of the local coverage of news and communities is disappearing. And so there's, there's now becoming this concentration of source or that you have to go to a common source to get this. So there's, there's been a loss of some of those types of data sources and new sources. At the same time, where there's this proliferation of other places you might find information. But ultimately, it seems that it comes down to a question of discernment. You know, what, what is it that can help people as they are looking at the sources to do and achieve some of the things that you described? I mean, you talked about in your antigen tips for the spread of misinformation you had mentioned, this idea of micro-targeting and awareness of that is one of the, you know, that's one of the ways that you sort of say, well, gee, this could be happening, or other ways thinking about the idea of whether or not you're living in an echo chamber or being, you know, receiving some filter bubbling of what you've done? Can you kind of help us think about skills or tools for discerning the source of information?
Sander van der Linden
Yeah, you mean, in terms of the source of the outlet?
John Bailer So if I'm looking for information about this, you know, about something that's not foolproof, this new book I've heard about, and I go out there, and I do a search for it. And I'm gonna get a whole bunch of returns on this, you know, what kind of guidance might you give? Give me to think about the best sources for trying to investigate to dive into to learn more about something?
Sander van der Linden
Yeah, well, one of the things that, you know, I mentioned in the book that I think is important is for people to try to look at the presence or absence of common manipulation techniques. And so when we're looking for things online, things are often presented, you know, I'm not talking about the stuff that's obviously false, like, you know, Flat Earther, or things like that. But, you know, there's often the subtle manipulation techniques that are used to influence people and we're not always aware of those. People tend to think oh, you know, I'm not going to be duped by flat Earth but that's not what we're talking about. It's the use of emotions, to try to influence people. So this could be fear mongering or creating outrage, polarizing headlines. So framing things just a little bit. So that pits two groups against each other to get people riled up about a topic or creating a cloud of doubt around mainstream narrative stuff, to instill conspiratorial forms of thinking in people or, you know, trolling people by, sometimes headlines kind of troll people. And that's, that's clickbait, or what we call discrediting, and these are often rhetorical techniques that are used to discredit the person rather than the argument. And there's a whole range of these techniques that we focus on others including false dilemmas. So you're presenting people with two options and pretending there's only two options? Well, in fact, there's more so taking out the nuance out of the situation, or scapegoating groups. And what we try to do is help people identify those techniques. So when they come across specific content, they can make up their own mind. And this is important, I think, because, you know, sometimes credible outlets can unwittingly amplify misinformation, and the source won't necessarily help you there. And that's why I think, you know, operating at the level of what is the level of manipulation present here, both as a function of the content, maybe also in terms of political bias, I mean, we should also remember that every outlet has some political bias, and that's totally fine. But we should try to keep that in mind. So what's the level of manipulation? What's the level of bias, and then people can be empowered to make their own judgment. So for example, is to give you a specific example. I mean, you can tell me what you think about this headline, but the Chicago Tribune reposted this story during a pandemic. A healthy doctor died two weeks after getting COVID vaccine. CDC is investigating why. Now the Chicago Tribune is a reputable outlet. If you look at independent fact checkers, I mean, that the source is good. They're not outright lying. I mean, a healthy doctor did die two weeks after getting the COVID vaccine. But in terms of statistics here, right, what's happening here is that correlation is framed in a way to imply causation. And there was no evidence at the time that these two events had anything to do with each other. And so that could be construed as misleading and trying to influence people was one of the most shared stories on Facebook and became very prominent in anti-vax groups. And so that's where I think that, if we can just say that that's false, but people might want to keep in mind that there is a framing technique that's used to influence their opinion on vaccinations and then make up their own mind about how they feel about a headline like that. So that's kind of where we're going with this particular approach.
Rosemary Pennington You know, when I talk to people about this, and about misinformation and disinformation, you know, when people want to come back with me as often, like, if they would just use their brains, they would figure out what's true. But in the book, you talk about how actually, our brains are not necessarily always going to seek out what is the kernel of truth. So could you talk about sort of how our brains function in this? And what sort of might drive us to actually be taken by some of this misinformation we see?
Sander van der Linden Yeah, I think there's certain things that are pretty universal about the brain. I mean, it's a big world, but, you know, a lot of people are susceptible to this. And most people rely on what we call a truth bias, which is the fact that, you know, most of the time we just assume that stuff is true. For and from an evolutionary standpoint, that kind of makes sense, because, you know, people aren't lying to us constantly, all of the time, at least didn't used to be the case, right. And so, you know, because you're bombarded by so much information, you have to be selective, so you can't stop and think about every piece of information analytically. So it makes sense to assume that most of the things people are telling you are probably true. But then when you shift to environments where the base rate of misinformation is much higher or misleading information like social media, applying such a heuristic can lead you astray. And that's, I think, where things get tricky. There's other related phenomena like illusory truth, which is the simple finding that the more often you repeat a claim, the more likely people are to think it's true. And this is very pernicious in the sense that the brain uses fluency as an indicator for truth signal. And so fluency is about the ease with which something is processed. And the faster you can process something, the more fluent it is. And so the brain is more likely to think that there's some truth value to it. So stuff you've heard before, is fluent, because you know,you can process it faster, and so on, whereas complex, new science, new stories that you haven't heard before, are complex, that slows you down. And so it doesn't operate in the same way and manipulators can take advantage of that. So if I do a quick test with you guys, I mean, you know, yeah, you know, um, and so, you know, Moses could take two animals on the ark. And so, is that correct or not?
Rosemary Pennington I'm not going to answer because I read this.
John Bailer
I read this book, too. Okay. You're not gonna get me? No.
Sander van der Linden Okay, there you go. There you go. So, most people say in a biblical story. Oh, yeah, there were two animals of each kind, right. But no, it was no, actually. And so and so you know, what happens here is that prior knowledge that people have is actually not activated at the moment. And so people are not accessing it when trying to discern things. And so that's why knowledge doesn't always protect you from being duped by misinformation. And that's, I think, a very important finding, because we assume that just general education is going to help before and I'm sure that it's good and helpful in some aspects, but it's no guarantee that people are not going to be duped by specific info. And, yeah, the last thing I'll say is that, you know, aside from these, these general cognitive mechanisms, and as far as psychology goes, universal is a big word. But I think our estimates say that up to 75% of people, even kids, are susceptible to things like illusory truth. But there are also motivational factors, right? We're a social species, people want things to be true because they belong to a certain group or a political party, or they have spiritual, religious or other motives that lead them to selectively attend and spread information about a particular topic. And that's obviously a big part of why we see the spread of misinformation. In fact, we know from our studies, where we tried to predict the virality of online content on Twitter and Facebook for millions of posts, that dunking on the other side is what gets rewards on social media. And so you know, it's also about the social incentives there.
John Bailer You know, when I thought about the model, this image of virality, I started thinking about the inoculation and vaccines, I found myself, I mean, clearly, we're, I hope we're post COVID. But as we think about this, you know, if we think about it as if the vaccine was rolled out, originally, it was rolled out to the most susceptible populations. Eventually, it went across all ages, including the very young, to protect them. And not only that, there was kind of a period of time where it was thought to be effective, but then boosters were needed. So I was just wondering if you could just reflect a little bit upon, you know, when do we start introducing these types of ideas or when or when would it be effective to start helping to have this inoculation? And what would a booster look like for us?
Sander van der Linden
Yeah, so all of the analogies I think, kind of ring true for the psychological vaccine as well. So you know, generally, it wouldn't be ideal to roll them out to the people who are most susceptible. right? So who are these people, you know, people who tend to be more politically extreme people who are spend more time on getting the news from social media, people who are more generally paranoid about societal affairs, people who are low in actively open minded thinking, which is something that relates to how flexible you are with regards to evidence and being open to changing your mind. And it's kind of a form of cognitive rigidity. So people who are more extreme are not very flexible in their thinking, but how can you get the vaccine to those types of individuals, and you know, as obviously, as scientists, we're not going out there and implementing this, but we have found partnering with the World Health Organization, or the United Nations, you know, during crisis situations, they can help target or interventions that people whom they might find could benefit from them the most. Another solution is to team up with some of the organizations that kind of run a lot of the flow of information in society. So we've decided to team up with Google, who owns YouTube, as you might know. And one of the ways in which they thought this could be helpful was, you know, they said, if you really want to scale this approach, what if we put some animated videos that that we created in the ad spaces on YouTube, you know, those annoying ads on YouTube, and that's where the weather Prebon would go. And I love this idea, because suddenly, instead of, you know, the sort of ads that are used, usually to boost profits for companies, we could hijack it for science and test people in the ad space after they've been exposed to these videos. We did some testing with them, and it took years for them to actually do this. But, you know, of course, they wanted everything to be neutral. And I think that's important, too. So you know, our false dilemma kind of pre bunk goes something like this. Are you guys Star Wars fans not really and so and yeah, there we go. So you know, the false dilemma, the weekend dose, the pre-bunk here is not something real about, you know, a hot topic issue had been sort of an issue, but it's a clip from Revenge of the Sith. And so you have Obi Wan Kenobi talking to Anakin Skywalker, who says, you know, if you're not with me, you're my enemy. And then Obi Wan replies only a Sith deals in absolutes. And then the narrator says, who wants to be a Sith, right? And don't use these techniques. And that's, that's kind of the vaccine, so to speak, in their context. And yeah, that was great for the YouTube audience. But the critical thing here is that YouTube, they haven't done this. But the results were, you know, smaller than the lab, because people are distracted on YouTube, but we wanted it to be a real test. And, you know, they scale this across millions of people. But ultimately, what I think they could do is make them non skippable in the ad space across billions of people so that we don't have this problem that we have to find and target people. But they could just expose everyone and go and make it non-optional. I mean, they haven't done this. But Google is certainly trying to get them excited about the research. I mean, we work with the research arm of Google, right, and so and so that's kind of where we're headed. And then we do know that it does wane over time. And so our test period, there was an average delay of, of 18 hours between test and exposure. But we know from our own research that if you do nothing, the vaccine does wane over time, whether it's weeks or months. And so you need to boost people. And so you ask what does a booster look like? Now, it turns out, it doesn't have to be the whole treatment. Again, you could show people some slides, you could show them a shorter version of the video. That's kind of what we did in some of our boosters, like just a brief segment of the video. You could have them play a game, it could be anything, the crucial thing that we uncovered is that it's about motivation, and memory, people lose motivation to want to defend themselves. And also they forget some of the lessons and so what you need to boost are these two processes in some shape or form.
Rosemary Pennington
Sander given all of this work that you've done, how optimistic are you for the future around this issue of misinformation?
Sander van der Linden
Yeah, it's interesting. A lot of the journalists I talked to seem pretty pessimistic. And , you know, they're pretty cynical, and they're asking me how optimistic I am. And my students always say that I'm an eternal optimist. And I don't know if I think that's because, you know, I spend a significant portion of my time working on solutions, which makes me optimistic. I certainly don't think that pre-bunking or inoculation is the silver bullet that's going to fix everything, but I do see some hope there in the way that it's being adopted and rolled out that we have we're starting to have a good first line of defense and then obviously, fact checking and debunking are important too, and we need more coordination in the sense that. Obviously we want this to be part of educational curriculums. Have it be adopted in schools that have this, you know, generally roll out critical thinking skills as it pertains to digital literacy specifically, but I'm optimistic that that will. We can get there. You know, I think we're in a very toxic climate right now about people being worried about being told what to believe, I think what we need to focus on is maybe not telling people what they need to believe, but empowering them to discern manipulation techniques so that people can make up their their own mind about evidence and giving people those skills, I think is the most important thing we can do. But in order for that to work, we need to roll it out widely, and institutionalize it and get bipartisan support for those initiatives. So that's what I try to spend my time doing. And that's what makes me optimistic. There are certainly things that make me less optimistic. Like, you know, Twitter was one of the organizations that was an early adopter of the free banking initiative during elections, for example, but you know, Musk fired the whole pre-bunking team, and so we're taking a few steps back there. So you know, you win some you lose some, but overall, I stay optimistic. I do see challenges with AI generated content, deep fakes. It's gonna get more difficult. But you know, what we do is, we're always trying to think ahead of the next challenge and inoculating people against that. And that's how that's kind of how I stay optimistic, I think.
Rosemary Pennington
Sander that is all the time we have this week thank you for being here.
Stats and Stories is a partnership between Miami University’s Departments of Statistics, and Media, Journalism and Film, and the American Statistical Association. You can follow us on Twitter, Apple podcasts, or other places you can find podcasts. If you’d like to share your thoughts on the program send your email to statsandstories@miamioh.edu or check us out at statsandstories.net, and be sure to listen for future editions of Stats and Stories, where we discuss the statistics behind the stories and the stories behind the statistics.