Alexandra Freeman is the Executive Director of the Winton Centre for Risk and Evidence Communication, a role she took up in 2016. She previously spent 16 years working for the BBC, primarily a producer and director for BBC Science. Alexandra is passionate about bringing science to the widest possible audience. Along with working in television she has also helped develop content for computer games, social media and websites, as well as formal learning resources.
Claudia Schneider is a postdoctoral research associate with the Winton Centre for Risk and Evidence Communication and the Cambridge Social Decision-Making Laboratory. At the Winton Centre she studies the communication of uncertainty about evidence to various stakeholders, particularly the unquantified ‘quality of the underlying work’. Claudia received her PhD in Psychology from Columbia University and also held a graduate research scholar position at Princeton University. Her research focus lies at the intersection of decision science and applied social psychology. Her work uses a combination of methods ranging from quantitative laboratory surveys to field studies in diverse cultural and social settings.
Episode Description
Communicating risk is difficult at any time but during a pandemic, communicating risk well can be what keeps a disease from spreading, as one public health official has put it, like wildfire. During the COVID 19 pandemic, experts, journalists, and elected officials have all been working to find the most effective way to communicate risk to the public. Helping people understand their risks of infection – or of infecting others – can be the thing that gets them to follow mask mandates or other public health advisories. Effectively communicating risk in COVID 19 is the focus of this episode of Stats and Stories with guests Alexandra Freeman and Claudia Schneider
+Full Transcript
Rosemary Pennington: Communicating risk is difficult at any time but during a pandemic communicating risk. Well, can be what keeps the disease from spreading as one public health officials put it like wildfire during the COVID-19 pandemic experts journalists and elected officials have all been working to find the most effective way to communicate risk to the public, helping understand their risks of infection or of infecting others, and being the thing that gets them to follow mask mandates or other public health advisories effectively communicating risk, and COVID-19 is the focus of this episode of Stats and Stories where we explore the statistics behind the stories, and the stories behind the statistics, I'm Rosemary Pennington Stats and Stories is a production of Miami University's Department of Statistics and media journalism and film, as well as the American Statistical Association. Joining me is regular panelists John Bailer, Chair of Miami statistics department, Richard Campbell of media journalism and Phil, is a way. We have two guests joining us today from the Winton center for risk and evidence communication at the University of Cambridge, Alexander Freeman is the executive director of a center or role she took up in 2016. After working for the BBC for 16 years, primarily as a producer and director for BBC science. Her work has won numerous awards including a BAFTA and the triple A s Kavli Gold Award for science journalism, Claudia Schneider is a postdoctoral research associate with the center and the Cambridge social decision making laboratory at the Winton center she studies the communication of uncertainty about evidence to various stakeholders, particularly the unquantified quote quality of the underlying work and quote, Alex and Claudia thank you so much for joining us today.
Alexandra Freeman: Yeah, thank you lovely to be here.
John Bailer: Great to see you both,
Pennington: Just to begin our conversation. Could you describe how you think about risk in the work that you do.
Freeman: It's a very subjective feeling, and a lot of risk managers sort of think about it as a probability, or a number or at least you know a likelihood of something happening and the impact of that thing happening, but to the rest of us risk is something that you feel you know how vulnerable you are to that particular event, you know whether you're going to have to take time off work whether you've got caring responsibilities, what your financial resources are all of these will affect how risky you feel something is so really what we're trying to do when we communicate risk is to help somebody form that subjective feeling. And if you're trying to communicate, how you feel about something to somebody else. It's really hard. I mean it's like trying to communicate, how you perceive a color to somebody else. So, this is just a really tricky thing to do. And it's got so many different facets.
Bailer: Claudia Schneider: Just even defining what, what is it, when you say it's a subjective feeling and the response of that is, I like, I like the analogy that you have with color, and I thought that worked very well as you were as you'd written that up as well. I'm curious about what you hope to achieve with risk communication is the first part of the question. And the second part of the question is, how do you study risk communication?
Freeman: Okay, let me tackle the first bit and I'll let the carrier tackle the second so it was interesting listening to your introduction because you talk about how important risk communication is to people, you know, following public health mandates. And really what we in the Winton center are trying to do is quite different from what a lot of public health communicators are trying to do, or people making messages where they really want to change someone's behavior, or at least change their beliefs, because what we're trying to do in the Winton center is very overtly to inform people, but not to persuade them. So we work on how best to communicate numbers in a way that helps people understand them, and then form their own opinions about them. So we would never set out to try and encourage people to wear face masks . We just want to help people understand what the situation is, so that they can make a decision in their own mind in their own circumstances. And that might also help them make decisions that are actually more sensible for their own circumstances or more generalist as well because what I've noticed for instance in the UK is that, you know, there are a lot of government guidelines a lot of government regulations, and I hear people discussing, you know, oh well we're allowed six people in the room, and then, you know, so the seventh person has to be outside the door but then as soon as that sixth person comes out then the seventh person can come in and you're like, you're just not you've not communicated the principle here. So people are making decisions that aren't necessarily the best in the circumstances, because they're not understanding the whole underlying problem and the whole underlying potential range of solutions. So now over to Claudia to talk about how we study it. Yeah. How do we study it.
Claudia Schneider: So, we do qualitative and quantitative research to try to get it from different angles and also because they both give different kinds of insights. It obviously depends on the exact sort of domain and question that we're asking you know I guess Alex said risk is defined differently in different domains and you would look at it in different ways depending on your exact research question. So for instance, early when the pandemic hit, we ran a range of quantitative surveys across different countries that looked at people's risk perception, and the sort of, you know, predictors or like factors that are associated with that. And here, like to react so how we conceptualize risk perception was that we came in from the more cognitive but also the affective angle because it has these different components. So we had some measures in there where we asked people you know how worried are you about this, to get this affective component but then also questions like how likely do you think it is that you will be affected or like get sick in the next six months, how likely do you think that you know family and friends will get sick from the virus, because these different components the sort of you know, severity perceived likelihood and effect and they all play a role in shaping people's risk perception.
Richard Campbell: So can you talk a little bit more about how you did that. I mean you know this idea of saying it's really easy to say oh yeah we started, we surveyed a bunch of countries, well that, you know, there are a lot of devil in the details. So can you describe a little bit about the process by which you did that study.
Freeman: It's how all of these were online samples. So, you know, people, completing our surveys on their computers, tablets, and how we recruited them was, I mean, at least three different ways. A little bit depending on the country. So, first of all, I should have said that we ran this survey in 12 different countries. Yes. And for like the different countries, we had different, you know, serving platforms. So we used prolific academics which is like a very well known one in the psychology field for the UK and for the US. We then gradually started using and responding, which is another sort of surveying company, more for the other countries in Australia we used to not have so we use these different platforms to, to get those national samples, but then we also, you know, compared between the platforms so for instance for the UK. We have samples of us responding and samples that used prolific, and we often try to go sort of half half to, you know, in case there were any kind of effects with regards to these sampling platforms. Now, we don't have sort of true probability samples for our different countries, but we had samples that were nationally representative on age and gender, we use interlocking quotas to you know get at least to a certain level of representativeness so that we don't just have any you know random online sample but really tried to improve our data quality and get to like the maximum amount of video quality that we could
Bailer: Let just a quick follow up before I turn this over to Rosemary. So, with doing online surveys, this is an interesting question to me is, are there groups that are missed. You know, so I'm trying to picture, you know, in the process of doing a study like this, you know, is it possible that maybe some of the most vulnerable populations are actually not being represented in studies where you're relying on this kind of online survey methodology.
Scheinider: I would say absolutely because online already means someone has to have access to a computer, you know, and also be be willing to participate, right, because like this is research so there, there's a consent form and, you know, like, even just having people interested to be signed up on these platforms, and then be interested once they get the sort of awkward to like you know take our studies so. Absolutely. And, you know, I think that's a genuine caveat with a lot of social science research. But, you know, given that we use the quotas for age and gender. You know, we're already covering a lot there because we don't just have, you know, young males like 30 to 35 but we do have, you know, from 18 all the way above 80 for men and women in these different countries. So, you know, we're trying to get there. But absolutely, you know, the most vulnerable we're probably not getting.
Freeman: I think we see that when demographic details that we're not quoting on some of our samples are pretty highly educated, you know, much, much more highly educated than the general population in that country, we occasionally quoted on an estimate balance as well. But where we don't, we know that they're not ethnically representative of the populations, and some of the important factors that we're looking at are to do with trust and trust in institutions. And those have a real impact we see on things like risk perception and how much people trust information from different sources. And I very much suspect that our populations are not at all balanced on trust in institutions because that's exactly the kind of thing that I would expect, not to be balanced in the form of but we're getting through these kind of recruitment procedures.
Pennington? That's actually what I wanted to ask about was this issue, because you sort of framed the work Wilson Center is doing, as, as communicating but not an attempt to persuade right like here's the information. Hopefully you can find it useful to make choices in your life, but I wonder if that kind of communication has gotten more difficult. It seems like there has been sort of, it's been noted like around the world, there is this sort of skepticism about science, whether it's related to vaccinations, or climate change or public health interventions in various places. And I wonder if sort of the role of your center and figuring out how to communicate these things has just gotten more difficult or the work of your centers got more difficult or if you're thinking about how to sort of navigate that sort of situation.
Freeman: Wow, that's a really interesting thing to bring up because I think there is a feeling that trust in science and trust in scientists has decreased, but we asked these questions in our surveys all the time now we've not seen a decrease in trust in science, or scientists all medical experts across the whole of 2020 in the UK We've been sampling every six weeks or so in the UK. We have seen changes in trusting government and ability to deal with COVID. But that general level of trust in scientific expertise has stayed pretty high. And I think that's true in quite a lot of countries. We have seen a decline in willingness to get vaccinated, which is interesting, And I don't know, you know, trust in expertise is very is one of the factors that comes out as very important in people's decision to say they'd be willing to get vaccinated and of course we're only asking people, theoretically, in our surveys. And that I think is a really interesting concerning trend. But it may be to do with people's feeling of the quality of the evidence underlying vaccines. Because Claudia can talk about her work on that, looking at people's reactions to quality of evidence and perception of quality evidence, it seems to be very important in their sort of willingness to then sort of believe it and take action on it. And certainly, you know, anecdotally I hear a lot of people going oh well you know the vaccines, it's not been around long we've not got that much evidence about it. I think I'll hang on and wait and see how it goes.
Bailer: So, just as a quick question as a follow up the the idea that was there seems like there were some surprises. I mean I wasn't surprised to see that the level of support for science was relatively stable. Although the question is, is it a low low is that a low level. I mean, is it that wasn't great to start and it just hasn't changed.
Freeman: I mean obviously all these things we can own something along the bottom yeah it's awful and nobody's changing it No, I was better than that and politicians and government didn't seem to fare as well in terms of your studies. I was just curious, what was the most surprising result. When you were doing this analysis, it's kind of popped out if there was one. And also, what kind of difference is the digit we're really remarkable as you look between countries or among countries.
Scheinider: I'll start with the second one differences that are remarkable or not so I mean, when we found our first big comparison of. At that time, I think it was 10 different countries, was that the predictors adverse perception of time were actually quite similar and the different countries. Oh, yeah. So, the big ones that came up quite consistently was personal experience with the virus at the time. So, you know, you know somebody who wasn't affected and effected yourself, and so on. And then there's sort of worldviews so had they more of an individualistic worldview or more of a pro social worldview. So you know, those who kind of believe that, you know, it's important to do things for the good of society, and others kind of thinking about the vulnerable people. They showed a higher risk perception. Maybe because, you know, quite early on. It was clear that this virus was affecting different parts of populations in in different ways. And then also the sort of third big one was sort of, to the extent that people talk about the virus with a family and friends are kind of like you know interactions on the kind of social level, which is interesting because you know if we think that things like you know what we know about the virus so the kind of, you know, more objective knowledge like that maybe should have been the biggest driver offers exception. But we actually found that it was more about the sort of social cultural aspects that had a bigger impact on people's perceptions, and that was across the different countries.
Pennington: You're listening to Stats and Stories and today we're talking with Alexandra Freeman and Claudia Schneider of the Winton center for risk and evidence communication at the University of Cambridge, Alex. So you come from a journalism background. So I wonder what advice you would give to journalists who are trying to communicate this information about risk to their audience. what do you think should be guiding them as they try to sort of frame. These stories in a way that are useful and serve the public good.
Freeman: Oh, gosh. Gosh, that's such a difficult question. I think the funny thing is that when I worked in the media I wouldn't have called myself a journalist, but I was, you know, making documentaries. I always had a story in my head, all my work was about trying to lead people through a story to understand the conclusion that I or somebody else would come to based on that information. And, you know, all the sorts of things you use in a documentary or the musical the filming it's all designed to help people on this journey and control their emotions and how they feel about things. And actually what I have really come to realize, working in the Winston center is that there is an entirely different way of approaching the kind of information that people might need, especially in a, in a time like this, and a much more neutral presentation, not trying to help people see that something is big or small or worrying or not worrying, but just giving them the information in a way that allows them to make up their own minds about it. And maybe I'm just completely blind to the fact that journalists do this all the time and as a documentary maker I had a different kind of opinion on it. But I think there is an entirely different set of kinds of ways of communicating guidelines to follow when you're communicating in that really neutral way. And all you're trying to do is produce things that can be well understood and can be seen as being trustworthy. And, you know, these are sorts of things like the uncertainties. Not worrying so much about keeping everything really short really clear, makes it really easy. But instead, making it really trustworthy and really balanced. You mentioned,
Campbell: I really like this this idea of kind of the trustworthy communication is being really the touchstone and the goal here, but it often seems like the communication of the uncertainty is really what's is what's lacking, that the, that there's almost a false sense of precision that we don't trust people to be able to be consumers with uncertainty. And how do you respond to that? Well,
Freeman: You know, we have been doing work on those because it's such a common thing that you hear that you know people can't just putting the uncertainty will undermine their trust it'll give them too much to think about. but people don't want uncertainty because it makes it harder for them to make decisions. And it may be. It's true that none of us want uncertainty. It'd be lovely if the world had no uncertainty in it but that's just not the world we live in, unfortunately, and, you know, we should be denying uncertainty that's there. I mean I'm sort of ethically and from a trustworthy perspective we shouldn't be denying it. But also, further down the line if you're very certain about something and then turn out to be very wrong about it. I'm sure that's gonna undermine people's trust in you as a communicator or as a scientist, much more than if you'd been open all the way along. So, and I think that's a lesson that people are perhaps learning some people are learning for the first time in this pandemic where it's a world where we were so uncertain earlier in the year we didn't know how the disease worked we didn't know how fast it was going to spread we didn't know how many cases there were now we didn't even know how many deaths they'd been last week, you know, our uncertainty just found every possible dimension. And if you know a foolhardy journalist stepped out there and said things that were very certain, you were sure as hell gonna be wrong the next week so I think people learned that lesson. And so the work that we've been doing in this where we've been, you know, presenting people with precise numbers or precise looking numbers and numbers with a numerical range around them so you know saying it could be between x and y. And then looking at people as trust, asking them to write their trust in those numbers that trust in the producers of those numbers. The amount of uncertainty they feel about it all of those kinds of things. We've been fighting when people are completely fine with accepting uncertainty, you know, they perceive it so they'll say this is a bit more uncertain. But it doesn't undermine their trust in the producers of the information, and it only slightly undermines their trust in the number. And the more precise you can be, you know, if you're giving an actual numerical range, then, you know, that's not a problem at all if you are more vague about your verbal uncertainty you're saying, oh, there may be some uncertainty. Well of course that's a bit unsettling But on the whole, people accept uncertainty. And we did that in all our surveys across 10 countries. So,
Pennington: I want to ask this question about contact so I was reading this blog post that you wrote about the strange world of risk perception and communicating risks, and you make this point, we're about numbers only ever making sense in a context in which they're familiar. And I wonder how much like that familiarity with the context also plays into our willingness to accept uncertainty as well as sort of the figures.
Freeman: So we did look at uncertainty, communication, in a numerical form in a lot of different contexts, so they were things like you know migration figures unemployment figures climate change, numbers COVID numbers, but all of those are relatively unfamiliar you know the language of numbers is not one that we all sort of use day to day, except possibly in terms of money, or possibly time you know those are the sorts of things where we're used to having a number. And we know that, you know, $100 is worth. However much we feel it is in this context. So we have a sense of the context around money and time. But those are probably the only cases where language is really familiar. And it's interesting because we were talking to some researchers who study uncertainty communication in financial situations. And I suspect that they have quite different results from those because it's a really familiar context to people, but the fact that we didn't find people we're worried about the uncertainty, even in our kind of an unfamiliar numerical world. I think probably means people aren't worried about the uncertainty, but it could just mean that people completely ignore that bit of numbers in the middle of a sentence. Although they did score, but there was uncertainty around it so you know, when asked how uncertain is this number they were able to recognize that it was uncertainty.
Schneider: I was gonna say that, you know, some of us don't want to see uncertainty and variability disappear when our jobs are gone. This is, this is job security for. So let's let's not go, let's not go too crazy land here, Alex.
Freeman: I don't think you need to worry.
Bailer: There's a lot of job security here then. Okay. Good. I'm glad to hear. So I want to follow up on on kind of acting in this uncertainty and, you know, it's sort of, there's, there's a component of yes you're communicating information and trying to educate and bring people you know have people kind of understand where this is, but then ultimately there are there. Will this translate to action, and you know and so I let me let me turn this over to Claudia, let's say you've done some studies looking at uncertainty and how people are responding, perhaps and can you could you talk a little bit more about that for with us. Wow.
Schneider: Can I turn that question a little bit around and follow up on something that Alex said before, when we talked about the uncertainty and that it doesn't undermine trust and, you know, when we give the numbers in the range. Okay. That is true. However, this is something that's dear to my heart because that's what I study that's my focus. There is a component of of the quality of the underlying. So, you know, you can give people a number and it can be a precise number, or it can be a number with a range or confidence intervals so we do have that uncertainty and the sort of dark uncertainty sense. However, then there's the underlying set of information like what is that number based on, right, like, this is based on like some RCTs are observational studies you know do experts agree on this or not, do we have a lot of data that underlies this figure or is there like a lack of data. And we found that that people really care and are really interested in knowing what the quality of the underlying evidence is because that is an indicator of, you know, how much can I trust this information, you know, surely then in case it's some, you know, behavioral advice, like wearing eye protection or like wearing face masks. Should I follow that advice or not, and why it's important to set our size we found that people react to these different quality of evidence levels. And we, in our work looked at sort of high quality low unknown or just not giving them any information. Instead of what we see is that when people are presented explicitly with some kind of information that we tell them you know this is high underlying quality that there's sort of levels how they, you know, read that information in terms of trustworthiness or you know how effective they feel that sort of intervention is or, you know how much they would make decisions based on that is actually on the same level compared to groups that weren't presented with any quality of evidence information at all. Now, what does that mean in practical sense it could mean that, what if we communicate, you know information that might not be the highest quality of underlying evidence, and we just don't talk about the quality indicator. Then, you know, it could be that the people are maybe, naturally, kind of, you know, assuming that the quality is, you know, fairly high or like good enough, but it might actually not be. And this matters because we see that people react very strongly to low quality of evidence that definitely changes their perceptions and, and their decision making. Therefore we think it's really important to, you know, have not just a number and a direct sort of uncertainty, but also think about how to communicate this quality of evidence.
Freeman: Yeah. Everything is so easy in this game isn't it it's easy to talk about quality risk and how you respond. I guess was we're nearing the close of our episode I'm curious if there's there's sort of one takeaway message you might give us, you know, for people that are that think about, about communicating information, one sort of, if there's one thing that you would say, make sure you worry about this. If you're talking about risk and evidence.
Bailer: Okay. So the one thing I would say, and it's quite, it's not any of the things that I've written about before, is if you only care about how many people read your article or quickly act on your information, it's really easy to get the feedback on how well you have done. And so that's an easy thing to aim for. But, if, when you think in your heart of hearts that what you're trying to do is to inform people, then I would say, think about how you can tell whether you're doing a good job or not. And then I think once you've reflected on that. I think it brings to mind quite a lot of issues. And so I would say, think about what you're trying to achieve and then think whether you are measuring the right things to know whether you're achieving it sort of similar but I also think that just thinking about, and investigating what the kind of information that you put out. You know what that does to people's reactions and like perceptions, how do they take this information so that it's not just, oh, you know, I present this and this kind of way with this kind of graph, but really thinking about you know what effects could this have such as for instance with, you know, the quality of evidence insights. Because if we don't think about that we might communicate in a way that creates biases and is actually you know not trustworthy or persuade someone some kind of way. And then we might not even be aware, like, be aware of those pitfalls. So I think really really thinking through what am i communicating, but as Alex said also like how does that, you know, and now that the receivers and how do they perceive that information.
Bailer: What a fun conversation and great.
Pennington: Stats and Stories is a partnership between Miami University’s Departments of Statistics, and Media, Journalism and Film, and the American Statistical Association. You can follow us on Twitter, Apple podcasts, or other places you can find podcasts. If you’d like to share your thoughts on the program send your email to statsandstories@miamioh.edu or check us out at statsandstories.net, and be sure to listen for future editions of Stats and Stories, where we discuss the statistics behind the stories and the stories behind the statistics.