Hey, I'm Working On A Story And Can You Help Me With The Statistics? | Stats + Stories Episode 52 / by Stats Stories

52 Goldin.jpg

Rebecca Goldin ( @rebegol ) is a professor of mathematics George Mason University and the Director of STATS at Sense About Science USA . She has received several grants from the National Science Foundation to support her research in mathematics and in statistics education, as well as the Ruth I. Michler Memorial Prize from the Association of Women in Mathematics. Her work with STATS has appeared in many media sources, including NBC, CBS, NPR, CNN, and the Washington Post. She has worked with individual journalists from the Wall Street Journal, New York Times, NPR, 538, and many other media outlets, as well as run workshops for journalists and students alike. In 2014, STATS became part of Sense About Science, USA, and a major collaboration with the American Statistical Association was established. She continues to direct the effort to work with journalists to improve statistical reporting.

+ Full Transcript

Rosemary Pennington : Journalists are trained mostly as generalists. Those who attend Journalism schools or programs learn how to tell stories about a wide array of subjects, and are expected to pick up subject expertise while on the job. Some expertise is easier to come by than others, figuring out the ins and outs of P-values and standard deviation, for example, can be tricky. The site stats.org is designed to help journalists navigate the turbulent waters of statistical reasoning and is the focus of this episode of Stats & Stories. Stats & Stories is a partnership between Miami University's Departments of Statistics, and Media, Journalism and Film, as well as the American Statistical Association. I'm Rosemary Pennington. Joining me in the studio is regular panelist John Bailer, Chair of Miami's Statistics Department. Richard Campbell, Chair of Media, Journalism and Film is away today. Today's guest is Rebecca Goldin. Goldin is a Professor of Mathematical Sciences at George Mason University, and the Director of Stats.org. Her work with the site has been featured in a number of media outlets, and Goldin has received several grants from the National Science Foundation for her academic research as well as for her work with " stats.org". Thanks for being here Rebecca.

Rebecca Goldin : Thank you for inviting me.

Pennington : Just to get things started, can you talk a little bit about how you got started with stats.org?

Goldin : Stats has had quite an evolution over the years. We started, I don't know, probably, more than ten and less than fifteen years ago, so we started a long time ago. And we were something of a "gotcha" organization. Looking at how reports were being crafted and then reported in the media and often noticing how things were done wrong. But, it didn't take us long to realize we weren't having the kind of impact that we really wanted, if all we did was criticize rather than getting involved in the creation of media information. And so, we shifted our focus over the years. We now work much more on the level of education and communication and trying to get journalists to get their questions answered before they write their stories. To dig in to the statistical part and the quantitative part of their work, and to be a real resource for them in multiple ways. So, that's how I got involved. Originally, I think I wanted something more than doing just mathematical research, so I had the opportunity, and I really jumped at it and it's been a blast.

John Bailer : Very cool stuff! So, what's been the hardest statistical concept to convey? What's the most frequent question that you get?

Goldin : Okay so those are two different things! If you asked me "the hardest question I've ever fielded", I think all of your listeners will just turn off the show right away. Because it can be really hard! I think sometimes there's journalists who really want to dig in deep to the statistical methods of a paper, to try to understand if they're appropriate, if they're the right method. Other times they want to say, "Hey, can you explain to me what a P-value is?" and that conversation is usually about 45 minutes. Just to say what it is. So, there are topics and statistics that have a veneer of being "accessible" and you'll just say something really quickly, and we'll just all understand. And as a matter of reality, often it's very challenging to get those ideas across. I hope you don't put me on the spot, and make me explain right now. But that's probably the hardest concept that's come up, with frequency, is "what's a P-value?".

Bailer : Well, certainly the fact that the ASA has had two conferences related to it… suggests that it's not a trivial topic to understand and to communicate.

Goldin : That's right. And interestingly, it's one of these topics that there's reason to want to communicate because it's often used as some sort of benchmark for scientific validity. Much to the dismay of statisticians. So, I think that there is reason behind that, when you're looking at studies and want to understand the statistical language. One of the things I emphasize with journalists is the difference between English language, and Statistical language. It can often differ in significant ways.

Pennington : You mentioned that stats.org started as this "gotcha" space a little bit and then shifted. I was wondering, did that shift impact how receptive journalists were to the work that you were doing? I would imagine, having worked as a journalist, having someone put me on the spot for reporting that I maybe didn't communicate very well, would probably make me defensive. So, I just wonder, do you change that focus of the organization? If journalists have come "on-board" a bit more?

Goldin : Absolutely. I think that there's a huge difference between finding what somebody did wrong and being available to talk to them beforehand, it's absolutely human nature, right? That nobody likes to be told they did something wrong. If you were told that you did something wrong, you would never then, go back to that person to get support the next time around. So, we were totally ineffective in talking to journalists, but maybe we were effective at communicating the ideas in a different way. But I think that what's happened since we've turned from talking to journalists beforehand, where it's just general education, general offering of support, sometimes journalists have a specific story in mind and they want to ask a question and sometimes they want to attend a workshop and get some more general principles under their belts. What we've found is that there's enormous enthusiasm. In fact, I think that almost every journalist I've ever talked to has said they wished they had more comfort with numbers and things quantitative. In fact, I don't think that's restricted to journalists. I think that, I feel I wish I understood more about statistics. I think many statisticians feel that they could grow as statisticians. And I think people who have no communication background, nor quantitative background wish they had more quantitative background. So, it's kind of this common thing, you wish that you understood the numbers and statisticians and feel that as well. Even as they spend their lives trying to understand it. But journalists are super, super enthusiastic about it.

Bailer : That's neat. You mentioned earlier trying to differentiate between the English language and Statistical language. Can you give some examples you use when you're talking to journalists about that?

Goldin : eah there's great fun with these. The biggest one actually, I think that is most common, is what you guys had hit on earlier when we talked about "P-Values", and there's this concept called "Statistical Significance" that's related to P-values. And the idea is that if the data that you collected in doing some study are pretty unlikely to occur, in some circumstance that there's nothing to conclude, essentially… if that seems pretty unlikely, it's pretty unexpected in some way, you have a low p-value. And then you say that you have reached "statistical significance". And this is this benchmark. People will publish results saying, "this result was statistically significant", and the reality is that there is very little connection between statistical significance, and what we mean in English by the word "significant." So, usually by "significant," we mean that something is important, right? Something is impactful, it's going to change the world somehow. Or at least in some small way it's meaningful. It usually means, in our minds, that it's what we call "clinically significant." There's some outcome that's going to change due to that work. But statistical significance doesn't mean that at all. When people talk about significance, it's natural to think "ok, this is important. I better pay attention" but that's not actually what's meant by it. And I think that leads to a lot of confusion.

Pennington : You're listening to Stats & Stories, where we discuss the statistics behind the stories and the stories behind the statistics. The topic today is the work of stats.org. I'm Rosemary Pennington. Joining me today is Miami University's Department of Statistics Chair, John Bailer, and our special guest is stats.org Director Rebecca Goldin. So, you mentioned this issue of statistical significance, and then your work to try to demystify that for journalists and help them understand the difference between statistical significance, and significance. I'm wondering, for you, what is the most frustrating mistake you see journalists making over and over again in reporting, and what do you think they can do to correct that?

Goldin : So, I wouldn't say that talking about statistical significance is such a big problem in journalism, because most people "run scared" of the term. I think they're much more likely to talk about the results of a study that suggest that there's something causal going on, when there's really just a correlation. And that's the biggest mistake that I see all the time. That someone has established a relationship between two things and someone else interprets it to be that one of these things is causing the other one. There's kind of a standard, funny example of this, where you say "hey, your height is often correlated to your reading skills when you're young…" and that is because people are growing at the same time that they're learning to read. But, that doesn't mean that getting taller actually facilitates reading, or that reading helps you grow taller. That's a perfect example of something that's a correlation but not something that's causal. But when we get into the world of scientific studies, which are pretty complicated and the design is confusing, and people are trying to establish something that's going on behind the scenes… it's very easy to quickly fall into the trap of believing you've proved that something has a causal relationship. Meaning that one thing that you've measured causes something else that you've measured.

Bailer : Do you think that journalists are more suspicious of the single study result now? That they're not going to let just a single study make them cry from the rooftops… there's this phenomenon that's occurring. Or do you think it's taking more than that just that to register for folks?

Goldin : That's a great question. I think that maybe there you want to make the distinction between what individual journalists think and what their editors think. Rosemary, you might also want to speak to how that process goes. But a lot of times journalists might have that point of view, but they're on a deadline, with a tight timeframe and their editor really wants something flashy to put on the screen and get people to click, so I think they're often constrained and not provided with the resources, either timewise or just logistically, to get hold of other studies to try to put something into context. I think that some journalists will get quotes from experts to put any new study into context of what people have already found. I think that journalists, yes, are getting that very much, but I'm not sure that that's really impacting how we read the news, what we actually see when it comes out.

Pennington : Yeah, I would say that's a really nice description of how the news process works. Because often you are going in, and the editor has a particular kind of expectation. And usually the question is "what's new here?", and if you can't talk about what's new really quickly, then the story gets spiked a lot of the times. So, I think it's a pressure … how do you help journalists navigate that? If it's something… I mean it sounds like you understand that process well, right? The idea that you need to get something out that's flashy that's new. Is there something that you're doing at stats.org to help journalists better contextualize the reporting on maybe single shot studies?

Goldin : Yeah that's hard… I think the thing that we do that perhaps helps them with that is that we respond to them very quickly. So, if someone reaches out to us and says, "I want to know what you think about this study", a lot of times I'll respond to them and say, "but you should be aware that this literature out there…" and I'll bring up a couple things that I've found in a quick search but unless someone is really going to dive into it I think you're basically limited to providing them the expertise for a particular study that they're looking at. I do talk to them a lot about reading the contextual information that an author running study is generally obliged to put in. So, if you look at a conclusion or an introduction you're going to get a lot of information about… that puts that scientific study in a context. The introduction will usually say, "what are some other pieces of research that led them to ask the question that they asked, and then the conclusion will often talk about caveats and reasons why you might be concerned about how that particular study was designed. Aside from that, I wish we had more resources to give people the opportunity to put all of that into context. There are organizations trying to get scientific expertise together, rather than statistical expertise and really pull together scientists in specific fields so that a journalist could all up and quickly get that response, but, stats is really focusing on the statistical methods.

Bailer : We've asked in one direction what you're providing to the journalists, what have you learned from journalists? What are some insights that you might not have in terms of how you do your own business?

Goldin : That's great, I have learned so much, I can't even… I'm not sure I can answer this question quickly…

Bailer : Well, we have time.

Goldin : One of the things that I've really learned from journalists is that some of the smarts involved with providing a quantitative interpretation of something that you've read, are not actually numerical smarts, so to speak. So, the kind of skepticism that I really want to encourage in journalists, it seems that that kind of skepticism isn't dependent on how much experience that those journalists have had with math. So, I think people within the world of math and educators and people who really teach statistics and teach quantitative reasoning in a variety of ways often think that the problem with journalists, and the problem with miscommunication of statistical ideas and scientific research that uses quantitative processes or descriptions, is that they just don't have enough math, and if only they had more math … and if only they had more statistics, they'd suddenly be launched into this world that is much richer and more accurate and has many more shades of grey as well. But my experience is that, actually, the kind of skepticism that they really need is something that they can often develop with very little numerical strength, so even people who really struggle with the math can get much further than they ever thought by learning a little bit. And the level that they can get to, intellectually, is extremely high. Because there's a lot of work within journalism that has to do with your logical reasoning, with your ideas of how a study is designed, how you structure your argument. I think that has been a real eye opener for me, just how much we can do with just making a clear logical process, rather than having some kind of strict numerical smarts and understanding exactly what a standard deviation is and knowing what a formula for a P-value is, or anything that's of that nature. So, I've actually been super impressed at the level of journalism that goes on in the United States, and in Europe in some cases. But, I think I have really been impressed by how much people can uncover without the numerical stuff. So that's one of the things I've learned from journalists. I can point out so many other ones that are really important. I think I've learned to write better, by talking to a lot of journalists. And I've learned to respect much more, some of the aspects of the process by which people in the media interview, and think about topics. How they come up with their topics, how they find the experts, the level of expertise that they expect when they talk to people. All of that has been very impressive.

Pennington : You're listening to Stats & Stories, and our discussion today focuses on the work of stats.org. Rebecca, I'm going to shift gears just a little bit. I saw a video that I've not been able to watch yet, but the title was… and it's featuring you, and the topic was "why math is a great way, or the best way to make sense of the world." So, I, going to put you on the spot and ask you to explain why math is a great way to make sense of the world.

Goldin : Well, okay so, I think that we are living in a place where we often can't have conversation. So, there's this sort of… there are huge political disparities among people, they just seem darker in this period of time, and I think as we kind of go through our lives we could either live in a way where we're very happy with our own ideas, and we don't have to worry so much about anyone else's ideas. Or we can try to argue really strongly about our ideas. And other people will say "well, I just disagree with you", and then people can insult each other online, as they do but I feel like where the conversation can really occur is when you start facing what you're talking about in something factual. So, for me math and statistics really has a sense in which we can actually talk about what we mean, we can agree on something factual, and that agreement can be part of how we can have a conversation. So, if you want to talk about immigration and maybe you and I don't agree about how immigration law should be structured, but maybe we could agree on what the rates of poverty are amongst immigrants in the United States as opposed to the rates in the general population. Maybe we could agree about rates of crime in immigrant population versus outside of immigrant populations, so we can at least have a conversation that isn't charged with incorrect information that doesn't really get us anywhere. We can talk about ways in which economic models depend on certain kinds of information about immigrant communities or immigrants who are working and how much they're contributing to the economy. So, if we can agree on something factual, its often numerical because its aggregated information rather than "I know someone who did this". It's not like "I know someone", but rather, "here's a story that describes everyone together." So, putting all of the information together in some aggregate way, then we can move away from something that's a huge bias, into something that, I think, is a more real conversation. So, for me, that's a very powerful force in how I think and I'm hoping other people can access that.

Pennington : Richard, often, when we have guests asks them about the current environment, and the circulation of fake news, and usually asks them if it's harder to deal with factual information and have discussions about facts so you're talking about this idea of rooting conversations in facts, while also encouraging skepticism, and making people approach data with a more skeptical eye. Do you find, given the plethora and explosion of fake-news and misinformation in circulation, that it's harder to get people to buy into those two things. Sort of, hanging on to the facticity of things, and then approaching things skeptically?

Goldin : Well, that's a good question to try to make some kind of comparison which we now, and before now… I guess my take on a lot of these things regarding "fake news" is that the people who are most vulnerable to it are consuming actually very little news. So, you have a kind of story line about vaccination that you really believe that vaccination causes autism, and you go on websites that say this, and you read that kind of stuff, but you're really not talking to someone who thinks about this differently. I think that there are many people who are very much in their bubble and unfortunately, I'm not accessing those people. And I think that we do have a problem with trying to get across those built-in barriers from how we consume news, that we can look at the stuff that we already believe in, and not try to stretch our brains to something that is outside of our comfort zone. So, there are people who really aren't engaged in the conversation, and from that point of view, I think it's really difficult. But I also think a lot of people are consuming news that goes beyond just their bubble. And they're looking for understanding why something is fake and why something isn't fake. From this point of view, the opening is really there. I think that no matter what there's always going to be people who are just not open to a conversation. And on some level, I think "Well, okay, maybe that's a lost cause." So, I'm not really worrying about communicating with people who aren't interested. And so, from the point of view, if you ask like, maybe you should also answer what your experience is. Do you feel like people can only consume news in some kind of narrow way or whether they're able to change their minds ever based on information?

Pennington : I think research has shown that media and communication have a really hard time, changing people's behavior. It can change attitudes and opinions but that's sort of with an onslaught of information. One story that challenges someone's world view isn't going to make much difference, but maybe a cascade of reporting can.

Goldin : Right. That's a great way to view it. So there sure these huge cultural shifts that we've seen that I think media is very much a part of. So, we think about, for example, how people view smoking now, whereas people viewed smoking thirty years ago. And now there's a lot more acknowledgement in anything that you read that it's actually dangerous to smoke. And that didn't use to be the case. I think that it used to be that people tried to be really neutral about things and not say as much, or not imply as much, and that probably does create like serious cultural shifts.

Bailer : So, one of the things that this conversation makes me think about is the idea of what you mentioned earlier, and that's the idea of the importance of this formal structure of thinking about the world. And one aspect of that formal structure is accommodating the idea that you're willing to believe that you're wrong. If you're going to be doing anything that's statistical there is a component that says there's competing states of nature, and I might be wrong about what I believe. And evidence might lead me to reject what I believe. And that's a formal framing.

Goldin : Yeah. Absolutely. And what's interesting in statistics, is not just that evidence might be misleading, but that sometimes you just can't get the evidence that you really want. And that another part of it, right? So, I guess, as a working statistician, John, like you're always down in the weeds of what that means about what you can really do with your data, and what you can't do so a lot of times we want answers to questions that are simply not provided by our data. So, we take our data and we say "Hey, well, we can't answer the question that we really want." But, we can take this data and answer something slightly different." And then we answer those slightly different questions, and that's the best that we can do in the framework that is statistics.

Bailer : I agree. I have a follow up question, and that is what's the workflow of stats.org? Could you tell us what happens… you know, give us an example of what you've entertained, what does that trigger when you're contacted by a journalist? What is processed, and then what do you return?

Goldin : So, that varies wildly. I have a number of journalists who will just reach out to me directly. And then we have a website which is stats.org where you can go in and submit a question as a journalist. And those questions automatically trigger an email to me and my colleague. So, if I'm really busy he'll know that, and he'll send it off to yet another statistician who can respond to it. We have a board of statisticians who volunteer to be working with journalists directly, if needed. When I receive an email from someone who is working on something I will first, read it as carefully as I can to try to understand what's needed, and a lot of times someone will say something like "Hey, I'm working on this story and this is the topic, but I'm wondering if you can help me?", and all I know is the topic and that they want help. So, in that case my first response is, as soon as I can get it right back to the journalist and say "Can you explain more of what you're trying to actually do? What are you trying to write about?", and what are the questions that you have and if they have a copy of the study, if they can share it with me…otherwise I can try to find it through a library. Or if they're just asking a more general question, perhaps that's not even based on a study. So, then I thought first of the information exchange that allows someone to say, okay a human being is reading your email, because of course, they submit to, some kind of, you know, innocuous looking website. And they don't necessarily, you know they're not saying, "Dear Rebecca…", they're just writing in. so we kind of establish this thing, and if I can answer the question, which is usually the case, then I'll take the time to answer it. So that might be that I take a couple of hours to read a research paper that they're interested in some question about, and then I'll call them and we'll walk through some aspect that they're interested in. or it could be that they want me to check calculation I've had journalists write in to say, "I want to say that this is such-and-such percentage, did I do the calculation, right?" so let's just compute percentage. Or I've had things that are way way way more complicated. Like trying to understand what the authors are saying in a really complicated study. And a lot of times they have their own data. And that's actually really new and really fun, and I don't know if, Rosemary, you've also found that people are using data much more, but they really need to analyze it, and they don't know how to analyze it or they don't know what they could analyze, and what should they ask following up about their data? So, sometimes I'll go through somebody's Excel chart with them. We'll be on the phone a couple of hours and we'll actually sort through some of their data. And sometimes I'll whip something up in an Excel chart, with their data and send it back to them and walk them through what I was doing with it. Common questions include things like "How do I take that average in this situation?" So, they're interested in an average but they're not really sure how to take the average because they might have a bunch of numbers. Each number is an average itself, and they're trying to figure out how to do what we call a weighted average. Or other times they have some data and they really just don't know what will be useful, descriptions of that data. So, there are a variety of things from what I would call "Elementary School Math", all the way through what is probably taught in a Graduate Statistics course. And those questions can be vastly different from day-to-day, or week-to-week.

Pennington: Well, that's all the time we have for this episode of Stats & Stories. Rebecca Goldin, Director of " stats.org" thank you so much for being here today.

Goldin : Thank you.

Pennington Stats + Stories is a partnership between Miami University's Departments of Statistics, and Media, Journalism and Film, and the American Statistical Association. You can follow us on Twitter or iTunes. If you'd like to share your thoughts on the program send your email to Statsandstories@miamioh.edu discuss the statistics behind the stories, and the stories behind the statistics.