Reading, Writing and Risk Literacy? | Stats + Stories Episode 64 / by Stats Stories

 (photo credit: David Ausserhofer)

(photo credit: David Ausserhofer)

Gerd Gigerenzer is Director of the Harding Center for Risk Literacy at the Max Planck Institute for Human Development in Berlin and partner of Simply Rational - The Institute for Decisions. Gigerenzer has trained U.S. federal judges, German physicians, and top managers in decision making and understanding risks and uncertainties.

+ Full Transcript

Rosemary Pennington: From the moment we wake up to the moment we go to bed, we face risk. Risk from financial markets, from public health crises, from merely being human. Learning how to navigate lives filled with risk and its sibling uncertainty, has fueled philosophers, writers and scientists alike. It's also the focus of this episode of Stats and Stories, where we explore the statistics behind the stories and the stories behind the statistics. I'm Rosemary Pennington. Stats and Stories is a production of Miami University's departments of Statistics and Media Journalism and Film as well as the American Statistical Association. Joining me in the studio, our regular panelist John Bailer, Chair of Miami Statistics department and Richard Campbell, Chair of Media Journalism and Film. Our guest today is Gerd Gigerenzer, an expert on uncertainty and risk. Gigerenzer is the director of the Harding Center for Risk literacy at the Max Planck Institute for Human Development in Berlin and partner of Simply Rational the Institute for decisions. He's also the author of a number of books including Calculated Risks, Gut Feelings: The Intelligence of the Unconscious and Risk Savvy: How to make good decisions. Thank you so much for joining us today, Gerd.

Gerd Gigerenzer: Yeah, and I'm honored to be on your program.

Pennington: How do you define risk in the work you do?

Gigerenzer: So, the term risk has many meanings. One is to distinguish situations of risk from those of uncertainty. So, a situation of risk would be like playing a lottery or the roulette where you know the entire space station, nothing unexpected can ever happen. And uncertainty is...are those situations which are outside the class of probability theory. So, when new things may happen, we don't know all the alternatives, or the consequences. So, an example would be whom to marry. There can be surprises. Where are you going to invest your money? And so, I'm doing both sides of these topics. So, for risk, it's very important to understand probability theory and statistics and plans. It's very important to understand smart heuristics of rules of thumb.

John Bailer: So, you've been an advocate for risk literacy. You know we often hear discussions of general literacy or quantitative literacy. I think you're one of the few people I've heard describe risk literacy. Could you talk a little bit about why you believe that's important and what that is?

Gigerenzer I think the risk literacy is today as important as the ability to read and write. So, literacy in the narrow sense was, in the last century. We have managed to teach almost everyone how to read and write more or less. But we have not managed to teach risk literacy, that is, to deal and understand numbers, and also to understand and control our emotions that relate to risk literacy, for instance our anxieties and fears.

Pennington: Why do you think it is, that that has been…under explored is not the right word, but just why are we not paying enough attention to this issue of risk literacy?

Gigerenzer: Rosemary, this is a good question. Who would dream about that!

(Collective laughter)

Gigerenzer That be the case. So, the facts are, we teach our children the mathematics of certainty, algebra, geometry, trigonometry - beautiful theories, that are of not much relevance for the rest of our lives, at least for most people, compared to the mathematics of uncertainty, that is statistical thinking, which would be of much importance to everyone in the society. And it ranges from understanding what a thirty percent chance of rain means, to what a DNA test means, what a positive mammogram means. And in my research, I have shown again and again that not only most patients don't understand health statistics but also most doctors do not understand health statistics in their own field.

Richard Campbell: You talked a little bit about both, the sort of tension between the rational and reason and the emotional or listening to your gut. Can you give us some ideas on how you balance those two? You talk about the best results come sometimes from considering less information and listening to your gut, but clearly there has to be some kind of balance. Can you talk a little bit about that, maybe give us an example?

Gigerenzer: OK so. Let's first say what gut feelings are. I use it synonymous with intuition and intuition is based on years and years of experience. And it has two major characteristics. You quickly feel what you should do or not do but you cannot explain. So, a good expert in sports or in science has intuitions. I could not do my research without intuitions and then I need also methods like statistics to find out whether the intuitions are correct. That's what should be a normal way to look at intuition. But we live in a society where many people are suspicious of intuition and even blame every kind of disaster to intuition. So, intuition is not a sixth sense, it's not something arbitrary, it is not what only women have. We also have intuitions. And if you work with experienced, sports players if they wouldn’t listen to your intuition, they wouldn't get anywhere.

Bailer: Well it sounds like, as you describe, an intuition, it's just the synthesis of data. That's what your experience effectively reflects, right?

Gigerenzer: It is one specific feature. It's different from conscious thinking, you cannot explain it. We live in a society which goes away from achievement, performance to more and more justification. So, after the fact. So, you may have to leave your own company as long as you can explain it. Usually that's when someone else comes. But if you have good intuitions, your company profits from that. That's considered suspicious and certainly in large companies as opposed to family run companies, where they really have a vision of performance. So, I've worked with large companies who are in the stock market. And in direct interviews, when I asked the leaders, up to the executives of the companies, how often is an important professional decision that you or your group takes, at the end a gut decision, the answer, the average answer is in fifty percent of the cases. And I emphasize on the end, because before you look at the data, and more data, data doesn’t tell you all the time what you should do. And if you then use your own experience and you feel you should not do this, that’s a gut decision.

Campbell: How do you respond or reconcile the sort of anti-data, anti-science? There are people that distrust…that people have of data and science. You talked about blaming intuition, but there's certainly in the States, there is a large movement of distrust in things like global warming, other scientific things. How do you address the distrust that often comes today against the data and science?

Gerd Gigerenzer: So first we do have less of that in Germany than you have in the US and the best means would be better education. So, education for everyone and education in game, in statistical thinking. So, about evidence and also into psychology. That means why do people not believe in global warming? Because most likely, their friends don't believe in that. And if you start now, reading and trusting data, then you lose your friends. These mechanisms are in place. We need to basically mentally vaccinate young people already that this will be their future. It's a part of enlightenment which we still don’t have, that you learn to stand up to scientific values.

Pennington: You're listening to Stats and Stories, where we discuss the statistics behind the stories and the stories behind the statistics. Today we're focusing on risk and uncertainty with expert Gerd Gigerenzer of the Max Planck Institute. So, you are someone who has made communicating risk and uncertainty well, sort of part of your mission and that we are just talking about this issue of, you know, vaccinating society so they understand, you know, science. What role do you think journalists play in helping people embrace statistical thinking? Because I think one thing, you know, as someone who has been a journalist and who, you know, who went through grad school, stats and statistical thinking can be really difficult to wrap your brain around, once, when you're sort of thrown into it, but once you're in it, I think it can come pretty easily, but there's sort of…seems to be this sort of…again, particularly in the US, there seems to be at moments, this distrust of it, as Richard suggested. How much of role do you think journalists have to play in sort of in either propagating that or to changing that?

Gerd Gigerenzer: Now. So many journalists have no education in risk literacy. So, the…my experience, I trained journalists myself. And it happens every year and so they need to learn the basics, just, also in my opinion, what you need as a journalist is not so much. You don't need a degree in statistics but you need to understand a few basic concepts. So, let me start with this. One is the difference between absolute risk and a relative risk. So, when in the U.K., in the U.K. there is every other year a so-called contraceptive pill scare and the most famous one went this way. The U.K. Institution for Safety and Medicines declared that a study had shown that women who take a pill of the third generation, increase their risk of a thrombosis, double. So, hundred percent. So, what do you do? If you read that hundred percent, that's as certain as it can be, isn’t it? Many British women thought so and got in panic and stopped taking the pill, which led to unwanted pregnancies and abortions. So how much is hundred percent? The study had shown that out of every seven thousand British women who took the pill of the previous generation, one had a thrombosis which increased to two among the women who took the pill of the third generation. So, one to two out of every seven thousand is one in seven thousand. That's the absolute risk, which you can also communicate as a hundred percent increase from one to two, that's a relative risk increase. Relative risk frightened people and in this case, this single news led to the following result during the next year, so there were about thirteen thousand more abortions in England and Wales than usual. And that's a simple distinction between a relative and absolute risk and you could teach it to any teenager, and in this case those who would have the most abortions were the teenagers. They don't learn this simple distinction in school. They may learn all kinds of things that are not so relevant with our lives and here's a good example of how lack of statistical education causes not only panic and anxiety but also abortions.

Bailer: So, there was one of the things that you said in terms of the topics that you would introduce in school, the idea of absolute risk versus relative risk, so the importance, essentially, of understanding the baseline to which change is occurring.

Gigerenzer: Right.

Bailer: Can you talk about what are some of the other things that you think would be important lessons that would be taught early?

Gigerenzer: OK So relative and absolute risk is an easy example. Another thing is the ambiguity of single event probabilities. So, let me give you an example. If you see on your smartphone that tomorrow there is a thirty percent chance of rain, what does that mean? So, we have done a large study in many countries and asked pedestrians, what it means? So, I live in Berlin, most Berliners think, a thirty percent chance of rain means that it will rain tomorrow in thirty percent of that time, that is, seven to eight hours, others believe it will rain in thirty percent of the region that is most likely, not where I live. Most New Yorkers we have asked, they think the Berliners have no idea, it means something third, namely it will rain in thirty percent of the days on which this announcement has been made. That is, most likely, not at all tomorrow. So, are people stupid? No.

(Collective laughter)

Gigerenzer: I'm saying this because many of my colleagues in behavioral economics or psychology think so.

Pennington: Right.

Gigerenzer: So, it is again a very simple rule that everyone could learn. So always ask, percentage of what? And the reference class is not defined in these statements. So, it could be time, it could be region, it could be days, it could be something else. One woman in Athens told us, I know what thirty percent chance of rain means - three meteorologists think it rains and seven don’t.

(Collective laughter)

Gigerenzer: This problem has a very simple solution. Teach everyone, always ask about the reference class. There's nothing wrong, mis-wired in our brain. So, the US has a program which is called nudging which assumes that we are all risk illiterate so there is no hope for us because of risk illiteracy like visual illusions and it causes massive damage in wellness and health and therefore the government needs to step in and nudge us like sheep to where we would really be and what we don't know. This is not my vision of a society. We should teach children and adults to become risk literate rather than nudge them and it's not a big thing. It's easy, so the singular probability, the issue is, just give a few examples when people start seeing a thing. So, in medicine it's very often the case that you get the singular probabilities so you have…here’s a case. A friend of mine is a psychiatrist in Virginia and he used to prescribe his patients the anti-depressant Prozac and always told them that they have side effects in the sexual domain. So, he told them if you take the pill then I need to tell you that there is a certain chance of getting whatever it is…impotent or lose your libido or something, some problem with your sexual life. Let's assume it's a twenty percent chance of that. The patients were not happy to hear that. But also, didn't ask questions because they don't know how to ask questions about statistics. When he learnt about our research on…about this communication, he noticed he has given him a single event probability, you have a twenty percent chance of a problem. The patients can think anything about it. So, he then taught…he changed his communication about frequencies. Frequency is always based upon a reference class. So, he said out of every ten of my patients, two have a problem in a sexual way. That made his patients less unhappy. Because if you are a person who thinks, always lives on the sunny side of the life, it’s the other two, it's not you. He had asked, how did you understand that before? Many answered oh I thought it means that in two out of every ten sexual encounters something goes wrong.

Pennington: You're listening to Stats and Stories and today we're talking with Gerd Gigerenzer, director of the Harding Center for Risk literacy at the Max Planck Institute for Human Development.

Campbell: One of the things that in listening to you, and watching some of your videos, the notion of telling stories to make people understand data is really important to you. Can you talk about sort of your method for…just sort of explain what you're doing with us here today, in terms of using storytelling to help people understand statistics and data better?

Gigerenzer: It is an example. So, I work with doctors, with medical departments, to help them to educate medical students in statistical thinking and I emphasize statistical thinking not statistics. So, to learn to think. Medical students learn biostatistics which enters one ear here and exits immediately out the other.

(Collective laughter)

Gigerenzer: The reason is, that these biostatisticians do not connect the statistics with the content. And one way to connect is to give a story like the one about contraceptive pill or the thirty percent chance of rain. These are stories which stick in your mind and they anchor the concept and from there on you can then go and generalize and explain the underlying principle like this is a single event probability. By definition it has no reference class and people think in the reference classes so they make up something so everyone understands something different. Like the psychiatrist who thinks about his patients. It's a reference class but the patients don't think about other patients, they think about themselves. So, this is why stories are important but you shouldn't stay with stories, you need to know your principles so they can channelize this.

Bailer: I think it's a powerful idea to use stories to anchor concepts. That's really effective. You know when…I've read some of your work and you've talked about these ideas of the basics, you also talk a lot about the problems and the challenges that patients have when they're receiving information about the results of screening studies and that it's something that's fundamentally misunderstood by patient and physician alike. So, you know if you do get a positive result from a mammogram or some other screening test, can you talk a little about how that's often misunderstood and what are better ways of describing it?

Gigerenzer: OK. Cancer screening is a really interesting area because it's about a situation where you, by definition, have no ailment, no disease, and you’re healthy, and you could actually think, as opposed to a car accident. A false positive is an important concept here. Every test has false positives and therefore no positive test result is certain because it could be a false positive, false alarm. Mammography is a not a very good test. It causes lots of false positives. And therefore, if a woman tests positive in a screening mammogram, it's more likely that she does not have breast cancer than that she has breast cancer. The usual way this is conveyed is called Bayes rule. So, named after Thomas Bayes who allegedly has invented the rule but we don't really know.

(Collective laughter)

Gigerenzer: It's uncertain. Anyhow. Bayes rule works with conditional probabilities. Conditional probabilities are hard to understand for most people so we have developed a technique so that doctors and patients can understand a test result. So, shall I do a kind of experiment please for the three of you?

(Collective Sure!)

Gigerenzer: OK. Use mammography screening. I will first give you the information as it's taught today in conditional probabilities and I hope I will confuse you.

(Collective laughter)

Pennington: I'm sure you will.

Gigerenzer: I want to give you the same information in natural frequency, that's the technique that we people develop. I hope you will see through. Are you ready?

(Collective ‘Yes”)

Gigerenzer: Good Ok. Assume you conduct a mammography screening. What you know about this situation is that there is a one percent chance that the woman will have breast cancer. If a woman has breast cancer there's a ninety percent chance that she will test positive. If she does not have breast cancer, there is a nine percent chance that she will test positive. In the words of medicine, the prevalence is one percent, the sensitivity is ninety percent and false positive rate is nine percent. Ok? That's what you know. Now here's a woman who just tested positive. It's screening, you know nothing else about her and she wants to know from you, Doctor, do I have breast cancer or how likely is it? Ninety nine percent? Ninety? Fifty? Please tell me! What do you tell this woman? So, if my experiment works, your minds are now confused, you have a fog in your mind, you don't really know what to do with ninety percent, nine percent and so on. Is it true?

Pennington: It's true for me I'm not sure about John. (Laughs)

Campbell: It's true for the journalist. Let's talk to the statistician.

Bailer: The statistician teaches Bayes theorem and uses this kind of example so I'm going to recuse myself.

Pennington: Abstained. You've confused Richard and I though.

Gigerenzer: If I do this with doctors, in this case gynecologists, you should know the answer.

Pennington: Yeah absolutely.

Gigerenzer: They are confused. Most of them think it's not sure but something like ninety percent or eighty percent sure. OK. Now here’s the solution to this. And the solution is that it works in the same way as we've used before. Replace relative risk with absolute risk. Now we're replacing conditional probabilities with natural frequencies. Natural frequencies means the following. You start with the number, not with one person. Say a hundred women go through screening. And now we translate these hard to understand conditional probabilities into absolute numbers. That's the only trick. So, there are hundred women. We expect that one of them has cancer. That was the prevalence. And this one will likely test positive because of the ninety percent sensitivity. Among the ninety-nine who do not have cancer, we expect another nine to test positive. So, we have nine plus one who test positive. How many of these ten women do actually have cancer? Now most see the answer, one out of ten.

(Collective ‘Yeah’)

Gigerenzer See? And that's natural frequencies. So, when I…give an idea, I did these experiments with real gynecologists who have to continue in medical education, most in their forties and fifties, and who need to get points to renew their license. So, at the beginning of the…this is a ninety-minute training session, I gave them the problem, in probabilities, as that’s been taught and I had four…this was a system with only four answers. And I just spread them as far apart as possible. One answer was a ninety percent chance that she has actually cancer, eighty-one, or ten, or one. So, everything. The answers were just great about all alternatives. Mostly in the 80s and 90s. So. It shows you that medical education is not working. And even those doctors who should know what this is, these are not urologists, these are gynecologists who don't know the answer and then teaching them natural frequencies and at the end of the ninety minutes, I gave them a problem, the old problem again they are in conditional probabilities but they have now learnt these are the example, they translate them into natural frequencies. Then everything changed and 87 percent of it, one hundred sixty gynecologists now understood. There were a few hopeless cases.

(Collective laughter)

Gigerenzer: For teaching in many many more things so it can be done but it is not done. So, we work with the Charité, the major hospital in Berlin to change the education. The doctors who have studied in the US are equally ignorant about statistics.

Bailer: So, in your book Risk Savvy, one thing you wrote was if reason conflicts with a strong emotion don't try to argue enlist the conflicting and stronger emotion and that was a very powerful statement when I read that and does this suggest that arguments based on data and evidence will never cause change? There is a little bit of despair that I had when I read that.

Gigerenzer: It does not actually. That was a particular story. The context was 9/11, and the analysis about what Americans did after 9/11 so we know that many of them stopped flying.

Pennington: Yeah.

Gigerenzer: And the question was did they stay home or did they jump into their cars? I looked at the Transportation Statistics and found that tracking in cars increased for about twelve months up to five percent and they increased most on their long distance. So, it was certain proportion of people went in their cars. That caused an estimated 1,600 Americans to lose their lives on the road in their attempt to avoid the risk of flying. So that was the context. The lesson is of course I call this the dread risk. So, people fear situations where many people died at one point of time. They don't fear…it's very difficult to elicit the anxiety if many or more people died, over the year.

Pennington: Yeah.

Gigerenzer: Smoking or car tracking. So, it's not about dying, it's about dying together, socially. That's the fear. It was an unconscious fear. If you can work out then when I published Risk Savvy, I think it was a doctor, a US doctor wrote me. Dear Dr. Gigerenzer, I gave your book to my wife because she asked had this anxiety from flying, and I assured her not many people die but it was still no, I had no success with that. The story about his wife and in that case obviously rational arguments don't help and it's. I think that's the more couples go where rational arguments don’t help. And so, as a suggested to him, if reason doesn't help, then often invoking a competing emotion may help for instance their children you could basically point out that now driving long distance because of what? Anxiety for terrorists. That this puts your own children at risk. Do you want that? Have a paternal emotion. Is the anxiety cloud. So that’s what's the context.

Pennington: Well Gerd thank you so much for being here today. That's all the time we have for this episode. It has been a really interesting conversation.

Gigerenzer: Yeah, I like you very much.

Bailer: Thank you.

Campbell: Thank you very much.

Pennington: Stats and Stories is a partnership between Miami University's departments of Statistics and Media Journalism and Film and the American Statistical Association. You can follow us on Twitter or iTunes. If you'd like to share your thoughts on the program send your e-mail to statsandstories@miamioh.edu And be sure to listen for future editions of Stats and Stories where we discuss the statistics behind the stories and the stories behind the statistics.