Significance

Suffragette Statistics | Stats + Stories Episode 318 by Stats Stories

The work of suffragettes in both the United Kingdom and the United States has been immortalized in textbooks, as well as in movies and TV. The women activists who helped women gain the right to vote are often portrayed as heroes and radicals. What's gotten less attention is the connection between the statistical world and the suffragette movement. That's the focus of a recent issue of Significance Magazine as well as this episode of stats and stories with the magazine’s guest editor Dr. Altea Lorenzo-Arribas. 

Read More

The Algorithm of Love | Stats + Stories Episode 314 by Stats Stories

According to the Pew Research Center, three in ten US adults say they've used a dating app, with Tinder, Match and Bumble being the apps most likely to have been tried. Pew's research has also found that one in 10 partnered adults in the US met their significant other on a dating app or site. Dating app success is a focus of this episode of Stats and Stories with guest Dr. Liesel Sharabi.

Read More

The Dark Statistical Story of the World Cup | Stats + Stories Episode 295 by Stats Stories

Women’s World Cup action in Australia and New Zealand has wrapped up and Spain’s been crowned the champion. After players and fans headed home, residents were left to clean up after them. Hosts of such tournaments are also left to tackle the human rights implications of hosting an event that massive.  The human rights impacts of something like the World Cup are incredibly hard to measure and that is the focus of this episode of Stats+Stories with guest Dr. Megan Price. 

Read More

Listening Before Communicating Risk | Stats + Stories Episode 237 by Stats Stories

What do farmers in Kenya, fishers in the Philippines and teenagers in Boston have in common? They all need to balance risks when making decisions ranging from seed choice after considering predicted rainfall to life vest and chance of shark attacks to social distancing and emotional impacts. Understand risk is the focus of today’s episode of Stats+Stories with guest Tracey Brown.

Read More

The Data Economy | Stats + Stories Episode 213 by Stats Stories

Do you remember the first time you saw a prompt in social media asking about a product you were searching for on some other online platform? How about the first time you received coupons sent from your local grocery that incentivized buying your favorite consumable items? Today’s episode of Stats+Stories focuses on the origin, expansion, and future of the data economy with guest Timandra Harkness and guest host Brian Tarran.

Read More

The Right to Be Left Alone | Stats + Stories Episode 205 by Stats Stories

With the ubiquity of technology in our lives have come concerns over privacy, security, and surveillance. These are particularly potent in relation to what's come to be called Big Data. Navigating the complicated terrain is a constant conversation in some sectors of the tech industry, as well as academia. And it's the focus of this episode of Stats and Stories with Christoph Kurz.

Read More

Building Back Better | Stats + Stories Episode 195 by Stats Stories

Over the course of the last year, statistics have framed our lives in very obvious ways. From COVID cases to unemployment rates, stats have helped us understand what’s happening in the wider world. As we contemplate how to “build back better” in the aftermath of the pandemic, official statistics could help guide our way, at least, that’s what the authors of a recent Significance Magazine article think. That’s the focus of this episode of Stats and Stories with guest Paul Allin.

Read More

Everything Makes Sense with Statistics, Right? | Stats + Stories Episode 176 by Stats Stories

Tim Harford is an economist, journalist and broadcaster. He is author of "Messy", and the million-selling "The Undercover Economist". His newest book “The Data Detective” was released in the U.S. and Canada earlier this month. Harford is a senior columnist at the Financial Times, and the presenter of Radio 4's "More or Less", the iTunes-topping series "Fifty Things That Made the Modern Economy", and the new podcast "Cautionary Tales". Tim has spoken at TED, PopTech and the Sydney Opera House. He is an associate member of Nuffield College, Oxford and an honorary fellow of the Royal Statistical Society. Tim was made an OBE for services to improving economic understanding in the New Year honors of 2019.

Read More

The Recent (Regrettable) Rise of Race Science | Stats + Stories Episode 173 by Stats Stories

saini.jpg

Angela Saini is a science journalist, author and broadcaster. She presents radio and television programmes for the BBC, and her writing has appeared across the world, including in New Scientist, Prospect, The Sunday Times, Wired, and National Geographic. In 2020 Angela was named one of the world's top 50 thinkers by Prospect magazine, and in 2018 she was voted one of the most respected journalists in the UK. Her latest book, Superior: The Return of Race Science, was published in May 2019 and was a finalist for the LA Times Book Prize and the Foyles Book of the Year.


Episode Description

Race science – the belief that there are inherent biological differences between human races – has been “repeatedly debunked” in the words of the Guardian, and yet, like a pseudo-scientific hydra it raises its heard every so often. Most recently race science is the return of scientific racism is the focus of this episode of Stats and Stories, where we explore the statistics behind the stories and the stories behind the statistics with guest Angela Saini.

+Full Transcript

Rosemary Pennington: The race science, the belief that scientific study will uncover inherent biological differences between human races has been repeatedly deep in the words of the guardian and yet like a pseudo scientific Hydra raises its head, every so often. What's also known as scientific racism has framed studies of human intelligence and attractiveness and most recently emerged in conversations around genetics, the resurgence of scientific racism is the focus of this episode and stats and stories, we explore the statistics behind the stories and the stories behind the statistics. I immerse Mary Eddington stats and stories is a production of Miami University's Department of Statistics and media journalism and film, as well as the American Statistical Association. Joining me were our regular panelists John Bailer, Chair of Miami statistics department and Richard Campbell professor emeritus of media journalism and film. Our guest today is Angela Saini. Saini is a science journalist, author and broadcaster. She produces radio and TV programs for the BBC, and her writing has appeared in such publications as New Scientist prospect. The Sunday Times wired and National Geographic in 2020 Sandy was named one of the world's top 50 thinkers by prospect magazine, and in 2018 she was voted one of the most respected journalists in the UK. Her book superior the return of race science was published may 2019, and was a finalist for the LA Times Book Prize, and the foyles Book of the Year Angela thank you so much for joining us today.

Angela Saini: It's my pleasure. Thanks for having me.

Pennington: I was wondering if we could start our conversation with you describing kind of what historic race science was or is and how that compares to sort of its modern iteration.

Saini: Well, I think a lot of people imagine the racial categories that we use now around skin color to have been around forever. But of course they haven't been, they were inventions, and the time that they were invented was around the time of the enlightenment. When scientists and naturalists in Europe were looking at the natural world, and thinking about how to classify it. And, as well as classifying animals and plants. They also thought about classifying us, they thought, you know, this cultural diversity that we see all around the world, all these differences that we see. Maybe they rise to the level of different breeds or different species of human being. And that's where the idea of race, as we use it now, came from so that's not to say that people didn't think about human difference before, of course, there must have. But these racial categories, black, white, yellow, red, you know these very broad racial categories at least now that's around the time that they were invented. But there was. I mean we know now, but it's always been true that there are no natural dividing lines between the human species we are one human species. And we are very homogeneous as a species so we're more homogeneous than any other primate chimpanzees, have more genetic diversity than humans do. And so given that there are no natural dividing lines between us. Any attempt at categorization is by its nature likely to be arbitrary; it can't be anything else, it has to depend on categorizing you know what's important to them. And the fact that they landed on skin color is as arbitrary as anything else because, because at the time, there were lots of different ways of categorizing people so there were some people who thought there were a few races and people who thought there were 1000s of races. And the way that traditionally the word race hadn't been used very much, but traditionally the way it was used. Prior to that had been to refer to a family or tribe. So if you're using it by that definition, which in some ways is a more coherent definition because at least within a family you have some genetic similarity. We know more than you do at a continental level, then there can be millions of races, you know, logically by that by that standard.

But it was skin color that kind of became popular. And that scientists ran with European and American scientists around for hundreds of years, and it was given meaning really because that became one of the ingrained assumptions that formed the science of human difference. So there were lots of assumptions at the time, including for instance that women were not the intellectual equals of men, which is why women in Europe were excluded from many universities and certainly all the scientific Academies of Europe, from the Enlightenment onwards, because we were just seem to be separate but we are two separate categories and women were kind of intellectually separate category. So these assumptions. As arbitrary and as political and unscientific that they came to form the basis, like I said of the science of human difference. And that continued for hundreds of years in fact well into the 20th century, there are still many people who think in these terms now. And that's all that science is, you know, there really isn't anything else.

Richard Campbell: Could you talk a little bit about the notion of a centralism because I think some of our listeners probably don't know what that is. and also how some of these studies got passed, even more recently when got past the early editorial stage, because the, as you pointed out the starting assumption is that populations are essentially different, and people, and that doesn't seem to get interrogated at the beginning of some of this work.

Saini: Well, essentialism really cuts to the heart of this because it says that there are biological qualities that certain groups have or certain populations have that other populations don't have, and what. Historically, people have tried to make inferences based on their assumptions around these essentialisms. So for example, that the Western world is as economically prosperous, as it is as it was for a couple of 100 years at the time that these ideas are being developed, because of some essential quality that white Europeans have that other people in the world don't have, which is a very a historic way of thinking too because as we know, if you look through the course of human history Europe's dominance for as long as it was, is just one part of human history other other cultures and other civilizations have risen and fallen and and you know Western European civilization will go the same way we know you know that's how history works. But, you know it, what it does is try to explain society, and what we can see out there in the world through nature and say that this isn't historical, this isn't political, this isn't social This has nothing to do with how we live or how we choose to treat each other. This is because of some qualities that we have within ourselves. And it's a it's an argument that remains powerful to this day there are many, particularly on the right, and by this I mean the far right the alt right, who want to be able to make these claims because if they can then we don't need to do anything about inequality as we see it in the world, whether that's gender inequality or racial inequality, or even class inequality, there are still attempts to reintroduce class into this equation as it existed in the early 20th century, a lot of for example the British eugenics and race science movement was about was actually about class. And there were attempts to state that for example poor people were genetically biologically inferior to richer people and that's why you had generational poverty. And there are some people even trying to revive that now in the 21st century, believe it or not

John Bailer: You know what, when I was reading your book one thing that really struck me was the issue of the kind of the cultural and political context searches done and how that shapes and frames the way that that kind of you we've looked at problems. So I, you know, this seems like this echoes throughout history as part of this investigation. Can you talk a little bit about that?

Saini: Yeah, absolutely. I mean, I think there is this, I mean, I studied engineering, and I was certainly trained within a system that taught me that what we do when we do science or engineering or mathematics or whatever is objective that we set apart from society we were above politics. And the problem with that is that we forget that much of the science, including those very early assumptions that I just mentioned earlier, were very much rooted in the politics of the time they were informed by the politics of the time. And because they weren't interrogated enough because of that politics. That's why mistakes continued for so long. And this is how mistakes happen even sometimes orthodoxies can build within the science sciences fallacious orthodoxies can build within the sciences, for a very long period of time because nobody questions. These basic assumptions because they assume that everybody who's doing this is perfectly objective so there can't be any problems here. And that is something I think we need to challenge, I trust the scientific method. I really do think it is the best one of the best ways we have of understanding the universe and understanding ourselves as humans, but, it's limited by our own by the fact that we are human and that we, we have these biases and purchases. We are informed by the world around us. And that shapes the questions that we ask the limits to what we can imagine. You know, for example, it's only relatively recently that scientists have started challenging the idea that there is a gender binary, you know, to think outside those boxes. And that's because it literally wasn't within the purview of their imagination to but there could be anything else out there and society in that sense is challenged it because everyday people and their discomfort with these gender categories and how they feel about how they feel about these things and challenging that politically has been entered into the sciences and then scientists start asking these new questions. So, we have to accept that. And if we can accept it and understand it and engage with the fact that science sits within society that's embedded within cultures, then, then I think we can get closer to objectivity because then we can understand exactly what it is that we're looking at.

Bailer: Just as a quick follow up. I remember years ago when I was reading Stephen J. Gould, some of the work that he had written that was what I had the epiphany of that kind of cultural context in which, in which science is done and I've found myself thinking, oh my what kind of, you know, how is this, how is the world in which I live now and the culture in which I live now shaping the way that I ask questions, or how I look at problems or how I think about interpreting results and analyses and that's a that's a that's a, that's an important and challenging consideration as we do our work.

Campbell: I was gonna just follow up on that. How is this sort of the politicization of science, during the COVID crisis I mean this has really been a remarkable thing is this a new phenomenon or is it a new stage or just the the the mask wearing thing in the states here, you know, divided politically it's just, it's sort of incredible to me, and the stories that are emerging of people dying in the Midwest who refuse to admit they have they have COVID that they have something else. Is this a phenomenon that you've seen before and studying the history of this

Saini: t's always been there they've always been people, I mean there are lots of different, I think there are lots of different things happening this year. One of them is as you say conspiracy theories and this kind of pseudo scientific conspiracy theories that can be quite elaborate, and especially popular because they spread so easily on social media, that you know this phenomenon of misinformation and disinformation that gets spread so quickly through things like WhatsApp and Facebook and Twitter.

And it's because we consume things so so fast and we don't always have time to challenge it, and it's very easy for these incorrect means to kind of proliferate, and it is something I'm working on I set up a group last year, which now sits under the wall institution here in London, which is one of the oldest scientific societies in the world. And we are a group of journalists, policymakers, social media experts counterterrorism experts, academics, a very broad range of people all interested in this problem of pseudoscience in whichever way it manifests, and what you quickly start to realize when you look at this, is that these people who, you know, whichever conspiracy theory they're there to whether it's an anti Vax one or whether it's flat earth or a climate change denier conspiracy theory or whatever it is that what they have in common, because they don't have anything in common demographically They come from all kinds of walks of life, age everything, but what they do tend to have in common is a mistrust of authority. And this is that common thread, you see. And actually that's understandable because, you know, very often are authority figures and not always trust, and especially these days where we have all these populist leaders around the world who are willing to lie sometimes outright to the to their citizens, then it's very easy to build a mistrust of authority and to buy into certain conspiracy theories. And that is why it's sometimes very well educated, very skeptical people are sometimes the most vulnerable to this because what they're really doing is questioning what they're seeing to such an extent that they question everything you know, even the fundamental basics. And that is the point at which we need to engage with these topics. This isn't about ignorance always, you know, very often, especially, I mean I looked at anti vaxxers particularly for a documentary last year. These are often very well educated middle class people who are very well clued up on the facts but what they're choosing to do is dismiss a certain set of facts and choose another set of facts that fits with their fears or their worldview, and what the conspiracy theorists do. And the ones who spread this kind of misinformation disinformation and who do that for lots of different reasons, including sometimes state actors. So there are Russian bots you know spreading this kind of stuff around, but what they try and do is play on those fears. So for example the legitimate fear of a parent that their child might be hurt if they're given this medicine or this vaccine, and then they draw you into that rabbit hole of false facts and everything can, and sometimes seeded with accurate stuff, you know, for example, real examples of Vaccine Injury, but it marginal but real examples, and then use that to kind of build a case as seeds that doubt in your mind so it can be. I think it's very complex, the way the psychology of this works, and especially with the internet and then dynamics around the internet, it makes it even more difficult, but it's a phenomenon that has always existed there have always been doubts around these things and often what's happened, for example with vaccine doubts. Is that a big pandemic like this will happen. Everyone feels that they need to take the vaccine, and then the doubt kind of subsides. A little bit, and shocking though it is and it's unfortunate that it happens that way that people have to die in order to be confronted with the devastating reality of the importance of these things but that's often how these things happen.

Pennington: You're listening to stats and stories and today we're talking with science journalist Angela Saini. Angela so you write about the work of Karen Fields. And the idea of race craft and I you. I'm trying to find iPad to pull up on my phone, because I love this, this line where you say, in thinking about you know race is sort of in relation to sort of witchcraft and sort of it being a construct and about race writing it as biologically real as witches on broomsticks. I love that line but I also I think back to Richard's earlier question about sort of editors letting these things through right so you also write about a blog post a man wrote about sort of the, you know, lack of intelligence and attractiveness of black women, and then talk about sort of the scientific papers that get through. And I wonder it's sort of the inverse of what Richard just asked if, if, because people are credentialed who are pushing some of these views that it lends sort of truth and and sort of vigor, to this idea of there being what is it biological diversity remember how one of the people you talk to biological diversity or something right they're human buttons too Yes Yes, that's the term. Right. Whether the credentials behind some of these people sort of reifies the idea that race is somehow real.

Saini: Yeah, and it's a real problem. I think it's a difficult one to tackle because I think the nature of academia is that it is a board church and in some ways it needs to be a board church in order to maintain academic freedom.

And I value that I do think that's important, but at the same time.

What we get is as a result of that we do get people, and we've always had these people so they've been existed right from the beginning, and people who hold very marginal political views who then turn to science to justify those political views. So many of the people, for example that I write about in superior or who I interviewed for superior about their.

What have been termed by some people, scientifically racist or pseudo scientific positions or papers that they've written. Most of them are not geneticists. In fact, none of them are geneticists, most of them tend to be psychologists political scientists, you know, people outside these disciplines where the work of whether real kind of biology, around human differences done.

And very often, when you kind of scratch beneath the surface and this is something I've tried to do very hard in my work is not just interview people who are critical of race but understand those who adhere to these racist theories, or what have been termed racist theories. Why do they do it, why are they so attached to these ideas. And when you dig underneath very often, what comes out is a kind of political underbelly so you know they're anti immigration, or they're anti racial mixing, or they feel that there should be some form of segregation between people that equal opportunities are a bad idea that affirmative action is a waste of time. And that's often what lies beneath all of this and what they're really doing is using the science as a tool to justify these political beliefs, and sometimes they go through quite unbelievable contortions intellectual contortions to be able to do that. And because the evidence really doesn't support the idea number one that race is real or that there are these deep psychological differences between us, but they won't let it go, and what they cling to increasingly is the possibility that one day evidence will come along to prove them right. And, and, you know, you could say that about pretty much any area of science because we don't know everything we're never going to know everything, especially because human nature is not just some simple biological kind of substrate, it's it, who we are is heavily influenced by our environment our culture, our biology is affected by our environment and culture how we develop our brains, everything. So because all these things are so intertwined. You cannot extricate them there are, there is no separate nature and nurture they're all intertwined with each other. We are always changing. And so you can never get a full grip on who we are as human beings, you can never say definitively what human nature is. And that's really where the, the territory that they occupy now is that uncertainty. And I guess they will occupy it forever as long as they hold these political beliefs, and that's the that's the space that they'll that they'll live in the the thing we have to challenge is not just a science that all the pseudoscience that they're peddling, but really understand why they why they so desperately want it to be true.

Bailer: You talk a lot about where the work appears, or some of the more recent research, and it is reinforced. For me the idea of of identifying funding sources, as well as identifying kind of the outlets for this work just because it's it's appearing or just because it's been supported doesn't doesn't necessarily mean there isn't an agenda that goes that goes with that could, you know, can, can you describe a little bit about how you know digging into that and kind of how do we how do we kind of inoculate ourselves against these kinds of impacts.

Saini: Well, within scientific publishing there is a wide range of quality. So there are some journals that are right at the bottom end like the mankind quarterly so this was a pseudo scientific journal that was set up after the Second World War, by race scientists including one Nazi race scientist who carried out experiments on the body parts of Holocaust victims, some of them children, so he and others all around the world, I should say scattered all over the world, and not confined to any one region, set up this publication which is still, you can still read today. So it's still being published in fact I interviewed the person who was then the editor of mankind quarterly when I was writing superior. And so in that sense, on the margins of scientific publishing there are people trying to keep these ideas alive in those circles. Very often they're writing for each other so they cite each other's rights for each other they're not generally cited in the mainstream in mainstream academic journals, but some of them also do have a presence in mainstream academic journals. So one thing I learned in 2018. During my research two of the editors of mankind quarterly were sitting on the editorial board of the journal intelligence which is a major journal published in the field of intelligence, which itself is a very fraught field so it has a. It has its roots in eugenics as well it has a very dark history and history it hasn't completely let go of unfortunately even to this day so there are still figures within the intelligence community who are considered racist by others within academia who have been denied platforms or denied access to conferences because of their views. But anyway, so these two. These two people were on the board of this journal and Elsevier which is a major Publishing Group has very strict rules around who can sit on editorial boards. And when I asked them about why they allowed these two people who had very weak academic credentials to be sitting, one of them in fact has been.

He hadn't kind of honorary position with an Irish University which has now been rescinded as well so it has no academic affiliation anymore. And I, and I asked them why do you have these people on your journal board because you have certain standards that you're meant to uphold and they entirely wash their hands of it and said it's not for us it's for the editor in chief, the editor in chief told me it was a matter of academic freedom that this was about having a plurality of views within the journal, which is worrying because the journal itself has published a number of articles over recent years, by people who've had links to the alt right and white supremacists who, you know, have strong connections with the mankind quarterly have edited or written for it. And he just refused to do anything, but by the end of 2018, when I went back to check the editorial board when I was, updating my references, I noticed that those two people had been quietly removed from the editorial board. So I feel that maybe because I wrote an article at the time that maybe there was some pressure within the editorial board to clean up their act a little bit, but the the point I'm trying to make is that these are not isolated instance, there are other problems within other journals. If anyone goes to the brilliant website retraction watch. You can see how common this actually is the basis pseudoscience it as recently as this year has had to be retracted. In fact one paper published earlier this year was retracted from a journal. And after criticisms of how politically motivated, it seemed to be.

And then the authors themselves admitted that their data was shoddy, and that they should that they should retract.

So you really have to ask yourself, you know, are we, upholding the standards that we need in academic publishing. And this isn't just a matter of academia anymore this is a matter for all of us because the public has access to these papers now because of because of the internet. And if we can't trust what we're reading, if these kind of retractions are going to continue and if we're going to get dodgy people sitting on the boards of journals writing papers, then it's going to erode trust in science even further.

And it's going to damage the reputation of science and make it much harder I think for good scientists to do good work. But there are people I mean I know I work with journal editors and journal groups and there are people trying to tighten those standards, not just around quality but also around the ethics and looking at the repercussions of their work.

Campbell: Angela, how much of that is a problem, I think you.I think this was from your, your piece in nature, where you say, scientists rarely interrogate the histories even have their own disciplines.

How much of what you just talked about, is, is because of that is because we're not the scientists themselves aren't even aware of the long trajectory of history. And I think you're right elsewhere. John probably won't, but how humanities professors and humanities have provided a stronger critique here than science itself. I think this is changing. I think there's more attention being paid to history, but talk a little bit about this failure of science to interrogate its own history.

Saini: Well, humanities does also have its own problems. It's within the social sciences that you often see the best critiques I think of the sciences. And one of the problems that we have is that scientists very rarely engage with that body of knowledge. So for example when it comes to medicine and race health and race. There is actually a huge wonderful body of literature that we have within the social sciences, looking at the effects of racism and discrimination on health on the body. Mentally on all of these things. And yet, in the covid 19 pandemic this year. I saw a number of high profile physicians and medical researchers, looking to genetics to explain the racial disparities that we were seeing immediately, you know, by March, April as soon as it was clear that black and Asian people in certain countries are dying at higher rates than others, they've jumped straight genetics which, if they were aware of that body of literature that shows the effect of racism, discrimination structurally on how we live and how people are treated and not just that also around class and all these different factors, I mean, a lot of this is stupid socio economic status and a lot of that work is done within the social sciences. And then we would I don't think we would be jumping to those kind of essentialist conclusions or assumptions, immediately. And so we do need, I think, more dialogue, and more humility I think sometimes among scientists that it's not just hard science it contains all the data that you need that there is data out there in the world that it's actually equally, and sometimes even more important when we're talking about certain, certain things.

And that failure to understand not just that body of social and cultural knowledge but also history. I think is why a lot of mainstream scientists fall into these traps while they make these mistakes, and I know this from my own experience because as I said, I studied engineering. I was, I was very poorly exposed to the social sciences when I was at university, but as an adult after I left, I was working in the BBC, and in my spare time I started doing a degree at King's College London which is just here in London in their department of war studies, so this was an interdisciplinary science and security course, in which taught by social scientists mainly but also a few people who have experienced in sciences and engineering. And for the first time, I learned about the construction of knowledge feminist critiques of knowledge or, you know, all these Foucault everything, all these things that I've never been taught before I suddenly got an introduction to and also the history of science, technology, how ideas develop the cultures that they're developing and how that shapes how we think about them. And it completely changed the way I think about ideas. And I, I really very firmly believe and in fact I've been advocating this all this year in every university talk I've given that we should integrate science, history and humanities teaching into science teaching more. I really very strongly believe that every time you learn a scientific concept in whichever discipline, it is, you should know the background to it.

Pennington: Well that's all the time we have for this episode of stats and stories Angela, thank you so much for being here today.

Saini: Thank you for having me.

Pennington: Stats and Stories is a partnership between Miami University’s Departments of Statistics, and Media, Journalism and Film, and the American Statistical Association. You can follow us on Twitter, Apple podcasts, or other places you can find podcasts. If you’d like to share your thoughts on the program send your email to statsandstories@miamioh.edu or check us out at statsandstories.net, and be sure to listen for future editions of Stats and Stories, where we discuss the statistics behind the stories and the stories behind the statistics.


Big Data and Big Laughs | Stats + Stories Episode 157 by Stats Stories

timandra.jpeg

Harkness writes and presents BBC Radio 4 documentaries including the series FutureProofing and How To Disagree, and Are You A Numbers Person? for BBC World Service. She formed the UK’s first comedy science double-act with neuroscientist Dr. Helen Pilcher, and has performed scientific and mathematical comedy from Adelaide (Australia) to Pittsburgh PA with partners including Stand Up Mathematician Matt Parker and Socrates the rat. 

Her latest solo show, Take A Risk, hit the 2019 Edinburgh Festival Fringe with randomized audience participation and an electric shock machine. A fellow of the Royal Statistical Society, she’s a founder member of their Special Interest Group on Data Ethics. Timandra’s book Big Data: does size matter? was published by Bloomsbury Sigma in 2016.


Episode Description

Statistics is generally a field not known for its humor, at least to the broad public. Which is a shame because humor is a way to make complicated subjects – like statistics or big data – accessible to general audiences. The intersection of humor and stats is the focus of this episode of Stats and Stories with guest Timandra Harkness, coming to you from the annual meeting of the Royal Statistical Society with guest host Brian Tarran.

+Full Transcript

Rosemary Pennington: Statistics is generally a field not known for its humor, at least to the broad public, although I will say John Bailer has been an exception in my life

John Bailer: That’s because you laugh at me.

Pennington: It’s a shame though because humor is a way to make complicated subjects like statistics or big data accessible to general audiences. The intersection of humor and stats is a focus of this episode of Stats and Stories coming to you from the annual meeting of the Royal Statistical Society. I’m Rosemary Pennington. Stats and Stories is a production of Miami University’s Departments of Statistics and Media, Journalism and Film as well as the American Statistical Association. Joining me as panelists are John Bailer, chair of Miami Statistics Department, and Brian Tarran, editor of Significance Magazine. Our guest is writer, comedian, and presenter Timandra Harkness. Harkness writes and presents BBC Radio for documentaries including the series Future-Proofing and How to Disagree and Are You a Numbers Person for BBC World Service. I, frankly, am not. She formed the UK’s first comedy science double-act with scientist Dr. Helen Pilcher and has performed scientific and mathematical comedy from Australia to Pennsylvania with partners including stand-up mathematician Matt Parker and Socrates the Rat. Her latest solo show Take a Risk hit the 2019 Edenborough fringe with randomized audience participation and an electric shock machine. A fellow of the Royal Statistical Society, she is a founding member of their special interest group on data ethics. Timandra’s book Big Data Does Size Matter? was published by Bloomsbury Sigma in 2016. Timandra thank you so much for being here today.

Harkness: It's a pleasure.

Pennington: I am just going to ask I think the obvious question is how does a comedian take on technology and math and science as a focus of her work?

Harkness: That’s a relief because I thought you were going to ask about the electric shock therapy.

[Laughter]

Pennington: I do want to know about that though.

John Bailer: My question Timandra, I’m going to ask that next.

Harkness: I may be the only fellow at the Royal Statistical Society that likes firing electric shock machines. Well, interestingly, there’s a lot of people now that use comedy as a way of getting across their particular subject, whether it’s science or math or something else, and I came in the other way. I came in the other direction. I was already a professional stand-up comedian and so was Pilcher, although she had a day job at the time, and we met at a meeting at the Royal Society on stem cells because I was trying to write something about it. And we bumped into each other in the coffee room and I was really surprised because I’d only ever seen her in rooms above pubs making jokes about beer bellies, and there she was looking smart with a badge on and so I sidled over and went what are you doing here and she said I’m a stem cell scientist, that’s my day job, what are you doing here? And so, we went, oh, we should do some comedy about science. Because we were both getting really bored with the things that comedy was always about. It was always about the differences between men and women and about drugs, about sex, about alcohol, and we just wanted to do some comedy about something more interesting. Although, ironically, when I look back at the things I’ve done comedy about, I have actually done, now, comedy about the differences about men and women and sex and drugs, but from a scientific and mathematical point of view. So, it was really for me, and then I went on to do a degree in math statistics, but for me, it was comedy that reignited my curiosity about science and mathematics and statistics. So, it’s more the other way around for me. It’s less why do you use comedy to talk about mathematics, it’s more how did you end up in mathematics having started out in comedy?

Bailer: You know I think there’s an element of you have to change arts before you change heads and that the comedy is opening up to message. It’s engaging and getting excitement and interest. And if you can get the interest, then the messaging can also be connected to it.

Harkness: Yes, all of that is true and I think a lot of people do use it for that but, absolutely, genuinely for me, it was the other way around. I like doing comedy because I like making people think. That’s absolutely true. I always have. I’ve always been more interested in the kind of comedy where people laugh and then go oh, that’s interesting, why did I laugh at that? Because it opens people’s minds up a bit. It catches them unaware, and also it is enjoyable, which is always a plus. And then it was my curiosity then about science and mathematics that I kind of came to in that direction, and then I thought well if I find it interesting, why wouldn’t anybody else find it interesting and it does make a change from talking about the same old same old thing. Because this was back in 2000-2001. So now there are a lot of good people doing good comedy about science, statistics, mathematics at the time we genuinely were the first two people in the UK, I think there were a couple of guys in Australia doing it.

[Laughter]

Harkness: The electric shock machine, I first got it when I did a show at Brave Science Agenda, costarring Socrates the Rat. His job was to be male and a rat and I- one of the differences that psychologists find, on average between taking risks. And I wanted something that I could demonstrate this very graphically to the audience, preferably with audience participation. So a psychologist friend don’t you is there like a civilian version of the equipment that you use that I could buy to do you know harmless pain on an audience member and he said this is great timing, I’m about to relocate to Singapore. I have an electric shock machine; I don’t want to take it with me; it’s yours. And so he gave me this laboratory machine with all the safety instructions, it’s got a seven-page risk assessment and everything and I would invite people in the audience in the show about sex differences to get up and basically gamble. Take a 50-50 bet, and if they lost the bet I’d get to give them electric shock and if they won the bet they get to give me electric shock and I gave them some money. And I have to say whoever was flipping the coins on that, who is another audience member- let’s just say I looked back at the end of the tour and I was well down on money and electric shocks so I don’t think it was fair action going on there. And then when I went to do a show about risk, this was my obvious thing, and again, basically, I used it for gambling; to let people in the audience think about their own decision making around risk. And your previous guest, Tim Harford, I think has probably looked at this where it’s never a purely mathematical calculation; there are always psychological elements. It’s never just about going on average I will win if I do this because you might say well, I’m prepared to take a quite a large risk of a very small electric shock, but I’m not prepared to take even a very small risk for a very large electric shock because there’s a kind of maximum amount of pain that I’m prepared to risk. So, it was I really always get people randomly selected from the audience and offer them a chance to do this gamble about whether to get electric shock or not as a way of saying that whenever we make these decisions, it’s not just about can you do arithmetic in your head. It’s always in the context about much wider decisions that we make.

Pennington: I love that you phrased it harmless pain. I would not do that because any pain to me feels harmful.

Harkness: Well, there is actually- I had to get people to sign a consent form because people with a pacemaker, for example, it’s very dangerous for them and certain other medical conditions. Also, it really ups the ante on stage when some audience member volunteer is having to read a consent form. It ups the fear level, which makes the whole thing more dramatic. And it also gives them a little point where they could elegantly back out, you know, if they’re having second thoughts, they can really say oh well, no, I’ve got a medical condition, so I can’t do this.

Bailer: You know, in reading through your Big Databook, I really liked the historical tour of thinking about data and society and statistics and also about computing and how that emerged. And then you have this organizing statement here of data, where you touch on these different components. Would you kind of summarize for folks who haven’t read it how you’ve organized your thinking about big data?

Harkness: Oh, my backronym.

Bailer: Your backronym?

Harkness: Backronym. Yes, I thought everybody knew this word backronym, which is where you want an acronym. So, you want a word that spells out your ideas, but then you reverse engineer it to get the word that you wanted. So, I felt I would do this so that I could get data, D-A-T-A, now obviously big data is partly big; there is a lot of it that is part of it. But I thought it’s not just that; it’s not just that there is more of it than there used to be, it’s also these other things and I did, I managed to get the big D-A-T-A. so these are diversity diverse or dimensions if you want to get a bit technical. The idea that you can have different types of data and when you combine them you get a multidimensional picture, whether it’s of an individual or something that you’re studying. So I mean domestic sounds and that was a brain scientist called Professor Paul Matthews, who said if you have lots of brain scans, for example, he said, I have brain scans, but if you have lots of brain scans that’s just large data. Big data is when you combine the brain scans, the patient records, the postcodes where the patients have left, the weather records of those postcodes, and then you put them all together and then you ask a different question from what the people were collecting the data for. In this case, he wanted to know how many hours of sunshine had the patients had, and did that correlate with the progression of their illness? So, there’s D’s, different divergence damages. A is automatic because so many things we do now just automatically generate data, so it’s almost collected by default. T is for time because things are pretty much collected in real-time it lends itself really well to making a time series and you can project that into the future and see how things are going to change. And then the other A is for AI, for artificial intelligence because the products used to analyze data very much are what you might call artificial intelligence. I mean I don’t want to make claims that it can be fake but there’s an element of unsupervised processing, where it’s sort of saying follow every step in this program. You say to the computer I want you to separate these into sick and well or healthy and I’m going to give you one dataset that’s presorted well, I’m going to let you use the rules that you need to follow to sort of rest the data. So that’s different, diverse, automatic, time and AI.

Tarran: On the subject of algorithms and AI I guess you could do a lot I think I saw, was it a tweet, that you said about you’ve become an overnight expert in algorithms

Harkness: Well I think it’s really quite well really, this is a classic case where

Harkness: We take the grades that the teachers have given often based on previous exams that kid has taken at least as your starting point, but they didn’t do that. they went what’s really important to us is that the overall pattern of the grades will closely resemble the previous three years. So what we’re going to do is for each school we will take the results for the previous three years and we’ll get an average of those and we’ll say okay well those are the grades that your school is going to get this year; this pattern. You know, so many As, so many Bs, and so on. And then oh okay how are we going to decide which kid gets which one well we’ll get the teachers to rank them in order from best to worst and then we’ve already got this select box of grades that we’ve decided your school is getting and we’ll give them out in that order from top to bottom, and that was what they did. The only role that the kids' exam results that the actual kids getting this result and to the algorithm, the only previous exam results played was as a whole class they would say if they had done spectacularly better or worse than previous year then we’ll adjust it upwards or downwards, or if they were in a very small subject group at which point say we’ve got ten or eight kids in a class, yeah okay it’s probably an old fad and just allocate for previous years. So, in that case, we will take them into account. But I just thought it was a, an astonishing decision, and b, also horrible typical in fact of the way a lot of algorithms work that make decisions about us, that then really that minimally based on anything we do or are or have done very largely based on what the population of people who are deemed to be like us have done in the past.

Harkness: Well, yes, and no. I mean I think that we are a bit more aware of these things but yes, it is a bit astonishing to see that the whole Juggernaut if you like rumbles on the same way. In fact, that’s the thing that I'm interested in now and look at that now is to say we're surprised let's go.

Harkness: Human being are the ones who built this stuff; human beings decide what data to collect this is all human beings doing this. The question really is. What is it about us? What is it about human beings here and now at this point in history, that makes us so very keen to hand over decisions to algorithms? No matter how many times we see how flawed and how biased how incomplete they can be the for this urge to hand over human judgment decision-making to an algorithm.

Pennington: You're listening to Stats and Stories recording at the annual meeting of the Royal Statistical Society. Our Guest is writer comedian and presenter Timandra Harkness. Why so use was talked about how now like we need to sort of step back a little bit from our trust in algorithms. I guess I the question I wanted to pose is sort of why you felt compelled to write about big data in the first place. There's a lot of people writing about and publishing about big Data. What was it that made you feel like you had to publish and write that book?

Harkness: It's actually- it started a few years before with me getting into statistics and I doubt that Brian remembers this but the first thing I ever wrote for Significance magazine was an article called Seduced by Stats Question Mark, which was probably about the time that my partner and I were doing the show called mass death on the fridge. And it was because I was confused. After all, you know, I really like math. So that's why I went back and studied it again. I've always liked it. But I've always realized that this is a kind of a minority sport really, most people don't like mathematics and they'd be very happy to never have to look at it again. And yeah those same people getting really excited about the statistics. They were getting really excited about infographic displays in newspapers, what your previous guest Tim Harford was talking about and I thought well, this is odd because I like statistics. I'm quite excited about what you can do with them. But I know for a fact that all of you people really hate mathematics, so why are you getting so excited about some graphs? It’s as if you think he has some magical Oracle of objective truth that in a difficult time where nobody really knows what's going on, you can at least look at the numbers of the numbers will appeared in shining light and tell you what to do and then as things evolved as such to see people talk about big data in the same way and I was thinking, well again the kind of mathematical side because this is really exciting. Can you really do all this stuff just by collecting loads data and applying mathematical processes to it because that’s really exciting if you can do all the things that you’re claiming this could really transform those things. But on the other hand, is this those same people that got really excited about infographics in newspapers are they now really excited about big data because it’s big and shiny and I don't understand it. So maybe it's really clever. And in fact, I took to an American Scholar called Christine Rosen who is looking at it and I said to her, you know if you got a definition of Big Data because this is this when I was making a program that for really for about it, and she said yes, it's an oracle. People look at it. They think it's going to give them all the answers. So so it was that really and maybe you know part of it was my mathematical interest; me going look isn't this clever you get all this data and you did this to it and it tells you this thing that you never knew before and I do still find that really exciting, but then the other bit of me was you know me as a citizen if you like going, why are we so convinced that all these quite difficult messy complicated human problems can be solved if you just collect enough data put it a big enough computer.

Pennington: I’m going to pull in a question from the audience and this is a reminder. If you have questions for to Timandra, we will try to get to them throughout the rest of the show and certainly at the end, but someone just post a question whose decisions do you think are more biased, algorithms or people? And I felt like a nice sort of question to sort of scoop in there.

Harkness: That's brilliant question. That is a big question. I mean, I think it partly underlies to get algorithm you think well, I know that I'm biased, I'm full of all these shortcuts and loyalties and emotions, so maybe an algorithm could step back from that and be more objective. Well, I think there's two things that play one of them is that algorithms are made and designed by people. They are as flawed and imperfect as the people that build them. The advantage they have if you like is that by building an algorithm you kind of have to build assumptions into it, but it does help you be more aware of what the assumptions are that you're building in and even though you can't have a fair algorithm in an unfair world, for example, to go back to the a-level schools results algorithm, the truth is that in a normal year where the kids took exams a lot of them would find that their exam results were lower than the teachers have predicted. So, this does tell us something about the the unfairness the school system probably but in a normal year, the kids get state exam themselves. So at least they get to affect their own outcome, and this year they didn’t. So, you can't actually have an algorithm that is going to dish out a completely fair result because the world is not fair. What you can do is say okay well, but we need to be explicit about what kind of fairness we trying to achieve. Are we trying to achieve everybody goes in on equal basis in which case we know that what comes out will be unfair because it's not a fair world, or are we going to say we want things to come out and look fair by some other measure in which case maybe we have to adjust people and not treat me fairly on the way in? I mean off called were big defensive. They said well, we have tested our algorithm by all these measures we test them like are poor children that are disadvantaged. No. Are boys or girls going to be disadvantaged? No. Are all these different ethnic groups can be disadvantaged? No. All these subpopulations are going to come out roughly is in the same state as they would if they take the exams. So, on a population level, they said we've been totally fair. Look every subpopulation has been treated fairly as like if every individual has been treated fairly. So so that I think it can be good that building an algorithm makes you think she decide well what's fairness look like what kind of fairness you want? And also, by the way, this reveals to us what the other fairness as there are in the world. But but the thing is there's also this slightly underlying assumption that people are basically all biased and prejudiced and awful and I think you have to remember the difference between algorithms and people is that a person can reflect on themselves go, oh, yeah, I should just caught myself assuming that, you know all boys were like this. But actually, You know, even if the data says that on average more boys are like this and I shouldn't assume that of every boy that I meet and therefore I'm going to change my attitude in future and deliberately try not to think that and deliberately set myself up so that I don't slide into this habit. What is an algorithm has no moral sense? I'll run this with this is going to be wrong and algorithms going to do exactly what you programmed it to do.

Bailer: So, you know one thing about the the algorithms I wonder I love this idea that you phrase from this issue of turning over human judgments to algorithms. But I also wonder if it's how people sell algorithms and the results of algorithms that perhaps they sell them as if they have this this this level of precision that they really don't have. That they oversell predict the precision and the uncertainty and variability that are baked into this.

Harkness: Yeah, I think that's a point very well made and again, When it's not there is just a very basic thing of especially if you're a corporate entity and you're designing an algorithm you go hey, our magic algorithm will help you do this and you go you've just given me two decimal places there so that basically into making this up. I'm not going to you seriously at all and this is a problem, sometimes, that you want to question, how did you get those outcomes? And especially if they're private companies they go we can't tell you it's a statute, is it's a secret. It's our commercial Secret. But the other thing is I think that- it's that uncertainty question, which I think is a much bigger question. I think we look to things that have numbers attached for certainty. I think that's one of the great deep appeals at the moment of statistics and data and numbers is that the world is very uncertain, it's very unpredictable feels. It feels risky, even though actually it's safer than any other period of history still even in spite of the pandemic is still a very safe period but because it's hard to make sense of Because it's a world that's changing socially and politically as well as everything else. I think people feel very insecure. They feel fearful about the future and they hope that numbers and data will give them something very definite. So, you may know that the future is going to be awful, but at least you'll know it's going to be awful in with mathematical precision. Whereas of course also all statisticians know that approximately 95% of your job, two or three close to either side is actually just quantifying uncertainty, is saying well, we think it's probably within this range, but the like the more you narrow that down the less certain you can be about it. So you can look at you could easily look at the whole of Britain and go well, we’re certain London is in there somewhere, but then the smaller the area you pull out in the city unless you know is London. So I think that- I think more be more upfront about inserting would really help in a lot of cases, just if we all need to learn to accept risk, not just in the sense of going out on your bicycle and getting in a terrible accident like poor Tim did, but risk in the sense that you don't know what the future's going to be and sometimes you don't even know things about the present, you know, we don't really know how many people have Coronavirus. We can make an estimate by various methods, we can have various figures and go. Okay. Well, these different but they give us a ballpark figure, but we don't know, we probably never will know what we have to do is become better at making decisions. Accepting that we don't know things were certain, and all we can really do is get an idea of roughly what something is and how to how uncertain this is.

Pennington: So, we have Tim Harford still with us and I believe he has a question for you.

Tim Harford: So to bother you you raised a couple of times of the puzzle that we put so much trust in algorithms and I wanted to ask you about that a little bit more in the a-level predictions thing is it's a really stark example. This is a situation where if you put it all the government said we are going to cancel your exams, this is not safe and then a computer will give you the grade that you would have got if you sat the exams. Which of course when you put it like that possibly be true. How did they- how is it that they managed to fool themselves into thinking that it might be true? How did the rest of us nod and accept it like, oh, yeah, I suppose that'll do, and is there anything that we can do to have a more realistic view of what algorithms can and can't achieve? Because they’ve got their place of course?

Harkness: Well exactly I mean possibly hopes that the fact we had teenagers out on the streets with signs that say things we probably can't say the podcast but they're very rude about what the algorithm does. I’d slightly hope that that will actually sink in and people will go, yeah lot of this is just hyperbole. How could- How could an algorithm ever possibly know that? I do think it's less a sign of how powerful its data are more sign of how much we lack in human fields of go politics economics philosophy, even with that. We do have a government in the UK at the moment that is quite technocratic. We have been certainly dominant coming since one of the chief advisors is really really keen on data prediction algorithms and getting more people into government and understand data, which you know, what level would be great. It would be great if more of them understood stats data, but they there is a slight air of well, yeah, you don't get enough clever people enough data and that will give us all the answers, and I rather want to say we are the government you should have ideas. You should have policies; you should have principles. You should have a vision of where you want to take this country to, and that's what's going to get us through and data and algorithms, however, good, they are can only be a means to help us get that. They can possibly give us a better idea of where we are and a better idea of the outcomes if we do different things, but they don't get to tell us what we should do. I do and I do think it's that I think it's the lack of direction, a lack of vision and lack of self-confidence that leads us to put far too much confidence in algorithms.

Pennington: That's that feels tied into a question we have we got from an audience member who asks if we're spending enough time scrutinizing the questions we're trying to have big data answer for us.

Harkness: No, I don't. That's a really good question. Exactly for that reason. I think if you formulate the question, right then finding the answer is often most the easier part and I think that if you ask a lot of statisticians their job is to go in early and help people formulate the right question in the right way. You know, I would still say even though I that I was more a writer than a statistician and I always say if I can ask the right questions I consider it a job well done rather than giving the answers for somebody else.

Pennington: So, we have two more comments that came through one just from someone who said they attended you show last year, very entertaining and instructive but they did not volunteer for the Shockwave. So, this is kind of related. So, someone is asking if you would mind telling your favorite statistical joke?

Harkness: Well, they might set it before because it is my favorite and I do tell it all the time. But why should you never tell a statistician he was average? Because it's means.

Pennington: That sounds like one that John Bailer might have actually said.

Bailer: I have to tell you Timandra, my family thought this was an impossibility that there could be someone who could have humor and statistics as part of their life, except I have a worse one. It may be a bit UK specific. I don't know if this will make sense for an American audience, but what is a statistician’s favorite sandwich filling? Correlation chicken.

Harkness: See, I don’t know if you have Correlation chicken. American listeners going, huh?

Pennington: I am familiar with it. I do before we before we go I saw you. So I was stalking you this morning as I did preparations and I saw that you tweeted out that you have a new piece out in Significance about and I figured Brian would like me to ask about that about John Gronte. I don't know if I'm pronouncing it Gronte. Yeah. I'm a bit obsessed with superhero of stats why? Harkness: Yeah, well because, now he was born 400 years ago this year as I discovered right this piece more Significance because he makes a tiny part in my book because I try and get over the ideas of the stats of this the book by telling you the story of the person who first thought of them because then they will make more sense. And he said he lived through the English Civil War, fought English Civil War of parliament side waves of plague because he was in London. He was a founder member of the Royal Society in spite of just being humble haberdasher. But he wrote this one book which is about the bills of mortality was about the death records of what people are died of and in this book he just didn't fetch it it all these concepts which he needed to try and get information out of the data it basically this raw data for about 50-60 years of mortality and he went through and he said well, you know, but you can see these patterns. If you do this, you can see that pattern, so he came up for example with the idea of excess deaths. He looked they said well in this, you know this year we says about plague year because this many plague deaths listed but hang on, if we look at deaths from other diseases in the years before and after this year, they were about seven or eight thousand. And in this year, it's seven- It's in this year's 18,000. So where did these 10,000 other deaths come from? The must-have been more plague deaths than were written down as plague. And so many ideas which you know, you didn't have the language for it, but he basically invented a lot of statistical ideas and yet there's not a statue that's not even a little plaque to say where he lived.

Bailer: There should be.

Harkness: There should be I'm going to start campaigning like there’s a few fans actually. I might just start a fan club and it's got no statue of John Gronte, and then he lost everything in the Great Fire of London and then he was persecuted because he had converted to being a Roman Catholic, which at the time, was very unpopular and basically died in poverty aged only 53. I know of his sight his life is a roller coaster and he invented all these statistical ideas. There should be a Hollywood movie about it. If there are any Hollywood producers listening, write to me.

Bailer: That’s our biggest listener audience segment, Timandra, that’s clearly who we’re appealing to in this series.

Harkness: Absolutely. George Clooney could totally play him.

Pennington: I’m going to launch, just as we’re wrapping up, John’s question he normally asks, some sort of maybe what advice would you give to statisticians who want to maybe not shock people in an audience, right, but maybe want to communicate to a broad public? What advice would you give to them as they're thinking about how to present they’re their research or connect with those those audiences outside the statistical community?

Harkness: Basically, you've got to start where those people are and I think this is always whether you're trying to do comedy or radial write books or whatever you're trying to do just start with those people are listen to them more than you talk to them. Think about, well, what's you know, what are they concerned with? Have a look at what their newspapers are, to see what the stories are what the adverts are for. Those are the things that those people are interested in start from there and go to where they are. Look for things that will arouse their emotions and it's taken us right back to Tim Harford started to dissolve. It's the feelings that will grab them and make them care. If you can't make them feel something about what you want to talk about, then why would they give you any attention at all?

Pennington: Oh, that's great. Thank you so much for being here today. That's all the time we have for this episode Timandra.

Harkness: It’s been an absolute pleasure.

Pennington: We’d also like to thank the Royal Statistical Society for allowing us to record two programs as part of their annual meeting. Stats and Stories is a partnership between Miami University’s Departments of Statistics and Media, Journalism and Film, and the American Statistical Association. You can follow us on Twitter, Apple Podcasts, or other places where you can find podcasts. If you’d like to share your thoughts on the program send your emails to statsandstories@miamioh.edu or check us out at statsandstories.net and be sure to listen for future editions of Stats and Stories, where we explore the statistics behind the stories and the stories behind the statistics.


How to Understand the World Better With Statistics | Stats + Stories Episode 156 by Stats Stories

Tim is an economist, journalist and broadcaster. He is author of "How To Make the World Add Up", "Messy", and the million-selling "The Undercover Economist". Tim is a senior columnist at the Financial Times, and the presenter of Radio 4's "More or Less", the iTunes-topping series "Fifty Things That Made the Modern Economy", and the new podcast "Cautionary Tales". Tim has spoken at TED, PopTech and the Sydney Opera House. He is an associate member of Nuffield College, Oxford and an honorary fellow of the Royal Statistical Society. Tim was made an OBE for services to improving economic understanding in the New Year honors of 2019.

Read More