Dr. Alicia Carriquiry is a Distinguished Professor of Liberal Arts and Sciences and a Professor of Statistics at Iowa State University. She serves as Director and lead investigator for the Center for Statistics and Applications in Forensic Evidence. The NIST Center of Excellence’s mission is to increase the scientific rigor of forensic science through improved statistical applications. Dr. Carriquiry provides scientific oversight and research expertise to the center. She participates in the Organization of Scientific Area Committees subcommittee on Materials and Trace Evidence and serves as a technical advisor for the Association of Firearms and Tool Mark Examiners. Dr. Carriquiry was recently named to the National Academy of Medicine and elected as a fellow to the American Associations for the Advancement of Science.
+ Full Transcript
Rosemary Pennington: For many Americans, forensic science is framed for them by TV shows like, Dexter, or CSI, or Bones. Shows which focus on the work of genius forensic scientists, as they sift through evidence, in order to give investigators, tools to find a criminal. But along with the physical fingerprints and bloodstains, there’s also a lot of statistical data that could be sifted through. The intersection of forensics and statistics is the focus of this episode of Stats and Stories, where we explore the statistics behind the stories, and the stories behind the statistics. I’m Rosemary Pennington. Stats and Stories is a production of Miami University’s Departments of Statistics and Media, Journalism, and Film, as well as the American Statistical Association. Joining me in the studio are our regular panelists, John Bailer, Chair of Miami Statistics Department, and Richard Campbell, Former and Founding Chair of Media, Journalism, and Film.
Our guest today is Alicia Carriquiry. She’s a Professor of Statistics at Iowa State University, and the Director of the Center for Statistics and Applications in Forensic Evidence, or CSAFE. Alicia, thank you so much for being here today.
Alisha Carriquiry: Thank you for having me.
Pennington: Could you tell us a little bit about CSAFE’s mission?
Carriquiry: Sure. So, CSAFE has a three-fold mission. Our research mission is to build up, develop the probabilistic foundation of pattern evidence and digital evidence -- and we’re going to talk about that – what that means afterwards. We are also expected to train forensic scientists and law professionals in trying to transmit the importance of statistics, quantitation, etc., and outreach. So, we reach out to the forensics community and try to work with them in helping them fill the gaps in research in the evaluation of evidence.
John Bailer: How have you been received? One of the questions, when I was listening to one of the talks that you had given, you said as a goal that you wanted to shatter the notion that, “I know a match when I see one”. So, I would think that what you’re doing, may run completely in the face of people that are specialists that have been practicing for years.
Bailer: Well, that was a quick answer. This will be a short show. [LAUGHTER]
Carriquiry: Let us just say that we were not received very fondly to begin with, and one can understand that, right?
Carriquiry: So, here you have professionals that have been practicing their craft for many, many years, and then there’s us coming out of nowhere, saying this is not the way you should do things; this is the way you should do things. And so, there was resistance, but on the other hand, I think that the poor forensic practitioners, I think they find themselves between a rock and a hard place, because on the one hand, there’s us saying, you need to change the way you’re doing things, and then there’s also pressure from jurors and lawyers and the public, that is starting to ask questions. Okay, so you say this matches, what’s the probability of that match? And so, I think – well, no, not, I think, I know – the conversation has started changing, and we have tried to put emphasis on the message that we don’t want – we are not telling people they’re doing everything wrong, we’re just trained to develop tools that they can use to improve the way they work.
Richard Campbell: Alicia, what was the – what was sort of the tipping point when these police departments started sending you bullets and – so what happened to instigate that?
Carriquiry: Well, my colleague, Heike Hofmann, was really the intellectual driver of this work in bullets, and I, have been hitting the meetings for firearms examiners, and at some point, we presented some work that said, we have this classifier. We can tell with all this classical sets of bullets that have been out there. We make no mistakes in the classification of pairs of bullets that were fired from the same gun or from different guns, and send us some bullets. And so, we so we sort of issued a challenge, and we started getting sets of bullets from all over. They wouldn’t tell us ground truth. So, they would send us this, sort of, question bullets and asked us to classify them. And we’ve been doing really well in terms of getting back to this departments and saying, this is what we think is going on with this bullet. This came from this gun. This came from that gun. And so far, there’s been no mistakes.
Bailer: So, have these predictive models that you have for identifying bullets, been used in courts?
Carriquiry: No, no. So, this is a very long process, right? So, first of all, you have to have – well, like everything else, right? So, you have to have a lot of testing, a lot of validation, and we’re working a lot with machine-learning type of algorithms, learning algorithms, and those are very, very, very dependent on the data. You have to train those algorithms.
And so, we are in the baby steps still, and the technology we’re using to take measurements, we’re using three-dimensional, high-resolution microscopy to look at the surface topology of these bullets, very high-precision fractions of microns. These instruments are not available in your normal crime lab. And so, we have a plan now to see whether we can work with labs, put some of these instruments in the labs, and really start pilot-testing these technologies in real cases.
Campbell: So, my knowledge of forensic science comes from watching many hours of Scandinavian detective show [LAUGHTER] on Netflix and Amazon. So, but I’ve never in my years of watching, have never seen questioned when the forensics guy says, “these bullets came from the same gun”, and nobody says, “Are you sure? Are you certain?” And you’ve introduced uncertainty to this whole process that, I think, in some place, I think you’ve called it – it’s a subjective science often.
Carriquiry: It’s an art.
Campbell: Yes, that’s right.
Carriquiry: It’s not even a science. Nobody measures anything. So, the way this works is, the firearms examiner gets – let’s suppose that there’s some bullets recovered from a crime scene, and then they have a suspect, and they have the gun that belongs to the suspect. They get a couple of – or three or four test shots from that gun, and then, essentially, they do this visual comparison. So, they put these samples under what’s called a, comparison microscope, which is just an optical microscope with two lenses. They fix one of those samples, and then they rotate the other one, until they find striations that cut across the two samples. And there’s nothing – there are no thresholds, so it’s not like somebody needs to find 17 matching striations before they declare a match. There’s no such thing, and the conclusions are categorical. This is a match. This is not a match, or it’s inconclusive. And there is no room for plus or minus, you know what I’m talking about. And so, you ask a firearms examiner how many mistakes have you made in your career, and they will look you in the eye and say, zero, and you look at them and you go, listen, you’ve [LAUGHTER] made mistakes, and they say, I have never made a mistake. I’ve never been challenged. The fact that you’ve never been challenged doesn’t mean you’ve never been wrong. And so, this, the conversation, started from such a faraway place from science, that moving into science is a slow process.
Bailer: So, I have to ask a follow-up to that. You see lots of evidence that’s portrayed, that’s forensic evidence, what’s the type of evidence and the quick conclusions that are drawn on these shows, that Richard mentions, that drives you completely crazy?
Carriquiry: Oh gosh! Well, you know, the fingerprints that – they sit in the computer and two seconds later, there’s a match. [LAUGHTER] That is so crazy. This business of, they look into a couple of test tubes and they say, “oh yeah, just by looking at this reaction, I can tell you, this is the substance.” It’s such fantasy. [LAUGHTER] Nothing happens like that. And the one thing that really drives you crazy, or at least me, in these shows, is that they’re always – they never, ever, ever, consider the probability of a random match. So, in this shows, if it matches, it means the guy did it. But in real life, two things can match and still have a different source, and so you certainly can – the simplest of all cases is, you can have some biological sample from the crime scene, or some blood from the crime scene, and it turns out to be Type O, and then your suspect is also Type O. So, in CSI, they would say, this is the killer. In real life, 45 percent of the population have Type O blood. So, there’s a very high chance that somebody other than the killer – other than the suspect, left that blood at the crime scene. This probability of a coincidental match or of a random match is never even mentioned. And so, and until recently, I got to tell you, it was never mentioned in court either.
Bailer: You described, there are sort of, two components, when thinking about evidence and the value of such evidence, and one was the idea of probative value, was a description that you used, just sort of, making sure that this evidence actually does provide you with, kind of a, inside into whether or not a person was guilty or it’s associated with a particular individual. Can you talk about, kind of, the types of evidence that you obtain from this forensic investigation?
Carriquiry: Oh gosh, there’s all types of evidence. So, there’s of course, biological evidence of all types. There’s, what we call, pattern evidence and pattern evidence would be things such as shoe prints, or fingerprints, or the striations on a bullet, or handwriting, or blood spatter. So, when somebody gets killed, there’s all this blood spatters, and bite marks. Then there’s trace evidence, for example, the chemical composition in duct tape, or in glass, gun residue, I mean, there’s any number of different types of evidence, and some of it is more, what we call, probative than others. So, we know, for example, that DNA is highly probative, because the probability of observing two people with identical DNA, is infinitesimally small. But we know that blood type, like I mentioned, is not probative, because unless you have an extraordinarily unique blood type, there’s a lot of people walking out there that have the same blood type that any of us have. And the issue is, that for most other evidence, we don’t know what the probative value is. So, in the case of fingerprints, everybody, more or less assumes, but there hasn’t been anything proven, that everybody has a unique fingerprint. But even if that’s true, the fact that, in that crime scene, you don’t find pristine fingerprints. You find the smudged partial prints, sometimes one on top of another one, and so the question is, if I have this very, very noisy image of a fingerprint, can I still say, with certainty, that this particular individual left that fingerprint that I am observing with so much noise, and the truth is, you can’t. There’s been any number of individuals that have been wrongfully convicted on account of faulty fingerprint evidence. Not because a fingerprint is unique or not, but because you simply cannot observe perfect evidence at the crime scene.
Pennington: You’re listening to Stats and Stories, and today we’re talking forensics and statistics with Alicia Carriquiry. Alicia, so this is a fairly new area, how did you become a forensic statistician, if that is a title that you can use?
Carriquiry: By happenstance [LAUGHTER]. So, many years ago, boy, 20 years ago by now, two colleagues of mine, Hal Stern, who is – two former colleagues of mine – Hal Stern, who is now in University of California- Davis, and Mike Daniels, who is now at the University of Florida. We were approached by the FBI, to see whether we were interested in looking at some data that they had, and the only reason we were approached by the FBI is that, at Iowa State, there is a federally funded lab, called the Ames Lab, and they had money from the FBI. And so, the question at the time was about bullet lead. So, when you have – bullets are made out of lead, and that lead is really an alloy that has all this trace elements like silver, and bismuth, and all kinds of other things. And at the time, the FBI was using this as evidence in court. So, they would find a bullet at the crime scene, they would find a suspect, the suspect had unspent bullets, let’s say in a box, and then they would analyze the lead, and compare the lead from the unspent bullets, to the lead in the crime scene bullets and if they found those chemical compositions to be what they call, indistinguishable, they would conclude that the bullets that were used in the crime, were originally part of the box that the suspect had at home. And so, they wanted us to come up with some sort of probability of a coincidental match, and we correct in saying this. An the hope was that we would confirm – but it turns out that we found that there was an enormously high probability that bullets with identical composition would be indifferent boxes, which makes perfectly good sense, because when you have one batch of lead in a bullet manufacturing plant, they produce 300,000 bullets from one, what they call – what is it called – one batch of lead alloy. So, you expect bullets come in boxes of 50, you would expect to find many boxes with the same composition. And so, we told them, this really doesn’t have probative value at all. And that’s how – and so, they didn’t like that. [LAUGHTER]
Carriquiry: That was the end of our work. [LAUGHTER] But it turns out that this led to – our report leaked and this led to a lot of challenges in court and this lead to the National Academy is establishing a panel to look at this, bullet lead, and essentially, the panel agreed with what we had said originally. So, the practice was discontinued. And so, that was a big impact actually.
Bailer: That’s cool.
Carriquiry: And so, then we did – I did nothing else. Hal continued doing some forensics in between, until, in 2014 there was a call for proposals to establish a center, a research center, and Stephen Feinberg, who of course, passed away in 2016, gave me a call and said, “Hey, what do you think we put in for this?” And I said, “Yeah, I think you’re crazy, but well-” [LAUGHTER] “-I’m game!” And famous last words, we got the funding, and established a center in 2015.
Campbell: I’m imagining, you mentioned being approached by the FBI, that – and maybe this has already happened, that you’re going to be approached by a lot of smart defense lawyers who are going to be looking for reasonable doubt, which you could introduce fairly easily it seems, with this evidence being so new. Has that happened or do you anticipate that?
Carriquiry: Oh, this happens now. I get three or four calls a week.
Campbell: Three or four calls a week.
Carriquiry: Yeah, it happens a lot. And the problem is, I used to do a lot of pro bono work, typically for defense, although, on one or two occasions, I worked with the offense – the prosecution, but I cannot do that anymore right now, because I am trying to work with the forensics community. And so, the last thing I need to do is go to court and attack them, so.
Campbell: Very interesting.
Bailer: Yes, you need to be neutral at this point.
Carriquiry: I have to. I have to.
Bailer: So, you mentioned that the 2009 NRC report, National Research Council report, and there’s a special issue of Significance Magazine that’s going to come out and reflect on that. I’m curious, what has changed in the last ten years?
Carriquiry: Well, unfortunately, not a whole lot. What has changed? I think what’s slowly starting to change is, like I mentioned, the conversation. And there is a good chunk of the forensics community that is interested in moving forward and doing things in a more scientific way, if you will. And I think they’re starting to get excited about the fact that, yes, we are not out there trying to show that everything they’ve done is wrong. We’re out there trying to develop better ways to do their assessments. And so, that has changed, the legal community has become a lot more aware of these issues. So, there are initiatives now to get lawyers and judges a little bit more versed in this type of question science and statistics and so on, which is kind of funny, because every lawyer you talk to, they say, “I became a lawyer because I hate math.” [LAUGHTER] Well, I hate to tell you but – So those – but in terms of practice, there hasn’t been a whole lot of changes unfortunately.
Campbell: How have journalists covered your work? And that’s one question, and a second part, what can journalists do better? I mean, they have to report on these cases all the time. They have to report on the science that you’re involved with. You’ve got two journalists at this table, and we’re kind of interested in helping our students do a better job of telling stories that are complicated, and this is certainly a complicated story that has a lot of uncertainty to it. Can you talk a little bit about that?
Carriquiry: Yes, so, there’s a few journalists out there that have been beating this drum for a long time. There’s this gentleman in the Washington Post called, Radley Balko I think his last name is, and he has been talking about forensic stories for a long time, but it doesn’t receive the – I don’t think it receives the attention it should receive. There’s big stories out there. The Innocence Project, whom I’m sure everybody has heard about, has identified almost – no, let’s see – well, over 350 individuals that have spent 20, 30 years in jail, for crimes they didn’t commit, and who were convicted on the basis of junk science, you know, junk evidence, and these stories need to be told. There’s a real and horrific human cost when you have somebody incarcerated when they’re 20 and they come out when they are 57, for things they didn’t do, and that’s one part of the story. The other part of the story, I think, that we don’t tell very well, and particular to the kids that are doing science and technology, is that, applications such as these, like forensics, tremendously important from a social point of view, right? So, we can change the way justice is administered, and I think that would be so tempting for young people to get into a field like this, because you can really start changing the world.
Bailer: So, if someone – if a student wants to get into this world, they want to become someone who does the analysis of forensic evidence, what’s kind of the path that you recommend for them to get involved? What kind of skills should they have?
Carriquiry: Right, so, if somebody wants to work in a crime lab, then of course, the skill would have to be science but with some statistics. You cannot avoid the statistics, or at least some data science, something or another. You have to be able to think what the data being. So that’s one recommendation if you want to do research in this area. So, be an academic that’s working in this area, then things such as, studying chemistry, studying statistics, studying computer science, the sciences in general, are going to be the ones that are going to take you down this path, I think. And there’s, like I said, an area, a wide variety of different types of evidence, and so there’s room for all types of science.
Pennington: Well, that’s all the time we have for this episode of Stats and Stories, Alicia. Thank you so much for being here today.
Carriquiry: Thank you very much.
Bailer: Yeah, thanks, Alicia.
Pennington: Stats and Stories is a partnership between Miami University’s departments of Statistics and Media, Journalism, and Film, and the American Statistical Association. You can follow us on Twitter, Apple Podcast, or other places where you can find podcasts. If you’d like to share your thoughts on the program, send your email to StatsAndStories@MiamiOH.edu or check us out at our website, StatsAndStories.net, and be sure to listen for future editions of Stats and Stories, where we discuss the statistics behind the stories and the stories behind the statistics.