Dr. Jeffrey Morris is the George S. Pepper Professor of Public Health and Preventative Medicine and Director Biostatistics Division, Department of Biostatistics, Epidemiology and Informatics at the Perelman School of Medicine University of Pennsylvania. He has been actively involved in scientific communication efforts on social media and with various media outlets. He is also a distinguished research fellow at the Annenberg Center for Public Policy.
Episode Description
Three hundred and thirty-two days, that was the international statistic of the year in 2020, as identified by the Royal Statistical Society. That was the length of time between scientists publishing the genetic sequence of COVID-19 on the 11th of January, and an effective vaccine being administered on the 8th of December. This vaccine was an integral part of the world's pandemic response. Vaccines aren't new. In a World Health Organization report describing the history of vaccines, Dr. Edward Jenner is credited with the world's first successful vaccine for smallpox in 1796. In the last 100 years, vaccines were developed for yellow fever, pertussis, polio, hepatitis B, measles, mumps, rubella, and more. Well, how do we know vaccines are safe and effective? Why do some people argue against using vaccines? That's the topic of this episode with guest Dr. Jeffery Morris.
Full Transcript
John Bailer
Three hundred and thirty-two days. That was the “International Statistic of the Year” in 2020, as identified by the Royal Statistical Society. That was the length of time between scientists publishing the genetic sequence of COVID-19 on the 11th of January and an effective vaccine being administered on the 8th of December. This vaccine was an integral part of the world's pandemic response.
Vaccines aren't new. In a World Health Organization report describing the history of vaccines, Dr. Edward Jenner is accredited with the world's first successful vaccine for smallpox in 1796. In the last 100 years, vaccines were developed for yellow fever, pertussis, polio, hepatitis B, measles, mumps, rubella, and more.
Well, how do we know vaccines are safe and effective? Why do some people argue against using vaccines? On our show today, Dr. Jeffrey Morris will help us understand these questions and other issues related to vaccines.
I'm John Bailer. Stats and Stories is a production of the American Statistical Association, as well as Miami University's Departments of Statistics and Media, Journalism and Film.
I'm joined in the studio by my colleague Rosemary Pennington, chair of the Department of Media, Journalism and Film.
Our guest today is Dr. Jeffrey Morris. Morris is the George S. Pepper Professor of Public Health and Preventive Medicine and director of the Biostatistics Division, Department of Biostatistics, Epidemiology, and Informatics at the Perelman School of Medicine at the University of Pennsylvania. He also has been actively involved in scientific communication efforts on social media and with various media outlets. He is a distinguished research fellow at the Annenberg Center for Public Policy.
Jeff, it's a delight to welcome you to the podcast.
Jeffrey Morris
Thank you. John, Rosemary, it's great. Yeah, great to be on.
John Bailer
So how did you first get involved in studying vaccines?
Jeffrey Morris
It was interesting because in my career, I'd never really done much with that. I did most things in cancer research before this, but it was really the pandemic that got me interested in that.
You know, in the early pandemic, when we're all kind of stuck at home, locked down, there's this new virus disrupting our lives. We have no idea what's going on, so I just took time to try and keep up with the emerging data and literature to figure out what was going on.
And that's what kind of got me involved more in scientific communication. But also then, by the end of the year, the big talk was about these vaccines coming that were hopefully going to end the pandemic and get us back to normal.
And so that's what kind of got me interested in vaccines—really the pandemic and the COVID vaccines first.
John Bailer
So, you know, this 332-day thing—that seemed really fast, you know, just to get to go from start to something that's being applied. Is that typical? How long does it normally take to develop a vaccine?
Jeffrey Morris
Yeah, no. I mean, it's a miracle they were able to do it that fast, because it's very atypical. You know, I think the typical timeline for a vaccine in a non-emergency situation might be 10 to 15 years. In a fast-track situation, maybe five to seven years. So yeah, less than a year is remarkably fast.
But given the urgency of the pandemic, there were a number of things that were done to enable those steps to be completed much more quickly than they usually would.
Rosemary Pennington
Yeah, and I would love to hear more of that, because there are a lot of people that I know who have gotten COVID vaccines but then also are still sort of hesitant around that because of this fear that it happened so fast.
Why was it able to happen so fast? And I guess, given that you've been looking at the communication, how do you feel about how it was communicated as it was being pushed through? Was the reason it was able to be done so fast communicated as well, as it might have been? I guess that's my question.
Jeffrey Morris
Yeah. Well, one thing I can say is I shared concerns, and I was very skeptical initially as well, because I was very nervous that it was done so quickly.
And one thing—since the mRNA vaccines sort of dominated, and the viral vector to a lesser degree—we forget that there were really dozens of different vaccines that they tried all at once. But these were the ones that got to the finish line first and got across. So, there was just a tremendous effort.
Of course, funding helped. There was funding that is never provided on that scale under normal circumstances. But just the urgency of the pandemic—with it being a novel virus, we knew people were dying, we knew it was overwhelming hospitals to some degree—so they recognized the importance of it. They tried to do everything they could to accelerate the development.
One thing that maybe is a misconception is that people think there were a lot of steps that were skipped and that major shortcuts were taken. And I think that's not really as true as people tend to think. Really, there were some steps that maybe were not done as much and some steps that were done more quickly, but pretty much the usual steps were followed.
There were some ways they sped it up, though. One of the early steps is identifying the target and developing the actual assay. With the mRNA vaccines, they had a big advantage because once the sequence was identified in January of 2020, the mRNA technology is very quick to develop something with a new sequence.
There had also been a lot of work in the years preceding the pandemic further developing the mRNA platform. A lot of it was in cancer. When I was at MD Anderson Cancer Center, I worked on some projects with some of those vaccines in cancer. And I think some of the technical hurdles to getting it to work had just been overcome, so it was kind of ready to roll. That really sped things up, because they were able to very quickly have something ready for work.
Then the other thing they did was the sort of preclinical testing—phase one, phase two, phase three. Usually these are done in a serial fashion with breaks in between. But here they overlapped them and put some of them together. They put phase one and two together. They did the preclinical at the same time as the phase one and two. Then they very quickly designed the phase three and moved into that.
Once the phase three started, phase three is typically tens of thousands of patients. It's a randomized trial with many centers. It takes a huge amount of time to organize this and get it off the ground. Usually, the accrual is slow. But here they got it off the ground very quickly. There was so much interest that it accrued remarkably fast.
For their primary endpoints for efficacy, they were able to reach those endpoints within a few months, and the results were so positive that they could actually achieve their primary endpoint within a few months.
So, I think all of these factors led to the ability to develop it rapidly.
And then the last thing that also sped it up is there’s usually a long delay in figuring out the manufacturing. But what happened is they were working on the manufacturing concurrently with the phase three trial to work out the kinks so they could hopefully roll it out more quickly once it was approved.
So, I think that was another area where the overlap allowed them to go more quickly.
Rosemary Pennington
And then, you know, the rest of my question was just sort of the messaging around that, right? And so, I think, you know, hearing that and having followed that as it was happening—obviously, this makes sense. Obviously, no one is going to share a vaccine that's unsafe on this scale.
But it did feel like the messaging around why we could move so quickly felt like, for me as a consumer of media, it did not get communicated as well as it might have. And I just—I guess I wonder what your read on that is.
Jeffrey Morris
Yeah, I can see that. Yeah, I think it probably could have been better.
For my own sake, as I was following it as it was going, I heard them saying this, so I kind of nodded my head and was like, okay, that's how they're speeding this up. But I was still concerned—okay, are they going to miss something? Is something sloppy?
They probably could have done a better job. But I actually think the bigger problem with communication, and maybe some of the things that hurt credibility and trust among some of the public, was more not acknowledging the limitations of what they knew at that point.
There were some things that were very well established from that initial trial, but there was still a lot of uncertainty in terms of more severe outcomes—how well it was going to prevent severe or fatal COVID, how long the protection would last, whether it would wane, how well it would work with future genetic variants.
And then also, even though the safety data was promising, one thing I emphasize in a white paper I recently wrote is that the phase three trials are never sufficient to really establish and detect all of the potential risks. Because with something like vaccines, even very rare risks are important to know about. A study of 20,000 per arm is not going to detect a one-in-100,000 risk.
So, I think there was uncertainty about whether there may be some risks from these vaccines, especially with it being a new technology that we didn't know about yet.
I think a lot of those nuances and limitations were not communicated, perhaps in the name of trying to encourage more people to get the vaccine—which may help public health and might be a good thing. But if it was done in a way that was not transparent about the uncertainties, limitations, and risks, I think that's where there was potential for it to backfire. And I think it kind of did.
Rosemary Pennington
That does feel like a kind of perpetual problem when it comes to science communication of any kind—figuring out how to communicate the issue of uncertainty well and explain what that means in a scientific sense versus what it might mean in a non-scientific sense, right?
And it feels like with COVID that issue—how to communicate uncertainty well in a way that doesn't produce fear and skepticism but is honest—just sort of came to a head in this very urgent crisis moment.
And it felt like it was a really potent example of that struggle and the need to figure out how to do that well.
Jeffrey Morris
I think that's well said. I think you hit the nail on the head.
Because I think this is always true. Science is always nuanced. There's always uncertainty, and a lot of times that gets lost in the public messaging. People who aren't trained to think as scientists—I think it's hard for them to manage the uncertainty. People want to think a little more black and white: the vaccine works, the vaccine stops transmission, the vaccine will end the pandemic. People want to think that way. The vaccine is safe.
But the nuances are always more complex. And we scientists like talking about the nuances, but sometimes we bore people, or sometimes we confuse people. And that's why the messaging is difficult.
That's part of why I've gotten involved a little bit in scientific communication and working with media, because I understand the need to simplify the message. But I think we need to do a better job of retaining more of the nuance, more of the uncertainty, and maybe trusting the public to understand a little bit more complicated things than we sometimes assume they can.
That's part of the teacher in me. I sort of take it as a challenge to say, okay, this is complicated, but I'm going to get this person to understand it. I think they can.
Because I think we can relate complex principles in ways that resonate with people and help them understand. So that's where I think we can educate the public a little more about some of the scientific—even statistical and epidemiological—nuances, so that they can recognize when things are misinterpreted or understand the nuances a little better.
So that's something I've tried to do. I don't know—I probably am trying to push it too far in my efforts. But overall, I think that's something we need to figure out how to do better.
We have a very educated, informed society, and people can handle being talked to as adults. We don't need to talk to them as children.
John Bailer
You know, just as you were talking about that, I found myself thinking about different types of literacies—whether it was scientific literacy, data literacy, statistical literacy. You pick your favorite one.
And just the story that—the idea—you mentioned this question about, well, you know, how long does a vaccine’s effectiveness last? I think there’s—I can well imagine the confusion of saying, “Oh, I was vaccinated for this as a kid. Now I don't have to do it again,” or “I was vaccinated recently, and I don’t have to do it for another seven years.”
And the idea that you're starting out with something for this recent disease and you go, “Well, you know, I have to do this again in six months, right?” There are these kinds of challenges to try to process the fact that with certain diseases and certain kinds of interventions for them, they don’t all just work the same.
So that’s a hard story to tell.
Jeffrey Morris
I think it is, because I think people think vaccine—yeah, it's something you get as a child, you have lifelong immunity, and you're not going to get the disease.
And that's a little bit simplistic. I think for certain types of viruses, even if you get infected—like this is true of SARS-CoV-2—if you get infected today, you have some immune protection once you recover. So that's great. But you might get COVID-19 again in less than a year.
So that protection doesn't last the rest of your life. It might only last a year. So why would we expect vaccines to somehow be permanent if the infection itself does not provide permanent immunity against reinfection?
And when we have viruses that mutate quickly and that spread in ways that are very efficient, it’s just really hard to completely stop them and completely prevent them.
We want to mitigate them so they don't kill people, so they don't put people in the hospital, and hopefully so we can slow the spread. But we may never stop the spread. We may never make the virus ever go away.
But maybe that’s okay. We learn to live with all kinds of viruses that are around us. The question is, can we mitigate the serious risks from them?
John Bailer
You're listening to Stats and Stories, and we're talking with Jeff Morris about vaccine effectiveness and countering misleading information related to the use of vaccines.
One question is, why has there been such pushback against vaccines, particularly recently?
Jeffrey Morris
Yeah, it's a good question. I mean, I think that over time there have always been groups of people who have been vaccine skeptics—wondering whether there are too many vaccines or whether vaccines are causing damage that's not being acknowledged.
And I think that's been around for decades. But I think the pandemic really helped that to gain much more support and maybe become more mainstream in some ways and really take off.
So, I think some of the issues we were talking about related to people being upset about the government response during the pandemic—first the strict mitigation, and then with the vaccines that people were concerned about. Maybe people felt like they weren't being transparent about explaining everything that was known and unknown about the vaccine.
And then when there's something like mandates and vaccine passports, where people feel forced to take it when they're not comfortable, that raises the emotion. And I think that contributed to a lot of concern about the COVID vaccines.
I think a lot of people during the pandemic would say, “Look, I'm not anti-vaccine. I support all the other vaccines, but I'm just anti-COVID vaccine.”
But what's kind of interesting is that I think a subset of people who became disillusioned with COVID vaccines have progressed in the last couple of years to being skeptical and concerned maybe about all vaccines—or at least a larger class of vaccines.
And I think a lot of that has naturally come out of some of the dynamics of the pandemic.
But then I think also with RFK, who has really spent the last couple decades as kind of a crusader in some ways regarding potential vaccine harms through the Children's Health Defense organization—when he became the director of HHS, I think that's again brought these ideas more into the mainstream of the current administration and for a large part of the public.
And then some of the changes we've seen going on are continuing to raise these questions and make changes.
So, I think that in some ways these ideas have become connected to certain political groups and political viewpoints, so that those people will now look at these ideas in a different way and perhaps lend more credence to them.
So, I think all of those dynamics together—but I think the pandemic definitely played a role in helping these ideas grow and take root in more people.
Rosemary Pennington
You have an article with the American Journal of Epidemiology that was examining some of the hot takes and fallacies that were circulating in relation to COVID-19 vaccines and vaccine effectiveness.
As you're talking about this and sort of the political figures that are pushing a kind of worldview around vaccines, I wonder what your thoughts are. Because at the same time, social media has become this very potent space where myths and narratives and ideas get spread really widely.
Often, a misguided take on something can go viral even as people try to correct it. And I wonder how much of a problem social media is when it comes to trying to communicate about vaccine effectiveness and sort of mitigate some of these more vaccine-skeptic views.
Jeffrey Morris
Yeah, no, I think it's a big thing. And I think social media is a big part of it, but I think it's even broader than social media.
If we think back historically—I'm a Gen Xer—and as I grew up, we had like three channels on TV. You had the news there, and you had your local paper. You didn't even have USA Today when I was a kid. So, everybody was getting the same news. We'd have different views, but we were all seeing the same things.
Whether it was complete or the full story, people could question it, but we were seeing the same thing.
Over time, the internet made all this information available on social media. Now AI gets to know who you are and, for marketing purposes, gives you what you want to hear. So, it really just exacerbates the polarization and creates these enormous echo chambers where people get fed information that the bots think is what you want to hear.
What that tends to do is give people a distorted sense of reality, because everything they see is reinforcing their view. The things that contradict their views they don't tend to see as much.
So, I think it leads to greater polarization more broadly, even outside of this issue of vaccines. But when it comes to these questions about vaccines, I think it's a tremendous problem.
A lot of people have legitimate questions—like me, especially when they were first approved. And I definitely have some genuine critiques. I see that there are genuine risks.
So, I think in that environment, where there are other people who want to take it much further, it creates a ripe environment for people to make claims that are maybe very dramatic, maybe untrue, but they'll still gain a tremendous following and get forwarded on social media and spread.
A lot of people will tend to believe it because it's reinforcing their doubts.
So yeah, I think that's a huge problem. And honestly, that's kind of what got me involved in social media during the pandemic. At first, I just did some social media posts, but then I created a blog, covid-datascience.com.
That was really my goal—to try to use my expertise as a biomedical scientist and statistician to evaluate the emerging evidence using the data and the emerging studies, and to critically assess them using scientific principles.
I wanted to try to put out reliable information about what the data shows, and as part of that, to counter claims that I think are not supported by the data.
So that was part of my goal—to get into that space, raise questions, and help people look more carefully at the things they're believing and seeing. Does the data really support this claim that you're believing and forwarding?
John Bailer
You know, one of the things that struck me with some of the writing that you've done is how often you'll see that studies are essentially misinterpreted.
Basically, people are saying, “Look, the data shows that the more people that are vaccinated, the more deaths you observe in that community.” So that's there.
But that's kind of a naive assessment, because it's not really controlling for factors or other things that someone who knows something about good study design and analysis would be sensitive to.
Could you talk through at least an example or two where that occurred and how you kind of deconstruct this false argument?
Jeffrey Morris
Yeah, sure. And the big picture of this—and this is what's really tricky with the public—is that sometimes what seems very simple and intuitive and clear is wrong. The truth is more nuanced, but because of the distrust, when someone tries to explain the nuance, it can sound like they're double-talking and trying to cover up what is plainly true.
I think that dynamic makes it tricky. But a lot of very well-meaning people, very intelligent people—misinterpreted a lot of what we saw. Again, we're in sort of an open data age and an open internet age, so all this data is available, which is great. But that also means that people will do naive summaries, make claims, and then it gets forwarded.
I know there were a couple examples that led to blog posts that I did that kind of went viral themselves. One of them was in the summer of 2021, when the vaccines had been out about six months. There were reports that about 60% of the hospitalized COVID patients were vaccinated. So, people were saying, well, therefore the vaccines can't be working—60%, that's more than half, so the vaccinated are worse off.
When I saw this, Israel had shared this data on the web, and indeed it was true—60% of those currently hospitalized with infections were vaccinated. But when I looked at the data more carefully, we saw that 80% of the adults in Israel were fully vaccinated at that time point.
So, when you actually compute the severe infection rate in the vaccinated and unvaccinated groups, the vaccinated had a 67% lower rate of hospitalized COVID infections than the unvaccinated. The 60% number made it look like the vaccinated were worse off, but actually, as an entire group, they were two-thirds better.
That's something called the base rate fallacy. If you compare two groups and one group is much bigger, then if you just look at the number of events, the bigger group can look like it's doing worse even if its rate of that event is much lower.
But there was something even more subtle there. From what we were hearing, 67% lower risk would actually be very disappointing. We expected something more like 90% at that time based on what we were being told and what some other data showed.
What's interesting is that those results were for all vaccinated people across all age groups. Israel had also split the data into those under 50 and those over 50, and the older people were simply more vaccinated. Over 90% of the older people were vaccinated, while more than 80% of the unvaccinated were in the younger group.
So, if you split the groups into younger and older and compute the severe COVID rate, then in the younger group they were 85% less likely to have severe COVID, and in the older group they were 92% less likely to have severe COVID if they were vaccinated.
So, what initially looked like the vaccinated group was doing worse actually showed a strongly protective effect—perhaps even around a 90% reduced risk.
That's an example of age confounding, and even something really subtle called Simpson's paradox.
When I did a post on this—it was my first post that went viral—it was downloaded almost a million times in the next day after I posted it. In Italy, Spain, and Germany, their main newspapers printed articles on my blog post talking about this, because people were wondering how this could be true.
I’ve had a lot of people tell me they use this as an example of Simpson's paradox, because people were really perplexed about the 60%. But when you break it down, even in a very basic way, you can see how people would misinterpret the data.
John Bailer
So this idea of breaking out of echo chambers and using the scientific method and testing what you believe—these are all really important messages if you're going to advise the public to be more critical consumers of claims.
Do you have a piece of advice or two you might share?
Jeffrey Morris
That's a good question. I think it depends on the context, because there are a lot of potholes.
One thing that I really emphasize in my own communications is that observational data makes it very hard to draw causal conclusions. When you see observational data, you have to be very careful.
What looks like a simple summary—and something that seems very intuitive—can be completely wrong, and you might draw the total opposite conclusion. It can be difficult and subtle.
So, my biggest warning is to be cautious about that.
In some of my blog posts—and now I've shifted more toward posting on Twitter/X—I try to highlight patterns I see over and over again so that people who follow me can recognize them on their own.
I think there's been some success with that, because I see people raising these concerns themselves and saying things like, “Oh no, this is Simpson's paradox,” or “Oh no, this is the base rate fallacy.” So, I think it helps a little bit.
John Bailer
That’s all the time we have for this episode of Stats + Stories. Jeff, thank you so much for joining us today.
Jeffrey Morris
Thank you very much for having me.
John Bailer
Stats + Stories is a partnership between the American Statistical Association and Miami University’s Departments of Statistics and Media, Journalism, and Film.
You can follow us on Spotify, Apple Podcasts, or wherever you listen to podcasts.
If you’d like to share your thoughts, email us at statsandstories@amstat.org or visit statsandstories.net.
Be sure to listen for future editions of Stats + Stories, where we discuss the statistics behind the stories and the stories behind the statistics.