Epidemiological BRIDGE Building | Stats + Stories Episode 240 / by Stats Stories

Dr. Sandra Alba is an epidemiologist at KIT Royal Tropical Institute in Amsterdam. For the past 15 years, she’s been applying statistical and epidemiological methods to evaluate public health programs in low- and middle-income countries. Her research focuses on data quality and good epidemiological practice, more specifically the interplay between research integrity and research fairness in multi-disciplinary international research collaborations.

Dr. Susan Rumisha, Senior Research Officer at Telethon Kids Institute and a biostatistician working in the field of public health and infectious disease epidemiology. Rumisha works on the Malaria Atlas Project and has over 15 years of experience in designing and conducting malaria and health system research. Her interests include applying advanced and modern statistical approaches to data from surveys, research, and routine health surveillance systems to generate evidence to guide decision-making processes in public health practice, policy formulation, and health systems performance, at national, regional and global levels.

Episode Description

Health research is complicated, no matter the scale or the scope. Global health research, however, brings with it particular issues. For the last decade, researchers in epidemiology have been pulled between issues related to research integrity and research fairness. Bridging the two is the focus of this episode of Stats and Stories with guests Sandra Alba and Susan Rumisha.

+Full Transcript

Rosemary Pennington
Health Research is complicated no matter the scale or the scope. Global Health Research, however, brings with it particular issues. For the last decade, researchers and epidemiology have been pulled between issues related to research integrity, and research fairness. Bridging the two is the focus of this episode of stats and stories where we explore the statistics behind the stories and the stories behind his statistics. I am Rosemary Pennington. Stats and Stories is a production of Miami University's department of statistics and media journalism and film as well as the American Statistical Association. Joining me is regular panelist John Bailer, Chair of Miami’s logistics department. We have two guests joining us today. The first is Dr. Sandra Alba, epidemiologist at high tea oil tropical Institute in Amsterdam. For the past 15 years, she has been applying statistical and epidemiological methods to evaluate public health programs in low and middle income countries whose research focuses on data quality and good epidemiological practice. More specifically the interplay between research integrity and research fairness and multidisciplinary international research collaborations. Joining her is Dr. Susan Rumisha, Senior Research Officer at telethon kids Institute and a biostatistician working in the field of public health and infectious disease epidemiology. Rumisha works on the malaria Atlas project, and has over 15 years of experience in designing and conducting malaria and Health System Research. Her interests include applying advanced and modern statistical approaches to data from surveys, research and routine health surveillance systems to generate evidence to guide decision making processes in public health practice, policy formulation and health systems performance at national, regional and global levels. The two are joining us today to talk about the bridge guidelines designed to help bridge research integrity and research fairness and global health APD. Epidemiology, Sondra and Susan, thank you both so much for joining us today. I'm just gonna get started with actually I think the question John was going to respond to, but simply just so we know what the bridge is about and why it matters. If you could describe what research fair, fairness and research integrity are in your field and why this intervention is necessary.

Susan Rumisha
When we talk about research integrity, and as the word itself stands integrity is like how you do your work is like how, how do you work in a manner that someone can trust what you have done. And even you yourself, you feel the confidence of the work that you have done in a way that you are, you are kind of ready to share it to other people and to spread it widely. So it starts from the entire range of when you're designing your work, like the methodology you use, and to the time that you have the results. So things like honesty, are you honest? Can you honestly talk about your work? Is it of a good quality? And when you actually carefully thought around you and did the design? The webinars also implemented it? And all the time you have to think that can someone actually repeat this work, the replicability of this work. And this is a big thing for them in their epidemiological research. And everything that was done can be taught in its transparent manner. And as you're communicating it openly to the to the audiences to the end users, can you speak of things that you have done in in an open manner, but also it includes the peace of the professionalism because as we as you're working in the research, you are involved in people in the eye we need you to to respect those who are involved in the work. So that creates the entire two forces between doing the work and those who are working with it. But another thing that is the last piece about integrity, is that you have to feel that I'm accountable for the work that after and in that accountability kind of closes the part of the integrity side only for Sandra to have anything to add. Then she can talk about fairness maybe.

Sandra Alba Yeah, no, I think you summarized it very well. And Omar, yeah, concerns regarding integrity and our day to day relationship with the integrity but indeed, as a global health epidemiologists, another important concept for us to take into account is is research sheerness gained increasingly important over the past years, and it's very much aligned with costs to decolonize global health because there's a recognition that the wealthiest and best equipped institutions have a significant competitive advantage over the other institutions and that can create power imbalances and can then in turn She really affect the integrity of the research. So the two are definitely linked. But the research fairness is really about, yeah, bringing research home in a certain way. So it's really ensuring that there is a local ownership for the research, that it's part of a local research agenda. And that the benefits are then found, again, locally, in the context where the research is being done. So in a way one could maybe summarize, yeah, the research integrity is about really aiming to have really scientifically rigorous research and the fairness is, from our perspective is really about the socially impactful, you know, you want to have the impact in the right place. And the right place is where the research is being done. It should benefit those populations there.

John Bailer You know, this is great, this is, this is fascinating work. And it's fun to be able to talk to you about this, I could it might help just to step back and give us an example of of the kind of problem that's investigated in global health, you know, so to help set the stage and give context beyond the kind of the general framework that you've started, let's let's drill down a little bit, I know you gave this really this nice hypothetical example, as part of the significance piece that that you wrote up. But I'm wondering if you could give us a different example that might help us gain some insight into this. Let me share my story. We like stories here.

Susan Rumisha I like the somebody that Rosemary put together to explain to me. And that is I think it has somewhat a big story of myself. I am well, I'm working with a malaria class project now. And I'm coming from, it's a global, it's a global level where I am now. But I started 18 years ago, I started working at the National Institute for Medical Research in Tanzania. So those are early 2000, I was very naive, and coming in as a small, very energetic person in data analysis, and you love data, and you just love statistics. So from the local, when you're working in the country level, you are actually you're very close to the what we call end users, you're really close, because they are there, you see the problems and you think through, you can sometimes actually, we can be in meetings and you can swap between you become a researcher, or you become a community in which to make those roleplay so that you can design your wet well, then I grew, I became now original researcher, then now you have multiple countries coming into play. So it's actually you're growing, then now I'm internationally well, the global level. So as you're waking, I'm gonna connect that, to understand this integrity and fairness. But as you're growing, you are learning. You are learning as actually as you're growing, you understand the global context, but something that is connecting yourself to the end users, because as you go in the bigger picture, you are, you're starting to see things on a global level. And you start to slowly ignore things, which are actually very important. And then later we'll talk about them in our criterias when you come to international space, you are even further stretched. And sometimes you don't choose to disconnect if there is a setup that you are disconnecting. So this is this this this integrity and fairness and everything I see myself for example, as a victim of that, because you have now to to find a way is like that will still bring the will maximize what Sandra was to the maximize the impact at the at the very lower level of where the use of that wastage is, you want to be honesty, you want to be of good quality, you have to be to be accountable. And so in the work that I do without mentioning specific examples, I see, I see the need of some of the words that we're going to discuss, whatever the bridge, the need of having that way of standardizing things. And as I was working with Nameberry, I'll be sharing with my colleagues. I was challenged, for example, in the clinical research, they have what they call the good clinical practice. And if before you, you take your fit into any of the clinical work, they will train you on how to do good clinical practice. But we know we don't have something like that in good epidemiological practice. So it's like, I don't give you examples. But these are the challenges that have been driving our heads in like that. Can we challenge the epidemiological side of research to stretch ourselves? Maybe we might not be able to reach the clinical world or clinical research, but we can go there. So without giving examples, we just have something that already we are looking at, we have seen it and we want to get their standardization in a way that can be done everywhere in the right way. We can trust each other without your privilege or your advantages and disadvantages, but you can trust each other that the product has come from epidemiological research.

John Bailer So let me just find a quick follow up. Yeah, just so thank you. I mean, what I'm hearing you say is that that it's, it's kind of pushing back against this loss of connection to the people most directly impacted by some of these, these particular health concerns that that as you as you go to greater levels of aggregation, that you start losing some of the connection. And this seems like this, this was direct, this isn't kind of pushing back and reacting to that, in part is that, is that fair to as a summary?

Susan Rumisha Is it fair to somebody? And that's why we think that once we have a guideline, you will, you will know that okay, if I feel disconnected, I have something that I can hold on to. When you have any, when you have a guide, it helps you to connect with you. So that but if you don't know, you can do either way. And I think as long as you think this is right, you do it, because there's nothing that is standardizing how you should do the work. And that's what brings our bridge actually into the play.

John Bailer
This was really interesting to think about, and I, I mean, you have me thinking about other terms that I'd never heard before. And so I'm going to one was the idea of parachute studies, and the other was ethics dumping. And I'm wondering about it, you know, and maybe Sondra, I don't know, if you want to take this, so what is it? That's that? What are the concerns that are encapsulated in those phrases?

Sandra Alba
Yeah, parachute research is what happens when you have people who don't know the context at all who are suddenly parachuted or actually parachute themselves into a contact to conduct research. And yeah, I think that Susan can probably think of more parachute researchers than she wished she could, because of course, she must have seen it happening in Tanzania, an issue of global health, because like going back to what is global health? So global health, if you look at the definitions, it's about studying the causes of morbidity and mortality across geographical boundaries. Yeah, so that's a bit the definition of global health, and with a focus on health equity, and also health promotion, you know, so that's a bit today that the keywords are, in global health. So in theory, it's about the diseases that cross boundaries, and actually, this is where Susan's work in malaria is really, really a global, really an example of global health research, because malaria is exactly one of these diseases, right? That crosses all boundaries. And I think it's, you know, it's really makes sense, you know, also her path, her career path that she mentioned, first working with at national level, then at regional, because this, this is a regional problem in Sub Saharan Africa, but then also beyond in that sense, it's really a global health problem. But what's happening also in global health is that there's researchers from many different contexts that work in many different contexts. And one problem that we have is when researchers who don't know the context in which they study, start doing research somewhere, and that's when we really see this disconnect, that Susan is talking about happening. And it's the way it and the direction in which it happens normally is researchers from a high income country, somewhere from the US or Europe, who are pursuing, you know, maybe studies of postgraduate studies, PhDs, whatever, who then for one reason or another, ended up doing a research in a low middle income, low or middle income country, like somewhere in Sub Saharan Africa and Tanzania, whatever. And, yeah, they may not know the context, they may not really know what the data users really need. And they will start doing the research that they think is right, but then they may actually really continue pursuing that research. That's really further opening this gap that we see between what the data users need and what the research is actually producing. So that's really the problem with parachute research. And it's it I think it's a good imagery, you can really imagine this researcher who's like parachuted you know, into context and has no idea. So that's a problem. And that a lot of which, thankfully, is receiving more tension. And in some ways, the bridge guidelines want to try to address that. And that's why there's a lot of emphasis on really recreating that, you know, that connection with the local decision makers that Susan was talking about and saying, you know, how that was really standard practice for her when she was working in Tanzania local level. And yeah, that's really what the guidelines aim to do. And what, of course, with Suzanne and other colleagues, we've really tried to infuse within the guidelines because we see that as a really important aspect of doing Global Health Research.

Rosemary Pennington 14:44
You're listening to stats and stories and today we're talking about global epidemiology with Sandra Alba and Susan or Misha. I wondered if you could maybe now kind of walk us through what those bridge guidelines are. I think we, I think you know, you've mapped out really well with these issues are and I love The visual of the parachute researcher in journalism, we also have this phrase of someone who sort of parachutes in to report a story and parachutes out. So I was like, Oh, yes, I know exactly what that's meaning. But maybe now we can kind of talk about, you know, what these particular guidelines that you guys have are suggesting researchers should follow and why they are the ones that we should be paying attention to.

Sandra Alba So what are these guidelines? Yeah, good. Epidemiological practice guidelines. And they consist of six standards, which cover the six major steps in an epidemiological study. So it starts from study preparation, then protocol development, then data collection, data management, data analysis, and then reporting and communication. And so for each of these steps, there's a standard, which, you know, explains a little bit what you should aim for, if you want to go through this step with both integrity and fairness. Because again, here, the reminder is that we try to encapsulate both the concerns of doing a study with integrity and fairness. And then within each standard, there's a number of criteria, we're trying to take you step by step through each of the steps of conducting a study. And they really act as a bit of a checklist reminder, it's not really, it's presented as a checklist. But the idea is really like a bit of a reminder of what you should keep in mind. Because we also aware that we want it to be a useful guide, who don't want it to be something that you know, you just go through as a checklist kind of item that you go check, check, check, it's really to encourage you to think of like the different things that are important at every stage of the study, and to really encourage, hopefully a dialogue within the study team about what needs to be done at every step of the study.

Rosemary Pennington
I was looking at this, I think blog posts that the two of you wrote, where you were also sort of breaking down, sort of not only you're trying to bridge integrity and fairness, but sort of bridging different players or, or things that are at play. So you've got bridging academia, and practicing bridging disciplines, and bridging the power gap that you write about in this blog post. So I thought it was so useful to also think about this stuff.

Susan Rumisha
Yeah, so just to add on what Sandra was saying, in the blog was another one of our strategies like to really push these bridge guidelines to the audience, and that was published in the series of equitable partnership. And that fits very well. We love that series in that I think, Oh, this is the one where to go. So as Sandra was saying about the emphasis that we are putting into this, the one of the pieces that we see that we need a breach is about the researcher themselves. And the practicals. We call it academic academia and practice. So the researchers, they are these people at different levels, they do work, they have the skills, they have the technology, they have the resources, and sometimes they think that a problem is true is a problem. But sometimes you will find if you want to have the most impact to maximize that impact, that fairness that we talked about. You need to involve those end users in that is a practical part of it. And there we say the local communities that could be one example could be your policymakers. Because sometimes we leave we got until the end, then you come and you start bombing the policymaker and make them to make a decision, but you forgot completely about them as you're doing your research work. So we need that those are the people that later they will do the work if you did a bad work, they have to struggle to make it go into the practice. So while Britt is that, because that will actually ensure that Santa already spoke about that you will answer the right questions that the public they are of interest, and actually to be in the code in context of the problem that we are working on. And then we are expecting it to have the most impact. And another one is the disciplines because there are two things. They say sometimes you have qualitative research that you can edit in numbers. That's beautiful. But what's behind these numbers? And you have the qualitative which talks about stories, some people they're just talking about things but what's the magnitude of this means and people perspective about things. So So Well, there we used to example of like to display for example, quantitative people like us statisticians and everything and we have the sociologist anthropologist want to breach those and we're emphasizing so much about in the design of this, this criteria of the of the data included in the bridge, that which they should come together because they are complementing each other to understand the full story of what is actually happening. Then the last is the power gap. We don't talk about that. But we know there are imbalances. A lot of people might be just this less advantaged because they don't have resources. They don't have the technology. They don't have the expertise, but it's not actually they don't have the issue. Major that people are using to look at that and scale it. So whatever that is there, whatever level that is, this cannot be bridged together so that we benefited from each other because you can have the money, but a local scientist could have a local understanding. So your money might not be able to be applied in the right way, because you have not managed to bridge the two. So that power gap, it has to be preached. And the thing that we try to emphasize, we promote dialogue. And in that, in that in that blog, we talked about dialogue, we had your phrase from from Kiswahili that you have to build the bridges, because it's this a local ways that community used to bridge each other dialogue, dialogue, dialogue, talk to people, involve you so that you can select the right stakeholders, the right people to work with the right means of doing the work, and that when we think, from these guidelines, ethnicity will give you like a way steps by step. Yeah, yeah. So that is I think that's the summary of the blog that we tried to summarize our pitch guidelines.

John Bailer
You know, one of the things when I was looking through the guidelines, and the checklist that you have that I thought was kind of neat was that, you know, like, for example, in standard three, the data collection, the very first one uses valid and reliable research instruments. And I thought, you know, that's, that's what everyone would expect from any study, you know, that that's sort of a standard thing. And then the next, the very next point is ensure that research instruments are locally adapted and culturally appropriate. I thought, Ah, okay. That's the push pause, you know, ever, you know, you sort of you're flowing through this and saying, Oh, yes, of course, of course, everyone does that, wait, but they don't necessarily do the second part. And I thought that that was kind of a really interesting aspect of taking what a researcher might expect and reading through this. But then saying, Don't forget about these various levels like you've mentioned, don't forget that you have this connection to these end users. Don't forget that this is a process in partnership. That was nicely done, how that was built in. And, you know, there are other things that you clearly worked hard on trying to find these ways to infuse it. I'm curious about how this has been received, or how people are using some of these guidelines in terms of work that they're initiating. And maybe it's how you're using it. But but but in general, how it's been, what's the uptake of these of these guidelines,

Susan Rumisha
let me tell you, I give an example about the validity and instruments that are locally adapted. And we have been working while I was still at a national level, doing research. And we have collaborators that from the other side of the world Western, they have, we were working together, we are designing a study, then we are starting to build instruments in some of the instruments, some of the instruments are just being adapted from different studies. So sometimes you don't need to design a new tool, you can just adapt from another survey. So we receive these questions. And we are looking at this list of questions that we are supposed to go and ask them in the community. And you look at some of the questions like, Wow, this question, even me, if now I turn myself to be a community, I won't be able to respond to this question. Because in the context of this country, you can't ask this question. So it could be something that it was like the way the question is, the information that you're looking at is just capturing, you can't ask a woman that question. But wherever that question was, adapted to that culture, it was okay. So now we are going through this discussion that this question has to be two things, either we have to rephrase it, go round round, you can do maybe find a way to capture and find the sociologist to go and ask this question in a different way, or drop it. And it's sometimes you manage to drop questions, sometimes you don't, it's like the pressure is too much, then you will retain some of the questions. You will go to the field, you know, what is happening? They are coming all not answered, or I don't know. I don't know. It's not that they don't know, they couldn't ask that question. And sometimes even the interviewer, you can't ask this old, elderly woman, like, how can I ask you this question, culture is not acceptable. So these are the things that we try like, okay, valid and reliable, that's good. But they have to be appropriate. So these are just given examples. I've been through that.

Sandra Alba
And then the two are linked, right? Because then you have bad quality data, you know, because this is an example Susan has given where then you don't have data. But then in other cases, you don't know, you know, maybe researchers found a way around it. The field workers, you know, they're all very intrapreneurial resourceful people. And, you know, they might be finding ways, you know, also around it and asking the question in a way that they feel is a, you know, an appropriate way, but then everybody has a different way of dealing with this. And then in the end, even a worst case scenario than the ones Susan then mentioned is then you have answers, but actually they're not. They're not actually but they're not valid, because they're not really the answer to that question. Because each field worker, each interviewer, yeah, changed it in a way that they felt was necessary, and you have absolutely no knowledge, view or handle on this. So then the important Yeah, that we were really, really aware of these issues. And that I think that's why they're really infused everywhere. Because in the end, these guidelines were developed really by people who do the research and who know about this. And yeah, and the two are connected, because in the end, if you don't have locally adapted tools, you also don't have a valid tool. So they, and that's a bit maybe one of the, yeah, one of the things that we struggled a little bit with the guidelines is that, you know, we tried to make it something very linear step by step, but in the end, it's not so linear step by step, you know, these two things, we separated them as, like an integrity issue and a fairness issue. But in the end, it's all kind of together. And, in that sense for us, it's important that the guidelines are reused more as an age, you know, something that to carry along with you to remind you to keep on looking at, you know, regularly to refresh your mind. Because yeah, it's a bit circular. A study isn't so linear step by step. First, I check if it's technically sound, then I check, you know, if at first I check if it's reliable, then I check if it's culturally appropriate. No, it's good. You know, it's all together. And then in the end, it's the piloting that will also tell you if this tool is really working.

Rosemary Pennington
Well, that's all the time we have for this episode of Stats and Stories, Sandra and Susan, thank you so much for joining us today. Thank you. Thank you. Thanks again. Stats and Stories is a partnership between Miami University’s Departments of Statistics, and Media, Journalism and Film, and the American Statistical Association. You can follow us on Twitter, Apple podcasts, or other places you can find podcasts. If you’d like to share your thoughts on the program send your email to statsandstories@miamioh.edu or check us out at statsandstories.net, and be sure to listen for future editions of Stats and Stories, where we discuss the statistics behind the stories and the stories behind the statistics.