This blog by Innovations for Poverty Action summarises discussions from a researcher gathering on measuring women’s empowerment in impact evaluations. Read the summary blog on the IPA website.
Painting a full picture of women’s empowerment in impact evaluations using surveys alone can be challenging. Qualitative methods can help researchers better understand a program’s impact on women’s lived experience and identify reasons why a program worked or didn’t work, but economists rarely have incentives to incorporate robust qualitative methods into their research.
In this post, we feature an interview with Dr. Sarah Baird, an economist and Associate Professor of Global Health and Economics at George Washington University. Sarah uses field experimental methods to understand what works to improve the lives of young people, particularly adolescent girls in Sub-Saharan Africa. In this post, she shares her thoughts on how qualitative methods can complement quantitative methods based on her experience as part of an interdisciplinary research team conducting a multi-country longitudinal study on gender and adolescence. She also talks about the barriers to conducting more mixed methods research in the social sciences.
Nellie: Today we’re talking about mixed methods and, in particular, integrating qualitative research into impact evaluations. Could you start by describing some of the limitations of impact evaluations that qualitative methods can address?
Sarah: The first limitation that always comes up is our ability to understand mechanisms. We find that a cash transfer program increases educational outcomes, but what barrier exactly did it address? Did outcomes improve because girls can now afford a uniform, or because households can now hire labor and afford to send the girls to school? While we do try to have some of those questions in quantitative surveys, they’re still pretty blunt. I think qualitative methods help to uncover some of those mechanisms.
In addition to the mechanisms, qualitative methods help us understand the nuance of why a program works in a specific context. As you know, impact evaluations get hammered for their ability to talk to external validity. Qualitative methods can help us think about why a program may or may not work at scale or in a different context.
Nellie: Right and I think unintended consequences probably fit in with context. Is that something that’s come up for you as well?
Sarah: Well, we can capture some unintended consequences on the key outcomes we’re measuring quantitatively. So if somehow our program increased sexual behavior or increased drop-out, we’ll see that. But when these findings pop up quantitatively, qualitative methods can help to better understand why that was.
Nellie: And that can be crucial to how and whether evidence is used to inform policy.
Sarah: Exactly. If the NGO or the government scaling up the program doesn’t understand the rationale for why you found what you did, it’s going to be harder for them to take those lessons and make sure they’re implemented properly.
Nellie: And given these potential benefits, why do you think so many economic evaluations don’t include a robust qualitative component? And how do you distinguish robust and non-robust qualitative methods?
Sarah: One of the main reasons is that many economists have a pretty limited understanding of all the different qualitative methods that exist and how to actually conduct qualitative work in a robust manner—in terms of the interview guides you design, how you sample. And we think it’s really important to do these things rigorously in quantitative work. We’re not going to just say, “Oh, let’s sample 50 people and do a survey.” We know that that’s not the right thing to do. And in qualitative methods, that’s kind of what we’re doing. I do think we learn some important things, but if you get serious qualitative researchers to look at what we’re doing, they’ll just shake their heads, the same way we would do if we saw someone doing quantitative methods with limited training.Then, in a world of limited resources, say you ask me whether I want to spend more money doing a robust household listing versus adding a qualitative component. Even me who fully appreciates mixed methods, given that kind of trade-off, I’ll put the resources toward making sure the quantitative work is as good as possible. And it’s going to be hard to get economists out of that mindset. In the same way, qualitative researchers may say that we need to put the resources into the qualitative work, and then we can add the quantitative component to get the big picture. It comes from the many years of training we have in our discipline. But learning from researchers on the cutting edge of qualitative methods, beyond focus groups and in-depth interviews, has been eye-opening to me. And I think that’s where maybe you would get more economists excited about the value of qualitative research and see us budgeting for robust qualitative work and bringing on a qualitative researcher.
Nellie: So there’s a lot that goes into incorporating robust qualitative methods—the resources, the time, having a co-investigator who actually knows what they’re doing. Tell me about a study you worked on that you think could have benefited from better integrated qualitative components. What was missing? Was it the resources? The co-investigator?
Sarah: My best example is a cash transfer study in Malawi that I worked on right out of grad school. We used qualitative in-depth interviews to elucidate what people understood about the program and how they understood their treatment arm in comparison to somebody else’s. Those interviews became footnotes in the paper to address referees’ critiques. Economists typically view qualitative research as a way to fill in those little gaps in the quantitative analysis.
We also applied for funding to analyze the qualitative data more rigorously, but the funder came back with funding just for the data collection. I don’t think they fully appreciated that qualitative analysis is a totally different skill set than what our team of economists could offer. Now we have hundreds of in-depth interviews sitting on my computer that no one has analyzed yet because of a lack of funding. That’s a shame.
If I could go back, I would bring in an expert on qualitative methods to design an integrated study using their state of the art methods along with ours. Even if you have the funding, the challenge is in the publication at the end. There’s still a pretty limited audience for that type of work, and if you’re just out of your doctorate and aiming for tenure, you want to publish in the best journal in your discipline. It’s hard to spend significant resources on qualitative work that’s going to be left out of the paper, because it just won’t be able to get into the outlets that you’d like.
Nellie: So, what could journals do to encourage more inter-disciplinary research?
Sarah: Right now the most common way around this issue is to publish two versions of the paper—one on the qualitative findings and another on the quantitative. That’s a good first step, and from a policy perspective, if a colleague and I then present to the government together, we can also convey the nuance of the results.
But most of the academics reading one paper are not reading the other. From that perspective, we might learn a lot more by integrating the findings. I’d like to see a journal that development economists, as well as demographers, sociologists, and others doing qualitative work, all respect to develop a reputation for publishing excellent mixed methods work. There could still be papers on the quantitative and qualitative findings but also a third paper in this journal that looks at the two together. For this to happen, senior researchers who have less pressure on them for tenure would need to be excited (a) to do work that goes to this journal and (b) to read this journal.
Nellie: Let’s then backtrack a moment to what you were saying earlier about the funding for your study in Malawi. It sounds like there was a misunderstanding about what the economists who were working on the study could do in terms of qualitative analysis. What could donors do better to encourage inter-disciplinary collaboration?
Sarah: One idea is for donors to create incentives similar to those for North-South researcher teams. Often grant applications are reviewed by one expert in research methods and one expert in the thematic area, and the methods expert is likely to be more knowledgeable about either quantitative or qualitative methods person generally. In order to fund more mixed methods work, donors would need to have a more balanced panel of reviewers with expertise in both quantitative and qualitative methods.
Nellie: One argument for putting more focus on mixed methods work is to improve our research on women’s empowerment. Could you speak about why qualitative methods are especially useful in this context?
Sarah: Particularly when we’re working with adolescent girls and thinking about their transition to adulthood, the interventions are very multifaceted. Life skills training in safe spaces, for example, is trying to teach a whole host of skills that affect many different dimensions—economic, psychological, health—of girls’ lives. Quantitatively it’s difficult to define the primary outcomes and the mechanism by which those outcomes are impacted. Qualitative research can unearth the details in these multifaceted programs, especially when you’re thinking about adolescent girls. Is it the effect of spending time with peers or the curriculum developed for the safe space group that led to lower pregnancy rates? And as we said, the mechanism can lead to very different scale-up strategies and cost.
Nellie: And untangling these mechanisms is an important component of the DfID-funded Gender and Adolescence Global Evidence (GAGE) mixed methods study that you’re working on with Nicola Jones, who I’m going to speak with next week. Could you tell me about the study and your role?
Sarah: GAGE is a nine-year mixed methods longitudinal research and evaluation program examining what works to transform the lives of poor adolescents in general and poor adolescent girls specifically. Now we’re finalizing the program and study designs, as well as the survey instruments, and we hope to launch data collection in early September.
This is a good example of a multi-dimensional, multi-capability program. GAGE’s conceptual framework is set-up to measure outcomes across six capability areas ranging from economic empowerment to bodily autonomy, integrity and freedom from violence to education and learning to voice and agency to psychosocial well-being and sexual and reproductive health and rights, health and nutrition. Our programs are designed to influence all of those domains. This set-up makes it clear that we’re going to need both quantitative and qualitative work to get at the bottom of what we’re ultimately impacting and why.
Nellie: In the work you’ve done already, how have the qualitative researchers contributed to the quantitative aspects of the research?
Sarah: It’s been very useful to get the other teams’ perspective on what can be better captured qualitatively. For example, in many surveys, we ask open-ended questions, such as “What do you like about school?” and then write a descriptive answer.. These questions are meant to unpack the mechanisms behind our findings. If you find that an intervention kept girls in school, you really want to understand why. Was it changes in how safe they feel at school? But looking at mechanisms qualitatively instead is a good division of labor. We may not pull all these questions out of the quantitative questionnaire, because with a much larger sample we can talk more generally about the population. But our surveys can stick to what we think are going to be the key mechanisms and leave the secondary and tertiary possibilities to the qualitative work.
Being able to remove these questions from the survey and explore them in qualtitative work is important, because when you want to capture outcomes across six dimensions, you very quickly get to a point where you need have a three- to four-hour questionnaire to capture everything that you wanted to. No one, particularly young adolescents, has that kind of attention span.
Nellie: That a really helpful take-away – thank you. To wrap-up, if you were speaking to other economists, what do you want them to know about incorporating qualitative methods into their work? And in particular, while we’ve talked a lot about the benefits, could you identify some of the drawbacks or challenges that you think they should be prepared for?
Sarah: One challenge is that good qualitative work costs money, and I don’t think we as economists appreciate that. So if you really want to do qualitative work well and engage with a qualitative researcher then that has implications for the budget for quantitative work.
In addition, because we’re coming from different areas of expertise, we also need to be patient with each other in explaining why we ask the questions that we do and why we think something is important to measure. And I think it’s the same on their end. They have to spend more time explaining the value of what they’re doing qualitatively, why they’re using this method, and how they get at generalizability through this kind of sample size—things that we have a hard time grasping. And of course, the ultimate negotiation over publications and journals also gets complicated.
Nellie: Right. And that brings us back to this barrier to mixed methods evaluations that we don’t really have a good solution to right now. There’s value in trying our best within the existing systems. And this conversation needs to keep happening, particularly among donors and journals and within academic institutions, in order to encourage more mixed methods work in this field.
Sarah: For now though, public health is a good space to move this mixed methods agenda forward. Our global health department is cross-disciplinary by design. Journals in economics, public health, and sociology are all equally valued. As an individual within the economics world, I have my hierarchy of journals, and my ego wants to speak to that world, but as a member of this department, they’re going to be equally happy if I publish in the Quarterly Journal of Economics or the Lancet. This is probably true for public policy departments as well.
As an organisation committed to making data and information meaningful and useful for policymakers, IPA also has the potential to move this agenda forward by fostering a conversation about the need for and encouraging more mixed methods work.