When I started The African Researcher, I wanted to make space for the kinds of conversations I wish I’d heard earlier in my career—stories about what it actually looks like to build evidence in service of important societal issues, work with communities and policymakers, and stay grounded when the work gets complicated. That’s exactly why I was excited to speak with Dr. Karen Austrian, a Kenya-based researcher who leads Population Council’s Girl Innovation, Research, and Learning (GIRL) Centre.
Throughout our interview, Dr. Austrian spoke about research in terms of its practicality—an endeavour that should be undertaken to answer real questions, and serve real people. She described the GIRL Centre as a “one-stop shop for data and evidence” related to adolescent girls, with a focus on generating evidence, making it usable, and nurturing the next generation of researchers—especially young women scientists.
What stayed with me most from our discussion was how Dr. Austrian framed what research represents—a disciplined way to seek truth and avoid cherry-picking, to learn honestly (even from disappointing results), and to stay in a continuous loop with the beneficiaries who will actually use the findings.
Dr. Austrian’s Path into Research: Starting with Programs, then Following the Questions
One of the reasons I enjoy interviews like this is that they often disrupt the neat narrative we tell about career paths. Dr. Austrian put it plainly: “I had no desire to be a researcher.” Her entry point into the sector was programmatic and community-based. She first came to Kenya about 25 years ago while studying in university, and later returned to work with adolescent girls in a Nairobi slum.
In that work, Dr. Austrian noticed that adolescent girls were being missed by many mainstream approaches targeting youth. She described how, back then, “’youth centres’ were all the rage,” but when you entered them, you saw a bunch of men in their twenties hanging out, rather than a space where a 13-year-old girl would feel safe. The design of these youth centres shaped who showed up—and who didn’t.
Dr. Austrian described being frustrated with the decision-making without evidence. She said it “bothered” her that money was being spent “without a lot of thought around, what do we know works?” She pointed out how the simplest form of research—documenting who was attending the youth centres as counts of age and gender—consistently showed young men as the typical beneficiary. That kind of descriptive evidence might sound basic, but is powerful: it changed what could be argued for, and justified program redesign.
That situation pushed Dr. Austrian to pursue graduate training in public health and reproductive/adolescent health, and, eventually, a PhD “to really progress as a researcher.” She worked with Population Council, first as a consultant and then permanently, conducting research that helped her see how data could make visible what communities were already experiencing and determine which programs would have the most positive impact.
Throughout Dr. Austrian’s research trajectory, she made sure to grapple with the complexities of programming for adolescent girls. She raised a striking question: if adolescent girls’ outcomes are shaped by numerous sectors such as health, education, safety, economics, and culture at the same time, then why would we expect single-sector solutions to hold?
The GIRL Centre: Building Evidence and Talent
Dr. Austrian explained that the GIRL Centre’s work sits at the intersection of research and action. On the research side, the Centre aims to better understand adolescents’ lived experiences—especially how those experiences differ by gender. She made sure to point out that adolescent girls cannot be considered a single “category.” As she put it, the Centre’s goal is to avoid treating girls as “a homogenous group,” and instead use evidence to show how needs differ for girls who live different realities; for example, those who are in school vs. out of school, married vs. not married, migrating vs. not migrating, and working vs. not working.
Dr. Austrian also emphasised something that I think many of us in health research wrestle with: research that’s never applied can start to feel like research for research’s sake. She was refreshingly direct about that. In her words, “it is important… not just to do the research, but to make sure that the research can be used by others,” adding, “otherwise we shouldn’t be doing it… it’s a waste of money, frankly.” In our conversation, “use” didn’t just mean publishing in scientific journals, but rather, creating products and strategies to disseminate evidence in accessible language, and convening different actors—such as researchers, implementers, policymakers, and funders—to interpret the implications together.
Dr. Austrian then described a third priority for the GIRL Centre: nurturing talent. She spoke about feeling a responsibility to mentor and “create space” for emerging researchers, particularly “young female scientists.” That commitment felt connected to her broader view that research shouldn’t be treated as an isolated craft—it’s something that’s intertwined with people and systems, and requires a pipeline of capable, confident researchers.
Research as a Systemisation of Knowledge
Dr. Austrian considers it imperative to use research as a tool to answer questions that program designers, communities, policymakers, and donors are actively asking. “I don’t see research uptake as a linear process,” she said. “I think it’s a cyclical process,” where evidence informs decisions, and the questions and gaps that decision-makers face feed back into what researchers study next. For that cycle to work, she emphasised, “everyone has to have a relationship with one another.”
Research, in Dr. Austrian’s words, is “the systematisation of knowledge.” It levels the playing field and helps ensure that conclusions are based on evidence rather than depending on the loudest voice, the most compelling anecdote, or the most photogenic success story.
Sometimes, your research findings reveal exactly what you want them to: evidence of program success and lives being impacted in positive ways due to the intervention you’re studying. That’s the best-case scenario. However, the reverse can also be true, which puts researchers in an uncomfortable position: when findings reveal what doesn’t work. Dr. Austrian argued that it is “really important to document what doesn’t work just as much as it is to document what works,” even though the sector makes that hard. After all, as she put it – who wants to tell a donor that the project they’re funding isn’t working?
Dr. Austrian shared an experience from her own work: a large, longitudinal randomised control trial (RCT) that evaluated an Adolescent Girls Empowerment Program being implemented in Zambia. The program was implemented well, and researchers observed improvements in some near-term outcomes such as adolescents’ self-efficacy, knowledge, and savings behaviours. However, the program did not lead to the expected longer-term impacts on outcomes such as education, teenage pregnancy, violence, or timing of marriage. Dr. Austrian described the challenges of what came next: the donor was prepared to fund a major program scale-up, which the team had already begun designing—yet all parties mutually decided that they “had to walk away” because the evidence didn’t support the level of impact they were aiming for.
Yet, the research was not in vain. Dr. Austrian reflected that the Zambia trial became a turning point precisely because it generated clear, credible evidence about what wasn’t enough to produce longer-term outcomes. In the case of the Zambia program, what wasn’t working was the program design. The intervention was multi-sectoral, but its components largely sat at the individual level. Girls were offered access to empowerment groups, savings accounts and health vouchers – while that might seem like it should have been effective, it mainly focused on what girls could do and access themselves. However, the program didn’t account for the important role of factors that are outside the girls’ control, such as household economic constraints and the community norms that shape decisions about schooling, marriage, and early childbearing.
Dr. Austrian described how those lessons directly informed program redesign in the next iteration of work in Kenya: the approach stayed multi-sectoral, but became explicitly multi-level. Alongside components targeted at individual girls (like group-based safe spaces), the new model added a household-level economic component (cash transfers tied to girls’ school participation), and a community-level component that engaged adults and decision-makers to shift norms around girls’ education and gender equality.
A call to be bolder
Towards the end of our conversation, Dr. Austrian offered advice to emerging researchers—advice that felt especially relevant to those navigating academic environments that can be rigid. Her message was that “we need to be a bit bold in these times.” She encouraged younger researchers to bring bold new ideas, about the issues we choose to study as well as the methods we use. Academia can sometimes feel old school and hierarchical, and younger researchers might feel pressured to prove themselves using conventional topics and approaches.
Dr. Austrian’s point wasn’t that traditional research questions and methods are never valuable; rather, it’s that the next generation has permission—and responsibility—to stretch and break boundaries. To stay rigorous, but also to be innovative, to ask fresher questions, and to practice research as something that’s engaged with the world. In her words, “how do we become researchers who do work in relationship with the world, rather than in isolation from it?”
As I left our conversation, I kept thinking about that cycle she described: evidence flowing into decisions, and real-world questions flowing back into research. If we can keep that loop alive—honestly, humbly, and with a willingness to learn even when results disappoint — then the research becomes more than a series of publications. It becomes a public service.





Leave a comment