[Editor's note: This blog was originally written for and published on the DRUSSA (Development Research Uptake in Sub-saharan Africa) website as a four part series. It was based on a short presentation to the RIMI4AC conference in London, UK. It has been shortened and reproduced with permission.]
As highlighted in a recent DFID-run event for research uptake practitioners entitled: Beyond communications, research uptake is a concept that has been evolving rapidly over the last several decades. As such, a variety of challenges face research uptake practitioners and those wanting to strengthen capacity to get research into use. These operate at three levels: systemic barriers, institutional barriers and individual barriers.
The first challenge is probably that the concept of 'research uptake' is little understood. In brief, it is focused mainly on the demand side of research, working to stimulate an enabling environment among end users of research to commission and find appropriate information to support their own policy processes. This assumes working closely with key stakeholders, but also probably assumes some sort of capacity strengthening for them to understand and demand high-quality research.
But more at a systemic level, the barriers to research uptake are numerous—and have been well articulated by others. In particular, the RAPID programme at ODI notes six lessons for getting research into use. They highlight different time horizons and different notions of evidence between the research and policy spheres. A researcher needs as long as research takes and findings are often wrapped in a variety of qualifications and caveats, but policymakers often need clear findings at key points during the policymaking process. As a former policymaker turned researcher from Brazil so wonderfully put it: “In Brazil we don’t talk about pilots. I can’t go to an official and say ‘give me two years and I’ll give you the answer’. Why? Because we have elections. If a policymaker waited two years to take action he’d be shot.”
We also know that the research-policy-practice interface is a complex and dynamic one. Not only do those trying to get research into policy need to have some knowledge about where in a particular policy cycle the research topic is, they must also understand who is working to influence that process, what their drivers are and how they’re doing it. And unfortunately, one of the corollary barriers is that change doesn’t happen the same way twice, which means a lot of experimentation, expertise and critical thinking are required to link research, policy and practice. This latter point also helps explain why research uptake is focusing so heavily on strengthening the demand-side—if a policymaker is requesting research findings, half the battle is already fought.
This brings in another systemic barrier—research funders distort demand. While research funders aren’t necessarily the end users of research, they are the ones that set the priorities through what they are willing to fund. Often some sort of demonstration that there is a gap in existing knowledge and some demand for the research results is part of the grading criteria when selecting proposals, but the distortionary effect of the donor cannot be overlooked.
There are also perverse incentives on both the supply and demand side that act as a significant barrier to research uptake. On the research side, promotion is most often dependent on publishing as many papers as possible in peer-reviewed journals with high impact factors, not in communicating that research. There are also time pressures—a student knocking on the office door is more likely to capture a researcher’s attention than a distant policymaker in a capital city far away. And funding is a chronic problem—with researchers perpetually chasing the next grant, who has time to do “extra” work communicating research findings? On the policy side, incentives tend to focus on maintaining political legitimacy. That might include carrying through on promises made during electoral processes, adhering to a particular ideological standpoint (and let’s be clear, evidence-informed policy is a clear ideological position too), or just not looking idiotic in the public eye.
Finally, at a systemic level, research uptake requires a diverse skills set to deliver. In addition to strong research skills, Simon Maxwell likes to argue that policy entrepreneurs require four key skills: story telling, policy engineering, networking and political fixing. But these skills are underpinned by a huge area of oft-overlooked technical skills, including:
- Editing and language skills
- Digital engagement skills
- Graphic design and desktop publishing
- Media planning and engagement
- Event planning and management
- Database management
- Data analysis
- Information literacy
- Knowledge management
- Budgeting and programme management
- Marketing and public relations
- IT skills
Focusing now mainly on the supply side of research at an organisational level, given all the systemic challenges, one of the biggest barriers to research uptake is figuring out where to start and how to institutionalise appropriate systems and processes that support research uptake activities. Does a research institute need a central communications/ marketing/ dissemination/ media relations/ knowledge management team? If so, where should it sit? In a grant management office? By itself? As part of the IT department? In the library?
On top of that, where are the capacities for research uptake best placed? Certain skills probably need to remain with individual researchers, but some are probably better supported by an outside team.
Funding research activities is also usually a challenge. It depends on the funding models employed by an individual institute, but many (maybe even most, particularly in Sub-Saharan Africa) institutes lack core funding, and must raise money through projects or other sources. Striking a balance between support to projects versus strengthening and supporting institutional engagement is key, and is also hugely difficult. Does a central team get funded out of overheads? Do they try to support themselves through their own projects and research?
And from an institutional perspective, brain drain is always a worry. If organisations invest in building research uptake skills, there’s no guarantee staff will stick around and that they will continue to benefit from these skills. Researchers may end up in a relevant ministry, for example (though this could turn out to be a good thing for the institution). More centralised teams with specific transferable skills often find themselves poached by the private sector, and in developing countries, especially, by international agencies and non-governmental organisations.
Last but not least, at an individual level, barriers to research uptake are multiple. One of the most frequent points of opposition to research uptake that I hear, and one that I’m hugely sympathetic to, is that researchers must, first and foremost, be good researchers and that if policymakers or practitioners want to use their work, it’s their prerogative. This is a notion we must counter, strongly and with a moral imperative. It is not just the responsibility of policymakers and practitioners to seek out research—it is also the researcher’s responsibility to make it accessible, especially when it can (and does!) save lives.
Another individual barrier is the ego—and I mean this in two ways. At one level, effective research uptake activities require strong brands from strong researchers. This means that researchers must at some level be sure of themselves and of their findings and be confident to take them out into the big wide world. On the flip side, researchers must be willing to accept help and advice and work with others. Just because a researcher, who has more often than not been focusing on a study for a significant period of time, understands the findings doesn’t mean everyone will—EVERYONE needs a good editor, always. Also, given the diverse skills required for research uptake, it’s highly unlikely that any one person knows best.
That may sound like a lot of barriers and I’d hate to leave people thinking it cannot be done. A number of examples of incredibly impactful research uptake activities exist—and they can and have improved and saved the lives of many. It just means there’s some work to do.