fbpx
Skip to content

Ten ways to optimise evidence-based policy

13 November 2019

Research

Share

Focus on a person holding a magnifying glass.

It is time to move away from the ‘government are from Venus, researchers are from Mars’ narrative when it comes to evidence and public policy. Instead we should focus on the challenges both parties experience from different but complementary angles.

At a glance

In paper for the Journal of Comparative Effectiveness Research, Peter Bragge (Monash University) outlines the following ten ways to optimise evidence-based policy:

  1. Start with the basics
  2. Recognise the big picture
  3. Unpack problems
  4. Get the best evidence you can in the time you have
  5. Complement desktop evidence with information about what’s happening on the ground
  6. Be transparent about the state of the evidence
  7. Choose short-term wins, but invest in long-term when needed
  8. It’s not just about what works
  9. Stay connected
  10. Governments are not just research end-users; they can be research partners

The paper is based on experience in delivering applied behaviour change research to government.

1. Start with the basics

Working across sectors requires shared language. This should start well before the final report. The seemingly simple question, ‘What do you define as research?’, can unpack a range of hidden assumptions. Is ‘data insights’ within a government organisation ‘research’? What is the relative value of ‘grey’ and peer-reviewed literature? Asking and addressing ‘the stupid questions’ optimises the value of collaborative engagement by creating a shared starting point.

2. Recognise the big picture

The line from research to practice is anything but linear with research evidence one of the many inputs into policy. Building an understanding of the policy-making process highlights the context in which research-based information is utilised. Expectations also need to be managed about what research can and cannot deliver.

3. Unpack problems

In a world that demands instant solutions, there is often little time or motivation to question the nature of the problems being solved. Taking problems ‘as read’ carries risk of wasted investment. Exploration of the problem and/or the barriers that need to be overcome to effect change is necessary. Despite this obvious value, this exploration is frequently not undertaken.

4. Get the best evidence you can in the time you have

Review-level evidence is the preferred unit of knowledge to inform policy and practice as it brings together knowledge from multiple individual studies. This reduces bias. Systematic reviews of primary studies are the definitive review type, and are freely available in centralised platforms of systematic reviews such as:

These platforms carry significant advantages over native internet searching by assembling smaller volumes of more credible and easily navigable evidence. Rapid reviews, which synthesise review-level evidence (i.e., a ‘review of reviews’) enable government to gain insights into the state of knowledge in weeks.

5. Complement desktop evidence with information about what’s happening on the ground

Research reports take time to publish and are usually not specific to the end-user context. These disadvantages can be mitigated by reviewing practice in addition to evidence. This enables information from research to be triangulated against on-the-ground realities, constraints and enablers from the perspective of groups vested in the issue. There are a broad range of methods for practice review including one-on-one and focus group interviews.

6. Be transparent about the state of the evidence

A common misconception about evidence-based practice is that practitioners take no action unless they can draw upon definitive evidence. Knowledge about the state of the evidence can usefully inform policy development and delivery

7. Choose short-term wins, but invest in long-term when needed

The rise of the ‘nudge’ movement has created widespread interest from governments in simple, relatively cheap behaviour change strategies. However, nudge strategies generally have modest effect sizes at a population level, and not everything can be ‘nudged’. When a behaviour or the system within which it occurs is complex, long-term commitment is necessary.

8. It’s not just about what works

Another misconception regarding the research-policy nexus is that it is limited to ‘what works.’ Governments need information about the effectiveness of strategies to address pressing public policy issues. But they also need to know about:

  • the nature and impact of the problem
  • the relative cost–effectiveness of different policy approaches
  • the acceptability and feasibility of proposed interventions
  • barriers to implementation
  • how solutions will be organised, governed and funded.

9. Stay connected

It is important to understand and respond to emerging trends. The information age has created a deluge of globally visible conversations and a range of technologies that can track these. The price for such immediacy is decreased confidence in the rigour of information, much of which bypasses traditional peer-review and other checks and balances. In the coming years, new approaches will need to capture and appropriately weight academic, government, traditional, social media and other information streams.

10. Governments are not just research end-users; they can be research partners

Co-production of research is a classic case of the whole being greater than the sum of its parts. Governments have in-depth knowledge of:

  • the big systems that are designed to make life easier for citizens
  • challenges faced in delivering services through these systems
  • system performance data.

Researchers are expert problem solvers who bring research knowledge and skills to systems, challenges and data. It is time to move away from the ‘government are from Venus, researchers are from Mars’ narrative and instead focus on the challenges both are looking at from different, often complementary, angles.

Want to read more?

Ten ways to optimise evidence-based policy – Peter Bragge (2019), Journal of Comparative Effectiveness Research, Published online: 7 November 2019

This brief is part of a Research Series written by Maria Katsonis. This research brief originally appeared in The Mandarin as part of The Mandarin and ANZSOG’s 2019 Research Series called The Drop.

Sign up to the bridge

Published Date: 13 November 2019