Our Programmes



Sign up to our newsletter.

Follow ODI

Impact and Insight: What we know

Time (GMT +01) 14:00 16:00


Phil Davies,  Deputy Director, Government Social Research Unit, Cabinet Office

Fred Carden, Director of the Evaluation Unit, International Development Research Centre, Canada

Matthew Quinn, Head of the Strategic Policy Unit, The National Assembly for Wales


Sandra Nutley,  RURU

The first meeting in the series aimed to explore what we already know about the research/policy/practice interface and identify the important gaps in our knowledge and to identify themes and approaches for subsequent workshops. The speakers were Phil Davies Deputy Director, Government Social Research Unit, Cabinet Office, Fred Carden, Director of the Evaluation Unit, International Development Research Centre, Canada and Matthew Quinn, Head of the Strategic Policy Unit, The National Assembly for Wales. Each speaker spoke for about 20 minutes followed by discussion. At the end of the workshop Sandra Nutley facilitated a discussion about possible topics for future meetings.

Phil Davies talked about lessons the Cabinet Office has learned about the research-policy interface, and initiatives to improve evidence-based policymaking including the Coordination of Research and Analysis Group. Key points included the following:

  • More evidence-based public policy is important to improve effectiveness and efficiency, to ensure services are client-based, to improve accountability and democracy and to improve trust in government.
  • But policymakers are influenced by a wide range of factors other than evidence including their own experience and expertise, their own judgement, the resources available, values, habits and traditions, lobbyists and pressure groups and pragmatism.
  • There are many different kinds of evidence including descriptive and analytical evidence from surveys, economic evidence from cost-effectiveness studies and practical evidence from pilot projects. Policymakers need to be able to balance all of these when they make decisions.
  • It is not easy. The UK government is trying to move from a situation where most policy was opinion-based to a situation where more is evidence-based and there are many problems:
    • Sometimes research is of poor quality or contradictory
    • Researchers and policymakers value different kinds of evidence differently: policymakers favour contextual and practical evidence, researchers favour scientific and empirical evidence
    • There are many actors: special advisers, experts, think tanks, lobbyists etc
    • Much researcher-generated evidence is too long, too detailed, not practical, and often delivered at the wrong time
  • The government is trying to overcome these through a variety of mechanisms:
    • Strategic programmes in each department
    • Establishing better incentives
    • Closer collaboration with researchers and uses ex-ante
    • Better identification of the researchable questions
    • Better systems for knowledge translation
    • Persistence and opportunism
Comments and questions from the floor included:
  • Can beliefs can be considered evidence? No, but they are an important part of the context.
  • 'Lines of Argument' might be a better term than 'Theories of Change'.
  • It is also very important to consider the demand for evidence.
  • There are often few incentives for researchers to develop policy relevant research.
  • Researchers need to make sure that they make their evidence more accessible.

Fred Carden talked about the lessons they learned from their recent Strategic Evaluation of the Influence of Research on Public Policy, and implications for how IDRC does its work. Key points included the following:

  • IDRC's research aims to expand policy capacities, broaden policy regimes and affect policy outcomes, but most IDRC staff had a very weak understanding of how their work actually could influence policy. The policy study aimed to learn more about this and to generate recommendations about how IDRC could do it better.
  • The study included background work, the development of a framework for analysis and then 22 case studies of IDRC-funded projects. They were all cases where someone in IDRC thought there had been some kind of policy influence.
  • Analysis included a series of workshops and seminars with IDRC staff and partners around the world and further work on two main issues: firstly what IDRC does and how it does it, secondly where and when it does it - ie the context for the work.
  • Context turned out to be a particularly rich area. The degree of stability, the ability of policymakers to use evidence, and the opportunity for new policy development were all key success factors.
  • It was possible to identify 5 general types of context:
    • Where there is clear demand from policymakers
    • Where there is some interest, but weak policymaker capacity of leadership
    • Where there is interest but resource gaps
    • Where policy makers are neutral
    • Where policy makers are not interested at all (but researchers are)
  • It was also possible to see how good research done in the right way had helped to shift issues up this ladder.
  • This understanding has helped IDRC to be more explicit about the objectives of the research it funds: some aims for policy impact; some doesn't (though may contribute in the longer term).
  • It has also helped IDRC to understand its role more clearly: to build both researcher and policy maker capacity; and to facilitate communication.

Comments and Questions from the floor included:

  • While none of the IDRC cases showed a reduction in demand for research, this often happens in practice.
  • Mapping policy processes is very complex.
  • There remains a debate in IDRC about the role the research they support should have in shaping policy processes.
  • The decision maker's policy context (the pressures, constraints, influences, etc.) are probably more important than the research itself.

Matthew Quinn talked about his experience of evidence-based policy making in Wales, and the validity of models developed in the medical field in other policy domains. Key points included:

  • One way of looking at the process of policy making is as networking activities in a social space. I'd like to look at the role that research can play in that.
  • The Welsh Assembly has had to start all of this from scratch. Before devolution very few policy issues, and the data to support them was disaggregated and it was almost impossible to say anything specifically about the situation in Wales.
  • Bureaucracies are very difficult to do anything in, and they are all different, with their own unique cultures. Most are very heavily influenced by financial control systems, and there are few effective mechanisms for sharing knowledge, and very few incentives to ask "why".
  • Many departments are under political pressure to be seen to be "doing something about everything" but know they only have the resources to do a few things well. It is important to understand this internal prioritisation. The attraction of new ideas often outweighs the evidence of success from old ones, and pressure to deliver on specific targets undermines longer term strategies.
  • The current Strategy for Wales (Wales: A Better Country), includes research and policy testing, and the Assembly has just created an Office for Social Research, is appointing a Chief Social Researcher, and developing a core statistics base.
  • Every Welsh Assembly Department now has a research and evaluation department and under the Freedom of Information Act, we are publishing all the evidence and reasoning behind all ministerial decisions.
  • Another tool we use is the "Policy Gateway" through which every policy has to pas, which asks whether the connections with other relevant policies have been thought through.
  • Other key requisites for more evidence-based policy include informed participation and recognition of the importance of values and context.
  • The key problem though is the general lack of knowledge management and the lack of a long-term knowledge-base within government about existing knowledge and the impact of policies.

Comments and Questions from the floor included:

  • The research-policy nexus may not be as important as the interface between research and practice in the corporate sector. Researchers and firms often have different research timeframes than that of the policy process and cannot wait for it.
  • What does 'backfilling with evidence' mean? In Wales, a choice had been made to follow a value driven approach to policymaking and it was now necessary to develop the evidence to defend it and to develop the desired interventions: using evidence to get the method right.
  • While building up knowledge management capacities is crucial it takes time and may requires more resources than are available.
  • More systematic reviews could improve the supply of evidence and increase the demand for evidence.
  • One of the greatest challenges is how to get people who have limited time to use evidence.

Sandra Nutley's session at the end generated a long list of possible topics for future meetings:

  1. Engagement – when
  2. Neutral space – what, how
  3. Policy implementation and research
  4. Research / Policy / Practice transfer (people)
  5. Outputs, outcomes, impact 9assessment)
  6. Policy Evolution (understanding reality of this)
  7. Evidence use outside the executive (e.g. scrutiny)
  8. Scrutiny – how institutions learn how they accumulate knowledge
  9. Understanding ‘active’ spaces – networks
  10. The policy making context constraints
  11. Dimension of policy context and what works in different context
  12. Getting the audience and the venue right
  13. The ethics of research utilization for the researcher – beyond finds
  14. Role of funders of research


The first meeting in the series aimed to explore what we already know about the research/policy/practice interface and identify the important gaps in our knowledge and to identify themes and approaches for subsequent workshops.