Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.
- View all journals
- My Account Login
- Explore content
- About the journal
- Publish with us
- Sign up for alerts
- Open access
- Published: 28 May 2019
Transforming evidence for policy and practice: creating space for new conversations
- Kathryn Oliver ORCID: orcid.org/0000-0002-4326-5258 1 &
- Annette Boaz 2
Palgrave Communications volume 5 , Article number: 60 ( 2019 ) Cite this article
20k Accesses
58 Citations
115 Altmetric
Metrics details
- Science, technology and society
A Correction to this article was published on 29 August 2019
This article has been updated
For decades, the question of how evidence influences policy and practice has captured our attention, cutting across disciplines and policy/practice domains. All academics, funders, and publics have a stake in this conversation. There are pockets of great expertise about evidence production and use, which all too often remains siloed. Practical and empirical lessons are not shared across disciplinary boundaries and theoretical and conceptual leaps remain contained. This means that we are not making the most of vast and increasing investment in knowledge production. Because existing lessons about how to do and use research well are not shared, funders and researchers are poorly equipped to realise the potential utility of research, and waste resources on—for example—ineffective strategies to create research impact. It also means that the scarce resources available to study evidence production and use are misspent on overly-narrow or already-answered questions. Patchy and intermittent funding has failed to build broadly relevant empirical or theoretical knowledge about how to make better use of evidence, or to build the communities required to act on this knowledge. To transform how we as a community think about what evidence is, how to generate it, and how to use it well, we must better capture lessons being learned in our different research and practice communities. We must find ways to share this knowledge, to embed it in the design of our research systems and practices, and work jointly to establish genuine knowledge gaps about evidence production and use. This comment sets out one vision of how that might be accomplished, and what might result.
Are we investing wisely in research for society?
For decades, conversations between research funders, users, and producers have focused on different aspects of what evidence is, the roles it plays in policy and practice, and the different ways in which roles can be enhanced and supported. Most researchers feel unequivocally that ‘more research’ is always better—and funders and governments seem to agree (Sarewitz, 2018 ). Governments are increasingly using investments explicitly to help create the evidence base for better decision-making. For example, funding has been explicitly focused on the United Nation’s Sustainable Development Goals (UKRI-UNDP, 2018 ). The UK government has made several targeted investments, including the £1.5 billion Global Challenges Research Fund to address substantive social problems, (Gov. UK, 2016 ; UKRI, 2017 ), and in health, the thirteen (up from nine) Collaborations for Leadership in Applied Health Research and Care, which received £232 million 2008–2019 (NIHR, 2009 ). This investment looks set to continue with a further £150 million allocated to the Applied Research Collaborations (NIHR, 2018 ). In the US, the Trump administration recently signed into law $176.8 billion for research and development, of which $543 million is specifically for translational health research (Science, 2018 ). These funds are made available to researchers with an effective proviso that the research is targeted towards questions of direct interest to policymakers and practitioners.
There has also been an increase in the infrastructure governments provide, such as scientific advisory posts and professionals (Doubleday and Wilsdon, 2012 ; Gluckman, 2014 ), and a range of secondments and fellowship opportunities designed to ‘solve’ the problem of limited academic-policy engagement (Cairney and Oliver, 2018 ). The UK Government recently asked departments to produce research priority areas (Areas of Research Interest (ARIs)), to guide future academic-policy collaboration (Nurse, 2015 ). Yet, there has been almost no evaluation of these activities. There is limited evidence about how to build the infrastructure to use evidence in impactful ways and limited evidence about the impact of this investment (Kislov et al., 2018 ). We simply do not know whether the growth of funding, infrastructure, or initiatives has actually improved research quality, or led to improvements for populations, practice or policy.
Thus, despite our ever-growing knowledge about our world, physical and social, it is not easy to find answers to the challenges facing us and our governments. Spending ever-increasing amounts on producing research evidence is not likely to help, if we do not understand how to make the most of these investments. Discussions about wastage within the research system often focus on valid concerns about reproducibility and quality (Bishop, 2019 ), but until we also understand the broader political and societal pressure shaping what evidence is produced and how, we will not be able to reduce this waste (Sarewitz, 2018 ). In short, our research systems are not guided by current theory about what types of knowledge are most valuable to help address societal problems, or how to produce useful evidence, or how to use this knowledge in policy and practice setting.
Who knows about how to improve evidence production and use?
Fortunately, even if under-used, there is a significant body of academic and practical knowledge about how evidence is produced and used. Several disciplines take the question of evidence production and use as a core concern, and this inherently transdisciplinary space has become populated by research evidence from different academic and professional traditions, jurisdictions and contexts.
Much of the funded research into knowledge production and use has been conducted in health and health care, and other applied disciplines. Although there are perennial inquiries about the ‘best’ research methods which should inform policy and practice (Haynes et al., 2016 ), this field has offered some very practical insights, from identifying factors which influence evidence use (Innvaer et al., 2002 ; Oliver et al., 2014 ; Orton et al., 2011 ), to identifying types of evidence used in different contexts (Dobrow et al., 2004 ; Oliver and de Vocht, 2015 ; Whitehead et al., 2004 ). Researchers have explored strategies to increase evidence use (Dobbins et al., 2009 ; Haynes et al., 2012 ; Lavis et al., 2003 ), and developed structures to support knowledge production and use—in the UK, see, for example, the What Works Centres, Policy Research Units, Health Research Networks and so forth (Ferlie, 2019 ; Gough et al., 2018 ). Similar examples can be found in the US (Tseng et al., 2018 ; Nutley and Tseng, 2014 ) and the Netherlands (Wehrens et al., 2010 ). Alongside these practical tools, critical research has helped us to understand the importance of diverse evidence bases (e.g., Brett et al., 2014 ; Goodyear-Smith et al., 2015 ), of including patients and stakeholders in decision-making (Boaz et al., 2016 ; Liabo and Stewart, 2012 ), and to contextualise the drive for increased impact outcomes (Boaz et al., 2019 ; Locock and Boaz, 2004 ; Nutley et al., 2000 ).
The social sciences have provided research methods to investigate the various interfaces between different disciplines and their potential audiences. Acknowledging insights from philosophy, critical theory and many other field (see, e.g., Douglas, 2009 ), we highlight two particular perspectives. Firstly, policy studies has helped us to understand the processes of decision-making and the (political) role of evidence within it (Dye, 1975 ; Lindblom, 1990 ; Weiss, 1979 ). A subfield of ‘the politics of evidence-based policymaking’ has grown up, using an explicitly political-science lens to examine questions of evidence production and use (Cairney, 2016b ; Hawkins and Ettelt, 2018 ; Parkhurst, 2017 ). Political scientists have commented on the ways in which political debate has been leveraged by scientific knowledge, with particular focuses on social justice, and the uses of evidence to support racist and sexist oppression (Chrisler, 2015 ; Emejulu, 2018 ; Lopez and Gadsden, 2018 ; Malbon et al., 2018 ; Scott, 2011 ).
Secondly, the field of Science and Technology studies (STS) treats the practice and purpose science itself as an object of study. Drawing on philosophies of science and sociologies of knowledge and practice, early theorists described science as an esoteric activity creating knowledge through waves of experimentation (Kuhn, 1970 ; Popper, 1963 ). This was heavily critiqued by social constructivists, who argue that all knowledge was inherently bound to cultural context and practices (Berger and Luckmann, 1966 ; Collins and Evans, 2002 ; Funtowicz and Ravetz, 1993 ). Although some took this to mean that science was just another way of interpreting reality of equal status with other belief systems, most see these insights as demonstrating the importance of understanding the social context within which scientific practices and objects were conducted and described (Latour and Woolgar, 2013 ; Shapin, 1995 ). Similarly, Wynne showed how social and cultural factors determine what we consider ‘good’ evidence or expertise (Wynne, 1992 ). More recently, scholars have focused on how science and expertise is politicised through funding and assessment environments (Hartley et al., 2017 ; Jasanoff, 2005 ; Jasanoff and Polsby, 1991 ; Prainsack, 2018 ), the cultures and practices of research (Fransman, 2018 ; Hartley, 2016 ), through the modes of communication with audiences, and on the role for scientific advice around emerging technologies and challenges (Lee et al., 2005 ; Owen et al., 2012 ; Pearce et al., 2018 ; Smallman, 2018 ; Stilgoe et al., 2013 ).
Are we acting on these lessons?
However, funders and researchers rarely draw on the learning from these different fields; nor is learning shared between disciplines and professions (Oliver and Boaz, 2018 ). Thus, we have sociologists of knowledge producing helpful theory about the complex and messy nature of decision-making and the political nature of knowledge (e.g., Lancaster, 2014 ); but this is not drawn on by designers of research partnerships or evaluators of research impact (Chapman et al., 2015 ; Reed and Evely, 2016 ; Ward, 2017 ). This leaves individual researchers with the imperative to do high quality research and to demonstrate impact, but with little useful advice about how as individuals or institutions they might achieve or measure impact (Oliver and Cairney, 2019 ), leading to enormous frustration, duplicated and wasted effort. Even more damagingly, researchers produce poor policy recommendations, or naively engage in political debates with no thought about the possible costs and consequences for themselves, the wider sector, or publics.
We recognise that engaging meaningfully with literatures from multiple disciplines is too challenging a labour for many. The personal and institutional investment required to engage with the practical and scholarly knowledge about evidence production and use is—on top of other duties—beyond most of us. Generating consensus about the main lessons is itself challenging, although initial attempts have been made (Oliver and Pearce, 2017 ). Across the diverse literature on evidence use, terms are defined and mobilised differently. Working out what the terms are implying and what is at stake in the alternative mobilisation of these terms is a huge task. Many researchers are only briefly able to enter this broader debate, through tacked-on projects attached to larger grants. There is no obvious career pathway for those who want to remain at this higher level. There are simply too many threads pulling researchers and practitioners back into their ‘home’ disciplines and domains, which prevents people undertaking the labour of learning the key lessons from multiple fields.
Yet the history of research in this area, scattered and patchy though it is, shows us how necessary this labour is if useful, meaningful research is to be done and used (DuMont, 2019 ). Too much time and energy has been spent investigating questions which have been long-since answered—such as whether RCTs should be used to investigate policy issues, whether we need a pluralistic approach to research design; whether to invest in relationships as well as data production. But governments and universities have also failed to create environments where knowledge producers are welcome and useful in decision-making environments; where their own staff feel able to freely discuss and experiment with ideas; and universities consistently fail to reward or support those who want to create social change or work at the interfaces between knowledge production and use.
This failure to draw together key lessons also means that the scarce resources allocated to the study of evidence production and use have been misspent. There has been no sustained interdisciplinary funding for empirical research studies into evidence production and use in the UK, and in the US only over the last 15 years (DuMont, 2019 ). This has led to a dearth of shared empirical and theoretical evidence, but also a lack of community, which has had a detrimental effect on the scholarship in this space. All too often, research funding goes towards already-answered questions (such as whether bibliometrics are a good way to capture impact). We must ensure that new research on evidence production and use addresses genuine gaps. That can only be done by making existing knowledge more widely available and working together to generate collaborative research agendas.
An unfortunate side-effect of this lack of community is that many who enter it do so with the sense that it is a new, ‘emerging’ field, which will generate silver-bullet solutions for researchers and funders. Because it is new to them, researchers feel it must be new to all—not realising that their own journey has been undertaken by many others before them. For instance, there are many initiatives which claim to be ‘newly addressing’ the problem of ‘evidence use’, ‘research on research’, the ‘science of science’, ‘meta-science’, or some other variant. Whether they explore the allocation and impact of research funding and evaluation, the infrastructure of policy research units or the practice of collaborative research, they all make vital contributions. But to claim as many do that it is an ‘emerging field’ illustrates how easy it is, even with the best of intentions, to ignore existing expertise on the production and use of evidence. We must better articulate the difference between these pieces of the puzzle, and the difference those differences make. Too many are claiming that their piece provides the whole picture. In turn, funders feel they have done their part by funding this small piece of research, but remain ignorant of the existing knowledge, and indeed of the real gaps.
Research on evidence production and use is often therefore not as useful as it should be. Failing to draw on existing literature, the solutions proposed by most commentators on the evidence-policy/practice ‘gap’ often do not take into account the realities of complex and messy decision-making, or the contested and political nature of knowledge construction—leading to a situation where an author synthesising lessons from across the field can end up sharing a set of normative statements that might imply that there has been no conceptual leap in 20 years (see e.g., French, 2018 ; Gamoran, 2018 ).
Evidence and policy/practice studies: our tasks
There are therefore two key tasks for those primarily engaged in researching and teaching evidence production and use for policy and practice, which are to (1) identify and share key lessons more effectively and (2) to build a community enabling transdisciplinary evidence to be produced and used, which addresses real gaps in the evidence base and helps decision-makers transform society for the better. We close with some suggestions about possible steps we can take towards these goals.
Firstly, we must better communicate our key lessons. We would like to help people articulate the hard-won, often disciplinary-specific lessons from their own work for others—and to work with partners to embed them into the design, practice and evaluation of research. For instance, critical perspectives on power can describe the lines of authority and the institutional governance surrounding decision-making (Bachrach and Baratz, 1962 ; Crenson, 1971 ; Debnam, 1975 ); the interpersonal dynamics which determine everything from the credibility of evidence to the placement of topics on policy agendas (Oliver and Faul, 2018 ; Tchilingirian, 2018 ; White, 2008 ); to the practice of research itself, and the ways in which assumed and enacted power leads to the favouring of certain methodologies and narratives (Hall and Tandon, 2017 ; Pearce and Raman, 2014 ). How might this translate into infrastructure and funding to support equitable research partnerships (Fransman et al., 2018 )? What other shared theory and practical insights might help us transform how we do and use research?
Secondly, we must generate research agendas collaboratively. In our view, the only way to avoid squandering resources on ineffective research on research is to work together to share emerging ideas, and to produce genuinely transdisciplinary questions. We made a start on this task at recent meetings. A 2018 Nuffield Foundation-funded symposium brought together leading scholars, practitioners and policymakers, and funders, to share learning about evidence use and to identify key gaps. We followed this meeting with a broader discussion at the William T. Grant Use of Research Evidence meeting in March 2019 which has also contributed to our thinking.
We initiated the conversation with a Delphi exercise to identify key research questions prior to the meeting. We refined the list, and during the meeting we asked participants to prioritise these. This was a surprisingly challenging process, which revealed that even to reach common understanding about the meaning of a research question, let alone the importance, discussants had to wade through decades-worth of assumptions, biases, preferences, language nuances and habits.
Based on this analysis, we identify three main areas of work which are required to transform how we think about to create and use evidence (Table 1 ):
Transforming knowledge production
Transforming translation and mobilisation
Transforming decision-making
The topics below were selected to indicate the broad range of empirical and normative questions which need broader discussion, and are by no means definitive. Of course, much research on some topics has already been done, but we have included them—because even if research already exists, it is not widely enough known to routinely inform research users, funders or practitioners about how to better produce or use evidence. We observe that much of the very limited funding to investigate evidence production and use has gone to either developing metrics (responsible or otherwise, Row 2 column 4) or tools to increase uptake (Row 2, column 4), to the relative neglect of everything else. There are significant gaps which can only be addressed jointly across disciplines and sectors, and we welcome debates, additions, and critiques about how to do this better.
A shared research agenda
As we note above, these topics are drawn from proposed questions and discussions by an interdisciplinary group of scholars, practitioners, funders and other stakeholders. It became clear during this process that many were unaware of relevant research which had already been undertaken under these headings. These topics reflect our own networks and knowledge of the field, so cannot be regarded as definitive. We need and welcome partnership with others working in this space to attempt to broaden the conversation as much as possible. We have selected a proportion of the selected topics to illustrate a number of points.
First, that no one discipline or researcher could possibly have the skills or knowledge to answer all of these questions. Interdisciplinary teams can be difficult to assemble, but clearly required. We need leadership in this space to help spot opportunities to foster interdisciplinary research and learning.
Second that all of these topics could be framed and addressed in multiple ways, and many have been. Many are discussed, but there is little consensus; or there is consensus within disciplines but not between them. Some topics have been funded and others have not. We feel there is an urgent need to identify where research investment is required, where conversations need to be supported, and where and how to draw out the value of existing knowledge. Again, we need leadership to help us generate collaborative research agendas.
Third, that while we all have our own interests, the overall picture is far more diverse, and that there is a need for all working in this area to clearly define what their contributions are in relation to the existing evidence and communities. A shared space to convene and learn from one another would help us all understand the huge and exciting space within which we are working.
Finally, this is an illustrative set of topics, and not an exhaustive one. We would not claim to be setting the definitive research agenda in this paper. Rather, we are setting out the need to learn from one another and to work together in the future. Below, we describe some examples of the type of initial discussions which might help us to move forward, using our three themes of knowledge production, knowledge mobilisation, and decision-making. We have cited relevant studies which set out research questions or provide insights. By doing so, we hope to demonstrate the breadth of disciplines and approaches which are being used to explore these questions; and the potential value of bringing these insights together.
Firstly, we must understand who is involved in shaping and producing the evidence base. Much has been written about the need to produce more robust, meaningful research which minimises research waste through improving quality and reporting (Chalmers et al., 2014 ; Glasziou and Chalmers, 2018 ; Ioannidis, 2005 ), and the infrastructure, funding and training which surround knowledge production and evaluation have attracted critical perspectives (Bayley and Phipps, 2017 ; Gonzalez Hernando and Williams, 2018 ; Katherine Smith and Stewart, 2017 ). Current discourses around ‘improving’ research focus on making evidence more rigorous, certain, and relevant; but how are these terms interpreted locally in different policy and practice contexts? How are different forms of knowledge and evidence assessed, and how do these criteria shape the activities of researchers?
Enabling researchers to reflect on their own role in the ‘knowledge economy’—that is, the production and services attached to knowledge-intensive activities (usually but not exclusively referring to technological innovation (Powell and Snellman, 2004 ))—requires engagement with this history.
This might mean asking questions about who is able to participate in the practice and evaluation of research. Who is able to ask and answer questions? What questions are asked and why? Who gets to influence research agendas? We know that there are barriers to participation in research for minority groups, and for many research users (Chrisler, 2015 ; Duncan and Oliver, 2017 ; Scott et al., 2009 ). At a global level, how are research priorities set by, for example, international funders and philanthropists? How can we ensure that local and indigenous interests and priorities are not ignored by predominantly Western research practices? How are knowledge ‘gaps’ or areas of ‘non-knowledge’ constructed, and what are the power relationships underpinning that process (Nielsen and Sørensen, 2017 )? There are important questions about what it means to do ethical research in the global society, with honesty about normative stances and values (Callard and Fitzgerald 2015 ), which apply to the practices we engage in as much as the substantive topics we focus on (Prainsack et al., 2010 ; Shefner et al., 2014 ).
It might also mean asking about how we do research. Many argue that research (particularly funded through responsive-mode arrangements) progresses in an incremental way, with questions often driven by ease, rather than public need (Parkhurst, 2017 ). Is this the most efficient way to generate new knowledge? How does this compare with, for example, random research funding (Shepherd et al., 2018 )? Stakeholder engagement is said to be required for impact, yet we know it is costly and time-consuming (Oliver et al., 2019 , 2019a ). How can universities and funders support researchers and users to work together long-term, with career progression and performance management untethered from simplistic (or perhaps any) metrics of impact? Is coproduced research truly more holistic, useful, and relevant? Or does inviting in different interests to deliberate on research findings, even processes, distort agendas and politicise research (Parkhurst and Abeysinghe, 2016 )? What are the costs and benefits to these different systems and practices? We know little about whether (and if so how well) each of these modes of evidence production leads to novel, useful, meaningful knowledge; nor how these modes influence the practice or outputs of research.
Transforming evidence translation and mobilisation
Significant resources are put into increasing ‘use’ of evidence, through interventions (Boaz et al., 2011 ) or research partnerships (Farrell et al., 2019 ; Tseng et al., 2018 ). Yet ‘use’ is not a straightforward concept. Using research well implies the existence of a diverse and robust evidence base; a range of pathways for evidence to reach decision-makers; both users and producers of knowledge having the capacity and willingness to engage in relationship-building and deliberation about policy and practice issues; research systems supporting individuals and teams to develop and share expertise.
More attention should be paid to how evidence is discussed, made sense of, negotiated and communicated—and the consequences of different approaches. This includes examining the roles of people involved in the funding of research, through to the ways in which decision-makers access and discuss evidence of different kinds. How can funders and universities create infrastructure and incentives to support researchers to do impactful research, and to inhabit boundary spaces between knowledge production and use? We know that potential users of research may sit within or outside government, with different levels and types of agency, making different types of decisions in different contexts (Cairney, 2018 ; Sanderson, 2000 ). Yet beyond ‘tailoring your messages’, existing advice to academics does not help them navigate this complex system (Cairney and Oliver, 2018 ). To take this lesson seriously, we might want to think about the emergence of boundary spanning- organisations and individuals which help to interface between research producers (primarily universities, but also civil society) and users (Bednarek et al., 2016 ; Cvitanovic et al., 2016 ; Stevenson, 2019 ). What types of interfacing are effective, and how—and how do interactions between evidence producers and users shape both evidence and policy? How might policies on data sharing and open science influence innovation and knowledge mobilisation practices?
Should individual academics engage in advocacy for policy issues (Cairney, 2016a ; Smith et al., 2015 ), using emotive stories or messaging to best communicate (Jones and Crow, 2017 ; Yanovitzky and Weber, 2018 ), or rather be ‘honest brokers’ representing without favour a body of work (Pielke, 2007 )? Or should this type of dissemination work be undertaken by boundary organisations or individuals who develop specific skills and networks? There is little empirical evidence about how best to make these choices (Oliver and Cairney, 2019 ), or how these consequences affect the impact or credibility of evidence (Smith and Stewart, 2017 ); nor is there good quality evidence about the most effective strategies and interventions to increase engagement or research uptake by decision-makers or between researchers and their audiences (Boaz et al., 2011 ). It seems likely that some researchers may get involved and others stay in the hinterlands (Locock and Boaz, 2004 ), depending on skills and preference. However, it is not clear how existing studies can help individuals navigate these complex and normative choices.
Communities (of practice, within policy, amongst diverse networks) develop their own languages and rationalities. This will affect how evidence is perceived and discussed (Smallman, 2018 ). Russell and Greenhalgh have shown how competing rationalities affect the reasoning and argumentation deployed in decision-making contexts (Greenhalgh and Russell, 2006 ; Russell and Greenhalgh, 2014 ); how can we interpret local meanings and sense-making in order to better communicate about evidence? Much has been written about the different formats and tailored outputs which can be used to ‘increase uptake’ by decision-makers (Lavis et al., 2003 ; Makkar et al., 2016 ; Traynor et al., 2014 )—although not with conclusive findings—yet we know so little about how these messages are received. Researchers may be communicating particularly messages, but how can we be sure that decision-makers are comprehending and interpreting those messages in the same way? Theories of communication (e.g., Levinson, 2000 ; Neale, 1992 ) must be applied to this problem.
Similarly, drawing on psychological theories of behaviour change, commentators have argued for greater use of emotion, narrative and story-telling by researchers in an attempt to influence decision-making (Cairney, 2016b ; Davidson, 2017 ; Jones and Crow, 2017 ). Are these effective at persuading people and if so how do they work? What are the ethical questions surrounding such activities and how does this affect researcher identity? Should researchers be aiming to communicate simple messages about which there is broad consensus?
Discussions of consensus often ask whether agreement is a laudable aim for researchers, or how far consensus is achievable (De Kerckhove et al., 2015 ; Lidskog and Sundqvist, 2004 ; Rescher, 1993 ). We are also interested in the tension between scientific and politician consensus, and how differences in interpretations of knowledge can be leveraged to influence political consensus (Beem, 2012 ; Montana, 2017 ; Pearce et al., 2017 ). What tools can be used to generate credibility? Is evidence persuasive of itself; can it survive the translation process; and is it reasonable to expect individual researchers to broadcast simple messages about which there is broad consensus, if that is in tension with their own ethical practices and knowledge (even if the most effective way to influence policy? Is consensus required for the credibility of science and scientists, or can am emphasis on similarity in fact reduce the value of research and the esteem of the sector? Is it the task of scientists to surface conflicts and disagreements, and how far does this duty extend into the political sphere (Smith and Stewart, 2017 )?
Transforming decision-making, and the role of evidence within it
Finally, we need to understand how research and researchers can support decision-making given what we know about the decision-making context or culture, and how this influences evidence use (Lin, 2008 ). This means better understanding the roles of professional and local cultures of evidence use, governance arrangements, and roles of public dialogues so that we can we start to investigate empirically-informed strategies to increase impact (Locock and Boaz, 2004 ; Oliver et al., 2014 ). This would include empirical examination of individual strategies to influence decision-making, as well as more institutional infrastructures and roles; case studies of different types of policymaking and the evidence diets consumed in these contexts; and how different people embody different imperatives of the evidence/policy nexus. We need to bring together examples of the policy and practice lifecycles, and examine the roles of different types of evidence throughout those processes (Boaz et al., 2011 , 2016 ).
We want to know what shapes the credibility afforded to different experts and forms of expertise, and how to cultivate credibility to enable better decision-making (Grundmann, 2017 ; Jacobson and Goering, 2006 ; Mullen, 2016 ; Williams, 2018 ). What does credibility enable (greater attention or influence; greater participation by researchers in policy processes; a more diverse debate)? What is the purpose of increasing credibility? What is the ultimate aim of attempting to become credible actors in policy spaces? How far should universities and researchers go—should we be always aiming for more influence? Or should we recognise and explore the diversity of roles research and researchers can play in decision-making spaces?
Ultimately, methods must be found to evaluate the impact of evidence on policy and practice change, and on populations—including unintended or unwanted consequences (Lorenc and Oliver, 2013 ; Oliver et al., 2019 , 2019a ). Some have argued that the primary role for researchers is to demonstrate the consequences of decisions and to enable debate. This requires the development and application of methods to evaluate changes, understand mechanisms, and develop theory and substantive and normative debates, as well as engage in the translation and mobilisation of evidence. It also requires increased transparency to enable researchers to understand evidence use (Nesta, 2012 ), while also allowing others like Sense about Science to check the validity of evidence claims on behalf of the public (Sense about Science, 2016 ).
Next steps and concrete outputs
These illustrative examples demonstrate the vast range of discussions which are happening, and need to happen to help us transform how we produce and use evidence. We are not the first to identify the problems of research wastage (Glasziou and Chalmers, 2018 ) or to emphasise the need to maximise the value of research for society (Duncan and Oliver, 2017 ). Nor are we the first to note that all the parts of the research system play a role achieving this, from funding (Geuna and Martin, 2003 ), to research practices (Bishop, 2019 ; Fransman, 2018 ), to translational activities (Boaz et al. 2019 ; Nutley and Tseng, 2014 ), professional science advice (Doubleday and Wilsdon, 2012 ) and public and professional engagement (Holliman and Warren, 2017 ). There have been sustained attempts to build communities and networks to attempt ways to improve parts of this system Footnote 1 . However, most of these initiatives are rooted in particular disciplines or professional activities. We see a need for a network which bridges these initiatives, helping each other articulate their key lessons for one another, and progressing our conversations about how to do better research about evidence production and use.
Researchers, funders, decision-makers and publics will approach and inhabit this space from different, sometimes very different directions. We do not claim to be writing the definitive account. But we would like to open the door to more critical accounts of evidence production and use which are specifically aimed at multi-disciplinary and sectoral audiences. Our aim is to welcome and support debate, to introduce parts of our diverse community to each other, and to enable our individual perspectives and knowledge to be more widely valued.
We anticipate disagreement and discussion, and support a multitude of ways of approaching the issues we identify above. Some may feel that our energies should be directed to democratising knowledge for all and ensuring that this is mobilised to maximise equality and fairness (Stewart et al., 2018 ). Others may feel that our task is to observe, problematise and critique these processes, rather than engage in them directly (Fuller, 1997 ). Our view is that both normative and critical approaches are vital; as are empirical and theoretical contributions to our understanding of high-level research systems, down to micro-interactions in evidence production and use. Our contention is that we must keep this space vibrant and busy, producing new knowledge together, and learning from each other. This requires investment in research on evidence production and use, in virtual and literal spaces to hold conversations, as well as in capacity and capability. There are significant and important gaps in what we know about evidence production and use, but identifying the particular and specific research agendas for each of these gaps must be a collaborative process.
We also see a need to support those who are new to this space. Many come to the problem of evidence use without any training in the history of research in this space. We see a need to provide an accessible route into these debates, and welcome opportunities to collaborate on textbooks or learning resources to support new students, non-academics and those new to the field.
The Nuffield Foundation meeting which led to this paper demonstrated how valuable these opportunities are to enable learning and relationship-building through face-to-face interactions. We will continue to create opportunities for greater transdisciplinary and academic-partner conversations, to share learning across spheres of activity and to build capacity, and to use these new perspectives to generate fresh avenues of enquiry, through the new Transforming Evidence Footnote 2 collaboration.
Finally, we argue for increased investment to maximise the learning we already have, and to support more effective knowledge production and use. Too much money and expertise has been wasted, and too many opportunities to build on existing expertise have been squandered. We must find better ways to make this learning accessible, and to identify true knowledge gaps. Indeed, we believe that collaboration across disciplinary and sectoral boundaries is the only way in which this space will both progress and demonstrate its true value. We must prevent the waste of limited resources to understand how to transform evidence production and use for the benefit of society. Putting what we already know into practice would be an excellent place to start.
Change history
29 august 2019.
An amendment to this paper has been published and can be accessed via a link at the top of the paper.
See, for example https://www.alliance4usefulevidence.org/ , https://www.ingsa.org/ , https://4sonline.org/ , https://www.metascience2019.org/ , http://www.alltrials.net/
See Transforming Evidence site, https://transformure.wordpress.com/
Bachrach P, Baratz MS (1962) Two faces of power. Am Polit Sci Rev. https://doi.org/10.2307/1952796
Article Google Scholar
Bayley JE, Phipps D (2017) Building the concept of research impact literacy. Evid Policy . https://doi.org/10.1332/174426417x15034894876108
Bednarek AT, Shouse B, Hudson CG et al. (2016) Science-policy intermediaries from a practitioner’s perspective: The Lenfest Ocean Program experience. Sci Pub Policy. https://doi.org/10.1093/scipol/scv008
Beem B (2012) Learning the wrong lessons? Science and fisheries management in the Chesapeake Bay blue crab fishery Public Underst Sci 21(4):401–417. https://doi.org/10.1177/0963662510374177
Article PubMed Google Scholar
Berger PL, Luckmann T (1966) The social construction of reality: A treatise in the sociology of knowledge. Doubleday, Garden City, NY
Bishop D (2019) Rein in the four horsemen of irreproducibility. Nature 568(7753):435–435. https://doi.org/10.1038/d41586-019-01307-2
Article ADS CAS PubMed Google Scholar
Boaz A, Baeza J, Fraser A (2011) Effective implementation of research into practice: An overview of systematic reviews of the health literature. BMC Res Notes. https://doi.org/10.1186/1756-0500-4-212
Boaz A, Robert G, Locock L et al. (2016) What patients do and their impact on implementation. J Health Organiz Manag. https://doi.org/10.1108/JHOM-02-2015-0027
Boaz A, Davies H, Fraser A et al. (2019) What works now? Evidence-based policy and practice revisited. Policy press. Available at: https://policy.bristoluniversitypress.co.uk/what-works-now . (Accessed 17 July 2018)
Brett J, Staniszewska S, Mockford C et al. (2014) A systematic review of the impact of patient and public involvement on service users, researchers and communities. Patient 7(4):387–395. https://doi.org/10.1007/s40271-014-0065-0
Cairney P (ed) (2016a) Health and advocacy: What are the barriers to the use of evidence in policy? In: The politics of evidence-based policy making. Palgrave Macmillan, London, UK, pp 51–84. https://doi.org/10.1057/978-1-137-51781-4_3
Google Scholar
Cairney P (2016b) The politics of evidence-based policy making. Springer, London, pp 1–137. https://doi.org/10.1057/978-1-137-51781-4
Cairney P (2018) Three habits of successful policy entrepreneurs. Policy Polit 46(2):199–215. https://doi.org/10.1332/030557318X15230056771696
Cairney P, Oliver K (2018) How should academics engage in policymaking to achieve impact? Polit Stud Rev . https://doi.org/10.1177/1478929918807714
Callard F, Des Fitzgerald (2015) Rethinking interdisciplinarity across the social sciences and neurosciences. https://doi.org/10.1057/9781137407962
Book Google Scholar
Chalmers I, Bracken MB, Djulbegovic B et al. (2014) How to increase value and reduce waste when research priorities are set. Lancet 383(9912):156–165. https://doi.org/10.1016/S0140-6736(13)62229-1
Chapman JM, Algera D, Dick M et al. (2015) Being relevant: Practical guidance for early career researchers interested in solving conservation problems. Glob Ecol Conserv 4:334–348. https://doi.org/10.1016/j.gecco.2015.07.013
Chrisler AJ (2015) Humanizing research: Decolonizing qualitative inquiry with youth and communities. J Fam Theory Rev 7(3):333–339. https://doi.org/10.1111/jftr.12090
Collins HM, Evans R (2002) The third wave of science studies. Soc Stud Sci 32(2):235–296. https://doi.org/10.1177/0306312702032002003
Crenson MA (1971) The un-politics of air pollution; a study of non-decisionmaking in the cities. Johns Hopkins Press, Baltimore and London
Cvitanovic C, McDonald J, Hobday AJ (2016) From science to action: Principles for undertaking environmental research that enables knowledge exchange and evidence-based decision-making. J Environ Manage. https://doi.org/10.1016/j.jenvman.2016.09.038
Article CAS PubMed Google Scholar
Davidson B (2017) Storytelling and evidence-based policy: Lessons from the grey literature. Palgrave Commun. https://doi.org/10.1057/palcomms.2017.93
De Kerckhove DT, Rennie MD, Cormier R (2015) Censoring government scientists and the role of consensus in science advice: A structured process for scientific advice in governments and peer-review in academia should shape science communication strategies. EMBO Rep 16(3):263–266. https://doi.org/10.15252/embr.201439680
Article CAS PubMed PubMed Central Google Scholar
Debnam G (1975) Nondecisions and Power: The Two Faces of Bachrach and Baratz. Am Political Sci Rev 69(03):889–899. https://doi.org/10.2307/1958397
Dobbins M, Robeson P, Ciliska D et al. (2009) A description of a knowledge broker role implemented as part of a randomized controlled trial evaluating three knowledge translation strategies. Implement Sci 4(1). https://doi.org/10.1186/1748-5908-4-23
Dobrow MJ, Goel V, Upshur REG (2004) Evidence-based health policy: context and utilisation. Soc Sci Med. https://doi.org/10.1016/S0277-9536(03)00166-7
Doubleday R, Wilsdon J (2012) Science policy: Beyond the great and good. Nature. https://doi.org/10.1038/485301a .
Douglas H (2009) Science, Policy, and the Value-Free Ideal. University of Pittsburgh Press, Pittsburgh
DuMont K (2019) Reframing evidence-based policy to align with the evidence|William T. Grant foundation. Available at: http://wtgrantfoundation.org/digest/reframing-evidence-based-policy-to-align-with-the-evidence . (Accessed 28 Jan 2019)
Duncan S, Oliver S (2017) Editorial. Res All 1(1):1–5. https://doi.org/10.18546/RFA.01.1.01
Article MathSciNet Google Scholar
Dye TR (1975) Understanding public policy. Prentice-Hall. Available at: http://agris.fao.org/agris-search/search.do?recordID=US201300519645 . (Accessed 18 Jan 2019)
Emejulu A (2018) On the problems and possibilities of feminist solidarity: The Women’s March one year on. IPPR Progress Rev 24(4):267–273. https://doi.org/10.1111/newe.12064
Farrell CC, Harrison C, Coburn CE (2019) What the hell is this, and who the hell are you? role and identity negotiation in research-practice partnerships. AERA Open 5(2):233285841984959. https://doi.org/10.1177/2332858419849595
Ferlie E (2019) The politics of management knowledge in times of austerity. Available at: https://books.google.co.uk/books?hl=en&lr=&id=Ok5yDwAAQBAJ&oi=fnd&pg=PP1&dq=info:XZBJCDoqIowJ:scholar.google.com&ots=Vg1eZHL9e_&sig=fS2Bf8w7VtyDKfZ3InQWq-npbuk&redir_esc=y#v=onepage&q&f=false . (Accessed 14 Feb 2019)
Fransman J (2018) Charting a course to an emerging field of ‘research engagement studies’: A conceptual meta-synthesis Res All 2(2):1–49. http://www.ingentaconnect.com/contentone/ioep/rfa/2018/00000002/00000002/art00002#
Fransman J, Hall B, Hayman R et al. (2018) Promoting fair and equitable research partnerships to respond to global challenges. Rethinking research collaborative. Available at: http://oro.open.ac.uk/57134/ . (Accessed 25 Apr 2019)
French RD (2018) Lessons from the evidence on evidence-based policy. Can Public Adm 61(3):425–442. https://doi.org/10.1111/capa.12295
Fuller S (1997) Constructing the high church-low church distinction in STS textbooks. Bull Sci, Technol Soc 17(4):181–183. https://doi.org/10.1177/027046769701700408
Funtowicz SO, Ravetz JR (1993) Science for the post-normal age. Futures. https://doi.org/10.1016/0016-3287(93)90022-L
Gamoran A (2018) Evidence-based policy in the real world: A cautionary view Ann Am Acad Political Soc Sci 678(1):180–191. https://doi.org/10.1177/0002716218770138
Geuna A, Martin BR (2003) University research evaluation and funding: An international comparison. Kluwer Academic Publishers, Minerva. https://doi.org/10.1023/B:MINE.0000005155.70870.bd
Glasziou P, Chalmers I (2018) Research waste is still a scandal—an essay by Paul Glasziou and Iain Chalmers. BMJ 363:k4645. https://doi.org/10.1136/BMJ.K4645
Gluckman P (2014) Policy: The art of science advice to government. Nature 507(7491):163–165. https://doi.org/10.1038/507163a
Gonzalez Hernando M, Williams K (2018) Examining the link between funding and intellectual interventions across universities and think tanks: a theoretical framework. Int J Polit, Cult Soc 31(2):193–206. https://doi.org/10.1007/s10767-018-9281-2
Goodyear-Smith F, Jackson C, Greenhalgh T (2015) Co-design and implementation research: challenges and solutions for ethics committees. BMC Med Eth. https://doi.org/10.1186/s12910-015-0072-2
Gough D, Maidment C, Sharples J (2018) UK What Works Centres: Aims, methods and contexts. London. Available at: https://eppi.ioe.ac.uk/cms/Default.aspx?tabid=3731 . (Accessed 27 Feb 2019)
Gov.UK (2016) Science and research funding allocation: 2016 to 2020-GOV.UK. Available at: https://www.gov.uk/government/publications/science-and-research-funding-allocation-2016-to-2020 . (Accessed 14 Feb 2019)
Greenhalgh T, Russell J (2006) Reframing evidence synthesis as rhetorical action in the policy making drama. Healthcare Policy|Politiques de Santé. https://doi.org/10.12927/hcpol.2006.17873
Article PubMed PubMed Central Google Scholar
Grundmann R (2017) The problem of expertise in knowledge societies. Minerva 55(1):25–48. https://doi.org/10.1007/s11024-016-9308-7
Hall BL, Tandon R (2017) Decolonization of knowledge, epistemicide, participatory research and higher education. Res All 1(1):6–19. https://doi.org/10.18546/RFA.01.1.02
Hartley S (2016) Policy masquerading as science: an examination of non-state actor involvement in European risk assessment policy for genetically modified animals. J Eur Public Policy. https://doi.org/10.1080/13501763.2015.1049196 .
Hartley S, Pearce W, Taylor A (2017) Against the tide of depoliticisation: the politics of research governance. Policy Polit 45(3):361–377. https://doi.org/10.1332/030557316X14681503832036
Hawkins B, Ettelt S (2018) The strategic uses of evidence in UK e-cigarettes policy debates. Evid Policy. https://doi.org/10.1332/174426418X15212872451438
Haynes A, Brennan S, Redman S et al. (2016) Figuring out fidelity: A worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies. Implement Sci 11(1). https://doi.org/10.1186/s13012-016-0378-6
Haynes AS, Derrick GE, Redman S et al. (2012) Identifying trustworthy experts: How do policymakers find and assess public health researchers worth consulting or collaborating with?. PLoS ONE 7(3):e32665. https://doi.org/10.1371/journal.pone.0032665 .
Article ADS CAS PubMed PubMed Central Google Scholar
Holliman R, Warren CJ (2017) Supporting future scholars of engaged research. Res All. https://doi.org/10.18546/rfa.01.1.14
Innvaer S, Vist G, Trommald M et al. (2002) Health policy-makers’ perceptions of their use of evidence: a systematic review. J health Serv Res policy 7(4):239–44. https://doi.org/10.1258/135581902320432778
Ioannidis JPA (2005) Why most published research findings are false. PLoS Med 2(8):e124. https://doi.org/10.1371/journal.pmed.0020124
Jacobson N, Goering P (2006) Credibility and credibility work in knowledge transfer. Evid Policy. https://doi.org/10.1332/174426406777068894
Jasanoff S (2005) Judgment under siege: The three-body problem of expert legitimacy. In: Maasen S, Weingart P (eds) Democratization of expertise? Springer-Verlag, Berlin/Heidelberg, pp 209–224. https://doi.org/10.1007/1-4020-3754-6_12
Jasanoff S, Polsby NW (1991) The fifth branch: Science advisers as policymakers. Contemp Sociol 20(5):727. https://doi.org/10.2307/2072218
Jones M, Crow D (2017) How can we use the ‘science of stories’ to produce persuasive scientific stories. Palgrave Commun 3(1):53. https://doi.org/10.1057/s41599-017-0047-7
Kislov R, Wilson PM, Knowles S et al. (2018) Learning from the emergence of NIHR Collaborations for Leadership in Applied Health Research and Care (CLAHRCs): a systematic review of evaluations. Implement Sci 13(1):111. https://doi.org/10.1186/s13012-018-0805-y
Kuhn TS (1970) The structure of scientific revolutions. The physics teacher. https://doi.org/10.1017/CBO9781107415324.004
Lancaster K (2014) Social construction and the evidence-based drug policy endeavour. Int J Drug Policy 25(5):948–951. https://doi.org/10.1016/j.drugpo.2014.01.002
Latour B, Woolgar S (2013) Laboratory life: The construction of scientific facts. 1986. https://doi.org/10.1017/CBO9781107415324.004
Lavis JN, Robertson D, Woodside JM et al. (2003) How can research organizations more effectively transfer research knowledge to decision makers? Milbank Q 81(2):221–248. https://doi.org/10.1111/1468-0009.t01-1-00052
Lee CJ, Scheufele DA, Lewenstein BV (2005) Public attitudes toward emerging technologies: Examining the interactive effects of cognitions and affect on public attitudes toward nanotechnology. Sci Commun 27(2):240–267. https://doi.org/10.1177/1075547005281474
Levinson SC (2000) Presumptive meanings: The theory of generalized conversational implicature. language, speech, and communication series. https://doi.org/10.1162/coli.2000.27.3.462
Liabo K, Stewart R (2012) Involvement in research without compromising research quality. J Health Serv Res Policy. https://doi.org/10.1258/jhsrp.2012.011086
Lidskog R, Sundqvist G (2004) From consensus to credibility: New challenges for policy-relevant science. Innovation 17(3):205–226. https://doi.org/10.1080/1351161042000241144
Lin V (2008) Evidence-Based public health policy. In: Quah, Stella R (eds) International encyclopedia of public health. Elsevier, Oxford. https://doi.org/10.1016/B978-012373960-5.00234-3
Chapter Google Scholar
Lindblom CE (1990) Inquiry and change: The troubled attempt to understand and shape society. Yale University Press, JSTOR. http://www.jstor.org/stable/j.ctt1dszwww
Locock L, Boaz A (2004) Research, policy and practice–worlds apart? Soc Pol Soc. https://doi.org/10.1017/S1474746404002003
Lopez N, Gadsden VL (2018) Health inequities, social determinants, and intersectionality. NAM Perspect. 6(12). https://doi.org/10.31478/201612a
Lorenc T, Oliver K (2013) Adverse effects of public health interventions: a conceptual framework. J Epidemiol Community Health 68(3):288–290. https://doi.org/10.1136/jech-2013-203118
Makkar SR, Howe M, Williamson A et al. (2016) Impact of tailored blogs and content on usage of Web CIPHER–an online platform to help policymakers better engage with evidence from research. Health Res Policy Syst 14(1):85. https://doi.org/10.1186/s12961-016-0157-5
Malbon E, Carson L, Yates S (2018) What can policymakers learn from feminist strategies to combine contextualised evidence with advocacy? Palgrave Commun. https://doi.org/10.1057/s41599-018-0160-2
Montana J (2017) Accommodating consensus and diversity in environmental knowledge production: Achieving closure through typologies in IPBES. Environ Sci Policy 68:20–27. https://doi.org/10.1016/J.ENVSCI.2016.11.011
Mullen EJ (2016) Reconsidering the ‘idea’ of evidence in evidence-based policy and practice. European journal of social work 19(3–4):310–335
Neale S (1992) Paul Grice and the philosophy of language. Linguist Philos. https://doi.org/10.1007/BF00630629
Nesta (2012) The red book for evidence. Available at: https://www.nesta.org.uk/blog/red-book-evidence/ . (Accessed 14 Feb 2019)
Nielsen KH, Sørensen MP (2017) How to take non-knowledge seriously, or “the unexpected virtue of ignorance”. Public Underst Sci 26(3):385–392. https://doi.org/10.1177/0963662515600967
NIHR (2009) NIHR collaborations for leadership in applied health research and care (CLAHRCs): implementation plan 5.8. Available at: https://www.nihr.ac.uk/about-us/how-we-are-managed/our-structure/infrastructure/collaborations-for-leadership-in-applied-health-research-and-care.htm . (Accessed 14 Feb 2019)
NIHR (2018) NIHR announces £150m investment in applied health research. Available at: https://www.nihr.ac.uk/news/nihr-announces-150m-investment-in-applied-health-research/8800 . (Accessed 25 Apr 2019)
Nurse P (2015) Ensuring a successful UK research endeavour: A review of the UK Research councils. BIS/15/625, Department for Business, Innovation and Skills, London
Nutley SM, Smith PC, Davies HTO (eds) (2000) What works?: Evidence-based policy and practice in public services. Policy Press, Bristol
Oliver K, Boaz A (2018) What makes research useful? We still don’t know. Available at: https://www.researchresearch.com/news/article/?articleId=1377811 . (Accessed 18 Jan 2019)
Oliver K, Cairney P (2019) The do's and don’ts of influencing policy: a systematic review of advice to academics. Palgrave Commun 5(1):21
Oliver K, Faul MV (2018) Networks and network analysis in evidence, policy and practice. Evid Policy 14(3):369–379. https://doi.org/10.1332/174426418X15314037224597
Oliver K, Pearce W (2017) Three lessons from evidence-based medicine and policy: increase transparency, balance inputs and understand power. Palgrave Commun 3(1):43. https://doi.org/10.1057/s41599-017-0045-9
Oliver K, Innvar S, Lorenc T et al. (2014) A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res 14(1):2. https://doi.org/10.1186/1472-6963-14-2
Oliver K, Lorenc T, Innvær S (2014) New directions in evidence-based policy research: A critical analysis of the literature. Health Res Policy Syst 12(1):34. https://doi.org/10.1186/1478-4505-12-34
Oliver K, Tinker J, Lorenc T et al. (2019a) Evaluating unintended consequences: new insights into solving practical, ethical, and political challenges of evaluation. Evaluation (in press)
Oliver K, Kothari A, Mays N (2019) The dark side of coproduction: do the costs outweigh the benefits for health research? Health Res Policy Syst 17(1):33
Oliver A, de Vocht F (2015) Defining ‘evidence’ in public health: a survey of policymakers’ uses and preferences. Eur J Public Health: ckv082. https://doi.org/10.1093/eurpub/ckv082
Orton L, Lloyd-Williams F, Taylor-Robinson D et al. (2011) The use of research evidence in public health decision making processes: Systematic review. PLoS ONE. https://doi.org/10.1371/journal.pone.0021704
Owen R, Macnaghten P, Stilgoe J (2012) Responsible research and innovation: From science in society to science for society, with society. Sci Public Policy. https://doi.org/10.1093/scipol/scs093
Parkhurst J (2017) The politics of evidence: From evidence-based policy to the good governance of evidence. Routledge Studies in Governance and Public Policy. Routledge, London. https://doi.org/10.4324/9781315675008
Parkhurst JO, Abeysinghe S (2016) What constitutes “Good” evidence for public health and social policy-making? From hierarchies to appropriateness. Soc Epistemol 30(5–6):665–679. https://doi.org/10.1080/02691728.2016.1172365
Pearce W, Raman S (2014) The new randomised controlled trials (RCT) movement in public policy: challenges of epistemic governance. Policy Sci 47(4):387–402. https://doi.org/10.1007/s11077-014-9208-3
Pearce W, Grundmann R, Hulme M et al. (2017) Beyond counting climate consensus. Environ Commun 11(6):723–730. https://doi.org/10.1080/17524032.2017.1333965
Pearce W, Mahony M, Raman S (2018) Science advice for global challenges: Learning from trade-offs in the IPCC. Environ Sci Policy 80:125–131. https://doi.org/10.1016/j.envsci.2017.11.017
Pielke RA (2007) The honest broker: Making sense of science in policy and politics. Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9780511818110
Popper K (1963) Science as Falsification. Conjectures and refutations, readings in the philosophy of science. Routledge, London. https://doi.org/10.2307/3517358
Powell WW, Snellman K (2004) The knowledge economy. Annu Rev Sociol 30(1):199–220. https://doi.org/10.1146/annurev.soc.29.010202.100037
Prainsack B (2018) The “We” in the “Me”: Solidarity and health care in the era of personalized medicine. Sci Technol Hum Values 43(1):21–44. https://doi.org/10.1177/0162243917736139
Prainsack B, Svendsen MN, Koch L et al. (2010) How do we collaborate? Social science researchers’ experience of multidisciplinarity in biomedical settings. BioSocieties 5(2):278–286. https://doi.org/10.1057/biosoc.2010.7
Reed M, Evely A (2016) How can your research have more impact? Five key principles and practical tips for effective knowledge exchange. LSE Impact blog: 1–5. Available at: http://blogs.lse.ac.uk/impactofsocialsciences/2015/07/07/how-can-your-research-have-more-impact-5-key-principles-tips/ . (Accessed 10 July 2018)
Rescher N (1993) Pluralism: against the demand for consensus. Clarendon Press, Oxford University Press, Oxford
Russell J, Greenhalgh T (2014) Being ‘rational’ and being ‘human’: How National Health Service rationing decisions are constructed as rational by resource allocation panels. Health (United Kingdom). https://doi.org/10.1177/1363459313507586
Sanderson I (2000) Evaluation in complex policy systems. Evaluation 6(4):433–454. https://doi.org/10.1177/13563890022209415 .
Sarewitz D (2018) Of cold mice and isotopes or should we do less science? In: Science and politics: Exploring relations between academic research, higher education, and science policy summer school in higher education research and science studies, Bonn, 2018. Available at: https://sfis.asu.edu/sites/default/files/should_we_do_less_science-revised_distrib.pdf
Science (2018) Congress approve largest U.S. research spending increase in a decade. https://doi.org/10.1126/science.aat6620
Scott J, Lubienski C, Debray-Pelot E (2009) The politics of advocacy in education. Educ Policy. https://doi.org/10.1177/0895904808328530
Scott JT (2011) Market-driven education reform and the racial politics of advocacy. Peabody J Educ. https://doi.org/10.1080/0161956X.2011.616445
Sense about Science (2016) Missing evidence. Available at: https://senseaboutscience.org/activities/missing-evidence/ . (Accessed 14 Feb 2019)
Shapin S (1995) Here and everywhere: sociology of scientific knowledge. Ann Rev Sociol. https://doi.org/10.1146/annurev.soc.21.1.289
Shefner J Dahms HF Jones RE (eds) (2014) Social justice and the university. Palgrave Macmillan UK, London. https://doi.org/10.1057/9781137289384
Shepherd J, Frampton GK, Pickett K et al. (2018) Peer review of health research funding proposals: A systematic map and systematic review of innovations for effectiveness and efficiency. PLoS ONE 13(5):e0196914. https://doi.org/10.1371/journal.pone.0196914
Smallman M (2018) Science to the rescue or contingent progress? Comparing 10 years of public, expert and policy discourses on new and emerging science and technology in the United Kingdom. Public Underst Sci. https://doi.org/10.1177/0963662517706452
Article MathSciNet PubMed Google Scholar
Smith K, Stewart E (2017) We need to talk about impact: Why social policy academics need to engage with the UK’s research impact agenda. J Soc Policy 109–127. https://doi.org/10.1017/S0047279416000283
Smith K, Stewart E, Donnelly P et al. (2015) Influencing policy with research-public health advocacy and health inequalities. Health Inequalities. https://doi.org/10.1093/acprof:oso/9780
Smith KE, Stewart EA (2017) Academic advocacy in public health: Disciplinary ‘duty’ or political ‘propaganda’? Soc Sci Med 189:35–43. https://doi.org/10.1016/j.socscimed.2017.07.014
Stevenson O (2019) Making space for new models of academic-policy engagement. Available at: http://www.upen.ac.uk/blogs/?action=story&id=41 . (Accessed 12 Apr 2019)
Stewart R, Langer L, Erasmus Y (2018) An integrated model for increasing the use of evidence by decision-makers for improved development. Dev Southern Africa. 1–16. https://doi.org/10.1080/0376835X.2018.1543579
Stilgoe J, Owen R, Macnaghten P (2013) Developing a framework for responsible innovation. Res Policy. https://doi.org/10.1016/j.respol.2013.05.008
Tchilingirian JS (2018) Producing knowledge, producing credibility: British think-tank researchers and the construction of policy reports. Int J Polit Cult Soc 31(2):161–178. https://doi.org/10.1007/s10767-018-9280-3
Traynor R, DeCorby K, Dobbins M (2014) Knowledge brokering in public health: A tale of two studies. Public Health 128(6):533–544. https://doi.org/10.1016/j.puhe.2014.01.015
Nutley SM, Tseng V (2014) Building the infrastructure to improve the use and usefulness of research in education. In: Finnigan KS, Daly AJ (eds) Using research evidence in education: From the schoolhouse door to Capitol Hill. Policy implications of research in education, vol. 2. Springer, pp 163–175. https://doi.org/10.1007/978-3-319-04690-7_11
Tseng V, Easton JQ, Supplee LH (2018) Research-practice partnerships: Building two-way streets of engagement. Soc Policy Report. https://doi.org/10.1002/j.2379-3988.2017.tb00089.x
UKRI-UNDP (2018) UKRI-UNDP joint report: ‘How science, research and innovation can best contribute to meeting the sustainable development goals for developing countries’ full application guidance -applications by invitation only. Available at: https://www.ukri.org/research/global-challenges-research-fund/ukri-undp-joint-report-how-science-research-and-innovation-can-best-contribute-to-meeting-the-sustainable-development-goals-for-developing-countries/ . (Accessed 14 Feb 2019)
UKRI (2017) UK strategy for the global challenges research fund (GCRF). Available at: https://www.ukri.org/files/legacy/research/gcrf-strategy-june-2017/%0A%0A
Ward V (2017) Why, whose, what and how? A framework for knowledge mobilisers. Evid Policy. https://doi.org/10.1332/174426416X14634763278725
Wehrens R, Bekker M, Bal R (2010) The construction of evidence-based local health policy through partnerships: Research infrastructure, process, and context in the Rotterdam ‘Healthy in the City’ programme. J Public Health Policy. https://doi.org/10.1057/jphp.2010.33
Weiss CH (1979) The many meanings of research utilization. Public Adm Rev 39(5):426. https://doi.org/10.2307/3109916
White HC (2008) Identity and control: How social formations emerge. Princeton University Press, Princeton. https://doi.org/10.1007/s13398-014-0173-7.2
Whitehead M, Petticrew M, Graham H et al. (2004) Evidence for public health policy on inequalities: 2: Assembling the evidence jigsaw. J Epidemiol Community Health 2004:817–821. https://doi.org/10.1136/jech.2003.015297
Williams K (2018) Three strategies for attaining legitimacy in policy knowledge: Coherence in identity, process and outcome. Public Admin. https://doi.org/10.1111/padm.12385
Wynne B (1992) Misunderstood misunderstanding: Social identities and public uptake of science. Public Understand Sci. 1281–304. https://doi.org/10.1088/0963-6625/1/3/004
Yanovitzky I, Weber M (2018) Analysing use of evidence in public policymaking processes: a theory-grounded content analysis methodology. Evid Policy. https://doi.org/10.1332/174426418x15378680726175
Download references
Acknowledgements
We thank the Nuffield Foundation, the Wellcome Trust and the William T Grant Foundation for financial support for a meeting on Transforming the use of Research Evidence, held in London in 2018. We are grateful to both the participants at this meeting and those attending the William T Grant Foundation Use of Research Evidence meeting in Washington 2019. In particular, we very much appreciate the contribution of Kim DuMont, Paul Cairney and Warren Pearce who commented on drafts of this paper before submission. Our thanks to you all.
Author information
Authors and affiliations.
London School of Hygiene and Tropical Medicine, London, UK
Kathryn Oliver
Kingston University, London, UK
Annette Boaz
You can also search for this author in PubMed Google Scholar
Corresponding author
Correspondence to Kathryn Oliver .
Ethics declarations
Competing interests.
The authors declare no competing interests.
Additional information
Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and permissions
About this article
Cite this article.
Oliver, K., Boaz, A. Transforming evidence for policy and practice: creating space for new conversations. Palgrave Commun 5 , 60 (2019). https://doi.org/10.1057/s41599-019-0266-1
Download citation
Received : 19 February 2019
Accepted : 14 May 2019
Published : 28 May 2019
DOI : https://doi.org/10.1057/s41599-019-0266-1
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
This article is cited by
Public engagement for inclusive and sustainable governance of climate interventions.
- Livia Fritz
- Chad M. Baum
- Benjamin K. Sovacool
Nature Communications (2024)
Herding cats: integrative leadership strategies in inter- and transdisciplinary research programs
- Lisa Deutsch
- Astrid Björnsen
- Sabine Hoffmann
Sustainability Science (2024)
Impact, Equity, and Philanthropic Foundations: Can Randomized Controlled Trials Help Account for the Democratic Deficit?
- Jennifer E. Mosley
- Nicole P. Marwell
- Cameron Day
VOLUNTAS: International Journal of Voluntary and Nonprofit Organizations (2024)
Combining public health evidence, policy experience and communications expertise to inform preventive health: reflections on a novel method of knowledge synthesis
- Maddie Heenan
- Alexandra Chung
- Lucie Rychetnik
Health Research Policy and Systems (2023)
Developing, implementing, and monitoring tailored strategies for integrated knowledge translation in five sub-Saharan African countries
- Kerstin Sell
- Nasreen S. Jessani
- Lisa M. Pfadenhauer
Quick links
- Explore articles by subject
- Guide to authors
- Editorial policies
IMAGES