Evidence, Policy and Regulation
Evidence Synthesis by Building a Case
It is surely better to use evidence to inform predictions about which policies will work if we implement them and which not. Most policies involve complex interventions and almost all are into complex settings. In these cases a vast array of evidence of different kinds from different sources will be relevant. But how does this variety of evidence add up? How are policy makers to figure out what, all told, the evidence points to?
There are a variety of evidence synthesis schemes available that are reasonably good but of restricted applicability. Many can treat evidence even from different disciplinary approaches – so long as the evidence is homogeneous, that is: the separate pieces of evidence speak directly to the same immediate hypothesis. In general the pieces of evidence needed for policy evaluation are not homogeneous. They are not discrete masses all speaking immediately to one overall conclusion about how effective a policy is, but rather multiple parts of a structure of different claims that together may argue for this overall conclusion. In this case synthesis is more complicated.
The project will build on the idea that rather than thinking in terms of ‘weighing up’ evidence, the metaphor of ‘building a case’ is more useful. The argument for the overall conclusion involves a network of relationships among claims about different facets of the process by which the policy achieves (or not) its effects. Fans of courtroom drama are familiar with instances of this metaphor in which a case appears weak until evidence about a few critical facts is provided that make the case fit together and explain how the crime occurred. The aim is to lay the groundwork for a protocol for the synthesis of non-homogeneous evidence in ‘building a case’ to arrive at an overall evaluation, a protocol that would provide a basis for causal inference at different levels and for different parts of the intervention and that can aggregate different kinds of evidence from different kinds of sources and different perspectives (e.g. statistical, ethnographic).
Organised by Professors Nancy Cartwright and Julian Reiss, Philosophy Department and CHESS, there will be a Thursday morning reading group to survey and discuss the available literature in Michaelmas term at Durham’s Centre for Humanities Engaging Science and Society, and three workshops for policy makers, methodologists and anyone interested in working out how to answer these questions. Participation is open to anyone with an interest in the issues.
Workshop 1: 10 November 2015, 2.00 - 6.00pm, Senate Suite, University College (Durham Castle)
Evidence amalgamation: weighing the evidence v building a case
Organised by Dr Jacob Stegenga (Victoria University and IAS Fellow) and Professor Nancy Cartwright (Durham University)
This workshop will review conventional methods for amalgamating evidence and explore techniques for synthesis by building a case, investigating the relative advantages and disadvantages of the two kinds of methods.
Workshop 2: 3 May 2016, 13:00 - 17:00, Old Library, Grey College
Organised by Professor Julian Reiss (Durham University)
NB This workshop was originally scheduled to take place on 11 February 2016. Please note also the revised timings.
The Principle of Total Evidence is the recommendation to use all the available evidence when estimating the probability of a hypothesis (Carnap 1947). While seemingly uncontroversial, upon reflection the principle is more problematic than at first sight. On the one hand, it is, at least implicitly, denied by proponents of evidence-based approaches in the sciences: they recommend instead only to use evidence that has passed a certain quality threshold. On the other, what the principle really entails depends greatly on how to interpret ‘available’. Does it mean ‘known by the agent who makes the probability estimate’ or ‘known by the community of researchers’ or ‘knowable after a suitable search period’? Might it entail a demand to make new observations and experiments?
Contact firstname.lastname@example.org for more information about this event.
Workshop 3: 10 May 2016, 2.00 - 6.00pm, Senate Suite, University College (Durham Castle)
‘What will work in my school?’ What does the evidence say?
Organised by Professor Nancy Cartwright (Durham University)
This workshop will focus on education as one of the many concrete policy areas where understanding how to amalgamate evidence matters to practice. Participants will propose and probe practical advice for educators who want to use evidence but have to make real decisions in real time with limited resources.
Please register interest by contacting the Durham’s Centre for Humanities Engaging Science & Society (CHESS) administrator at email@example.com
Contact: Professorn Nancy Cartwright (firstname.lastname@example.org)
Talking therapies: evidence and evaluation
For mental health clinicians and experts the demand that practice be evidence-based represents both an opportunity and a challenge. The focus of this project is on evidence for therapies for depression – not just with reducing symptoms of depression but with understanding how recovery happens in vivo: how do people use therapy to build resilient character and improve their capacity for meaningful relationships and their ability to live the lives they choose? These are the kinds of outcomes that service users themselves tell us they value but that to date evidence-based pharmaceutical treatments have neglected and talking therapies are only just now focussing on. What kind of evidence could be collected in order to make these claims – that a therapy can deliver the kind of recovery that is relevant and valued by users – credible? Or show them to have effect? And which therapies are most effective, how and why, and under what circumstances?
Evaluation methodologies over recent decades have privileged randomised controlled trials (RCTs) as the dominant paradigm – the ‘gold standard’ – for establishing whether or not treatments work. Not only has this discriminated against more complex kinds of treatment, such as psychological therapies, which are less straightforward to test using RCTs, it has also had the unfortunate side effect of missing many of the real outcomes that matter to service users, in contrast to more easily targeted symptoms.
This project will hold two intensive workshops with practicing therapists, methodologists and others with interests in these questions from around the UK to investigate, first, how to construct empirical studies to collect evidence from interpersonal processes that lead to change during psychotherapy and, second, how to make valid and reliable inferences about causal mechanisms for use in improved clinical decision making. The aim is to begin to forge some solid research plans for the kind of research described, with the ultimate aim of developing new methodological tools more appropriate than those currently in favour for assessing the effectiveness of talking therapies for the outcomes that matter to service users.
Workshop 1: 27 April 2016, Seminar Room, Institute of Advanced Study
What evidence works best for which NICE therapy?
Workshop 2: 03 June 2016, Seminar Room, Institute of Advanced Study
What should an evidence-based pathway for improving work and wellbeing look like to the people who are on it?
The project is organised jointly by Professor Nancy Cartwright, an expert on methods for causal inference, and Jeremy Clarke, an adviser to the UK government on psychological therapies and a practicing clinician. Anyone with an interest in these problems is invited to attend. Those interested in participating should contact the administrator for Durham’s Centre for Humanities Engaging Science & Society (CHESS) at email@example.com.
The Importance of Context: evidence, values and assumptions in public health policy
In Stockton-on-Tees, the difference in male life expectancy at birth between the most and least deprived areas is 16 years – the highest in England. This reflects a larger pattern in which health inequalities are increasing in parallel with the rise of economic inequality, and despite stated policy commitments to reducing them. Review of current policies suggests that the application and interpretation of research evidence in public health policy relies on questionable assumptions about a high degree of choice and control over lifestyle, income, and quotidian living/working environment. Rich and poor live in different ‘epidemiological worlds’, and some have far more control over those worlds than others. Against this background, what forms of evidence, disciplinary perspectives and research methodologies are most relevant for the design of policies to reduce health inequalities?
The core activity in this programme will be a one-day interactive workshop in December, 2015 on Evidence, Values and Assumptions in Public Health Policy. Key contributors will be Professor Ian Greener on the perils of ‘rational choice’ as an assumption in public health policy; Professor Nancy Cartwright on putting all the evidence to use in health policy; and Professors Ted Schrecker and Linda McKie on choices about how much evidence of health impact is enough to act on socioeconomic inequalities that are key determinants of health inequalities (the standard of proof question). Based on responses to a widely circulated call for expressions of interest, up to 30 participants drawn from the university and the North-East community will be identified, and background materials distributed in advance of the workshop. Post-workshop, key messages will be synthesised; a proposed agenda for future activities circulate (including funded activities); and a short briefing paper for circulation across the university and outside will be developed; as well as a journal article. This activity will operate in parallel with an ESRC-funded series of seminars and conferences on revitalising the health equity agenda.
For further information please contact Professor Linda McKie (firstname.lastname@example.org)