The importance of definitive evaluation
The Medical Research Council provides a framework for conducting and reporting the evaluation of complex interventions and here we summarise their recommendations. Complex interventions (interventions comprising multiple components interacting to produce change) are widely used in the health service, but their complexity is not always recognised when the intervention is evaluated. Evaluations tend to focus on aggregate effectiveness, a simple cause-and-effect approach where a model of care is judged by its impact.
This approach tells us if the complex intervention worked, but we also need to know why it worked.
Traditionally, evaluation of health service interventions focused on measuring outcomes including the quality of care, productivity and more recently, patient satisfaction. But research tells us there are two issues to be aware of when performing outcome evaluation:
• Effect sizes: these may tell us that an intervention helped more people than it harmed in the time and place it was delivered, but often tell policymakers and practitioners little regarding how findings might be applied in new settings or to other populations (Cartwright and Hardie, 2012).
• Aggregate effectiveness: a focus on this means that we risk developing, evaluating and recommending interventions for implementation that have small population-level benefits at the expense of widening existing inequalities (Whitehead 2007).
Although vital to demonstrate if the intervention worked, outcome evaluation does not provide information on why it worked. To answer the latter, we need to know:
• how the intervention was implemented
• its causal mechanisms
• how effects differ from one context to another.
Briefly, we need to combine evaluation of outcomes with evaluation of process. This produces a definitive evaluation.
What does it mean to perform a service evaluation?
Without the knowledge of process, it may be difficult to address variations in services if they occur, since it will be unclear which factors are responsible for making a success of the intervention in one setting, but lead it to fail in another. This understanding is necessary in order to make a wider impact by informing policy and practice. To ensure work is scalable, it is important to identify the ‘critical ingredients’ that others can implement in their own local contexts.
A process evaluation is a study that aims to understand the functioning of an intervention by examining its implementation and mechanisms of impact alongside contextual factors. Process evaluation is complementary to, but not a substitute for, high quality outcomes evaluation.
Figure 1: Adapted from Moore (2014)
To understand implementation, the mechanisms through which interventions produce change, and the role of context in shaping implementation and effectiveness, quantitative and qualitative methods should be used.
• Quantitative methods measure key process variables and allow hypothesised mechanisms of impact and contextual moderating factors to be tested.
• Qualitative methods capture emerging changes in implementation, experiences of the intervention, and unanticipated or complex causal pathways. They can also be used to generate new theories.
As a clinician I am constantly stuck by how well the health service can operate and also how frustrating it can be. The temptation is to change things (be they referral pathways, means of communication, services offered) and hope that the frustrations reduce and the efficient and effective service increases. Quality improvement (QI) science has taught us of the importance of measurement when making improvements to a service. What to measure, how frequently, how to be sure that the difference is real, and how to show sustained improvement. However, when faced with a complex intervention such as changing the model of care provided across primary, community and secondary care, we need to think beyond just ‘does it work’.
As we develop new ways of working, it is incumbent upon all clinicians not just to share our ‘good practice’ in showing what works, but also to determine, within the complex intervention of service redesign, what it was within that intervention that made the difference. Not just ‘does it work?’ but ‘why does it work?’ We need to identify which aspects of the new model are crucial to its success, and which aspects are helpful but not essential (some of which may have been non-contributory or even a hindrance). In this way, when we come to implementing service redesign held up as a success in some local areas, we will have a much clearer understanding of why it worked and whether the same benefits might be seen in our area. The learning becomes not ‘what did you do’, or even ‘what were the outcomes’, but rather ‘why did it work’ and ‘what are the key factors to making it successful in other contexts’.
Key terms
Complex intervention: interventions with several interacting components
Definitive evaluation: combines evaluation of outcomes with that of process
Process evaluation: a study which aims to understand the functioning of an intervention by examining implementation, mechanisms of impact, and contextual factors
Improvement science: finding out how to improve and make changes in the most effective way
Realist evaluation: answers questions about what works, for whom and under what circumstances
Further reading:
Craig P, Dieppe P, Macintyre S, Michie, S., Nazareth, I. and Petticrew, M. Developing and evaluating complex interventions: new guidance: Medical Research Council, 2008. Link
Moore G, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, Moore L, O’Cathain A, Tinati T, Wight D, Baird J. Process evaluation of complex interventions: Medical Research Council guidance. MRC Population Health Science Research Network, London, 2014. Link
Fletcher, A., Jamal, F., Evand, E.R. et al. (2016) Realist complex intervention science: Applying realist principles across all phases of the Medical Research Council framework for developing and evaluating complex interventions. Evaluation, 22, doi:10.1177/1356389016652743. Link
References:
1.Quality improvement made simple, What everyone should know about health care quality improvement: Health Foundation, 2013
2. Cartwright N and Hardie J (2012) Evidence-Based Policy: A Practical Guide to Doing It Better. New York: Open University Press.
3.Whitehead M (2007) A typology of actions to tackle social inequalities in health. Journal of Epidemiology and Community Health 61(6): 473–478.