8.1 Chapter 8 sets out the approach to monitoring and evaluation including different types of evaluation and uses before, during and after implementation.
8.2 Monitoring and evaluation should be part of the development and planning of an intervention from the start. They are important to ensure successful implementation and the responsible, transparent management of public resources. Guidance on conducting evaluation is contained in the Magenta Book.
8.3 Evaluation is a systematic assessment of an intervention's design, implementation and outcomes. It involves:
understanding how a government intervention is being or has been implemented, what effects it had, for whom and why
comparing what happens with what was expected under Business As Usual (the appropriate counterfactual)
identifying what can be improved, estimating overall impacts and cost-effectiveness
8.4 When used properly, evaluation can inform thinking before, during and after implementation as set out in Box 15.
8.5 It is important to incorporate monitoring and evaluation into the development and appraisal stage of a policy, programme or project. Pilots can be used to test policy effectiveness of what works. Policies can also be designed with inbuilt variation to test the effectiveness of different approaches in real time.
Box 15. Uses of Evaluation
Before - brings together the existing evidence base, identifies uncertainties (and so where a future evaluation might focus), and helps reduce risk associated with an intervention prior to full implementation. |
| How is the intervention expected to work? |
| Is it likely to work? | |
| How is it expected to be delivered? | |
| What can we learn from previous monitoring and evaluation work? | |
| Can the intervention be piloted and tested before full roll-out? | |
During - allowing emerging evidence to inform ongoing adjustments to the intervention and informs implementation. It can also inform subsequent operational delivery.
|
| Is the intervention working as intended? |
| Is it being delivered as intended? | |
| Why is this? | |
| How can it be improved? | |
| What are the early impacts? | |
After - involves an assessment of the outcome of the intervention and provides a summative assessment of the learning gained throughout its design and delivery. |
| Did the intervention work? |
| Were there unexpected outputs and outcomes or were they as expected? | |
| What was the size of the impact? | |
| What was the cost to deliver the benefits and did the intervention achieve Benefit Cost Ratios estimated at appraisal? | |
| What can we learn to inform future interventions? |
8.6 Evaluation is often broken down as follows:
Process Evaluation - involves assessing whether an intervention is being implemented as intended within its cost envelope, whether the design is working, what is working more or less well and why. It supports understanding of internal processes used to deliver outputs, alongside what was actually delivered and when.
Impact Evaluation - involves an objective test of what changes have occurred, the extent of those changes, an assessment of whether they can be attributed to the intervention and a comparison of benefits to costs. It supports understanding of the intended and unintended effects of outputs, as well as how well SMART objectives were achieved.
8.7 Regulations may require post-implementation reviews (PIRs) which are closely related to policy evaluations. The aim is to review regulations at timely intervals to assess whether they are still necessary, whether they are having the intended effects and what the costs to business are. PIRs will generally focus on measures with significant impacts on business and should be conducted proportionately, supported by appropriate monitoring and evaluation. Better Regulation guidance provides more information on conducting PIRs.
8.8 The planning of monitoring and evaluation for spending proposals should follow the HM Treasury Business Case guidance for both programmes and projects. This allows a wide range of analytical and logical thinking tools to be used when initially considering the objectives and potential solutions. Planning and provision of resources for monitoring and evaluation should be proportionate when judged against the costs, benefits and risks of a proposal both to society and the public sector.
8.9 Monitoring and evaluation typically use a mixture of qualitative and quantitative methodologies to gather evidence and understand different aspects of an intervention's operation. Surveys, interviews and focus groups may be needed to understand the views of a wide range of stakeholders, evaluation questions should reflect immediate needs to manage and assess the success an intervention. Evaluation is important as:
it facilitates transparency, accountability and development of the evidence base
it can be used to improve current interventions
it expands learning of 'what works and why' to inform the design and planning of future interventions
8.10 Building monitoring and evaluation into the design of a proposal, and building resources into a proposal, has the following benefits:
it ensures timely, accurate and comprehensive data can be collected. Data collection should be done alongside the monitoring of costs; either within the intervention itself, or as part of the organisation's wider cost monitoring
it ensures monitoring and evaluation can take place
it allows for relatively minor adjustments to be made to the implementation design which can greatly improve the delivery of benefits
it means high-quality evaluation evidence can be obtained and reduces the likelihood of having to retrospectively ask for information on what was delivered and when
it means it can be possible to implement in a way that creates a natural comparison group, which can greatly improve the quality of the evidence around 'what works'
it means evaluation can be used during implementation to address significant issues or threats to delivery of planned outcomes to maximise delivery of benefits
8.11 Monitoring and evaluation objectives should be aligned with the proposal's intended outputs, outcomes and the internal processes, although they may also be wider. Policies and programmes that involve a series of related sub-programmes must also be subject to monitoring and evaluation in programme terms during and after implementation.
8.12 SMART objectives should be objectively observable and measurable. Their design should take into account monitoring and evaluation processes. Their suitability for use in monitoring and evaluation is a necessary condition for inclusion as SMART objectives (Chapter 4). Without verifiable and measurable objectives success cannot be measured, proposals will lack focus and be less likely to achieve Value for Money.
8.13 Data on Business As Usual, along with ongoing data collection, is vital to manage delivery and monitor the impact during and after implementation. Monitoring and evaluation should examine what happens compared to:
the impacts expected at the outset, in the business case or impact assessment if available
the situation at the start of implementation
what would have happened if Business As Usual had continued without the proposed intervention
8.14 In terms of the Five Case Model, a core set of questions to consider are set out in Box 16. A more detailed set of evaluation questions can be found in the Magenta Book.
Box 16. Core Evaluation Questions
To what extent were the SMART objectives achieved and by when, in particular:
|
8.15 Monitoring and evaluation evidence and reports should be actively owned by the Senior Responsible Officer and the team responsible for an intervention's delivery. Data and findings should be reported regularly, and reports should be timed to correspond to decision points where they can be of maximum use. Major findings should also be reported to the organisation's Accounting Officer and to the relevant external approving organisation.
8.16 Evaluation reports, and the research that informs them, should be placed in the public domain in line with government transparency standards and Government Social Research: Publication Protocol, subject to appropriate exemptions.