Meta-evaluation and meta-analysis

6.22 The term "meta-evaluation" was originally used to describe the "evaluation of evaluations" (Scriven, 1991) but has also been used to refer to "the synthesis of evaluations". It is similar to systematic review in that it tends to use explicit protocols and criteria for assessing the quality of evaluation studies. It tends to differ from systematic review in two ways:

• the evaluation will generally attempt to synthesise the results of the individual evaluations, either formally or informally, to provide some estimate of, for example, the average effect size across a range of similar studies, or the total combined effect of a number of related studies; and

• studies to be evaluated will not necessarily be identified through a systematic review of the entire relevant literature. Instead, they might be selected because they are of particular interest to the evaluation audience. This might be because they share a similar theme, were funded under the same programme, or were implemented in the same geographical area.

6.23 A meta-evaluation is relevant therefore where there are, for example:

• multiple policy interventions all working towards the same outcome, for example, interventions aimed at reducing childhood obesity;

• large scale programmes which have several strands with overlapping objectives, for example the legacy of the London 2012 Olympic Games covers economic, social and sporting impacts of the Games, as well as environmental and disability outcomes (http://www.culture.gov.uk/ ); and

• evaluations undertaken in different geographical areas using different approaches to achieve the same objective.

6.24 Meta-evaluation can use a range of more or less formal techniques for synthesising results and drawing conclusions. For instance, Meta-Evaluation of the Local Government Modernisation Agenda: Progress Report on Accountability in Local Government10 used a range of techniques, including:

• a count of existing evidence reports with findings in favour of a particular result;

• a questionnaire-based survey of local government officers; and

• focus group discussions with local residents.

6.25 As discussed elsewhere in the Magenta Book, the reliability of results obtained from techniques which use qualitative and other approaches which do not attempt to control for potential confounding factors is limited.

6.26 Meta-analysis is a more formal approach to meta-evaluation. It has been defined as "the statistical analysis of a large collection of analysis results from individual studies for the purpose of integrating the findings" (Glass, 1 976)11. It is a type of systematic review that aggregates the findings of comparable studies and "combines the individual study treatment effects into a 'pooled' treatment effect for all studies combined" (Morton, 199912). This can be based on a pooling of the individual observations from the original study datasets, but more commonly the average effect sizes estimated in each study are pooled. The variation in these effect sizes is then explained using statistical analysis, often multivariate regression using characteristics of the individual studies ("meta-data") as explanatory variables13

6.27 Meta-analysis is perhaps best known for combining the results of randomised controlled trials, but they are also commonly undertaken on non-randomised data from primary studies that use case-control, cross-sectional, and cohort designs. Meta-analysis has its own limitations, including limits to the comparability of outcomes considered in different studies, and variability in the reporting of relevant meta-data. As with other meta-evaluations, the reliability of the results is a function of the quality of the "source" studies.




_______________________________________________________________________

10 Meta-evaluation of the Local Government Modernisation Agenda: Progress Report on Accountability in Local Government, Office of the Deputy Prime Minister, September 2006, http://www.communities.gov.uk/

11 Primary, Secondary and Meta-Analysis of Research, Glass, 1976 Educational Researcher, Vol. 5., No. 10, Nov 1976

12 Systematic Reviews and Meta-Analysis, Workshop materials on Evidence-Based Health Care, Morton, 1999, University of California San Diego, La Jolla, California - Extended Studies and Public Programs

13 The Essential Guide to Effect Sizes: An Introduction to Statistical Power, Meta-Analysis and the Interpretation of Research Result, Ellis, 2010, United Kingdom: Cambridge University Press; Practical meta-analysis, Wilson and Lipsey, 2001, Thousand Oaks: Sage