Reporting and disseminating findings

10.33 However carefully planned and meticulously conducted the evaluation, if the findings are not understood and used correctly, the research will not meet its objectives. There are some key points to take into account when reporting and publishing research and evaluation findings.

10.34 Reporting an evaluation means more than writing a final report. It is important to ensure that feedback is provided to all the evaluation stakeholders, and that findings feed into new policy development and appraisal.

10.35 Notwithstanding the range of activities that should be considered in disseminating findings, the evaluation report is a key output and its effectiveness will depend on the brevity and clarity with which key conclusions and messages are conveyed. The aim of the reporting process throughout a project is to ensure the evaluation commissioners, partners and stakeholders are consulted about research methods, progress and results on an agreed basis. Regular interaction between the evaluators and the commissioning partners maintains the focus of the evaluation and teases out any problems with data collection or team dynamics as soon as they arise.

10.36 Opportunity to reflect on the findings as soon as possible helps the stakeholders to prepare for the conclusions and recommendations, and makes hard messages easier to respond to before the final report becomes public. Subject to commissioning partners' views, allowance should be made for comparison of the evaluation results with other relevant evidence, wider dissemination of the results, and consideration of their implications for policy design and delivery.

10.37 Useful guidance is provided by Vaughan and Buss8 on how to report social research findings to busy policy makers. They point out that many policy makers are able to read and understand complicated analysis, but most do not have the time. Consequently, many will want to be given a flavour of the complexities of the analysis but without getting lost in details. Other policy makers may not have the technical background and will want a simpler presentation. So there is a delicate balance between keeping the respect and interest of the more technical while not losing the less technical.

10.38 Of course, what the evaluation commissioners and other key stakeholders want to see and how they want to see it must determine the form and content of the report. Nevertheless, there are some simple tips suggested by Vaughan and Buss that are likely to be helpful whatever the form of the report; they are set out in Box 10.E.

Box 10.E: Reporting tips

Analyse and advise on the evaluated policy intervention - not on policy strategy and priorities

Keep it simple but not simplistic

Communicate reasoning as well as bottom lines

Use numbers sparingly in the summary reports

Elucidate, don't advocate

Identify winners and losers as well as the average effect

Don't overlook unintended consequences

Source: Vaughan and Buss (1998)

10.39 As discussed above, a useful first step is to report how the new evaluation findings compare with previous knowledge, particularly where there are clear consistencies or inconsistencies. New hypotheses may be required to explain the latter. It is useful to highlight research questions that emanate from the evaluation to inform future planners of research programmes and evaluations.

10.40 It is also important to thoroughly document the research methodology, commonly as part of a separate technical report rather than in the main report. (It is essential that the information remains available, even after all those working on a project have moved on). This should include research tools, such as questionnaires and topic guides used for qualitative/quantitative studies, as well as associated documentation, such as introductory letters and explanatory leaflets. Steps taken to process and analyse the data should be fully recorded, including:

data cleaning or imputation of missing values;

weighting for non-response;

how a final statistical model was selected; and

how standard errors were calculated.

10.41 Where possible, the source data should be archived to allow subsequent secondary analysis. Anonymised data can be deposited with the Economic and Social Data Service http://www.esds.ac.uk/ , although this is more common for quantitative data. It may also be necessary to retain the identifying details separately so that survey respondents can be re-contacted for further research, or to allow linking with other data sets. Where this is the case respondents will need to provide informed consent (this is discussed in more detail in Chapter 7).

10.42 In summary it is vital to think about the dissemination of the results at the time of planning the evaluation, including how they will be used, shared and built upon.





_________________________________________________________________________

8 Communicating Social Science Research to Policy makers, Vaughan and Buss, 1998, Applied Social Science Research Methods Series No. 48. Sage Publications.