● Lessons learnt are captured analysed and shared in a routine and systematic manner across government.
● Assurance has the authority to report directly and independently to the top of government (including Ministers).
● The impact of assurance is assessed as part of the reporting process.
● Project knowledge is captured in a systematic way from both point in time and continuous assurance.
● Information in reports is tailored to the audience - communicate only what that stakeholder needs to see (not one size fits all).
| Figure 5 Report assurance | ||||||
| Ideal state | Gap analysis | Suggested priority actions | Measures | |||
| Required element | Purpose | Statement of current position | Evidence for judgement | |||
| Analyse and report assurance mechanism information. | To identify: ● themes which highlight common issues and successes in types or stages of projects; ● areas of weakness in the current mechanisms; ● reviewer performance; ● the impact of the mechanism and inform; ● decisions on performing, planning and integrating the assurance mechanisms. | OGC reports the performance of assurance based on the use of individual mechanisms rather than their impact. The performance of assurance is evaluated on a mechanism by mechanism basis using a stakeholder survey. No assessment of whether the assurance mechanisms are operating as an effective integrated system. No standard method to capture and analyse assurance outputs across the portfolio in a systematic and timely manner. The gathering of lessons from assurance (including reviewer performance) is not systematic and does not automatically generate action. | There are case by case examples of impact but there is no standard approach to assess and report the impact of assurance on project outcomes. Since the introduction of Gateway in 2001, assurance mechanisms have evolved in response to perceived performance gaps. There are no objectives for the individual mechanisms to use the output or knowledge gained from one mechanism to inform another. Forty-five per cent of stakeholders think that the co-ordination between mechanisms is poor. Information from assurance is not systematically captured and analysed across the portfolio of projects and there is no structured approach to help departments learn from the experiences of others. OGC has allocated little resource to this function. | Implement a method for assessing and reporting the impact of assurance on the portfolio against time cost and quality. Use the impact calculation to form a view of where assurance is providing the best return on investment. Difficulty: Hard Implement an approach to continuously assess whether the mechanisms are operating as an effective integrated system (including frequency and definition of the necessary triggers to escalate issues for remedial action). Difficulty: Hard | Method in place to assess and report the impact of assurance (number of reports produced). Method for calculating return on investment in place. Approach in place to capture, analyse and report assurance system integration issues (indicator - number of reports produced). | |
| Feedback to project stakeholders. | To communicate findings from assurance applied on specific projects to project stakeholders. | |||||
| Produce assurance report. | To present analysis to project stakeholders in an easily accessible form communicating only what they need (not one size fits all). | |||||
| Analyse assurance lessons. | To systematically and routinely capture and analyse assurance output and insights (i.e. from point in time and continuous assurance) from individual projects, identify cross cutting trends, draw out lessons learned and identify examples of success/good practice or concern/ poor practice. | Knowledge from assurance rests within individuals. Lessons from point in time assurance outputs are captured and analysed in an unstructured manner. The gathering of lessons from assurance is retrospectively performed by OGC. | OGC relies on informal contact between its staff and Civil Service reviewers, plus discussions with individual departments to transfer lessons across projects and organisations. There is no formal system to record and share knowledge. Information from assurance is not systematically captured and analysed across the portfolio of projects. Themes for lessons are selected rather than as a result of analysis of the reviews. OGC has allocated little resource to this function. Across government, Centres of Excellence and OGC do not consistently identify themes or issues which are pertinent to particular types of projects or stages in the project lifecycle - something which stakeholders would value. | Implement an approach for capturing and analysing assurance lessons from across government that supports informal mechanisms Difficulty: Easy | Approach in place to capture and analyse lessons (indicator - percentage of assurance reports analysed). | |
| Populate repository of lessons. | To provide a single source of knowledge for projects and reviewers to easily access lessons produced by all assurance mechanisms. | Information on lessons from assurance is held in multiple places. | There is no easily accessible single source of assurance lessons which project staff can access. | Create a searchable repository of lessons that is accessible to project staff and assurance reviewers across government. Difficulty: Easy | Product in place to share lessons on a 'pull basis' across government (indicator number of unique requests for information). | |
| Communicate and publish lessons. | To publish lessons, themes and trends to project stakeholders in an easily accessible form communicating only what they need (not one size fits all). | Lessons are issued in individual bulletins which each recipient has to then manage as a repository of knowledge. | OGC's approach to spreading the lessons from Gateway reviews is informal and ad hoc, consisting of eight bulletins a year based on themes selected by staff rather than a result of analysis of the reviews. | |||
| Report portfolio status outside of government. | To provide public transparency on the current status of high risk projects (for example performance against cost, schedule and key delivery criteria). | No public reporting on the status of the project portfolio. | There is no public reporting of the status of high risk projects such as that undertaken by the United States Government for IT projects. Transparent and open government is a lever available to help deliver better outcomes in projects and strengthen accountability. Departments and OGC currently deal with repeated requests for project status information. | Implement an approach to publicly report project status. Difficulty: Easy | Approach in place to report public status of the portfolio (indicator - number of reports produced, number of requests for project information). | |
| Analyse and report portfolio financial performance and deliverability issues. | To undertake analysis of the portfolio and provide visibility of forecast deliverability issues (for example, the cumulative impact (economic or otherwise) on demand for a particular type of resource; significant skills or capability gaps) and financial performance issues. | The status of projects is reported as individual lines within the Major Projects Portfolio (MPP). The MPP is reported in full to the Prime Minister, HM Treasury Ministers and in part to Accounting Officers. No approach to assess and report portfolio deliverability issues or baseline and forecast expenditure. Project expenditure is managed in year. | Accounting Officers only have visibility of the rating of projects within their own departments. The information in the MPP is not presented or used effectively enough to monitor and inform the management of major projects as a portfolio commitment. Government does not use assurance information to help inform decisions on managing the portfolio. The MPP report presents project data but does not interpret and convert this to information that can be used to inform decisions across the portfolio. The current format and use of the MPP can be improved to highlight resource and delivery risks across the portfolio and trigger remedial action. The total financial commitment that the major projects portfolio represents is unknown. There is no evidence that financial information is used to forecast the future exposure of the portfolio against the original approved baseline costs. | Implement an approach to routinely track baseline and forecast expenditure for all high risk projects (including frequency of reporting, control limits for triggering escalation and distribution list). Report should include the cumulative variance in actual and forecast expenditure over time. Difficulty: Hard Implement an approach for assessing and reporting portfolio deliverability issues (for example, the cumulative impact (economic or otherwise) on demand for a particular type of resource; significant skills or capability gaps) (including frequency of reporting, control limits for triggering escalation and customers). Difficulty: Hard | Method in place to track baseline and forecast expenditure across the portfolio (number of reports produced). Approach in place to capture, analyse and report deliverability issues from a portfolio perspective (indicator - number of reports produced). | |
| Source: National Audit Office | ||||||