|
| ||
| Sub-questions Has the programme been defined clearly? Does the programme definition take into account likely business and external changes? Have stakeholders endorsed the arrangements for delivering the programme and its ongoing operation? Is there appropriate staff training and support in place to deliver the programme and effect business change? Has the programme identified enablers to achieve its objectives (eg people, policies, funding, processes, partners, technology)? Are they in place? Has the programme identified enablers to achieve its objectives (for example people, policies, funding, processes, partners, technology)? Are they in place? Is there an appropriate disaster recovery plan? | ||
| Essential evidence Programme brief, programme definition or programme initiation document. Plan for implementing programme. Operational risk management plan. | ||
|
Putting the programme into practice - examples from our studies Our 2017 report on The new generation electronic monitoring programme (see Q4) found that the Ministry of Justice (MoJ)’s planned timescale for this programme was unachievable. The Ministry initially allowed 15 months after signing the contract for the tags in August 2012 to develop, test, manufacture and deploy the new tags. Contracts, however, were not signed until July 2014 due to the discovery of overbilling by G4S and Serco, followed by two failed procurements for the tags. Five years after initiation, the programme had not delivered the intended benefits. We found that the MoJ had adopted a new high-risk and unfamiliar approach to the procurement, and failed to manage the implications. However, following internal and external reviews of the programme in 2015 and 2016, the MoJ took action to address many of the issues, including abandoning the original plan to develop new tags. Our 2018 report Rolling out Universal Credit concluded that there was no practical alternative to continuing with this new benefit system. However, we also found that it was unlikely that the Department for Work & Pensions (DWP) would ever be able to measure whether it had achieved its goal of increasing employment. Given this, combined with extended timescales and the cost of running Universal Credit compared to the benefits it replaces, we concluded that the project was unlikely ever to be value for money. We found several reasons that rollout had been considerably slower than planned. There were early problems, including issues with governance, contractors and developing a full working system. DWP used an agile approach, which allowed lessons to be applied and changes to be incorporated, including slowing the rollout. DWP did not measure additional costs for local organisations that help to administer Universal Credit and support claimants, and DWP had only partially compensated these organisations. We recommended that progress towards achieving the intended benefits was tracked better, including taking account of the impact on third parties; that the programme did not expand before operations could cope with higher claimant volumes; and that DWP worked more closely with delivery partners, including to make it easier for the latter to support claimants. Other relevant reports Transforming Rehabilitation: Progress review (paragraph 11) Early progress in transforming courts and tribunals (paragraph 14) Rolling out smart meters (paragraphs 12 and 21) E20: renewing the Eastenders set (paragraph 14) |