Contractual and non-contractual performance data

4  In producing Figure 2:

  We got publicly available statistical data on the number of taxpayers in each type of tax stream and how those taxpayers contribute to overall tax revenues. We used an average overall contribution, for 2006-07 to 2013-14, to weight the number of taxpayers and see the trend in the volume of work.

  We used unaudited statistical data from HMRC's annual report to examine the overall trend in operational spend. We excluded annually managed expenditure from this total. We inflated this spend using the GDP deflator, from the Office for National Statistics and HM Treasury websites. Within this, we used data obtained from HMRC to show the percentage of overall spend on Aspire and other ICT.

5  In producing Figures 3, 4, 5, 7 and other performance data references within the report:

•  We got different managerial performance data, including a list of all contractual targets and details of how they have evolved, and different versions of performance reporting that HMRC has used to manage, and report on, the performance of its ICT services. We summarised and analysed these data to draw out the key messages.

  We also viewed some HMRC performance hubs in their offices in Telford to see how they use this data to manage their ICT services.

6  In undertaking the analysis in paragraph 2.11 and 2.14 to 2.15:

  We got an analysis of the amount that HMRC spent on projects.

  There are four main steps in HMRC's planning and approval cycle - value, viability, define and design. The value phase is internal to HMRC and the supplier starts to be engaged from the viability phase. We sampled 57 items and requested documentation from HMRC to conclude on approval dates and financial amounts.

  HMRC could provide data on the approval dates of viability stage for 45 items, end of define stage for 32 items and end of design stage for 47 items. Between each phase there may also be standstill time for prioritisation and other governance activities. To reduce the effects of this, HMRC gave us dates for the start of the design phase in 27 instances.

  Overall we could calculate that, across 39 items, it takes 16 months from the end of viability to the end of design phase, across 28 items it takes nine months from the end of define to the end of design and across 27 items it takes seven months from the start of design to the end of design.

  Our average of seven months quoted in paragraph 2.11 is calculated by using the 27 items for which we have start and finish of design phase plus an additional 15 items for which we have end of define stage date but not start of design phase date. The additional nine months quoted in paragraph 2.11 was calculated using 38 items for which we have a viability date, end of define or start of design stage date and end of design stage date. The 95 per cent confidence interval on the seven months is plus or minus two months and on the nine months is plus or minus three months. The nine months is an underestimate as it excludes time in value or viability phase. HMRC told us that each of these phases takes just a few weeks.

  We tried to examine actual spend against budget for the 47 items with a design stage budget. As some projects have multiple design stages this amounted to 33 projects. We could not finish this analysis because of changes in how HMRC record the data in their accounting systems. However, we could identify that in at least 22 cases (67 per cent) additional spend beyond the original budget was agreed by HMRC and Capgemini after design stage concluded. The 95 per cent confidence interval on this 67 per cent is plus or minus 15 per cent.

  In addition, we got from HMRC an analysis of all projects delivered in 2013-14 that counted towards their key performance indicators. From this we determined which had design proposals and those for which changes were made after design proposal.