REPORT BY THE COMPTROLLER AND AUDITOR GENERAL HC 684-II SESSION 2012-13 10 JANUARY 2013 Ministry of Defence The Major Projects Report 2012 Appendices and Project Summary Sheets Our vision is to help the nation spend wisely. We apply the unique perspective of public audit to help Parliament and government drive lasting improvement in public services. The National Audit Office scrutinises public spending for Parliament and is independent of government. The Comptroller and Auditor General (C&AG), Amyas Morse, is an Officer of the House of Commons and leads the NAO, which employs some 860 staff. The C&AG certifies the accounts of all government departments and many other public sector bodies. He has statutory authority to examine and report to Parliament on whether departments and the bodies they fund have used their resources efficiently, effectively, and with economy. Our studies evaluate the value for money of public spending, nationally and locally. Our recommendations and reports on good practice help government improve public services, and our work led to audited savings of more than £1 billion in 2011. Ministry of Defence The Major Projects Report 2012 Appendices and Project Summary Sheets Report by the Comptroller and Auditor General Ordered by the House of Commons to be printed on 8 January 2013 This report has been prepared under Section 6 of the National Audit Act 1983 for presentation to the House of Commons in accordance with Section 9 of the Act Amyas Morse Comptroller and Auditor General National Audit Office 17 December 2012 This volume has been published alongside a first volume comprising of – Ministry of Defence: The Major Projects Report 2012 HC 684-I HC 684-II London: The Stationery Office £59.00 © National Audit Office 2013 The text of this document may be reproduced free of charge in any format or medium providing that it is reproduced accurately and not in a misleading context. The material must be acknowledged as National Audit Office copyright and the document title specified. Where third party material has been identified, permission from the respective copyright holder must be sought. Links to external websites were valid at the time of publication of this report. The National Audit Office is not responsible for the future validity of the links. Printed in the UK for the Stationery Office Limited on behalf of the Controller of Her Majesty’s Stationery Office 2533339 01/13 19585 Contents Appendix Seven Support contracts 4 Appendix Eight Cost performance on assessment phase projects 6 Appendix Nine Technology readiness levels 7 Appendix Ten Sentinel 9 Project records 9 Appendix Eleven The National Audit Office study team Definitions and classifications of cost, consisted of: Nigel Vinson, Hannah Kingsley-Smith, time and performance causal factors 11 Martin Wheatley, Ben Bourn, Appendix Twelve Mari Wallace, Andrea Atkinson, Project summary sheets 13 Graham Balkwill, Andrew Clark, Jim Cotton, John Marsh, Israel Ochwo, Tim Reid, Omer Riaz, and Jenny Yu, under the direction of Tim Banfield. This report can be found on the National Audit Office website at www.nao.org.uk/Major-Projects-2012 For further information about the National Audit Office please contact: National Audit Office Press Office 157–197 Buckingham Palace Road Victoria London SW1W 9SP Tel: 020 7798 7400 Enquiries: www.nao.org.uk/contactus Website: www.nao.org.uk Twitter: @NAOorguk 4 Appendix Seven The Major Projects Report 2012 Appendix Seven Support contracts Where projects have approved support contracts we report on the forecast spend against these in the Project Summary Sheets. The nature of a support contract depends on the type of project and the approach to support that the project team have taken. For projects where there is already an in-service platform, such as Merlin, Chinook and Warrior, projects report on the support to the in-service fleet, which is often contracted for in five-year pricing periods. Other projects such as Astute and Typhoon have approvals for the whole life support to the platforms. The Major Projects Report 2012 Appendix Seven 5 Figure 1 Cost variation in support contracts Cost variation in support contracts 14,000 13,100 13,100 12,000 10,000 8,000 6,000 4,000 2,000 1,131 726 818 63643637 475471 629 624 761 112 100 314 303 86 81 0 arrior ildcat Merlin Falcon ype 45 yphoon W Chinook T T Airseeker ynx W Astute Class L Submarines Approved cost (£m) Current forecast cost (£m) NOTES 1 Astute support is the total of the Initial Support solution, plus the Astute Class Training Service for Boats 1–4. 2 Chinook support covers the support to the in service aircraft (current 5-year pricing period) and the support approval for the 14 new Chinook. 3 Falcon support is the total for Increments A and C and the Urgent Operational Requirement. 4 Lynx Wildcat support is the cost of the Wildcat integrated Support and Training Contract. 5 Merlin support is a contract to cover the entire in-service fleet. We report on the current 5-year pricing period. 6 Type 45 support is the total of the Initial Spares contract and the 7-year full support contract. 7 The Typhoon support approval covers the entire life of the aircraft. 8 Warrior support is the total of the Battle Group Thermal Imaging Support contract and the Diesel Engines and Transmissions contract. Source: National Audit Office analysis of Departmental data 6 Appendix Eight The Major Projects Report 2012 Appendix Eight Cost performance on assessment phase projects Figure 2 shows the approved and forecast cost of each assessment phase, where preliminary work is carried out before the main investment decision. Figure 2 Cost variation on assessment phase projects Cost (£m) 3,500 3,000 3,016 3,037 2,500 2,000 1,500 1,000 500 107107 158 151 19 44 44 50 98 99 49 49 25 53 0 Cipher ype 26 NEADS Marshall T Spearfish Successor oduction Capability and Sustainability Military Afloat Reach e Pr Engagement Capability Cor United Kingdom Cooperative Approved cost of the assessment phase Forecast cost of the assessment phase Source: National Audit Office analysis of departmental data The Major Projects Report 2012 Appendix Nine 7 Appendix Nine Technology readiness levels This year in the Project Summary Sheets (Volume II) we are reporting on technology readiness levels for Assessment Phase projects (projects that are in the planning phase, prior to the main investment decision being taken). What are technology readiness levels? Technology readiness levels, or TRLs, are a technology management tool that provide an indication of the technical maturity of a project, by identifying risk associated with technology and system integration. A TRL, measured on a scale from 1 to 9 (with 1 being the least mature) can be given to each technology element of a project. TRLs are designed to be used to assess the risks of not delivering a project on time due to immature technology. This could be a powerful tool if used routinely as part of project management, especially in the context of the high levels of time slippage we are reporting this year due to technical problems on projects. Measuring TRLs When? TRLs are designed to be used at all stages of the acquisition cycle. The departmental guidance advocates that Project Teams use them at key decision points on projects, for example: • At the start of the Assessment Phase: To assess whether it is likely that the required technology will be mature by the in-service date. They specify a TRL of 3 for key technologies at this point (defined as analytical and experimental critical function and/or characteristic proof of concept). • At the point of the main investment decision: They advise a TRL of 7 (defined as technology prototype demonstration in an operational environment). Exposing the technology to the operational environment should reveal any limitations and therefore the risk to not achieving mature technology by the in-service date. 8 Appendix Nine The Major Projects Report 2012 How? TRLs have generic definitions but should be defined in the context of each project to make them measurable, so it is clear when a TRL has been achieved for a particular piece of technology. The Department does not have a mechanism for independent verification of TRLs; they are generally assessed by the Project Team. How the Department use TRLs: Spearfish Upgrade project The Spearfish Upgrade (SFU) project team applies ‘tailored’ Technology Readiness Levels (TRL) to progressively manage technical maturity and the associated risks of the weapon system design in accordance with departmental guidance. The SFU project team monitors the technology risk and manages the development of the system solution regularly using a Weapon Technology Readiness Progression matrix to track achievement against the plan. Progress is formally assessed at quarterly project reviews. The torpedo system and subsystem elements are broken down in accordance with the Product Breakdown Structure e.g. sonar, warhead, propulsion etc. This systematic TRL hierarchy underpins the design approach to enable hardware and software development and integration risks to be effectively managed to deliver a system solution. Specific TRL definitions which were defined during the Concept Phase are applied as SMART 1 criteria for assessment of TRL achievements during the Assessment Phase. Technical assurance includes independent evidence-based assessment by experts such as the Defence Science and Technology Laboratory, for all industry claims on the achievement of technical maturity. This approach to TRL progression management provides a foundation for the Assessment Phase acceptance process. For example, the Insensitive Munitions warhead system achieved TRL level 7 in January 2012 following land-based and in-water scale firings of the warhead system to demonstrate the technology in an operational environment.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages410 Page
-
File Size-