School of Business

(Australian Defence Force Academy Campus)

University of New South Wales

Australian Defence Industry Productivity

Benjamin Edward Wright

This thesis is submitted for the Degree of

Master of Philosophy (MPhil)

November 2011

Originality Statement I hereby declare that this submission is my own work and to the best of my knowledge it contains no materials previously published or written by another person, or substantial proportions of material which have been accepted for the award of any other degree or diploma at UNSW or any other educational institution, except where due acknowledgement is made in the thesis. Any contribution made to the research by others, with whom I have worked at UNSW or elsewhere, is explicitly acknowledged in the thesis. I also declare that the intellectual content of this thesis is the product of my own work, except to the extent that assistance from others in the project's design and conception or in style, presentation and linguistic expression is acknowledged.

Signed ……………………………………………......

Date ……………………………………………......

2

Copyright Statement ‘I hereby grant the University of New South Wales or its agents the right to archive and to make available my thesis or dissertation in whole or part in the University libraries in all form of media, now or here after known, subject to the provisions of the Copyright Act 1968. I retain all proprietary rights, such as patent rights. I also retain the right to use in future works (such as articles or books) all or part of this thesis or dissertation.

I also authorise University Microfilms to use the 350 word abstract of my thesis in Dissertation Abstract International (this is applicable to doctoral theses only). I have either used no substantial portions of copyright material in my thesis or I have obtained permission to use copyright material; where permission has not been granted I have applied/will apply for a partial restrictions of the digital copy of my thesis or dissertation.’

Signed ……………………………………………......

Date ……………………………………………......

Authenticity Statement ‘I certify that the library deposit digital copy is a direct equivalent of the final officially approved version of my thesis. NO emendation of content has occurred and if there are any minor variations in formatting, they are a result of the conversion to digital format.’

Signed ……………………………………………......

Date ……………………………………………......

3

Abstract Since 2001 the Australian Government has applied increasing pressure to the Department of Defence to find efficiency savings in its ongoing business operation. During the early 2000s the savings required by the Australian Government in the operational costs of the Department of Defence were fixed amounts and realised primarily through the reduction of administrative overheads and the early retirement of specific military platforms1 such as two of the guided missile frigates (Defence 2008).

In 2005 the Australian Government introduced to specific parts of the Department of Defence budget the concept of an efficiency dividend, which required an ongoing percentage of budgetary funds to be returned to the Australian Government in the form of savings. Between 2005 and 2011, as the size and scope of the efficiency dividend increased, the Department of Defence has been required to develop initiatives internally and externally to meet its savings commitments to the Government. One of the initiatives under investigation is ensuring that the productivity benefits generated by the Australian defence industry are distributed equitably to the Department of Defence in its role as sole consumer of defence industry goods and services.

Using an index number approach, this study empirically measures the changes in the productivity of the Australian defence industry during the period 2001 to 2009. In addition, the study compares changes in defence industry productivity with changes in defence industry profits, defence industry employee wages, the quality of service being provided by the defence industry, and the level of competition in the defence industry marketplace, to determine how any productivity benefits are being distributed.

This study finds that during the period of interest, the defence industry experienced an increase in both multifactor and labour productivity. The study finds evidence to suggest that changes in defence industry productivity have an effect on both the profitability and performance measures calculated for the defence industry but have no obvious direct relationship with changes in defence industry employee wages. The study also finds evidence to suggest that even small changes in the level of competition within the defence industry market can potentially influence changes in defence industry productivity.

1 A platform refers to a military asset such as a ship, plane or vehicle. 4

Table of contents Originality Statement...... 2 Copyright Statement...... 3 Authenticity Statement...... 3 Abstract...... 4 Table of contents ...... 5 List of charts...... 9 List of figures...... 12 List of tables ...... 13 Chapter 1 : Introduction...... 15 Background ...... 15 Objectives of the research ...... 16 Significance of the research...... 16 Thesis organisation ...... 16 Australian defence industry classification ...... 17 Defence industry market characteristics ...... 19 Chapter 2 : Literature review – productivity measurement ...... 23 Productivity concepts ...... 23 Productivity benefits...... 24 Productivity and efficiency...... 25 Measures of productivity...... 26 Measurement variables ...... 27 Measures of output ...... 27 Measures of input...... 29 Methods of productivity measurement...... 31 Index number approach...... 31 Stochastic frontier approach ...... 32 Data envelopment analysis approach...... 34 Comparison of methods...... 37 Chapter summary ...... 38 Chapter 3 : Literature review – productivity distribution...... 39 Productivity benefits distribution ...... 39 Defence industry firms...... 40 5

Profitability of defence industry firms ...... 40 Measuring profitability ...... 42 Defence industry employees ...... 44 Wages and productivity ...... 44 The Department of Defence ...... 45 Firm performance ...... 45 Competition within the Australian defence industry ...... 47 Chapter summary ...... 49 Chapter 4 : Methodology ...... 50 Productivity measurement ...... 50 Productivity index formula ...... 51 Output...... 51 Input...... 52 Data deflation ...... 55 Productivity distribution ...... 60 Defence industry firm profits...... 60 Defence industry employee wages...... 63 Defence industry performance...... 63 Defence industry competition ...... 64 Chapter summary ...... 66 Chapter 5 : Data and measurement issues ...... 67 Productivity measurement ‐ data ...... 67 Scope...... 67 Measurement of variables...... 67 Output variable ...... 67 Input variables ...... 68 Data deflators ...... 70 Measurement issues...... 70 Comparative ABS industry data ...... 71 Productivity benefit distribution ‐ data ...... 72 Defence Industry profit measure...... 72 Defence industry employee wage measure ...... 72 Defence industry performance measure...... 72 6

Defence industry competition ...... 72 Chapter summary ...... 73 Chapter 6 : Productivity measurement empirical results and analysis ...... 74 Defence industry results ...... 74 Chapter summary ...... 89 Chapter 7 : Productivity benefits distribution empirical results and analysis ...... 90 Empirical results and analysis – profitability ...... 90 Profit vs. productivity comparison...... 93 Empirical results and analysis ‐ employee wages ...... 98 Wage vs. productivity comparison ...... 99 Empirical results and analysis ‐ defence industry performance ...... 104 Performance vs. productivity comparison...... 105 Empirical results and analysis ‐ defence industry competition ...... 111 Competition vs. productivity comparison...... 119 Chapter summary ...... 120 Chapter 8 – Conclusions and recommendations ...... 121 Summary of research findings ...... 121 Productivity measurement ...... 121 Productivity benefits distribution ...... 121 Contributions of the research to knowledge...... 124 Research limitations...... 124 Suggestions for future research...... 126 Policy implications ...... 126 Concluding remarks ...... 127 Annex A – Tables and charts ...... 128 Tables ...... 128 Charts ...... 139 Annex B – Individual defence industry firm MFP and LP empirical results ...... 140 Australian Defence Industries...... 140 Australian Submarine Corporation ...... 142 Australian Aerospace ...... 144 BAE Australia...... 146 Boeing Australia ...... 148 7

CAE Australia...... 150 General Dynamic Land Systems Australia...... 152 Raytheon Australia...... 154 Defence Holdings...... 156 ...... 158 Summary of results ...... 160 Annex summary ...... 161 Annex C – Defence industry experimental productivity measurement (value added vs. performance output measure) – empirical results and analysis...... 162 Method ...... 162 Results – Individual firm...... 162 Results – defence industry...... 168 Summary ...... 169 Annex D – Defence industry firm profit results – accounting (EBDIT) and economic profit margins...... 170 Australian Defence Industry ...... 170 Australian Submarine Corporation ...... 171 Australian Aerospace ...... 172 BAE Australia...... 173 Boeing Australia ...... 174 CAE Australia...... 175 General Dynamic Land Systems Australia...... 176 Raytheon Australia...... 177 Tenix Defence Holdings...... 178 Thales Australia...... 179 Annex summary ...... 180 Annex E – Australian defence industry firms – changes in revenue and employees ...... 181 Annex F – Firm Annual Reports and Financial Statements ...... 187 Bibliography ...... 190

8

List of charts Chart ABS 4.1 ‐ 6427.0 Producer Price Indexes ‐ Aircraft Manufacturing and Repair Services and Shipbuilding and Repair Services ...... 56 Chart 4.2 ‐ ABS 6427.0 Producer Price Indexes ‐ Materials Used In and the Articles Produced By Manufacturing Division 22 Fabricated Metal Product Manufacturing...... 57 Chart 6.1 ‐ Defence industry MFP index ...... 75 Chart 6.2 ‐ Defence industry MFP (Thales and BAE observations removed)...... 76 Chart 6.3‐ MFP Comparison – Defence Industry vs. Manufacturing and Mining Industries ...... 78 Chart 6.4 ‐ MFP Sensitivity Testing –Defence Industry...... 81 Chart 6.5 ‐ MFP Comparison V2 – Defence Industry (X2) vs. Manufacturing and Mining Industries ...... 83 Chart 6.6‐ LP Comparison – Defence (X2), Manufacturing and Mining Industries...... 87 Chart 7.1 ‐ Defence Industry Annual Accounting (EBDIT) and Economic Margin...... 91 Chart 7.2 ‐ Australian Defence Manufacturing Industries Profit Indexes (Based on Current Profits)...... 96 Chart 7.3 ‐ Industry Comparison – Employee Compensation Growth – 1999 to 2009 ...... 99 Chart 7.4 ‐ Defence Industry Comparison – Productivity, Profitability, Wages...... 102 Chart 7.5 ‐ Defence Industry Scorecard Performance Index ...... 104 Chart 7.6 ‐ Defence Industry Comparison – Performance (overall, schedule, cost) vs. Productivity...... 106 Chart 7.7 ‐ Defence industry performance ‐ overall score ...... 109 Chart 7.8 ‐ Defence industry performance ‐ schedule score ...... 109 Chart 7.9 ‐ Defence industry performance ‐ cost score...... 110 Chart 7.10 ‐ Defence Contract Procurement Method (Number of Contracts)...... 113 Chart 7.11 ‐ Defence Contract Procurement Method ($ Value of Contracts) ...... 114 Chart 7.12 ‐ Defence Contract Procurement Method ($ Difference between Competitively and Non‐Competitively Procured Contracts)...... 115 Chart 7.13‐ Department Of Defence Contract Procurement Method ($ Difference between Competitively and Non‐Competitively Procured Contracts – Major Contract Adjustment).... 118 Chart 7.14 ‐ Defence Procurement Induced Competition vs. Defence Industry LP ...... 119

Chart Australian A.1 ‐ Defence Industry Labour Productivity Index – Sensitivity Testing Results ...... 139

Chart ADI B.1 ‐ MFP, Input and Output Indexes ...... 140 Chart B.2 ‐ ADI LP, Input and Output Indexes...... 141 Chart B.3 ‐ ASC MFP, Input and Output Indexes...... 142 Chart B.4 ‐ ASC LP, Input and Output Indexes ...... 143 Chart B.5 ‐ AA MFP, Input and Output Indexes ...... 144 Chart B.6 ‐ AA LP, Input and Output Indexes...... 145 Chart B.7 ‐ BAE MFP, Input and Output Indexes...... 146 Chart B.8 ‐ BAE LP, Input and Output Indexes ...... 147 9

Chart Boeing B.9 ‐ MFP, Input and Output Indexes...... 148 Chart B.10 ‐ Boeing LP, Input and Output Indexes ...... 149 Chart B.11 ‐ CAE MFP, Input and Output Indexes...... 150 Chart B.12 ‐ CAE LP, Input and Output Indexes ...... 151 Chart B.13 ‐ GDLS‐A MFP, Input and Output Indexes...... 152 Chart B.14 ‐ GDLS‐A LP, Input and Output Indexes ...... 153 Chart B.15 ‐ Raytheon MFP, Input and Output Indexes ...... 154 Chart B.16 ‐ Raytheon LP, Input and Output Indexes...... 155 Chart B.17 ‐ Tenix MFP, Input and Output Indexes ...... 156 Chart B.18 ‐ Tenix LP, Input and Output Indexes...... 157 Chart B.19 ‐ Thales MFP, Input and Output Indexes ...... 158 Chart B.20 ‐ Thales LP, Input and Output Indexes ...... 159

Chart ADI C.1 ‐ LP Comparison (VA Output vs. Performance Output)...... 163 Chart C.2 ‐ ASC LP Comparison (VA Output vs. Performance Output) ...... 163 Chart C.3 ‐ AA LP Comparison (VA Output vs. Performance Output)...... 164 Chart C.4 ‐ BAE LP Comparison (VA Output vs. Performance Output) ...... 164 Chart C.5 ‐ BOEING LP Comparison (VA Output vs. Performance Output)...... 165 Chart C.6 ‐ CAE LP Comparison (VA Output vs. Performance Output) ...... 165 Chart C.7 – GDLS‐A LP Comparison (VA Output vs. Performance Output)...... 166 Chart C.8 ‐ RAYTHEON LP Comparison (VA Output vs. Performance Output) ...... 166 Chart C.9 ‐ Tenix LP Comparison (VA Output vs. Performance Output)...... 167 Chart C.10 ‐ Thales LP Comparison (VA Output vs. Performance Output) ...... 167 Chart C.11 ‐ Defence Industry LP ‐ VA vs. Performance Measure of Output ...... 168

Chart ADI D.1 ‐ Profit Margin...... 170 Chart D.2 ‐ ASC Profit Margin ...... 171 Chart D.3 ‐ AA Profit Margin ...... 172 Chart D.4 ‐ BAE Profit Margin ...... 173 Chart D.5 ‐ Boeing Profit Margin...... 174 Chart D.6 ‐ CAE Profit Margin ...... 175 Chart D.7 ‐ GDLS‐A Profit Margin...... 176 Chart D.8 ‐ Raytheon Profit Margin ...... 177 Chart D.9 ‐ Tenix Profit Margin ...... 178 Chart D.10 ‐ Thales Profit Margin ...... 179

Chart E.1 ‐ ADI ‐ Changes in Revenue and Employees...... 181 Chart E.2 ‐ ASC ‐ Changes in Revenue and Employees ...... 182 Chart E.3 ‐ AA ‐ Changes in Revenue and Employees ...... 182 Chart E.4 ‐ BAE ‐ Changes in Revenue and Employees ...... 183 Chart E.5 ‐ Boeing ‐ Changes in Revenue and Employees ...... 183

10

Chart E.6 ‐ CAE ‐ Changes in Revenue and Employees ...... 184 Chart E.7 ‐ GDLS‐A ‐ Changes in Revenue and Employees...... 184 Chart E.8 ‐ Raytheon ‐ Changes in Revenue and Employees ...... 185 Chart E.9 ‐ Tenix ‐ Changes in Revenue and Employees...... 185 Chart E.10 ‐ Thales ‐ Changes in Revenue and Employees ...... 186

11

List of figures 1Figure 2. ‐ The Production Function (Single Input (X) and Output(Y))...... 25 Figure 2.2 ‐ The SFA Frontier (Single Input (X) and Output(Y))...... 33 Figure 2.3‐ The DEA Frontier (Single Input (X) and Output(Y))...... 35

12

List of tables Table 4.1 ‐ Australian Defence Industry Deflator – Cost Of Goods Sold...... 58 Table 4.2 ‐ Procurement Methods Categorised as Competitive or Non‐Competitive...... 66 Table 6.1‐ MFP Cycle Analysis (Annual % Change) – Defence, Manufacturing and Mining industries ...... 79 Table 6.2 ‐ MFP Cycle Analysis V2 (Annual % Change) – Defence (X2), Manufacturing and Mining industries ...... 83 Table 6.3 ‐ Australian Defence Industry, Manufacturing, Mining ‐ Labour and Capital Shares . 85 Table 6.4 ‐ LP Cycle Analysis (Annual % Change) – Defence (X2), Manufacturing and Mining Industries ...... 88 Table 7.1 ‐ Defence and Manufacturing Industry Comparison ‐ Profit vs. Labour Productivity. 94 Table 7.2 ‐ Industry Comparison – Employee Wages and Productivity (2001‐2009) ...... 100 Table 7.3 ‐ Defence Industry Comparison – Average Annual Change (Productivity, Profitability, Wages) ...... 102 Table 7.4‐ Defence Industry Comparison ‐ Performance vs. Labour Productivity...... 106 Table 7.5 ‐ Procurement Methods Categorised as Competitive or Non‐competitive ...... 111 Table 7.6 ‐ Defence Contract Procurement Method (% Total Contract by Value ($) and Total Number of Contracts) ...... 112 Table 7.7 ‐ Significant Competitively Procured Defence Contracts ...... 116 Table 7.8 ‐ Changes in Defence Procurement Induced Competition vs. Defence Industry LP . 120

Table1 A. ‐ Defence Company ScoreCard Performance Parameters...... 128 Table A.2 ‐ Defence Company ScoreCard Assessment Ratings...... 129 Table A.3 ‐ Australian Defence Industry Sample Size and Scope...... 129 Table A.4 ‐ Australian Defence Industry Employee Generated Data Observations ...... 130 Table A.5 ‐ Australian Defence Industry Deflator – Sales Data...... 131 Table A.6 ‐ Australian Defence Industry Deflator – Cost of Goods Sold Data ...... 132 Table A.7 ‐ Australian Defence Industry Labour and Capital Shares – Individual Firm...... 133 Table A.8 ‐ Individual Firm MFP Changes Relative to Previous Observation (MFP, Output, and Input)...... 134 Table A.9 ‐ Individual Firm LP Changes Relative to Previous Observation (LP, Output, and Input) ...... 135 Table A.10 ‐ Australian Defence Industry Productivity Index – Company Weights (Pre‐Data Cleanse)...... 136 Table A.11 ‐ Australian Defence Industry Productivity Index – Company Weights (Post‐Data Cleanse)...... 136 Table A.12 ‐ Firm Profit Results (Annual Changes EBDIT relative to previous years)...... 137 Table A.13 ‐ Average Annual Employee Compensation (Current Values) ...... 137 Table A.14 – Company ScoreCard Statistics...... 138

Table B.1 ‐ Australian Defence Industry – Individual Firm MFP Estimates...... 160

13

Table D.1 ‐ Summary Firm Average Profit Margin (Accounting and Economic)...... 180

14

Chapter 1 : Introduction

Background The Department of Defence budget for military equipment acquisition for the period 2010 to 2014 is approximately $21 billion (Australian Government 2010 p. 20). By 2030 it is estimated that the Department of Defence will have spent around $77 billion on new acquisitions (Defence 2009a p. 1) of which, based on historical sending patterns between 2001 and 2009, approximately 80%2 is likely to be spent within Australia.

The Australian Government has announced that a major part of future funding for military equipment acquisitions will be obtained in the form of ‘savings’ derived from the Department of Defence Strategic Reform Program (SRP) (Defence 2009b p. 3). To meet its share of the overall Department of Defence portfolio’s savings, the Defence Material Organisation (DMO) has determined that a portion of this cost reduction will be achieved through “productivity improvements in industry” (ANAO 2008 p. 49).

To achieve this outcome the DMO aims to provide an operating environment that encourages productivity growth in defence related industries as well as one that allows the Department of Defence to capture an equitable portion of the benefits derived from increased industry productivity. From a Department of Defence perspective, these benefits potentially include a reduction in product cost as well as improvements in product quality, specification and availability3. In terms of cost savings alone, if the Department of Defence could capture an additional 0.25% share of defence industry productivity growth in the form of a product price reduction, this could result in theoretical savings to the Department of Defence of $192.5 million up until 20304.

In order to make informed decisions in the future on policies and procedures aimed at improving both the generation of productivity growth in the defence industry as well as the distribution of productivity benefits to the Department of Defence, the Chief Executive Office DMO has posed the following question in an attempt to determine historically how much

2 Data obtained from the Interim Department of Defence Contract Register which records all contracts issued by the Department of Defence (refer to Chapter 4 for detailed information on the Department of Defence Contract Register). 3 Availability refers the period in which a particular piece of equipment is available for military use; commonly stated in days per year, aircraft flying hours, etc. 4 $192.5m = 0.25% x $77bn. 15

productivity growth has been generated within the defence industry and how any productivity benefits are being distributed.

“Productivity capture and apportionment ‐ Who captures the capital and labour and technology productivity in Defence supply, how much, and why? ‐ the supplier, the company’s staff, the Government, or the ADF warfighters?” (Defence 2009c).

Objectives of the research Motivated by the above question, the objectives of this research are to:

1. Measure the productivity performance of the Australian defence industry, and 2. Determine how the Australian defence industry productivity benefits are being distributed amongst defence industry firms, defence industry employees and the Department of Defence5.

Significance of the research There have been several prior studies identified within the literature relating to the productivity performance of defence industries external to Australia (Stansberry 1985, Chan 1997, Barros 2001, 2004, 2005) with none of these studies attempting to determine the actual distribution of productivity benefits either internally or externally. This research is therefore significant in two aspects. Firstly it is the only known study concerned with measuring the productivity performance of the Australian defence Industry and secondly, it is the only known attempt to determine and quantify the flow of productivity benefits amongst Australian defence industry stakeholders which is also a relatively new and untested concept.

Thesis organisation Chapter 1 provides a brief introduction to the thesis topic as well as providing the definition of the Australian defence industry for the purpose of this research. The chapter also contains details of the unique characteristics specific to the Australia defence industry market. Chapter 2 provides a detailed literature review on productivity concepts and commonly used measurement methods, as well as identifying potential issues that may influence the measurement of productivity in a defence industry environment. A literature review of the

5 For the purpose of this research, only productivity benefits being distributed to the Department of Defence as a whole entity will be considered, which by definition include any benefits being obtained by the ADI war fighter. No attempt will be made to decompose any measure related to determining what productivity benefits are received by the Department of Defence in order to identify the specific portion obtained by the ADF war fighter. 16

distribution of defence industry productivity benefits amongst stakeholders is detailed in Chapter 3, with potential apportionment measures investigated, including profitability, employee wages, performance and defence industry competition levels. Chapter 4 then details the methodology used in both the productivity measurement and the distribution of productivity benefits, with associated data and measurement issues being the focus of Chapter 5. Chapter 6 presents the empirical results for the defence industry multifactor and partial factor productivity measurements, with the empirical results relating to the productivity distribution measures presented in Chapter 7. Chapter 8 provides an overall summary of the results and possible policy implications of the findings for the Department of Defence.

Australian defence industry classification Since the Australian Bureau of Statistics (ABS) does not have an official classification for the Australian defence industry, it is not reported in their statistical framework as a separate industry. Rather, defence oriented firms are included in broader ABS industry categories such as manufacturing. This lack of official classification appears toe be du to the fact that the term “defence industry” is commonly used to encapsulate any firm capable of producing goods and services used for military purposes domestically or abroad (Markowski and Hall 2006 p. 43). The identification and separate reporting of such firms is likely to be beyond the current capability of the ABS or any other statistical agency, especially considering that an Australian defence industry would only account for approximately 1.3%6 of the ABS defined Australian manufacturing industry as a whole.

In lieu of an ABS defence industry definition, ACIL Tasman (2004 p. 4) proposed the following definition of the Australian defence industry in a report that was commissioned by several Australian Government Agencies, including the Department of Defence:

“The term ‘defence industry’ denotes those Australian industries that are actually or potentially involved in supplying Australian Defence Force capability and which are influenced by Defence business policies or purchasing decisions. Implicit in this definition are some important judgements about the ambit of defence industry. This definition includes selected defence oriented elements of the manufacturing sector (including shipbuilding, aerospace, automotive, chemicals, electrical and electronic

6 Based on a comparison of 2009 sales revenue of the Australian manufacturing industry against the sales revenue earned by the ten Australian firms included in the definition of an Australian defence industry. 17

equipment, other fabricated metal products and machinery and equipment). The definition excludes from the defence industry profile those industries supplying goods or services which, while perhaps critical to ADF functionality (e.g. petroleum, oil and lubricants; civil roads, harbours and airports; commercial information technology), are not significantly affected by Defence policy or purchasing.”

For the purpose of this study, the definition of the Australian defence industry will be based on the ACIL defence industry definition stated above, with additional emphasis on the fact that the individual firms used in this research rely primarily on the Australian Department of Defence for their business.

Using data obtained from the Department of Defence’s Interim Contract Register (IDCR) in consideration with the availability of individual firm data, the scope of the Australian defence industry for this research will be limited to the following ten firms and their controlled entities:

1. ASC Pty Ltd (ASC). 2. ADI Limited (ADI). 3. Australian Aerospace Limited (AA). 4. BAE Systems Australia Holdings Limited (BAE). 5. Boeing Australia Limited (Boeing) (formerly Eurocopter International Pacific Limited). 6. CAE Australia Pty Ltd (CAE) (formally CAE Electronics (Australia) Pty Limited). 7. General Dynamics Land Systems Australia Pty Ltd (GDLS‐A). 8. Raytheon Australia Pty Ltd (Raytheon). 9. Tenix Defence Pty Limited (Tenix). 10. Thales Australia Holdings Pty Ltd7 (Thales) (formerly Thales international Pacific Holdings Pty Limited).

Historically, the ten companies listed above have accounted for approximately 77%8 of the Department of Defence’s domestic spend on military equipment and sustainment9.

7 Thales Underwater Systems Pty Limited was also consolidated manually from 2004 to 2007 prior to it being included as part of the consolidated entity of Thales Australia Holdings Pty Limited in 2008. 8 Based on data obtained from the Interim Defence Contract Register. 9 Sustainment is the military term used to define the service support required for military equipment. Typical services include repair and maintenances, engineering, logistics, configuration management and disposal action (DMO 2010a p. 178). 18

Several other Australian firms which satisfied the criteria for inclusion as part of an Australian defence industry such as SAAB Systems Pty Ltd (SAAB) were considered but were excluded from the final sample due to the lack of usable financial data available.

Defence industry market characteristics Unlike a traditional competitive market where there are many buyers and sellers, the defence market consists of one purchaser and only a few suppliers, which leads Agapos and Gallaway (1970 p. 1094) to conclude that the defence marketplace “cannot be adequately described by models of the competitive, oligopolistic, monopolistic or even bilateral monopoly type”. However, taking this statement into consideration, if one had to choose a specific characteristic to define the defence marketplace, it appears that the characteristics of the defence industry are more monopolistic oriented than competitive.

In 2001 the Australian Minister of Defence stated that the Department of Defence is the sole purchaser of defence equipment and services in Australia (Reith 2001 p. 17). Because of this unique position changes in the magnitude of the Department of Defence’s spending directly affects the overall size and structure of the defence industrial base (Chan 1997 p. 3). In addition to the influence it has on the overall defence market structure the Department of Defence’s ultimate choice of weapon system combined with its ability to determine the method of procurement and directly enter into contract negotiations with one or more firms can indirectly control the entry, exit and growth of firms that it chooses to do business with (Agapos 1971 p. 37). This ability for the Department of Defence to control the defence market was highlighted in a speech made by the Australian Minister for Defence at the 2011 Defence and Industry Conference where he stated that the Department of Defence had the ability to exclude companies from any future tender process if deemed appropriate (Smith 2011).

This reliance of firms in the sector solely on one buyer has been perceived as detrimental to the long‐term viability of a successful defence industry (Chan 1997 pp. 23‐24).To reduce their fundamental reliance on the Department of Defence as their sole customer, the ability for defence firms to produce competitively in commercial markets in addition to the defence market has been raised as a possible option. While this may be plausible for some firms, such as Thales Australia, which has been expanding into commercial Air Traffic Management and transportation, for other defence industry firms that only provide military specific equipment and services, a transition into commercial mainstream markets may not be possible. Several

19

possible reasons for this inability to transition have been proposed in a 1997 United States Government Accountability Office Report on the trends in Defence spending, industrial productivity and competition (Chan 1997 p. 24). Reasons detailed in the report include the possibility that the production process and related infrastructure being used by defence firms to supply goods and services to the Department of Defence are significantly different to those used by commercial companies producing competitive products for the average consumer, as well as the possibility that some military products are simply not readily transferable for commercial use.

The supply side of the defence market consists of companies that provide goods and services ranging from weapons to clothing (Fonfria and Correa‐Burrows 2010 p. 177). Historically, the numbers of firms participating in all domestic defence industries globally have been decreasing. During the period 1946 to 1994, in the United States defence market alone the number of aircraft contractors decreased from 26 to 7, the number of missile contractors decreased from 22 to 9, and the number of tank contractors decreased from 16 to 2 (Chan 1997 p. 4). Of the firms that remain in the defence market, Hartley (2003 p. 108) found that these firms have tended to form strategic alliances primarily as a result of real‐term decreases in government spending on defence equipment and services. Based on a study of the United Kingdom defence industry, Bishop (2003 p. 1965) went so far as to suggest collaboration between firms at the international level had also increased, especially between firms that traditionally were direct competitors. This apparent increase in defence firm alliances and collaboration has the undesirable effect of whittling away any effective level of competition that may have remained as defence firms merge or exit the defence market.

In the Australian defence industry, ten firms have historically accounted for approximately 77%10 of the market, with eight of the ten being wholly owned subsidiaries of foreign multinationals. Similar to the experience in the United States defence market, a reduction in the number of defence firms operating in Australia has also been particularly evident in the last decade, with the acquisitions of ADI by Thales in 2006 and Tenix by BAE in 2008. These two acquisitions alone meant the two resulting firms, Thales and BAE, accounted for approximately 50%11 of the domestic defence market in 2009.

10 Based on the Department of Defence Interim Contract Register between 1999 and 2009. 11 Based on a comparison of revenue earned in 2009 by the ten Australia defence industry firms. 20

The inability of new firms to enter the defence market also exacerbates the issue of a shrinking defence industry. Some of the issues considered to inhibit new entrants include the high initial capital investment generally required, the perceived administrative and contractual burdens inherent in a highly regulated industry such as defence (Arena and Birkler 2009 p. 5), and the instability of defence market demand (Stansberry 1985 p. 156).

As a consequence of the declining number of firms operating in the defence market, the supply side of the market can be best described as one with a few near‐monopolistic firms rather that a competitive market (Suarez 1976 p. 399). Fonfria and Correa‐Burrows (2010 p. 178) state that in the Spanish defence industry there are many examples of individual major weapon systems being controlled by monopolists. This situation also exists in the Australian defence market and is particularly apparent when the market is categorised in terms of the five main defence industry sectors, being Electronics, Maritime, Aerospace, Land (i.e. vehicles) and Weapons/Munitions (Defence 2010 p. 27), with individual firms dominating supply to the different Department of Defence capabilities.

To combat the further shrinking of the Australian defence market, the Australian Government is actively pursuing measures to increase firm competition within the sector (Defence 2009d). However, increasing competition is not the only policy measure that the Australian and other Governments are actively pursuing.

For years, Governments have been regulating all aspects of defence industry based on the premise that a level of regulation is beneficial in enhancing the return on government defence spending. However, the actual benefits and underlying motivation of government regulation relating to the defence industry have frequently been questioned. Bishop (2003 p. 1965) claimed that some government policies can create barriers for new firms entering the market. More recently, a former Australian Minister for the Department of Defence indirectly conceded that government actions can impact negatively on the defence industry; he acknowledged that project‐by‐project acquisition commonly undertaken by the Department of Defence results in a “boom and bust” environment in which defence firms must operate (Reith 2001 p. 12). Direct regulation is not the only way in which Governments can influence the defence market.

As the sole consumer for defence products, the government budget can also directly influence the Department of Defence acquisition policy. This method of influencing the market is

21

questionable, particularly if the underlying motivation relating to a specific budget proposed by the Government is concerned more with “buying votes” than the impact it may have on the defence industry as a result of changes in the Department of Defence spending patterns (e.g. budget motivations may change during the election cycle and upon a change of Government). The importance of political motives in influence defence industry policy is highlighted by Kovacic (1999 pp. 450‐451) who states that one of the two main obstacles in allowing non‐ United States defence suppliers access to the United States defence market was directly politically motivated due to United States defence firms being a key employer in many electorates. Poole and Bernard (1992 p. 449) also identified the role that political played in the Canadian defence industry policy management and stated that “defence as economics” and “defence expenditure as vote getters” constitutes bad industrial policy.

Despite any potential political motives to secure votes for the incumbent government the official justification by the Australian Government for changing defence industry policy generally relates to increasing competition in the domestic defence market with the aim of generating productivity growth in defence firms, to encourage the growth of Australian firm involvement in defence supply, and to ensure that capabilities that are considered vital to Australia’s strategic military capabilities are able to be produced in Australia (Defence 2009d pp.125‐130).

22

Chapter 2 : Literature review – productivity measurement The chapter commences by outlining the concept of productivity, the benefits that can be obtained from increases in productivity and the difference between productivity and efficiency; two terms commonly misused. The chapter goes on to detail the difference between partial and multifactor measures of productivity, followed by an analysis on three methods of measurement commonly used in the research of productivity. The chapter then concludes with a comparative analysis of these three methods in relation to their usability regarding the measurement of changes in defence industry productivity.

Productivity concepts As defined by the literature, productivity relates to the change in output(s) produced by a nation, industry or firm, compared to the change in input(s) used (Coelli et al. 2005 p. 2) and is generally represented by a simple output/input ratio, as follows:

Productivity = Output(s) / Input(s)

Productivity growth (or) decline describes the change in productivity over several periods and is the difference between the rate of growth of output and the rate of growth of input. A growth in productivity occurs when an increase in the level of output(s) is greater than the corresponding increase in the total amount of input(s) used during a period. While productivity growth is commonly associated with increasing output, it can also be achieved with a decrease in output providing that inputs during the period also decrease at a proportionally higher rate (Van Biesebroeck 2007 p. 532).

In 1957, Solow described the portion of productivity growth between two periods that could not be attributed to changes in inputs as the “residual”, which under the assumptions of constant returns to scale12 and perfect competition in input and output markets, would theoretically be a measure of the technical change13 (also known as technical progress) that occurred during the period (Solow 1957 p. 319).

12 Constant returns to scale is where a proportional increase in inputs results in the same proportional increase in output (Coelli et al. 2005 p. 17). 13 Technical change is defined by Morrison (1993 p. 81) as a shift in the production frontier so that any output level may be produced with a smaller amount of input. While the term “technical change” is generally associated with advances in technology and innovation the term also encompasses improvements in corporate knowledge and experience as well as improved processes and procedures (Cowing and Stevenson 1981 p. 5). 23

Expanding on Solow’s concept, Abramovitz (1962 p. 763) stated that Solow’s residual is not just the result of technical change but could also be attributed to “the advance of knowledge, scale effects, and gains or losses from reductions or enlargements of market restraints, lags in application of knowledge, and similar causes”. If all these factors were correctly identified and accounted for, Jorgenson and Griliches (1967) proposed, the Solow residual should completely disappear. While this may be theoretically plausible, Hulten (2001) highlights that unavoidable practical issues such as measurement errors, unmeasured gains in product quality and the environmental costs of growth will always result in the residual being a combination of technical change and unexplainable error. Separating and identifying the makeup of the residual is commonly referred to as the decomposition of productivity change (Denny et al. 1981 p. 180).

Productivity benefits From a national economic perspective, productivity growth is beneficial, as it results in the conservation or savings of scarce resources per unit of output (Kendrick 1977 p. 1). Other benefits of productivity growth identified by Kendrick (1977 p. 1) include its influence on mitigating the effects of inflation by offsetting rising employee wages and other input price rises, as well as increasing the competitiveness of production.

At the industry and firm levels, productivity growth results in more income being available for distribution to stakeholders. Some of the different ways in which benefits from productivity growth can be distributed are through increased profits to shareholders, through better wages and conditions to employees, through lower prices to customers, and through increased tax payments to Government (Productivity Commission 2009). In relation to the Department of Defence as the sole consumer of defence goods and services, the potential productivity benefits are not restricted to lower product prices but may also be realised in the form of better quality products and improved project delivery performance.

While productivity growth results in economic benefits, during periods of productivity slowdown or decline the opposite can occur, with adverse economic effects being observed. In a study of productivity movement in the United States in the 1960s and 1970s, Kendrick (1977 p. 2) determined that a slowdown in productivity growth was associated with intensified inflationary pressures and a reduction in real income growth and the international competitiveness of American produced goods.

24

Productivity and efficiency Productivity and efficiency are closely related concepts that are often incorrectly used interchangeably. As previously outlined, the study of productivity is concerned with the identification and measurement of technical change (which generally involves the assumption that a unit is fully efficient), whereas the study of efficiency relates to how effective a unit transforms inputs into outputs based on current technology. A simple example of this concept is illustrated below in Figure 2.1 for a simple production process with a single input (X) and output (Y).

Figure 2.1‐ The Production Function (Single Input (X) and Output(Y))

Y

F1 3 F0 2 4 1

5

0 X

The production function (Fo) represents all feasible input–output combinations based on

current technology. Firms operating at points 2, 4 and 5 on Fo are considered fully efficient whereas firm 1 operating below Fo is considered inefficient, as it could theoretically increase output (Y) to point 2 while maintaining the same level of input (X) for given technology.

If technology advances over a period and the production function increases from Fo to F1 all firms can now produce more output using the same level of input. If a firm moves from point 2 to point 3 over the same period then this shift would be attributed to a change in technology (i.e. true productivity increase). If, however, a firm moves from point 1 to point 3 during the period, the increase in production would be a combination of an increase in efficiency (distance between points 1 and 2) and an increase in productivity (distance between point 2 and 3).

25

The importance of firm efficiency was stressed by Farrell (1957 p. 253), as it gives an indication of how much further an entity can “be expected to increase output by simply increasing its efficiency, without absorbing further resources”. The efficiency of a firm also gives an indication of its competitiveness compared to others in the same industry (Farrell 1957 p. 262).

Measures of productivity A review of the literature found that productivity can be measured in several different ways, with the measure chosen generally based upon the availability of appropriate data. According to the Organisation for Economic Co‐operation and Development (OECD) (Schreyer 2004 pp. 3‐ 4), the most commonly used measures of productivity are:

1. Partial measures of labour and capital productivity.

Labour (L) productivity = Output (Q) / (L)

Capital (K) productivity = (Q) / (K)

2. Labour‐capital multifactor productivity (MFP) measurements.

MFP (K, L) = Q / (L, K)

3. Capital, labour, energy, materials, and services (KLEMS) MFP measurements (also commonly referred to as a total factor productivity (TFP) measurement).

MFP (KLEMS) = Q / (KLEMS)

In the early stages of productivity theory development, partial measurements of productivity were common, as they were relatively easy to measure compared to the more complicated MFP measures. However, partial measures of productivity can be misleading if considered in isolation (Coelli et al. 2005 p. 3). This is due primarily to the factor substitution effect (Morrison 1993 p. 31), where an increase in a partial productivity measure such as labour productivity may be simply due to the substitution of labour toward another input such as capital.

To avoid partial measurement issues and to ensure that all production inputs and output are considered, a complete measure of productivity such as a TFP measure is recommended (Kendrick 1965 p. 122). The practical application of TFP measures has been found by researchers to be problematic due to the inherent difficulties in obtaining practical

26

measurements for all identified outputs and inputs. As a consequence, researches have tended to use MFP measures in the practical application of productivity research as a substitute for a theoretical TFP measure. These MFP models range from two‐factor models such as a Labour– capital model to the more detailed KLEMS model.

Measurement variables Fundamentally, the measurement of productivity is a simple output/input ratio. Determining the units of measurement for output and input, however, is not as straightforward. As stated in the literature, there are numerous issues that need to be considered when determining a practical unit of measurement for outputs and inputs which are discussed below.

Measures of output The unit of measurement used to describe a firm’s output is commonly expressed in non‐ monetary terms, such as number of widgets per year. While a non‐monetary unit of measurement would appear logical in the measurement of productivity, as it may more accurately define a firm’s output,e ther are several reasons why this does not occur in practice and why a financial measurement of output is used instead.

For a single‐product firm where the product is homogeneous over time (e.g. bricks, coal, electricity, etc), identifying a non‐monetary measure of output may be relatively simple. However, in the case of a firm that has heterogeneous output (i.e. multi‐product or where the quality of a single product changes dramatically over time), comparability issues arise. That is, the ability to compare products decreases over a period of time unless significant adjustments are made for differences in product specification and quality. These issues are further compounded when output data are aggregated at an industry level, as adjustments are required not only for differences within individual firms but also for differences between firms. For example, two firms could produce the same product but with vastly different specifications and levels of quality.

The 1997 Government Accountability Office report (Chan 1997 p. 4) highlights an example of the quality of a product changing in a defence industry environment; the report determined that the output trend experienced in the United States defence industry between 1973 and 1993 was a decrease in output volume, while output value increased. The Government Accountability Office study found that there were 65% fewer aircraft produced in 1993 compared to 1973, while 1993 inflation‐adjusted budgets for aircraft procurement were more than double those of 1973. These findings relating to aircraft manufacturing in 1973 and 1993, 27

respectively, highlight the misleading results which may be produced in a productivity measurement if quality is not taken into account. Hypothetically, if the Government Accountability Office findings were used to measure productivity in the defence aircraft industry and no adjustments for quality were made, the likely outcome would be a decline in productivity, as output (measured by aircraft numbers) had decreased, while input (measured by cost) had increased over time. These results are likely to be inaccurate, as a doubling in aircraft real price found by the Government Accountability Office in its report supports the fact that one aircraft in 1993 is not necessarily equivalent to one aircraft in 1973. As a consequence of this difference although though the number of aircraft had decreased, productivity for the defence aircraft industry may have actually increased rather than decreased, despite the apparent increase in cost, had the quality issues been considered ine th calculation. While there are various methods of measurement that attempt to deal with these types of issues, such as using hedonic measurement techniques (Coelli et al. 2005 pp. 140‐141), Morrison (1992 p. 133) points out that conventional data‐gathering techniques do not generally provide the appropriate level of detail to allow for these issues to be adequately dealt with in practice.

In lieu of a non‐monetary measurement of output, researchers and statistical agencies use a monetary value of output based on market value as a unit of measurement. The market value of output is used because it assumes that differences in output quality and specification will be automatically adjusted for in the price the market is willing to pay. Where comparisons over time are required, market values are deflated to an appropriate base year to reduce the possibility of changes occurring simply due to price moments (Business Councila of Australi 1986 p. 13). The two financial output measures identified in the literature that are commonly used to calculate MFP productivity, known as gross output and value added (OECD 2001a p. 24), are defined below.

1. Gross output (GO) is a measure of all output (goods or services) produced by an entity that is made available for external use. For a MFP calculation it is represented as:

Gross output based MFP = YGO / I(KLI)

2. Value added (VA) is the gross output less the cost of intermediate materials. For a MFP calculation it is represented as:

Value added based MFP = (YGO – I) / I(KL)

28

Where: YGO is gross output, I is intermediate inputs, I(KLI) is a combined index of labour,

capital, and intermediate inputs and I(KL) is a combined index of capital and labour.

In both the GO and VA models, adjustments are made to account for changes in inventory levels over the period.

Measures of input Detailed in the literature the commonly considered inputs in multifactor productivity measurement are labour, capital, energy, materials and services (Schreyer 2004). For the purpose of this research, only labour and capital inputs are considered. This is primarily due to the lack of consistent data available on the inputs.

Labour Commonly used measurements of labour used include the number of employees, the number of hours worked, the number of full‐time employees, and, the total wages and salaries expense (Coelli et al. 2005 p. 142). Of these measurements the OECD (OECD 2001a p. 40) recommends the number of hours worked as the preferred labour unit of measure with the least recommended measure of labour being the total number of employees for two main reasons.

Firstly, the total number of employees figure does not capture the change in actual hours worked by both full‐time and part‐time employees and secondly, the total number of employees figure is generally reported in practice at the end of the reporting period and may not accurately represent the labour force for the entire period. For example, employees who are dismissed or resign during the period are not considered.

Despite the unit of measure significant issues that should be considered where possible are differences in skill level, education, qualifications, age or gender, all of which have been shown to affect productivity (Jablonski et al. 1998 pp. 34–35). The potential effect of these factors on changes in productivity is evident in a study undertaken by Denison in the United States, which showed that adjustments for changes in the quality of labour accounted for approximately 1% of growth in the calculated US national product per year (Solow 1963 p. 39).

Capital Cobb and Douglas (1928 p. 140) defined fixed capital in 1928 as being “(1) machinery, tools, and equipment and (2) factory buildings”. Since then, the definition of capital has expanded to

29

include any given type of asset, tangible or intangible, where there is a “flow of productive services from the cumulative stock of past investments” (OECD 2001a p. 52). Several other costs related to capital investment, such as costs associated with Research and Development (R&D) and lease payments for capital equipment, are also captured by the OECD in the definition of capital. R&D expenditure is considered an input into the production process as an alternative form of capital (Morrison 1993 p. 254), with several studies finding that changes in R&D spending significantly affect productivity growth and decline (Griliches 1988, Nadiri 1993, Cohen 1995, Hall et al. 2009). Similarly, lease payments for capital equipment not owned by a firm but “present at production sites and capable of being used in production” (OECD 2001b p. 31), which are commonly referred to as operating lease payments, also represent a direct investment in capital equipment used in the production process and should be considered in determining the total capital services utilised (Rogers and Tseng 2000).

Conceptually, identifying capital is relatively simple; however, in practice measurement of capital is problematic and complicated. The main issue associated with obtaining a measure of capital is that, unlike other inputs such as materials and labour, which are consumed or utilised during a specific period of production, capital assets are utilised over several periods (Coelli et al. 2005 p. 144–145). In addition to spanning several periods over the life of an individual capital asset, the efficiency of the asset is likely to decline as it ages. For non‐homogeneous assets the efficiency usage patterns will vary, and for a firm with more than one type of asset the aggregation of productive service flows will become an issue, as it is extremely unlikely that assets will be at the same point in their lifecycle. As a result, a simple aggregation of capital stock value is an inappropriate unit of measurement to use in a productivity calculation. To account for this difference in usage patterns, rather than using the assets value as a measure of capital, a measure of the flow of capital services produced by each asset type is commonly used. If more than one type of capital asset is used, then the capital services of each asset type are aggregated to form a complete capital services total for the firm. The current method of aggregation used by the OECD and ABS was first developed by Jorgenson and Griliches (1967 pp. 254–255) and uses the relative share of the asset’s implicit rental value (also referred to as the user cost of capital) as a weight for the aggregation of capital services.

An aspect critical to the calculation of capital services that has not yetn bee considered is capital utilisation. The utilisation of capital equipment may vary over time independent of efficiency deterioration patterns. How to account for changes in capital utilisation has been

30

explored since Cobb and Douglas (1928 p. 146) undertook their production research in the early 1900s and is considered relevant in the context of an Australian defence industry productivity study. Since 1928, researchers have found that attempting to adjust for capital usage has remained not only problematic but virtually impossible in the practical application of a productivity analysis. Despite this, in their study of firm‐level productivity of the Australian manufacturing sector based on 1683 manufacturing firms, Rogers and Tseng (2000) have shown that capital utilisation adjustment is possible but only where the data are available. Using ABS Growth and Performance Survey data, Rogers and Tseng (2000) were able to make adjustments for capital utilisation using the reported equipment operation hours per week.

Methods of productivity measurement The literature identifies several different methods of calculating productivity measures; three commonly used methods detailed by Diewert (1981 pp. 18–28) are listed below:

1. Index number approaches (e.g. Divisia, Malmquist, Tornqvist),

2. Econometric estimation of production and cost functions (e.g. the Stochastic Frontier Approach (SFA)), and

3. Non‐parametric methods of linear programming (commonly known as the Data Envelopment Analysis (DEA) approach).

While the above three methods can be used to measure productivity, Norsworthy (1984 p. 322) proved that due to the various assumptions and characteristics of each method, different measurement methodologies using the same data can produce significantly different results and conclusions. Considering this finding, Charnes et al. (1988 p. 1) and Bauer et al. (1998 p. 111) recommend that where policy is to be determined from the results of empirical studies, where possible it may be prudent to compare and analyse the results of several measurement methods. As the scope of this research will only allow the use of one method of measurement, the advantages and disadvantages of using one method over another are detailed below.

Index number approach Index numbers such as the Laspeyres, Paasche, Fisher, and Tornqvist are commonly used to measure high‐level changes in economic and financial variables, such as changes in the price of consumer goods (commonly known as the Consumer Price Index) and the Australian Stock Exchange’s All Ordinaries Index (Coelli et al. 2005 p. 85). In relation to the measurement of productivity, Coelli et al. (2005 pp. 85–86) outline the two main areas index numbers are used 31

as either direct or indirect. Index numbers can be used directly to measure productivity changes (e.g. Hicks‐Mooresteen index number) and indirectly in the aggregation computation of data when methods such as DEA and SFA are utilised. For productivity calculations, the Divisia, Malmquist, and Hicks‐Mooresteen Indexes are commonly used, with the latter two generally using Tornqvist or Fisher output and input indexes in MFP index creation (Coelli et al. 2005).

Original approaches to theoretical measurements of productivity growth, such as that of Solow (1957) and Jorgenson and Griliches (1967), were derived using Divisia indexes (Diewert 1992 p. 211). eTh main issue with this approach, however, was that Divisia indexes rely on economic data being provided in a continuous time series form, whereas in reality these data are only available in discrete form (Diewert 1981 p. 21). Despite this, Diewert (1981 p. 21) concedes that this is only a minor flaw, as there are ways to approximate using discrete data.

The idea to use a Malmquist index in productivity measurement was introduced by Caves, Christensen and Diewert (1982 p. 1394), who developed a method14 using Malmquist input and output distance functions. One desirable quality of the Malmquist productivity index over other productivity indexes is its ability to be decomposed into measures of technical change and technical efficiency change (Fare et al. 1992 pp. 292–296). Caves, Christensen and Diewert (1982) also show that under certain conditions a Malmquist index can be approximated using Tornqvist input and output quantity indexes.

The Hicks‐Mooresteen Index (Diewert 1992 p. 240) is regarded as a simple approach to measuring MFP in which measures of output and input growth are calculated using output and input quantity index numbers (Coelli et al. 2005 p. 66). Originally, Diewert (1992 p. 243) provided strong justification for the Fisher input and output indexes to be used but Tornqvist indexes have also been found to be suitable (Coelli et al. 2005 p. 120).

Stochastic frontier approach The SFA to estimating production functions was developed simultaneously by Aigner et al. (1977) and Meeusen and van den Broek (1977). Prior to 1977, the practical implementation of the production frontier work pioneered by Farrell (1957) had been criticised by Aigner et al. (1977 p. 21) as resulting in average production functions being calculated rather than the maximum production function theoretically possible. This was partially due to the

14 Commonly known as the CCD method. 32

deterministic models commonly used assuming that all deviations from the production frontier were due to inefficiency and made no compensation for measurement error or other statistical noise (Ondrich and Ruggiero 2001 p. 434). The SFA attempted to address this shortfall by introducing an error term into the production function estimation methodology that would distinguish productive inefficiency controlled by the entity from other random “noise” that was outside the entity’s control (Aigner et al. 1977 p. 25). As depicted in Figure 2.2 below, introducing a random error variable has an effect on the resultant production frontier calculated using the SFA in that it will not encompass all observations (indicated by the circles).

Figure 2.2 ‐ The SFA Frontier (Single Input (X) and Output(Y))

Yi Production Output Frontier

0 xi Input

This stochastic treatment of errors is considered by many as the main advantage of the SFA over non‐parametric approaches (Kuosmanen 2006 p. 2). However, this unique characteristic is also the area where the SFA receives most criticism. As detailed in Kumbhakar et al. (2007 p. 2), unlike a non‐parametric approach, the functional form of the production function chosen (i.e. Cobb‐Douglas, translog, etc) and the specific distribution assumptions made for the inefficiency and noise error terms will have a significant impact on the resultant production function estimated. Theoretically, the literature shows that there are a number of plausible distributions that could be used but in reality choosing the correct one has proven difficult (Olson et al. 1980 p. 67). In a Monte Carlo study, Olson et al. (1980) compared three estimators of an SFA model (corrected least squares estimator, two‐step Newton‐Raphson method, and maximum likelihood) and found that overall the corrected least squares estimator was preferred. In any event, the SFA requires the parametric form and distributional 33

assumptions to be imposed and this is considered a limitation of the method. However, in an attempt to negate the conventional SFA model's distributional assumption, Kaliragan and Obwona (1994) propose a SFA method of measuring productivity in which this assumption is relaxed.

Although the distributional assumptions imposed on sthe error have been found to affect efficiency scores produced using the SFA method, they do not affect the efficiency ranking of the firms, since the SFA uses cost function residuals to rank the efficiency of firms and as a result firms with lower costs for a determined output will always be ranked as more efficient than firms with higher costs (Bauer 1998 p. 93).

Data envelopment analysis approach The term DEA was first used by Charnes et al. (1978 p. 432) while developing a production function model for evaluating public programs. Being a non‐parametric model, the DEA method is ultimately a measure of “relative efficiency” (Charnes et al. 1978 p. 430) against a “best practice” production function rather than the theoretical maximum production function. Figure 2.3 provides a basic diagrammatic representation of the DEA developed production function based on a sample of firms with one input (X) and one output (Y). Essentially, DEA uses all input/output combinations of firms within the sample to calculate a production frontier that is feasible for these observations. In Figure 2.3 the observations represented by the circles are considered fully efficient and are operating on the production frontier. The observations below the production frontier, however, are considered inefficient and could theoretically improve output production by altering the mix of inputs.

34

Figure 2.3‐ The DEA Frontier (Single Input (X) and Output(Y))

y Production Output Frontier

0 x Input

Figure 2.3 also highlights why the DEA production function is effectively a “best practice” production function, as it assumes that some of the firms are operating at full efficiency. In reality, however, this may not be the situation as all firms could be operating below full efficiency, which would not be reflected in a DEA production frontier.

The DEA’s non‐parametric frontier approach is considered both its greatest strength and weakness. The advantage of being a non‐parametric method is that no particular functional form is imposed (Kuosmanen 2006 p. 2) and few assumptions are required about firm behaviour such as cost minimisation or profit maximisation, as DEA allows for non‐optimising behaviour (Cherchye and Post 2003 p. 411). According to Cherchye and Post (2003 p. 411), not having to make these assumptions is the main advantage of the DEA approach, considering that economic theory in general tends to lack strong hypotheses about production technology and underlying assumptions. Worthington (2000 p. 211) also highlights that given its non‐ parametric basis, the measurement of inputs and outputs for the formulation of the production function can also vary considerably from the traditional measure used. These advantages of the non‐parametric approach do come at a cost.

Inherent in the DEA approach is that all deviations from the frontier are labelled as inefficiencies, with any stochastic errors likely to exist, such as errors in data, being completely ignored (Kuosmanen 2006 p. 2), which may result in an under or over statement of the level of reported inefficiency (Worthing 2000 p. 221). In addition, if the sample size used is small, firms

35

may be incorrectly classified as fully efficient and there may be a possibility that the true production frontier is significantly underestimated (Cherchye and Post 2003 p. 419). Due to the way DEA compares firms based on linear combinations with the same or more of every output (given inputs) or the same or fewer of every input (given outputs), small sample sizes also increase the risk of self‐identifiers and near‐self identifiers if combined with a relatively small number of inputs, outputs and other constraints (Bauer 1998 p. 92). In an attempt to mitigate the effects of small sample sizes, Shestalova (2003) and Lim and Knox Lovell (2009) used a sequential DEA method where previous period samples were used in current period calculations based on the assumption that in each period all preceding technologies are also feasible (Shestalova 2003 p. 212). As a result by definition technical regress cannot occur with decrease in firm productivity being attributed to deteriorations in performance (Shestalova 2003 p. 212). While Shestalova (2003) thought that this was appropriate for certain industries such as manufacturing Shestalova did concede that for other industries such as mining technical regress was possible and as such a sequential DEA approach would not be suitable. Another benefit Shestalova (2003) found in using sequential DEA was that the results are less sensitive to implausible observations. The effect of implausible (or outlier) observations can also be reduced using the outlier‐detection method whereby these observations are actually removed (Burgess and Wilson 1995 p. 348).

Being a non‐stochastic approach, Cherchye and Post (2003 pp. 418–419) highlight the general inability to use DEA efficiency results for statistical purposes, with Charnes et al. (1985 p. 97) going so far as to recommend that it is not appropriate for studies based on sample distribution knowledge. Cherchye and Post (2003) do, however, note that there are procedures available, such as bootstrap techniques, which can be used to estimate and consequently correct small sample biases and construct confidence intervals. Banker (1993 p. 1266) found that the actual statistical properties of the DEA method could be improved if suitable structures were imposed, while Henderson and Simar (2005) developed a non‐ parametric stochastic model based on kernel regression estimation. Kuosmanen (2006) proposed a stochastic non‐parametric envelopment data model that attempted to combine in a unified framework the virtues of both DEA and SFA. However, despite these attempts to combine properties of parametric and non‐parametric approaches, Kuosmanen (2006 p. 4) acknowledged that the conceptual link is still missing and as such these approaches are still not fully accepted.

36

Comparison of methods All three methods of productivity measurement have advantages and disadvantages in their application. In relation to the index number approach, the availability of accurate price data will determine the suitability of using an index number approach, considering this approach relies on both quantity and price data (Coelli et al. 2005 p. 313). Van Biesebroeck (2007 p. 560) found that index number methods produce consistently accurate estimates of productivity growth where there is a low level of expected error in the underlying data, regardless of the actual index number used. While an index number approach is commonly used to measure productivity growth, it does impose some strict assumptions that competitive behaviour exists (Diewert 1981 p. 26) and that all firms are fully efficient (Coelli et al. 2005 p. 6). Both of these assumptions are unlikely to apply to the majority of Australian defence industry firms, noting the defence market characteristics discussed in Chapter e1. Th index number approach in its basic form also does not allow for the efficiency measurement of a firm relative to a “norm” or the ability to determine if the selection and combination of inputs represent the least‐cost combination (Kendrick 1965 p.121). However, as both these measures are outside the scope of this research, the inability to determine these measures is not a major factor in determining an appropriate method to use.

Of the other productivity measurement approaches discussed, both the DEA and SFA assume that not all firms are fully efficient, with various DEA models developed to account for different scales of operations (Seiford and Tharall 1990). The main advantage of the non‐ parametric DEA approach over the parametric SFA is that the DEA approach does not need the functional form of the production function or a distributional form for the inefficiency term to be specified. The main disadvantage of the DEA over the SFA is that it does not take into account random errors or “noise”. Despite which method is used, Perelman (1995) found that both the DEA and SFA approaches do produced similar productivity growth figures but there tend to be significant differences between the two productivity decomposition results.

The main disadvantage in using either a DEA or SFA approach is related to the fact that the firms chosen for the Australian defence industry study are not homogenous in nature and vary significantly in their relative scale of operations. In addition, as the focus of the study isd relate to the overall apportionment of the benefits derived from and the productivity growth generated by the Australian defence industry, determining the relative efficiency between firms is also not as critical.

37

Chapter summary This chapter provided a detailed analysis of the common measures and methods of measurement used in productivity research. The use of these measures will enable the calculation of productivity changes in the Australian defence industry but will not provide an indication of how the benefits associated with productivity growth are distributed. The following chapter investigates several methods taken from the literature to determine how productivity benefits are distributed and in what proportion.

38

Chapter 3 : Literature review – productivity distribution This chapter explores various measurement techniques to determine the distribution of Australian defence industry productivity benefits. The chapter is divided into three main sections examining each of the main stakeholders that benefit from defence industry productivity growth, being the defence industry firms, the defence industry firm employees and the Department of Defence.

Productivity benefits distribution Once the level of productivity growth generated by the defence industry has been identified, the distribution of the resulting benefits of productivity growth amongst the main stakeholders can be addressed. As detailed in Chapter 1 the main stakeholders considered in this study are defence industry firms, defence industry employees, and the Department of Defence (including the war fighter, being the Royal Australian Navy, Army and Air Force).

The literature appears to contain little information relating to the internal or external apportionment of productivity benefits at the firm or industry level, with the literature identified dealing primarily with how productivity performance is affected by changes in the operating environment of an industry of firm rather than how changes in productivity influence the distribution of actual productivity benefits. Using firm profit as an indirect measure of productivity, Caves (1974) attempted to provide evidence of the “spill over” benefits that Foreign Direct Investment (FDI) in several firms had on the Australian manufacturing industry as a whole. Globerman (1979) undertook similar research in the Canadian manufacturing industry but used a direct measure of labour productivity to determine the existence of any improvements resulting from changes in the level of FDI experienced within Canadian manufacturing firms. Although the focus of the research by Caves (1974) and Globerman (1979) was related to the effect on productivity performance due to changes in external factors, their approach may provide some guidance for research being undertaken on productivity in the Australian defence industry. Rather than determining the effect of external influences on the productivity performance of an industry, it may be possible to determine how productivity benefits are distributed by identifying how changes in productivity performance affect certain measures of productivity benefits received by the stakeholders.

39

Based on the Productivity Commission’s (2009) assessment of the way in which productivity benefits are thought to be distributed, some direct measures that may be considered relevant in Australian defence industry productivity distribution research relate to profit, employee wages, product price, and government taxes. Although not explicitly stated ein th Productivity Commission’s assessment, changes of other forms in which productivity benefits could be distributed to customers are via an increase in the quality and specification of goods and services received. In addition to a direct measure comparison approach to determine the distribution of productivity benefits, a measure of defence market competition may also provide an indirect indication of the potential magnitude of productivity benefit distribution to each recipient.

Defence industry firms The method identified by the Australian Productivity Commission (2009) in which productivity benefits are likely to be distributed to firms is in the form of profits. Measuring the changes in the profitability of a firm relative to changes in productivity generated should provide an indication of the extent to which productivity benefits are distributed to the firm. For a publicly listed firm, this comparison may be relatively easy to make, as profit data are publically available and are generally subjected to a high level of scrutiny by shareholders who want to maximise their share value. For a firm wholly owned and controlled by a foreign multinational, the task of calculating profit levels may prove to be more problematic. Depending on the motivation of the parent company, there is a possibility that firm costs and revenues are manipulated to enable the parent company to appear more profitable at the expense of the subsidiary. Similarly, as there are no shareholders other than the parent company, the subsidiary may attempt to manipulate the reported profit to reflect a level of profit it perceives would be considered appropriate by the Government in relation to a defence industry firm.

Profitability of defence industry firms Due to the oligopolistic and more often monopolistic market in which Australian defence firms operate (see Chapter 1), it is plausible to anticipate that levels of profitability realised should be higher than those experienced by non‐defence related manufacturing firms. In an analysis of 9,300 industry segments from 1983–89, Lichtenberg (1992 p. 751) found evidence to suggest that defence contractors are “substantially more profitable than other segments”, with profit rates measured by return on assets being up to three times higher than firms operating in non‐government segments. These findings have been supported by McGowan 40

and Vendrzyk (2002 p. 949), who also found that defence contractors experienced “abnormally high profitability on their government work”. Conversely, based on a sample of 36 American defence contractors, Bohi (1973 p. 728) found no evidence to support the notion that defence work is any more or less profitable than non‐defence work undertaken by the same contractor; however, Bohi (1973) did stipulate that the study did not attempt to determine whether or not profits earned by defence firms overall were considered abnormally high.

There are several reasons identified in the literature as to why defence firms are thought to experience comparatively higher levels of profitability than non‐defence firms. One reason is that defence firms are expected to earn high profits due to the high bargaining power they exercise during contract negotiations as a result of the non‐competitive environment in which they operate (Agapos and Gallaway 1970 p. 1094). Agapos and Gallaway (1970 p. 1095) also highlight the possibility that during the negotiation process contractors may artificially inflate their proposal costs in anticipation that the “military service will ruthlessly bargain for a familiar percentage adjustment downwards”.

Another common justification for the defence industry earning higher profits is the idea that defence business involves a higher level of risk than non‐defence business. In a study of large United States defence contractors in the 1950s and 1960s, Stigler and Friedland (1971) found evidence indicating that defence business did carry a higher level of risk than non‐defence business. This level of risk experienced is not necessarily identical across the industry as a whole and may vary by company or even by project. Burns (1972 p. 18) found that a firm with a higher portion of its own working and fixed capital allocated to a project would experience a higher risk level than a firm with little or none of their own capital being used. The idea of defence business being more risky than non‐defence business is not supported by all researchers. Suarez (1976 p. 400) states that there is no risk involved in operating in the defence market as significant amounts of working capital, commonly in the form of upfront payments and plant and equipment,15 are provided to the companies by the Defence Department. In addition, Suarez (1976 p. 400) suggests that any unanticipated costs experienced by the firm can simply be transferred to the Defence Department rather than being absorbed by the firm itself. Suarez (1976) also stated that one way in which defence firms maintained existing profit levels was by simply being awarded new defence contracts

15 Plant and equipment provided to Australian defence contractors by the Australian Department of Defence is commonly referred to as Government Furnished Equipment. 41

despite performing poorly in terms of expected cost and schedule on their existing defence contracts.

In relation to the Australian defence industry, ACIL Tasman (2004 p. XV) found that overall defence industry firm profitability was similar to the general manufacturing and services sectors, of which defence oriented firms are considered to be a part. While ACIL Tasman did not provide comment on whether or not these profits were considered excessive, they did state that they were sufficiently large to encourage continuing investment by defence firms to meet future Department of Defence requirements. It should be noted, however, that ACIL Tasman (2004 p. 12) stipulate that in relation to the data used, “a large proportion of the larger, overseas owned prime contractors declined to answer questions about their profitability”, which may have resulted in the profitability data used being biased toward locally owned small to medium enterprises. As eight of the ten companies analysed in this research are likely to be these “large overseas owned prime contractors” to which ACIL Tasman (2004) referred, any comparisons between the ACIL Tasman study and this research should be made cautiously.

Measuring profitability Measures of profit are generally expressed in terms of either accounting or economic profit (Gow and Kells 1998 pp. 18–20). Accounting profit is a measure of profit based on implicit financial factors and is generally expressed by companies in their Financial Statements as the difference between revenue and expenses over a period while economic profit is a measure of profit based not only on implicit financial factors but also implicit ones, such as the opportunity cost to the firm of investment. As a result, economic profit is also considered a measure of monopolistic or oligopolistic profit (i.e. excess profit) because in theory in a competitive market the long‐term economic profit would be zero, as if the economic profit were positive then there would be incentive for firms to enter the market, while if it were negative firms would leave.

In determining accounting profit, certain expenses such as expenditures ont plan and equipment may be capitalised and then expensed in the form of depreciation over a certain time period. In contrast, a calculation of economic profit would realise these expenses at the time they occur. Economic profit also includes other costs not considered in the accounting calculation, such as the opportunity cost of holding capital. As a result, accounting profit is effectively equal to the economic profit plus the opportunity cost of a firm’s investment (Gow 42

and Kells 1998 pp. 18–20). This definition, therefore, allows a firm with a positive accounting profit also to experience a negative economic profit during the same period.

Any measure of profitability undertaken is only reflective of the data presented in the firm’s Financial Statements. This data essentially represents profit information that a firm is required to disclose in compliance with the national accounting standard, which may not necessarily reflect their true profit potential. Feeny and Rogers (1998 p. 9) raise the idea that firms may choose to recognise profits in other ways. For example, a firm may choose to increase spending on items such as employee bonuses or non‐essential equipment purchases (e.g. new office fit outs) during periods of high profitability.

The adoption of the Australian equivalent to the International Financial Reporting Standards (AIFRS) in 2005 may also have an effect on the reported profits of Australian defence industry firms, simply due to a change in the Financial Statement reporting requirements. Initial studies on the effect on profits of the introduction of these new standards have been mixed. A survey conducted by Liverpool University found that the average overall impact on United Kingdom firms was to inflate post‐tax profits by 39% (Garvey 2007), while Hellman (2011 p. 61) found that among a sample of 132 large Swedish listed companies the adoption of the new standards caused “material increases of both net profit and balance sheet numbers”, whilst clarifying that these increases were more likely caused by the adoption of standards not already used in Sweden. The opposite effect on profits was found by Perramon and Amat (2006 p. 17), who calculated a mean value change of ‐3.9% in the period post adoption of the new accounting standards, based on a sample of 28 non‐financial firms in Spain. In relation to Australian firms, the expectation was that the transition to AIFRS would increase profit by 6% in the first comparative year (Pawsey 2008 p. 4) but no practical study was found in the literature to confirm this expectation.

In relation to the firms comprising the defence industry, as eight are wholly owned subsidiaries where the parent company is also the only shareholder, profits may also be passed from the subsidiary to the parent via transfer pricing and subsequently may not be captured in a profit analysis. Transfer pricing is where the subsidiary purchases services and/or products from the parent at inflated prices which the subsidiary then passes on to the end user with little or no profit margin added. The actual extent to which this occurs within the Australian defence industry is currently unknown but should be noted in considering the results presented in Chapter 7. 43

Defence industry employees While profit was identified as the main way in which productivity benefits are distributed to firms, the primary form in which productivity benefits are distributed to employees is via increased wages (Productivity Commission 2009). Measuring the changes in employee wages relative to changes in defence industry generated productivity may provide an indication relating to the distribution of productivity benefits being obtained by defence industry employees.

Wages and productivity Typically the literature identified relating to workers’ salary and firm productivity focused on either determining the link between performance‐based pay remuneration and firm productivity or the existence of wage discrepancies between male and female workers relative to productivity generated by each sex. Although neither of these lines of enquiry is specifically concerned with the apportionment of benefits derived from productivity growth, the research on performance‐based pay does provide some related concepts that can be used in the study of the distribution of productivity benefits.

Improving employee motivation and morale has long been considered one of the major areas in improving productivity within a firm (Doucouliagos and Laroche 2003 p. 653), with one of the main incentives used to motivate employees being wage related (Cadsby et al. 2007). In a study on different types of employee compensation schemes, Cadsby et al. (2007 p. 387) found that there was significantly higher productivity in firms that engaged in performance‐ based pay remuneration rather than those firms with standard fixed‐salary arrangements. Cadsby et al. (2007) determined the main reason for higher levels of productivity for firms with performance‐based pay remuneration was the fact that performance‐based pay arrangements provide greater incentives for motivated employees than fixed‐salary schemes. Booth and Frank (1999 p. 447) went so far as to suggest that firms with performance‐based pay schemes actually attract inherently more productive workers, which naturally results in higher levels of productivity growth for the firm.

Thus, it would be reasonable to postulate that in the absence of all influences other than those caused by changes in productivity, industries that experience relatively higher levels of productivity growth should also experience relatively higher levels of growth in employee wages. The exact extent to which wages change in the defence industry relative to changes in

44

productivity compared to similar industries such as manufacturing may give some indication of whether the benefits of productivity gains are being apportioned to the firm’s employees.

The Department of Defence The three main areas in which the Department of Defence can benefit from the defence industry‐generated productivity is related to product cost, schedule and overall quality. While all three areas are considered important, based on the objectives of the Australian Government’s recently imposed Strategic Reform Project (Defence 2009b), the Department of Defence’s priorities are more likely to be achieving real cost savings while maintaining appropriate schedule and quality performance levels. Two measures available that may provide insight into the relationship between changes in defence industry productivity and changes in the cost, schedule and quality of services being received by the Department of Defence are a direct measure of industry performance and an indirect measure of the level of competition within the defence industry market.

Firm performance16 There are numerous measures that have been used by researchers to measure the performance of a firm or industry. As discussed previously in this chapter, the most commonly used measures of a firm’s performance, such as return on assets, return on equity and return on sales, are based on accounting variables and are generally associated with measures of profit performance (Gow and Kells 1998). In addition to profit measures of performance, some researchers have attempted to use other measures as an indicator of firm performance. Pandya and Rao (1998 pp. 77–78) used the level of product diversification offered by firms as a measure of performance, while Tsai and Wang (2005 p. 92) used value added as a measure of firm performance. While these measures of firm performance differ in their approach, they all provide a measure based on the firm’s perspective and not that of their customers. For example, a performance measure based on sales or profit may show a firm to be a high performer, but if the firm is a monopolist then the possibility exists for the firm’s performance to increase, based on a sales or profit measure, at the expense of the quality of customer service, as the customers generally have no option but to use the firm’s services or products. Measuring the changes of the quality of services being provided to the consumer by the firm is a measure of performance that may provide an indication of productivity benefits other than

16 For the purpose of this Thesis the term “performance” relates to the quality of services and products being provided by defence industry firms to the Department of Defence. 45

those in the form of lower product costs being distributed to the consumer. In the context of this thesis, the other benefits include improvements in the quality of products and services being provided to the Department of Defence.

In an attempt to obtain “better value for money for Australia”, in 2001 the DMO introduced a performance reporting programme known as the Company ScoreCard program (DMO 2005 p. 2), which provided a mechanism to internally assess the performance of defence industry contractors as well as the ability to provide performance feedback to defence industry firms to assist firms in improving their services and products. Every six months commencing in April and October17, the DMO rates the performance of prime contractors and significant subcontractors for contracts where either (DMO 2010b p. 3.8.1):

1. The value of a DMO capital acquisition contract (including contractual costs and options) exceeds $10 million; 2. The value of a DMO in‐service support contract or standing offer (including any extensions provided for in the initial contract) is $5 million or greater in a single contract or cumulative over the life of the contract; or 3. A DMO contract is considered to be operationally sensitive, military significant or may lead to subsequent contracts.

Reported contracts are given a performance score based on the individual assessment rating allocated to each of the performance parameters by DMO project staff. A complete list of assessment ratings and performance parameters is provided in Annex A Table Defence A.1 ‐ Company ScoreCard Performance Parameters and Annex A TableDefence A.2 ‐ Company ScoreCard Assessment Ratings. Each assessment rating descriptor and performance parameter is allocated numeric scores, which are then used to produce an overall company scorecard score. This calculation is explained in detail in Chapter 4.

Comparing changes in performance over the period against movements in productivity may provide an indication of the possible distribution of defence industry generated productivity benefits. If productivity benefits are distributed evenly, it would be expected that an increase in productivity would result in an improvement in the contract performance of the firm. Based

17 Six monthly reporting commencing April and October only started from round 5 (Apr 2002) onwards. Prior to round 5 the dates for each round were as follows: round 1 (Jan‐Dec 99), round 2 (Jan‐Dec 00), round 3 (Jan‐Jun 01), round 4 (Jul 01‐Mar 02). 46

on the same rationale, a decrease in productivity would likely result in a decrease in the level of contract performance.

Competition within the Australian defence industry The level of competition in a market is considered a significant factor directly affecting productivity, with perfect competition being a neoclassical assumption in the measurement of productivity (Balk 2007 p. 3). In general, the theory is that increased competition within an industry results in more efficient practices (Baily and Solow 2001 p. 159) and higher rates of productivity growth (Nickell 1996).

While it is generally accepted that the relation between competition and productivity is positive (Rogers 2003) there are some dissenting views. Under a Schumpeterian framework, increased competition is thought to lower monopolistic profit, which is considered necessary as an incentive for investment in resources and innovation (Rogers 1999 p. 102), even though there is relatively little empirical evidence to support this view (Productivity Commission 1999 p. 57). Despite this apparent lack of empirical evidence, as the Australian defence industry appears to rely on technologies developed overseas, the relevance of the Schumpeterian view in this research is likely to be low (Productivity Commission pp.56‐57). As such, the Australian Government has determined that increasing competition within the defence industry is vital for enhancing productivity and developing the industry as a whole (Reith 2001) and, in principle, should result in “Defence being supplied with high quality goods and services at the best possible price” (Victorian Government 2006 p. 6).

The perceived necessity for governments to actively participate in enhancing competition within defence industries has been the subject of several studies. Agapos (1971) proposed that United States Government initiatives actually hampered competition. Thomson (2006 p. 36) suggested that due to the unique characteristics of a defence market, a “well managed monopoly” may be more appropriate than full competition. Arena and Birkler (2009 p. 7) provided arguments for and against increased competition within the defence sector, whereas Camm (1993) supported increased competition with caution. This cautious approach has also been proposed within Australia, based on the rationale that it may be inefficient to promote competition within certain sectors of the Australian defence industry (Victorian Government 2006 p. 7). Despite these potential inefficiencies within certain sectors of the Australian defence industry, the Australian Government has favoured a more traditional approach and has openly promoted competition in all sectors of the Australian defence industry through 47

programs such as the Australian Industry Capability Program (Clare 2011) and via statements of intent detailed in various government publications, such as the 1992 Defence Policy and Industry Report and the various Defence White papers released over the past decade. While these government initiatives and publications are not solely concerned with increasing competition in the Australian defence industry, the level of government involvement in defence industry competition policy is evident in the 2001 Defence White Paper, where a new Department of Defence policy for industry was detailed, and a major component of which was directly related to “changing Defence’s competition policy” (Australian National Audit Office 2003 p. 27).

The success of government competition policy has been often questioned. In 2006 the United States Government Accountability Office found that despite the existence of The Competition in Contracting Act 1984, in the 2005 fiscal year only 41% of Defence Department contracts were awarded using full and open competition (Walker 2006 p. 6). In addition, it found that approximately two thirds of the US Army spend on security guard contractors was via sole‐ sourced contracts, despite knowledge that these contracts were 25% more expensive than those it previously awarded competitively. This trend appears to be similar in Australia, where it was reported in 2006 that less than half18 of Australian Department of Defence contracts were allocated using a competitive process (Victorian Government 2006 p. 6).

Researchers have used various techniques to determine a measure of competition in an industry and the effect that competition has on productivity. Nickell (1996) in a study of companies in the United Kingdom concluded that higher rates of productivity growth were linked to high levels of competition based on measures of the number of competitors in the market and the levels of market rents. Barros (2004 p. 2) notes that market share has also been used as an indication of a firms’ competitiveness. Despite an apparent lack of “consensus on how to measure competition” in a defence industry environment, a report conducted by the United States Government Accountability Office used the method of defence procurement as a measure of competition that existed in the United States defence industry (Chan 1997 pp. 16–17).

To determine eth level of competition likely to exist, the various methods of procurement used by the United States Department of Defence were categorised as either being “full and

18 This statement was also confirmed during the course of this research and is detailed in Chapter 7. 48

open competition” or “other than full and open competition”, based on the inherent level of competition induced by the use of each procurement method. For example, a sole‐sourced method of procurement was considered non‐competitive as only one suppler is approached, while an open tender procurement method was considered competitive as any supplier satisfying the requirements was theoretically able to participate in the procurement process. Acknowledging the nature of the defence market and the real possibility that an open tender procurement process may only receive one response, the Government Accountability Office did consider another indicator of competition, whiche was th number of offers received in response to solicitations in each procurement category. However, this measure of competition was not pursued, as the database used for the analysis did not capture this information (Chan 1997 p. 17).

In relation to the Australian defence industry, changes in the level of market competitiveness may provide an indication of the likelihood and possible extent to which defence industry‐ generated productivity benefits are distributed to the Department of Defence. An increase in competition is expected to result in an increased flow of benefits distributed to the Department of Defence, whereas a decrease in competition is likely to result in a higher level of productivity benefits being retained by the firm and/or its employees.

Chapter summary This chapter provides information on measures that can be used to determine the distribution of productivity benefits generated by the defence industry. To determine the benefits distributed to defence industry firms, a measure of profit is considered, while wages received by defence industry employees are used to measure the distribution of productivity benefits obtained by employees. A measure of the quality of services and products being provided by the defence industry is proposed to determine the level of productivity benefits distributed to Department of Defence. Finally a measure of the level of competition within the defence industry market may provide an indication of the likely extent to which these benefits are apportioned among stakeholders. The following chapter will outline in detail the methodology used to calculate each of the abovementioned benefits.

49

Chapter 4 : Methodology This chapter details the methodology employed to achieve the two main objectives of this research as stated in Chapter 1. The first part of the chapter details the process used to calculate a multifactor and partial measure of productivity at the industry level. The second part of the chapter deals with productivity apportionment aspects of the research and lists the procedures followed to calculate defence industry measures of profit, employee wage growth, performance and the level of competition within the Australian defence market.

Productivity measurement A labour‐capital multifactor productivity (MFP) measure and a separate partial labour productivity (LP) measure were calculated for each Australian defence industry firm, with the individual firm results aggregated to form defence industry MFP and LP indexes. A labour‐ capital MFP measure of productivity was used in preference to a more complete KLEMS MFP model, primarily due to data availability, as only two inputs consistently identifiable during the period. While only a partial factor productivity measurement, a LP measure was also calculated to enable a more complete comparative analysis between the Australian defence industry and the overall Australian manufacturing industry of ewhich th defence industry is essentially a sub‐sector.

As detailed in Chapter 2, any of the three methods of productivity measurement could be used to produce a productivity measure for the Australian defence Industry. Previous studies identified specifically focusing on defence industry productivity performance (Barros 2001, 2004, 2005) used the DEA method twice and the SFA method once. It should be noted, however, that the focus of Barros’s study related to the Portuguese defence sector, which has little similarity to the Australian defence Industry which is substantially more complex and varied in its structure19. Ultimately, as the data available was considered reasonable and as the focus of the study was not concerned with the relative efficiencies between Australian defence industry firms, an Index number approach was used based on the Tornqvist MFP index. When using an index method to measure productivity it is important to remember that “productivity indexes are not precision tools, but they do indicate general orders of magnitudes of change,

19 The Portuguese defence industry is relatively narrow with a tendency for the major suppliers to specialise in one or two products Barros (2001 pp 2‐5) whereas the large Australian defence suppliers being used in this research tend to have a number of different product lines. 50

given the theoretical and conceptual framework within which they have been constructed and in terms of which they must be interpreted” (Kendrick 1973b Ch2 p. 16).

To create a defence industry MFP and LP index, the individual results from the ten defence industry firms were calculated using the weighted average of the annual productivity changes experienced by each firm (OECD 2001a pp. 142–144). The annual value added generated by each individual firm was used as its weight, which is consistent with the measure of output used in the productivity calculation based on value added, as detailed below.

Productivity index formula The productivity measures used were based on a value added index approach using the following Tornqvist TFP index in its logarithmic form:

ln TFP Indexst =ln (Output Indexst / Input Indexst)

= ln Output Indexst – ln Input Indexst

11MK

rris itln q it  ln q is s js s jt ln x jt  ln x js 22ij11

Where: qs = Output, xs = Input, r = Revenue share of Output, s = Cost

share of Input, subscripts s and t = time periods, i = i‐th output

commodity, j = j‐th input commodity.

(source: Coelli et al. 2005 p. 119)

The methods used to calculate each of the productivity index formula’s variables are detailed below.

Output For the purpose of an Australian defence industry productivity analysis, a Value Added (VA)

measure of output was used, where VA is equal to the gross output (YGO) less intermediate inputs (I).

In addition to the financial VA output measure, an experimental measure of productivity was also calculated using a measure of output based on the perceived value of supply obtained by the Department of Defence. This experimental measure of output will be based on an annual performance score allocated to each defence industry firm via the Defence Material

51

Organisations (DMO) Defence Company ScoreCard performance rating program, which is explained in detail in the latter half of this chapter. As a fundamentally qualitative based measure, the use of the DMO allocated performance score as a measure of output has theoretical and practical issues, with some of these issues having already been discussed in Chapter 2. Despite these known issues, the purpose of undertaking this additional experimental measure of productivity using firm performance ratings for output is to provide an insight into the use of an alternate measure of output that is not reliant upon financial data supplied by the firm; rather, the desired supply benefits perceived by the Department of Defence are used as the measure of output. The results of this experimental measure will not be included in the main body of this thesis and instead a separate comparison of the measured results based on both the VA and performance measure of output will be undertaken at the firm and industry level and presented in Annex C. These results, based on the performance measure of output, will also not be included in the overall defence industry analysis and final conclusion.

Input The two measures of input used in the productivity calculations were labour and capital.

Labour Due to data availability, the unit of measurement used for this research on Australian defence industry productivity was the total number of employees figure reported at the end of each reporting period by each firm. While this unit of labour measure is not ideal, the Organisation for Economic Co‐operation and Development (OECD) deems it acceptable (OECD 2001a p. 40). In an attempt to improve the quality of the labour measure, the total number of employees was adjusted to account for the change in actual hours worked over the period, which may not have been captured in the financial data available (e.g. unpaid overtime). No other adjustments were made to the original raw labour number of employees data.

Capital To calculate the flow of capital services the perpetual inventory method (PIM) was used to develop a productive stock measurement with the User Cost of Capital (UCC) also calculated to use as a weight in the aggregation of individual asset categories. The methodology used to calculate each component required for the overall capital services measurement is detailed below.

52

As mentioned previously, to calculate a capital services measurement, a measure of the productive stock of capital for each firm was required. To simplify the process, assets were divided into the three following classes, which are commonly listed in firm Financial Statements:

1. Plant and Equipment,

2. Landd an Buildings20, and

3. Intangibles (which excludes non‐contributing production categories such as Goodwill, Customer order books, etc).

For each of the three categories of productive stock listed above, a financial measure was required. The three main ways in which a measure is commonly obtained is using the PIM, various survey methods (i.e. based on acquisition prices) or the “balance of fixed assets” method (OECD 2001b p. 39). As stated at the beginning of this section, the PIM method of productive stock measurement was used, which required a deflated time series of investment expenditure for each category as well as estimates of retirement and usage patterns of each category (Coelli et al. 2005 pp.145–148). Ideally, the investment profile for each class of asset should cover the entire life of the asset; however, this was not possible due to the data available not covering the entire useful life allocated to each asset category. To mitigate this issue, the opening net book values for the year that each individual firm’s productivity analysis commenced was used as an estimate of the current value of productive stock used by the firm. The annual investment in each of the asset categories was then added to this opening value to determine the total investment series for the period of analysis. The investment series for each asset category was then deflated to a 1999 constant value, with the flow of capital services for each category calculated using the following hyperbolic age efficiency profile consistent with that used by the ABS (Trewin 2000 p. 139):

20 While it would have been preferable to consider land separately, as land is generally considered an appreciating not depreciating asset, data availability prohibited this from occurring.

53

Vt = Vo(T‐(t‐1))/(T‐ β (t‐1))

Where:

Vt = the market value of an asset at the beginning of year t.

Vo = the market value of an asset at year 0.

β = the slope‐coefficient, which has been set to 0.5 for machinery and equipment and 0.75 for structures (Trewin 2000 p. 253). Machinery and equipment were allocated a lower value than structures, as it is assumed that their efficiency deteriorates faster than in the case of structures.

T = Mean asset lives (years): P&E = 12.1, Land and Buildings = 38, Intangibles = 8 (Trewin 2000 pp. 265‐273).

t = years 1, 2, ..., T.

(source: OECD 2001b p. 61)

Once a measure of capital services was calculated for each category, UCC was required to assist in the aggregation of the three asset categories. The UCC consists of a depreciation charge21 for the decline in value of equipment and an investment charge to account for the opportunity cost of the capital investment. The standard expression for the user cost of an asset is:

User cost = Vt (dt + rt)

Where: Vt = market value of a new asset at constant price, dt = the depreciation on a

new asset, and rt = a measure of investment charge.

(source: OECD 2001b p. 86)

The depreciation charge calculated was based on the gross capital stock for each category (i.e. the actual deflated investment series) using the diminishing value method, with the effective lives used being the same as those used to calculate productive stock (Trewin 2000 p. 263). The investment charge was calculated on the net capital stock for each category after accumulated depreciation was considered, using an investment charge of 8%, which was the

21 Also known as the consumption of fixed capital (OECD 2001b p. 63) 54

average rate of return for manufacturing calculated by the ABS between 1985 and 2006 (Pink 2007 p. 108). Using the UCC as weighting, the total productive stock for each firm was then calculated. To ensure all capital related investments were considered, the operating lease costs were also added toe th aggregated asset categories to form the total annual capital stock for each firm.

It should also be noted that while it is highly likely that the capital utilisation of defence industry firms has varied during the period covered by the study, as various Department of Defence projects were commenced and completed, the lack of data available relating to the usage patterns of capital equipment has resulted in no capital usage adjustment being made in this study. As such, the underlying assumption of full capital utilisation has been made in this analysis. dCapital an labour shares of input Once the units of measurement are known for all inputs, a method for combining these inputs into a single measure is required, as not all inputs will have the same influence on the production process. One of the more simple methods is based on the cost of each input over the period. Where only capital and labour inputs are considered, the formulae applied based on this method are as follows:

Sk = KPk / (KPk + LPl)

Sl= LPl / (KPk + LPl)

Where: Sk = Capital share, Sl = Labour share, Pk = rental price of capital services, Pl = price of labour.

To ensure that Sk and Sl sum to unity, three primary assumptions are made. These are: constant returns are to scale; the marginal products of capital and labour are equal to their respective real‐market prices (i.e. perfect competition); and there is neutral technology change (Kendrick 1973b Ch2 p. 12). This method of input aggregation is used by both the United States of America Bureau of Labour Statistics (BLS) (BLS 2007 p. 7) and the ABS (Trewin 2000 p. 368).

Data deflation To remove inflationary effects when measuring productivity, it is recommended that all data used be deflated to a constant baseline. Therefore the selection of appropriate deflators is crucial in the development of a productivity calculation as they transform the raw data 55

commonly presented in current figures as constant expenditure and revenue series (Coelli et al. 2005 p. 146). All data used in the productivity calculations were deflated to a base year of 1999, despite some companies’ MFP measurement not commencing until one to three years later. The reason for this was to ensure that all figures were constant when developing the overall defence industry MFP and LP indexes.

Output deflator At the industry or national levels, common deflators used by statistical agencies such as the ABS in their analysis of productivity are high‐level implicit price indexes such as the Consumer Price Index (CPI) or the Gross Domestic Product (GDP) deflator (Pink 2007). While this is reasonable at an aggregate national and possibly at the industry level, it is not preferable at the firm level. As stated by Morrison (1992 p. 136), the deflator used should be one that represents the materials used and produced by firms and as such, where possible, sub‐sector Producer Price Indexes (PPIs) are more appropriate to use than the high‐level indexes for deflationary purposes. The importance of applying a deflator at the lowest (or more appropriate level) possible is illustrated in Chart 4.1 and Chart 4.2 below, which highlight the significant differences in the Australian manufacturing industry sub‐sector indices.

Chart 4.1 ‐ ABS 6427.0 Producer Price Indexes ‐ Aircraft Manufacturing and Repair Services and Shipbuilding and Repair Services

(source: ABS 6427.0 Producer Price Indexes, Australia)

56

Chart 4.1 above is a comparison of the PPIs for aircraft manufacturing and repair services (series ID A2307734X) and shipbuilding and repair services (series ID A2307728C), taken from Tables 10 and 11 of ABS 6427.0 Producer Price Indexes, Australia. Over the period 1999–2009, there is a significant difference in the tmovemen of these indexes, which should be taken into account considering the different outputs produced by each defence industry firm. For example, the Australian Submarine Corporation’s primary output would be classified as shipbuilding, whereas Boeing Australia’s primary output is aircraft oriented. Similarly, a separate index should be considered for the input side (i.e. cost of goods sold) in addition to the direct output side (i.e. sales), as there is also a significant difference in the “articles produced” and “materials used” indexes for the same manufacturing divisions, as can be seen Chart 4.2 below.

Chart 4.2 ‐ ABS 6427.0 Producer Price Indexes ‐ Materials Used In and the Articles Produced By Manufacturing Division 22 Fabricated Metal Product Manufacturing

(source: ABS 6427.0 Producer Price Indexes, Australia)

Chart 4.2 shows the ABS 6427.0 PPI for the materials used in and the articles produced by manufacturing Division 22 fabricated metal product manufacturing. It is evident that the two indexes vary over the period and, to ensure that any inflationary component is removed, this difference should be taken into consideration when calculating the VA annual figure.

57

In an attempt to account for the significant differences in sub‐sector PPIs for manufacturing, unique firm indexes were created for the deflation of the main variables used in calculating the VA output measure of the productivity calculations (see Annex A TableAustralian A.5 ‐ Defence Industry Deflator – Sales Data and Annex A Table A.6 ‐ Australian Defence Industry Deflator – Cost of Goods Sold Data). In 2004, as part of a defence industry study, ACIL Tasman identified the most appropriate Australian and New Zealand Standard Industrial Classifications (ANZSIC) code used by the ABS, which was likely captured Australian defence industry firm Data (ACIL Tasman 2004 p. D‐1). Using this information in addition to data sourced from various Annual Reports and Financial Statements (such as the principal activities paragraph contained in the Director’s report of a Financial Statement), different combinations of ABS PPIs indexes were used for each firm to create unique firm indexes for deflation of variables used to calculate the firm’s annual value added figures.

A basic weighting system was also used to account for difference between goods sold and services provided by each firm because for the majority of firms the revenue earned from services in addition to revenue earned from products was found to be significant. Table 4.1 below details the various weights used for each firm.

Table 4.1 ‐ Australian Defence Industry Deflator – Cost Of Goods Sold

Goods Services Firm Data source weighting weighting ADI 75% 25% 2005 Annual Financial Statement ASC 67% 33% 2009 Annual Report AA 50% 50% Australian Aerospace Internet Site BAE 50% 50% BAE system internet Site Boeing 34% 66% 2009 Annual Financial Statement CAE 64% 36% 2009 Annual Financial Statement GDLS‐A 100% 0% 2008 Annual Financial Statement Raytheon 50% 50% 2009 Annual Financial Statement Tenix 66% 34% Tenix Internet Site Thales 66% 34% 2005 Annual Financial Statement (source:Firm goods and services weighting calulated by author)

The method used to determine the various weightings detailed in Table 4.1 for individual firms differed depending on the data available for each firm. If stated in the latest available annual Financial Statement, the revenue earned from sale of goods compared to revenue earned from

58

services was used22. Where the revenue breakdown was not specified in the firm’s Financial Statement, the weightings were based on the principal activities listed in the Director’s report of the Financial Statement, in consideration of the activities and projects being undertaken by the firm at the time. While this method of weighting detailed above may not be entirely precise, it is an attempt to deflate the raw data, acknowledging that a significant portion of revenue for the majority of firms is earned via the supply of services in addition to the sale of products.

Labour deflator Labour expense data for all firms was deflated using an unweighted combination of two ABS Labour Price Indexes related to the manufacturing and the professional, scientific and technical services industries. Unlike general manufacturing, defence manufacturing involves a significant level of specialised workers due to its high‐tech nature. The combination of the two ABS labour indexes was used in an attempt to obtain a labour deflator that more closely resembled this mix rather than using a general industry or national labour deflator.

Capital deflator Capital Data for all firms were deflated using the ABS Non‐farm Gross Domestic Product implicit price deflator. Since capital includes a wide variety of items ranging from computers to cars, industrial machinery to completed buildings, the use of a broader economic deflator was considered appropriate.

22 For BAE and Thales, the weights assigned were based on activities immediately prior to their acquisition of Tenix and ADI, respectively. 59

Productivity distribution To determine the distribution of defence industry generated productivity benefits to stakeholders, several measures were calculated. The distribution of productivity benefits being obtained by defence industry firms was determined using a measure of profit. For defence industry employees, the distribution of productivity benefits was determined using a measure of wages. To determine whether productivity benefits were being distributed to the Department of Defence, a measure of the performance of the defence industry in relation to the quality of goods and services it provides to the Department of Defence was used. In addition to these three measures investigating the distribution of productivity benefits to specific stakeholders, a measure was also calculated to determine the level of competition in the defence industry market and provide insight into the potential magnitude of total benefits distributed to each stakeholder.

Each of the measures outlined below was compared with the productivity results calculated for the defence industry during the period to determine the effect of changes in productivity on the distribution of productivity benefits to stakeholders. Comparisons were also made between the defence industry results and similar results calculated for ABS industries such as the manufacturing industry. The comparison between industries provided a reference point for expectations of how productivity benefits should be distributed compared to the observable distribution of productivity benefits within a competitive market.

Defence industry firm profits To determine the level of productivity benefits being distributed to the firms, a measure of profit was used. However, comparing absolute values of profit to determine the relative performance of one firm to another can be misleading, as a large inefficient firm may appear more profitable than a smaller efficient firm due simply to its size. To account for this potential discrepancy, numerous profitability ratios have been developed, with the most commonly used based on a firm’s revenue, assets, equity or market/shareholder value (Gow and Kells 1998).

Ratios based on equity or market or shareholder values were not used in this research for two main reasons. Firstly, as the firms examined in this study are not listed on a stock exchange, there is no simple way to determine a market or shareholder value and attempting to calculate a value in lieu is beyond the scope of this thesis. Secondly, as eight of the ten companies are wholly owned by foreign multinationals, which have provided guarantees to cover any

60

operational costs the firm may not be able to pay in a particular year, then the equity value is likely to be a poor indicator in relation to a profitability measure. This is highlighted by the fact that several of the Australian defence industry firms experience prolonged periods of “negative equity” as a result of accumulated losses being greater than the original contributed equity obtained from their parent company. The use of a negative equity figure could also prove to be misleading if it coincided with a negative profit figure for the same period, which would result in ae positive rat of return.

The return on assets ratio is also commonly used as a measure of profitability, as it provides an indication of how well a firm can generate income relative to its asset base. In addition, the return on assets ratio can also be used as a proxy for the firm’s “capital productivity” in the sense that an increase in the ratio could be the result of an increase in productivity (Gow and Kells 1998 p. 11). However, due to the lack of comparable manufacturing, statistical information available to allow a comparative analysis between a defence industry return on assets result and the manufacturing industry, this ratio was not pursued in the research.

The ratio considered appropriate in the context of this research and used is based on the firms’ revenue and estimated profit before depreciation, interest and tax (EBDIT) as follows:

EBDIT Profit Margin = EBDIT /revenue

A profit ratio using EBDIT in preference to other measures such as profit after tax (PAT) or profit before tax (PBT) is preferable, as it provides no only an indication of the dollar amount retained by the firm, which is then available to distribute to shareholders as net profits or the government ein th form of taxes, but also the amount available to debt holders in the form of interest and employees in the form of increased wages. EBDIT is also a preferred measure over the more commonly used earnings before interest and tax (EBIT), as it removes possible “accounting fictions” (Gow and Kells 1998 pp. 9–12) caused by creative accounting relating to the treatment of depreciation and amortisation, as well as abnormal “one‐off” non‐cash items such as project impairment which can also affect reported profit figures.

In addition to the accounting profitability measures above, firms’ economic profits were also calculated.e Th firm’s economic profit was then used to calculate the economic profit margin where the economic profit margin was equal to the economic profit divided by total revenue.

61

The economic profit measure for each firm used in the above formula was calculated as follows:

Economic profit = EBIT – Capital Charge (i.e. invested capital x WACC)

To calculate the capital change for each firm used in the economic profit calculation above, the following method was applied. Firstly, an invested capital amount was calculated based on the firm’s balance sheet adjusted to account for items that are capital in nature but are not listed on the balance sheet, such as operating lease payments. A weighted‐average cost of capital consisting of a debt and equity component was then ddevelope to calculate a nominal capital charge for the invested capital. The cost of debt component was based on the Australian Government ten‐year bond rate, while the cost of the equity component was calculated using the capital asset pricing model (CAPM):

Cost of equity = Risk‐free rate + (beta x equity premium)

Where: Risk‐free rate = Australian Government ten‐year bond rate, beta = 1, and equity premium = 4.5%.

The beta used in the cost of equity calculation given above is a measure of investment risk relative to the broader market. A beta of 1 indicates that the risk associated with the firm is on par with the market, while beta higher (lower) than 1 implies greater (less) risk. As the literature reviewed in Chapter 4 is mixed in relation to the risk associated with defence business, a conservative approach was adopted, with a beta of 1 used. The choice of beta used also determined the equity premium, as the higher the risk (beta), the higher the expected return on investment by firms. Since a beta of 1 was used, an equity premium of 4.5% was used, which was estimated as the average Australian market risk premium in 2005 (Capital Research 2005 p. 1). The economic profit results were then used to provide an indication of whether or not the defence industry is experiencing excessive, or monopolistic, levels of profitability.

After the individual firm accounting and economic profit measures were calculated, these results were aggregated to create an overall defence industry profitability result. Similar to the aggregation method used to determine the defence industry productivity indexes, a weighting

62

system based on value added was also used in the development of the annual defence industry profit figures.

While the profit ratios discussed above are useful in comparing the profitability of various firms and industries, they are not a measure that can be compared directly to associated changes ine th productivity of a firm or industry. As discussed above, an EBDIT profit margin provides a measure of profitability relative to revenue earned during a particular period, whereas a measure of productivity growth is related to the change in productivity over two or more periods. To enable a direct comparison to changes in productivity, changes in the aggregate profit levels for the defence industry were used. This enabled an analysis to be undertaken to determine the potential effect of changes in productivity on changes in profits. Using an appropriate industry such as the manufacturing industry as a baseline, changes in profits relative to changes in productivity should provide an indication of the level of productivity benefits being retained by the defence industry in the form of additional profit.

Defence industry employee wages Changes in the annual wages of defence industry employees relative to changes in defence industry productivity were used to provide evidence of the distribution of productivity benefits to defence industry employees. To calculate the average annual wages, the annual aggregated employee expense for the defence industry was divided by the total number of employees disclosed in the defence industry firms Financial Statements with no adjustment made to the employee expense or employee number figures provided by the firms. Adjusting the number of employees data for changes in hours worked over the period was considered, however, as ABS statistical data is not specific to the defence industry and it was determined that rather than risk introducing error into the initial calculation, the ABS data would be better to use in any potential analysis explanations.

In an attempt to ensure comparability amongst the defence industry and the ABS defined industries being used as comparators, the same method used for calculating annual wages for the defence industry was used in these three ABS‐defined industries, being the total annual employee compensation divided by the number of employees.

Defence industry performance To determine the extent of productivity benefits being distributed to the Department of Defence, the performance of the defence industry relating to the quality of goods and services provided was measured. 63

Since 1999, the Defence Material Organisation Company ScoreCard program has been used to monitor the performance of significant23 prime and sub‐contractors, with a performance score allocated at the end of each round to each of the firm’s major contracts in progress. An overall performance score was calculated by including all the Company ScoreCard performance parameters (Table A.1 ‐ Defence Company ScoreCard Performance Parameters) in the calculation, which was then used as the primary measure of the quality of goods and services provided by the Australian defence industry. Separate performance scores were also calculated based individually on the contract cost parameter and contract schedule parameter to obtain measures of cost performance and schedule performance, respectively. A weighted average of the three individual performance scores calculated for each firm was then calculated to produce overall performance, cost performance and schedule performance scores24 for the defence industry. These three defence industry performance scores were then converted into indexes for analytical purposes. The weighting system used for all calculations was based on the value added generated by each firm.

Defence industry competition To determine the level of competition in the Australian defence industry, the measure used was the same as that used by the United States Government Accountability Office in 2005 when the United States Defense Department procurement method type was used as a proxy measure of the level of competition in the United States defence industry market (Chan 1997). Applying the same methodology used by the Government Accountability Office, the following methods of procurement25 used by the Department of Defence (DMO 2010b) defined below were categorised as either competitive or non‐competitive:

1. Open tender – an open tender is where a request for tender is published and all submissions received by firms prior to the deadline for receipt of submission will be considered provided they satisfy the conditions for participation.

2. Closed/Restricted Tender – a closed / restricted tender is where only a certain number of firms are invited to participate in the tender process. This process may be used where a

23 A detailed list of criteria to determine if a contractor is significant is contained in Chapter 3. 24 The actual numerical weights applied to each of the performance categories as well as the numerical assessment rating scores calculated are not be presented due to this information being Department of Defence Commercial‐in‐confidence. As the results are presented in index form the omission of the actual scores has no impact on the overall performance analysis presented in Chapter 7. 25 Method of procurement terminology is based on the methods of procurement historically used in the Department of Defence Interim Defence Contract Register. 64

licence or specific legal requirements may restrict potential suppliers, where a standing offer panel arrangement is already in place, or when an open tender expression of interest has been issued, with only highly ranked suppliers being requested to continue with the tender process.

3. Select tender – this is the same as a closed / restricted tender.

4. Staged procurement – a staged procurement process is generally used for high‐value complex procurements and involves the use of a staged or structured acquisition strategy where the procurement process is broken into more manageable parts.

5. Orders against standing offer / orders placed against standing offer – standing offers are used to facilitate repetitive acquisition of goods and services over a specific period according to set terms and conditions. Orders against standing offers are made with suppliers on a standing offer panel, where the panel may have been created under an open, restricted or sole‐sourced tender process.

6. Sole source – a sole‐source procurement process is when only one supplier is considered. Sole sourcing is generally used in the absence of competition for technical reasons, where market research has identified only one possible supplier, or when unforseen events require the urgent acquisition of an item26.

7. Direct source – same as sole source.

26 A complete list for the conditions of sole sourcing are contained in the Commonwealth Procurement Guidelines Para. 8.33. 65

Based on the above definitions, the various Department of Defence procurement methods were categorised as either competitive or non‐competitive, with the results listed in Table 4.2 below.

Table 4.2 ‐ Procurement Methods Categorised as Competitive or Non‐Competitive

Competitive methods of Non‐competitive methods of procurement procurement Open Tender Single supplier direct source Closed/Restricted Tender Direct source Select Tender Sole source Order against standing offer / Staged Procurement Orders placed against standing offer Orders against standing offer / Orders placed against standing offer

In categorising the methods of procurement in Table 4.2 above, the inherently non‐ competitive nature of the defence market was taken into consideration as even a slight increase in the level of competition could have a noticeable impact. This resulted in the decision to classify the ‘closed / restricted tender’ and ‘select tender’ methods of procurement as being competitive in nature, despite the possibility that in a competitive market these methods of procurement may be considered more non‐competitive than competitive. Additionally, one of the categories ‘orders against standing offer / orders placed against standing offer’ was classed as being both competitive and non‐competitive. This was due to the fact that this particular procurement method allows either a competitive or non‐ competitive approach. During the actual categorisation of procurement data, contracts classified in this category were reviewed to determine whether they were competitive or not. This process is discussed in further detail in Chapter 7.

Chapter summary This chapter detailed the various methods used in the calculation of the productivity measurement and apportionment empirics. The data used in the measurement methods are outlined in Chapter 5.

66

Chapter 5 :a Dat and measurement issues This chapter outlines the sources of the data used in the productivity measurement and distribution calculations. It details the development of variables used in the empirical work as well as outlining various assumptions made in the practical application of the theory. Measurement issues associated with data used are also addressed, as these types of issues are likely to affect the quality of the results to some extent. The chapter is divided into two main sections, with the first relating to the calculation aspects of a defence industry productivity measurement while the second relates to the productivity distribution measurement calculations.

Productivity measurement ­ data

Scope As detailed in Chapter 1, the scope of this study is limited to ten Australian defence orientated firms which account for approximately 77% of the Australian defence industry, with the period of analysis being from 1999 to 2009 inclusive. While it would have been preferable to use a longer period of analysis, the lack of data for the main variables listed below precluded this. During the period selected, data availability and quality issues resulted in only 87 (84%) of the total 10427 possible observations being found to be usable (refer Annex A Table A.3 ‐ Australian Defence Industry Sample Size and Scope).

Measurement of variables For the purpose of this study, three primary variables were used in the calculation of a multifactor and single‐factor productivity measurement for the defence industry. These variables were value added for output, and capital and labour for input.

Unless otherwise stated, the data used to calculate each of the productivity measurement variables discussed below were taken from the individual firm annual Financial Statements submitted to the Australian Securities and Investment Commission or published on firms’ respective internet sites, as listed in Annex F – Firm Annual Reports and Financial Statements.

Output variable The primary method of output measurement for the Australian defence industry productivity analysis discussed in Chapter 3 was the VA output measure, as follows:

27 All firms had 11 possible observations, with the exception of ADI and Tenix, which had a maximum of 7 and 9 observations, respectively, due to mergers with other firms that occurred during the period. 67

VA = YGO – I.

Where: YGO = gross output and I = intermediate inputs

The practical application of the VA output measure results in the following Financial Statement line items28 being used as proxy measures for the gross output and intermediate input:

YGO = gross output = sales = “cash receipts from customers”, and

I = intermediate inputs = cost of supplies = “cash payments to suppliers and employees” ‐ “employees expense”.

In developing the VA figure, the “cash” items detailed above were obtained from the Cash Flow Statement, while the employee expense figure was obtained efrom th Statement of Financial Performance, as it was not identifiable in the Cash Flow Statement. One effect of obtaining the employee expense item from the Statement of Financial Performance rather than the Cash Flow Statement is that a slight discrepancy may exist, as the actual amount paid to employees (cash) may vary from the amount listed as the annual employee expense (accrual). This discrepancy is unlikely to be more than 2% (or seven days’ wages), depending on when the final salary run for the year occurs relative to the end of the reporting period. In any event, it is likely to have little if any impact on the actual productivity calculation.

Input variables

Labour variable The line item “employees expense” was used as a proxy for the cost of employees, while the annual number of employees for each firm was taken from the line item “number of employees at year end”, which was generally contained in the Notes to the Financial Statements. Of the 89 labour observations possible during the period, 10 (11%) of the number of employees data figures were not available (refer Annex A Table A.4 ‐ Australian Defence Industry Employee Generated Data). Where this occurred, a figure was calculated using the previous listed average employee expense after adjusting using the developed labour index, which has been outlined further in this chapter (e.g. Company X 2007 “# of employees” =

28 The actual line item descriptor used by individual firms may differ slightly in the wording from the ones used in this thesis. Where this occurred the actual figures being reported were checked to ensure consistence between firm data used. 68

(2007 “employee Expense” / 2006 average “employee expense“ inflated to 2007 using the ABS 6345.0 Labour Price Index, Australia).

As stated in the Chapter 4, the raw employee numbers data were adjusted using the ABS 5204.0 – Australian System of National Accounts (Table 15; Manufacturing; Hours worked) index to account for the change in actual hours worked over the period, which may not have been captured in the financial data available (e.g. unpaid overtime). No other adjustments were made to the original labour number of employees data.

Capital variable The line items used in the calculation of productivity stock were “Plant and Equipment”, “Motor Vehicles”, “Land and Buildings”, “Leasehold Improvements”, and intangible line items considered relevant to the production process such as “Software” and “Licences and Patents”.

The development of the investment series for each category required in the calculation of the capital services measure detailed in Chapter 4 was calculated using the combined annual cash expense amounts listed in the Cash Flow Statements plus the closing Net Book Value (NBV) of the year prior to the first year analysed listed in the Balance Sheet section of the Financial Statements. For the three asset categories used, the resulting investment formulae were:

1. Plant and Equipment = NBV (base year) + annual cash “Purchase of Property, Plant and Equipment” (listed in the statement of cash flows).

2. Land and Buildings = NBV (base year).29

3. Intangibles = NBV + annual identified purchase + R&D Expenditure.

The investment series for each asset category was then deflated to a 1999 constant value, with the productive stock for each category calculated using a hyperbolic age efficiency profile, which is consistent to that used by the ABS (Trewin 2000 p. 139). The deflated expense item “Operating lease rental – Minimum lease payments” was added to the aggregated asset categories to form an overall annual capital stock for each firm.

29 There were no additional investments to Land and Buildings during the period of analysis for any of the Australian defence industry firms used. 69

Data deflators

Output variable The ABS 6427.0 Producer Price Indexes Tables 10, 11 and 22 indexes were used to create the unique individual firm sales data deflators (Annex A Table Australian A.5 ‐ Defence Industry Deflator – Sales Data) while the indexes for the corresponding categories in the ABS 6427.0 Producer Price Indexes Table 14 were used to create the individual firm deflators for the cost of goods sold data (Annex A TableAustralian A.6 ‐ Defence Industry Deflator – Cost of Goods Sold Data).

Labour variable The average combination of ABS 6345.0 Labour Price Indexes series A2712246X‐Financial Year Index; Ordinary time hourly rates of pay excluding bonuses; Australia; Private; Manufacturing; and ABS 6345.0 Labour Price Indexes series A2712236V‐Financial Year Index; Ordinary time hourly rates of pay excluding bonuses; Australia; Private; Professional, scientific and technical services were used in the deflation of the labour expense.

Capital variable Capital Data for all companies was deflated using the ABS 5206.0 Australian National Accounts; Non‐Farm; Gross domestic product; implicit price deflators.

Measurement issues The primary use of accounting data taken from Financial Statements does raise some theoretical and practical measurement issues, primarily due to the theory of productivity measurement being based on economic data, while the practical application is generally based on financial or accounting data, which is not necessarily the theoretical equivalent. For example, accounting profit and economic profit can vary considerably, as accounting profit can be influenced by factors other than production such as the taxation regime, financial structure and accrual assumptions (e.g. provisions), to name a few (Van Biesebroeck 2007 p. 532). In contrast, economic measures are not affected by these factors.

Another aspect of productivity theory where the accounting and economic practices diverge is in relation to asset valuation and usage. For accounting purposes there are several different methods for valuing assets (i.e. at historical cost, market value or depreciated cost), with the method used chosen by the firm. In theory, the method chosen by the firm should reflect the way in which the asset is consumed, similar to the economic usage; however, in practice, the

70

chosen method is more likely to be accounting motivated to benefit the profit figure rather than complying purely with economic theory (Van Biesebroeck 2007 p. 523). The asset values included in Financial Statements also do not include a measure of “opportunity cost” for the capital invested.

Where possible, to limit the effect of the accounting issues identified above, data from the Cash Flow Statement were used in preference to data available in the Statement of Financial Performance. The rationale behind this decision was that actual cash flow is a more accurate indicator of the flow of resources in and out of the firm than the accrual figures presented in the Statement of Financial Performance.

In addition to the theoretical issue related to accounting and economic data, the practical application of the theory was also found to be challenging, especially in regards to the presentation and consistency of individual line items disclosed in the individual firm Financial Statements. In some instances the costs of production have been combined to form the expense category ‘costs of goods sold’ rather than being individually disclosed. The actual composition of this category also varies between firms during the period of analysis; for example, it was found that some firms include all costs related to production including salary and wages, while others only include the cost of materials. To complicate matters, over several years the composition of the ‘cost of goods sold’ category even varied within the same firm in response to changing Accounting Standards and disclosure requirements.

Comparative ABS industry data ABS industry data for the manufacturing and mining industries used in the comparative analysis were obtained from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10.

71

Productivity benefit distribution ­ data

Defence Industry profit measure Data used for individual firm and defence industry profitability measures were obtained directly from the firm’s Financial Statements (Annex F). Manufacturing industry data used for the comparative analysis were obtained from the ABS catalogues 8155.0–Australian Industry 2008‐09, 8155.0–Australian Industry 2005‐06 and 8155.0–Australian Industry 2002‐04.

Defence industry employee wage measure Data used to calculate the defence industry average annual wages were obtained from the firm’s annual Financial Statements (Annex F), with only the observations where both the number of employees and employees’ expenses were listed in the Financial Statement were used. This resulted in 49 (55%) of the total 88 possible labour‐related observations being used (refer Annex A Table A.4 ‐ Australian Defence Industry Employee Generated Data).

Employee compensation data for the ABS‐defined manufacturing, professional, scientific and technical services and mining industries were obtained from ABS 5204.0 Australian System of National Accounts, Table 48. Compensation of Employees, by Industry ‐ Current prices, while the number of employees for each industry was obtained from ABS 6291.0 Labour Force, Australia, Detailed, Quarterly, Table 04. Employed persons by Industry ‐ Trend, Seasonally adjusted, Original.

Defence industry performance measure Data used to create the three defence industry performance indexes were taken directly from the Defence Material Organisations Company ScoreCard database, with data relating only to the ten Australian defence industry firms used in the analysis.

Defence industry competition Data used to calculate the measure of competition in the defence industry market were obtained from the Interim Defence Contract Register (IDCR). The IDCR is considered the statutory record for the Department of Defence contract reporting and is the primary reporting tool used to comply with the Australian Government Senate Order 192 relating to Departmental and Agency Contract reporting requirements. A download of the IDCR was obtained for all contracts over $1 million for the period 2001 to 2009, with the data used restricted to the ten defence industry firms. It should be noted that data obtained for 2001 and 2002 is likely to be incomplete, as the IDCR has only been used for official purposes since January 2003. In relation to the data obtained for 2003 to 2009, as at 15 April 2010 96.16% of 72

IDCR data had been reconciled with the Department of Defence financial management system (J. Maranan, personal email, 14:52 15 April, 2010).

Based on the analysis of these data, the overall coverage of data obtained was approximately 77% ($30 billion) of the total contracts reported in the IDCR relating to military equipment procurement and sustainment based on contract value consisting of 1078 individual contracts.

Chapter summary This chapter provided information on how data were obtained and used in the various productivity measurement and distribution related calculations. The measurement issues associated with the choice and manipulation of data were also discussed. Finally, the chapter provided the ABS sources of data used in the comparative analysis of the defence industry results against the various ABS‐defined industries used. The empirical results of the productivity and distribution measures are detailed in Chapters 6 and 7.

73

Chapter 6 : Productivity measurement empirical results and analysis This chapter presents the empirical results of the defence industry multifactor productivity (MFP) and labour productivity (LP) calculations. Included in this chapter are details on the data cleansing process carried out to remove inconsistent observations from the final indexes calculated, as well as the testing that occurred to test the robustness of the productivity calculations and results. Finally, a comparative analysis is undertaken of the defence industry results against data obtained from the ABS on the manufacturing and mining industries.

As the focus of the thesis is on the defence industry rather than individual firm results, a detailed presentation of the empirical results for the individual defence industry firms used to calculate the defence industry MFP and LP indexes is not included in this chapter and instead is available in Annex B.

Defence industry results An Australian defence industry MFP index was created by aggregating the individual productivity calculations of the ten defence industry firms. This index was then compared against other MFP indexes for several other Australian industries, as calculated by the ABS.

Chart 6.1 below shows the MFP index calculated for the Australian defence industry prior to any data cleansing activities to remove the effect of any abnormal productivity measurement results. Due to the lack of firm observations in 1999 and 2000, the base year for the index was also changed to 2001 to ensure that the overall defence industry index was representative of the industry as a whole rather than one or two firms (see Annex A Table3 A. ‐ Australian Defence Industry Sample Size and Scope).

74

Chart 6.1 ‐ Defence industry MFP index

(source: Defence industry MFP index calculated by author)

As observed above in Chart 6.1, the defence industry MFP index indicates a 61.3% total increase over the period 2001 to 2009, which represents an annual increase of 6.2%. On further analysis of the observations used to create the defence industry index, there were two observations considered as outliers and subsequently removed. The observations removed were Thales 2004 and BAE 2009, for the following reasons.

Prior to 2008, Thales Underwater System Pty Ltd was not part of the consolidated Thales entity. From 2004 to 2008, Thales underwater systems Pty Ltd was only available for manual consolidation, which it officially included in the consolidated Thales entity in 2009. The manual consolidation of Thales Underwater System Pty Ltd in 2004 resulted in the calculated MFP change for 2004 of 3.76. As this result is not consistent with other Thales MPF observations and is likely to be due primarily to the inclusion of Thales Underwater System Pty Ltd rather than any change in Thales’s productivity, it was deemed appropriate to remove this observation from the defence industry MFP calculation.

Further, the BAE MFP change calculated for 2009 was 1.88 and is considered an outlier in relation to the other BAE MFP results calculated over the period. It is likely that this abnormal result in 2009 is related to BAE’s acquisition of 100% of the issued share capital of Tenix and its controlled entities on 27 June 2008. While it is possible that only a portion of the 2009 MFP 75

calculation is related to BAE’s acquisition and BAE in fact did experience a level of MFP growth during 2009, as BAE’s 2009 weighting for the defence industry index calculation is approximately 41%, a conservative approach has been adopted and the observation was removed.

The effect of the removal of the 2004 Thales and 2009 BAE MFP observations from the defence industry MFP calculation is displayed in Chart 6.2 below.

Chart 6.2 ‐ Defence industry MFP (Thales and BAE observations removed)

(source: Defence industry indexes calculated by author)

The removal of the 2004 Thales MFP observation resulted in a reduction in the measured increase in the 2004 defence industry MFP index from 23.3% to 11.6%, while the removal of the BAE 2009 observation resulted in the defence index decreasing by 13.1%, rather than increasing by 24.2% in 2009 had it been included. These omissions resulted in the defence industry MFP index indicating a 2% total increase over the period 2001 to 2009, which represents an annual increase of 0.3%.

In addition to the two observations detailed above, there were three other events that could have potentially affected the overall defence industry MFP result. Firstly, on 29 November 1999 ADI shares were acquired from the Commonwealth of Australia through a joint venture between Transfield Pty Ltd and Thompson CSF. During 2000, post acquisition ADI engaged in

76

significant restructuring activity that, combined with abnormal accounting losses of the sale, resulted in a one‐off net loss of $123.8 million in 2000 (ADI 2000 p. 3). These abnormal items were the main factors contributing to the measured MFP growth of 1.47 for the year, which is considered a potential outlying result compared with the results over MFP results for the entire period.

Secondly, on 13 December 2002 Boeing Australia Ltd disposed of its investment in two significant subsidiary companies (Hawker de Havilland Holdings Pty Ltd and Aerospace Technologies of Australia Limited) to Boeing Australia Holdings Ltd. The abnormal items associated with the sale of these subsidiaries combined with the drop in employees in Boeing’s consolidated financial statements for 2002 resulted in input and output decreasing by 57% and 67%, respectively30.

Thirdly, on 16 October 2006 Thales acquired 100% of ADI, which resulted in significant changes in the output and input indexes during 2006 that are more likely a result of the consolidation than any fundamental change in productivity.

It was decided to retain these three observations in the overall calculation for the following reasons. Since the base year for the defence industry MFP calculation was changed from 1999 to 2001, as discussed previously in this chapter, the 1999 observation was excluded from the results automatically for being out of scope. In relation to the Boeing 2002 and Thales 2006 observations, because these were not considered abnormal relative to their other MFP observations calculated and effectively had little influence on the overall Defence MFP index for 2002 and 2006, it was decided to leave them in the defence industry calculation and if necessary use these events in the analysis of movements in the defence industry indexes detailed later in this chapter.

Chart 6.3 below compares the Australian defence industry MFP index calculated by the author to the ABS Cat. No. 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity for the manufacturing and mining industries. Although the manufacturing and mining industries replicate exactly the output produced and market environment in which the defence industry operates, they are potentially the closest comparators available at the

30 It should be noted that on 15 December 2002 Boeing Holdings Pty Limited acquired 100% interest in Boeing Australia Limited. As Boeing Australia Limited is the main Boeing Group entity that predominantly undertakes defence‐oriented business, the analysis was undertaken on Boeing Australia Limited rather than Boeing Holdings Pty Limited. 77

industry level. The manufacturing industry is similar in terms of output to the extent that defence industry firms are included in the aggregate output calculations for the sector as a whole, while the oligopolistic environment of the mining industry more closely represents the non‐competitive environment apparent in the defence industry.

Chart 6.3‐ MFP Comparison – Defence Industry vs. Manufacturing and Mining Industries

(source: Defence industry MFP index calculated by author; manufacturing and mininig industries MFP indexes taken from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10)

As observed above in Chart 6.3, the defence industry MFP index indicates a 2% total increase over the period 2001 to 2009, which represents an annual increase of 0.3%. During the same period, the manufacturing industry experienced a 3.7% total decrease in MFP, while the mining industry experienced a 33.6% decrease in MFP. This represents an annual decrease in MFP for the manufacturing and mining industries of ‐0.5% and ‐5.0%, respectively.

As part of its experimental estimates of MFP, the ABS identified distinct productivity cycles, of which there were three during the period of analysis from 1999 to 2004, 2004 to 2008, and 2008 onwards (Australian Bureau of Statistics 2009). These are evident to a certain extent in the defence industry index. Table 6.1 below shows the annual defence industry MFP rate

78

compared to the ABS manufacturing and mining industries annual MFP rates during these periods31.

Table 6.1‐ MFP Cycle Analysis (Annual % Change) – Defence, Manufacturing and Mining industries

Manufacturing Defence industry Mining industry Period Year industry (annual MFP) (annual MFP) (annual MFP)

1 2001 to 2004 9.4% 1.8% ‐4.7%

2 2004 to 2008 ‐2.7% ‐0.9% ‐4.2%

3 2008 to 2009 ‐13.2% ‐5.4% ‐8.9%

Total 2001 to 2009 0.3% ‐0.5% ‐5.0%

(source: Defence industry MFP index calculated by author; manufacturing and mininig industries MFP indexes taken from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10)

The annual MFP growth rate for both the defence and manufacturing industries is highest in period 1 and declines during periods 2 and 3. While this comparison indicates that the Australian defence industry does experience changes in productivity growth and decline similar to manufacturing, the magnitude of MFP growth and decline does differ significantly between the defence and manufacturing industries. During the period of growth (period 1), measured MFP for the defence industry was substantially higher than that of the manufacturing industry, while the decline in MFP during periods 2 and 3 was also greater in the defence industry than in the manufacturing industry.

In regards to the mining industry, the change in productivity between periods 2 and 3 was similar to that for the defence and manufacturing industries; however, the change during period 1 is the opposite of that experienced in the defence and manufacturing industries. The defence and manufacturing industries both realised a negative change in MFP, whereas the mining industry experienced a positive change (i.e. the rate of MFP decline for mining decreased between periods 1 and 2). The defence and mining industries also appear more

31 As the defence industry index commences in 2001, the first period of analysis will also commence from 2001 rather than 1999. 79

volatile than the manufacturing industry, experiencing higher levels of MFP growth and decline during the three periods analysed.

The significant difference between the defence industry MFP and manufacturing industry MFP could be a result of the market environments in which the industries operate or influenced by numerous factors not considered in the initial MFP calculations. While the methodology used to calculate a measure of defence industry MFP is based on the methodology used by the ABS in its MFP calculations, there are two aspects of the productivity calculation used in this research that were modified in an attempt to customise the measurement specifically for the defence industry. These two defence industry customised variables relate to the choice of deflator indexes used, as well as the decision to adjust the labour input data for changes in hours worked over the period. Another notable difference in relation to the ABS MFP calculations and the defence industry MFP calculation is the significant differences in the share of capital and labour allocated to the MFP calculation for all three industries (see Table 6.3).

Sensitivity testing was conducted to determine the effect of using different deflator indexes, as well as the decision to adjust labour data for changes in hours worked. The chart below shows the results of the sensitivity testing conducted on the defence MFP index in relation to the choice of deflator indexes used (i.e. Non‐Farm GDP implicit price deflator compared to the uniquely created individual firm indexes) and the adjustments made for changes in hours worked.

80

Chart 6.4 ‐ MFP Sensitivity Testing –Defence Industry

(source: Defence industry MFP indexes calculated by author)

As shown in Chart 6.4 above, the choice of deflator and adjusting for hours worked has resulted in the MFP index calculated for the defence industry index ranging from ‐3.1% and 8.1% in total over the period. This equates to an annual growth rate of between ‐0.39% and 0.98%.

While the overall difference in the annual growth rate of the four indexes was 1.37% points during the period between 2003 and 2009, the indexes were found to diverge significantly in magnitude and to a lesser extent in direction. The obvious cause of this difference is predominately the choice of deflator used, with the use of the original customised deflator indexes resulting in a more volatile MFP index than the use of the Non‐Farm GDP implicit price deflator. This is not surprising considering the previous discussion in Chapter 4, where the significant differences in the output Producer Price Indexes relating to aircraft and shipbuilding manufacturing and repair services were highlighted in Chapter 4 Chart 4.1.

Also evident in the four MFP indexes detailed in Chart 6.4 above is the rapid growth experienced by the defence industry between 2002 and 2003. This 21.4% growth in MFP during 2003 was due primarily to the two firms Tenix and ASC. If these observations were to be removed, the measured defence industry annual MFP growth for 2003 would be reduced from

81

an increase of 21.4% to an increase of 4.3%. However, as these two observations are not inconsistent with the MFP change results calculated for the entire period for Tenix and ASC (see Annex A Table A.8 ‐ Individual Firm MFP Changes Relative to Previous Observation (MFP, Output, and Input)), there is no justification to remove them from the overall defence industry MFP calculation.

The obvious question now becomes which MFP measure better represents actual MFP experienced by the defence industry over the period. The use of a uniquely created input and output deflator index, which attempted to account for the difference in output of the firms chosen, has a strong theoretical justification. Similarly, as the labour input was based on the number of employees and not hours worked, attempting to improve these data by adjusting for changes in average hours worked was also reasonable. However, as both these adjustments are judgement based, it may be worth heeding the advice of Kendrick (1973 Ch2 p. 26) and rather than introducing speculative elements into the MFP measurement, use this information as a possible explanatory variable.

The following chart and table introduce the measured defence industry index, measured using a Non‐Farm GDP implicit price deflator and no labour hours worked adjustment into the comparative analysis with the manufacturing and mining industries MFP indexes.

82

Chart 6.5 ‐ MFP Comparison V2 – Defence Industry (X2) vs. Manufacturing and Mining Industries

(source: Defence industry indexes calculated by author; manufacturing and mininig industries MFP indexes taken from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10)

Table 6.2 ‐ MFP Cycle Analysis V2 (Annual % Change) – Defence (X2), Manufacturing and Mining industries

Defence industry Defence Mining Manufacturing (annual MFP) industry industry Period Year industry (annual MFP) (annual Non‐Farm GDP/no (annual MFP) labour adjustment Original MFP)

1 2001 to 2004 6.1% 9.4% 1.8% ‐4.7%

2 2004 to 2008 ‐1.7% ‐2.7% ‐0.9% ‐4.2%

3 2008 to 2009 ‐7.8% ‐13.2% ‐5.4% ‐8.9%

Total 2001 to 2009 0.3% 0.3% ‐0.5% ‐5.0%

(source: Defence industry MFP indexes calculated by author; manufacturing and mininig indexes taken from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10)

Chart 6.5 and Table 6.2 indicate that despite the removal of the uniquely created firm deflators and the removal of the hours worked adjustment, there remains a significant difference

83

between the ABS created MFP indexes for manufacturing and mining and the measured defence industry MFP index, especially during 2002 and 2008. This difference, however, may be plausible considering that in an analysis of 30 industry groups in the United States between 1948 and 1966, Kendrick (1973 Ch 1 p. 8) found a considerable degree of dispersion among industry groups over the period, with some industries increasing on average over 8% per year, while during cyclic peaks several industries experienced annual surges in measured productivity growth of over 10% per year. This is also supported by the magnitude of change in the MFP calculated by the ABS for the mining and manufacturing industries, which experienced annual changes of up to 5.4% and 8.9%, respectively.

While the measured MFP for the defence industry may be plausible, the above discussion does not explain why the defence industry exhibits a substantially higher overall measured MFP growth rate during the period than the manufacturing and mining industries nor why the defence industry index appears more volatile than the manufacturing and, to a lesser extent, mining industries. In relation to the volatility evident in the defence industry MFP index, there are two likely reasons for this volatility occurring, both of which relate to the inherent nature of defence procurement discussed in Chapter 1.

Revenues earned by firms in the manufacturing industry are likely to be continuous between periods, with expenses for items matched to the revenue earned for those items. Defence industry firms, however, may experience periods in which there is a significant time lag between when expenses are incurred for a product and when payment is received. This difference is also exacerbated by the magnitude of individual payments for individual firms, as well as the relatively small sample size used to create the defence industry MFP index. For example, an individual firm may have a material effect on the overall defence industry MFP index result if it experienced a significant mismatch between its revenue earned and expenses incurred for a product over two or more periods (an example of this is likely to have occurred in the MFP results for CAE in 2001).

Another possible reason for the high volatility evident in the defence industry MFP index may relate to possible labour and capital hoarding. This practice commonly involves firms operating at under capacity during temporary periods of low production rather than permanently changing workforce or capital levels. This factor may also provide some insight into why the defence productivity indexes appear more volatile than manufacturing in the short term (especially when compared to the original defence industry LP index based on the unique firm‐ 84

specific deflators). As discussed in Chapter 1, the ad hoc nature of defence military procurement, which usually involves relatively short production runs, could create a situation where defence firms may engage in capital and labour hoarding activities during periods where contracts are ending or in anticipation of new contracts being awarded. When activity increases, apparent output would naturally increase, whereas inputs would remain relatively unchanged until the additional capacity is absorbed by existing resources. Based on the way productivity is measured (i.e. using revenue and expense line items in the firm’s Financial Statements), and if not accounted for, then short‐term productivity results will show periods of rapid growth as projects are commenced and will decline as projects are wound down and completed. However, long‐term averages would be less affected by the above and would better reflect MFP changes.

Differences in measured MFP between the three industries are also likely to have arisen due to the different effects of macroeconomic factors, as evident in the decline in MFP experienced in 2009 by all three industries after the global recession of 2008.

Despite these factors, the most influential single factor that may explain the variation between industries is the difference between shares of capital and labour used in the MFP calculation for each industry. The table below compares the average labour and capital shares for the defence industry to that provided by the ABS for the manufacturing and mining industries.

Table 6.3 ‐ Australian Defence Industry, Manufacturing, Mining ‐ Labour and Capital Shares

Defence Manufacturing Mining L K L K L K 1999 90% 10% 53% 47% 29% 71% 2000 78% 22% 54% 46% 27% 73% 2001 81% 19% 54% 46% 23% 77% 2002 82% 18% 54% 46% 21% 79% 2003 85% 15% 53% 47% 23% 77% 2004 85% 15% 52% 48% 25% 75% 2005 87% 13% 51% 49% 23% 77% 2006 87% 13% 52% 48% 19% 81% 2007 88% 12% 53% 47% 18% 82% 2008 89% 11% 53% 47% 19% 81% 2009 88% 12% 53% 47% 19% 81% (source: Defence industry shares calculated by author; manufacturing and mininig shares taken from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10)

85

As seen in Table 6.3 above the Australian defence industry exhibits a significantly higher share of labour than both the manufacturing and mining industries. There are several reasons for this. Firstly, the research found that a significant portion of the total revenue earned by Australian defence firms was related to the provision of services with the remainder associated with the actual sale of manufactured goods. Secondly, the manufacturing of Australian defence industry goods is generally associated with low volume highly sophisticated items (e.g. Collins Class Submarines, Air Warfare Destroyer, etc) which relies predominantly on a large volume of highly skilled, highly paid employees, rather than capital intensive manufacturing plants and low paid employees commonly used by non‐military goods manufacturers.

Due to the high labour shares used in the individual firms’ MFP calculations, the Australian defence industry MFP index created may also be considered similar to an Australian defence industry LP index. Conversely, the mining industry has a relatively low labour share and high capital share, which would cause the measured MFP to be more heavily influenced by changes in capital productivity than labour. The manufacturing industry has a relatively equal labour and capital share and therefore both capital and labour productivity have a similar effect on the MFP calculated.

Due to these differences in labour and capital shares and the resultant effects of each factor component on the overall MFP calculation for each industry, a direct comparison of MFP changes between the defence, manufacturing and mining industries may not be entirely appropriate. To test this theory, the LP index calculated for the defence industry was compared to the LP indexes for the manufacturing and mining industries as seen in Chart 6.6 below. In addition to the original LP index calculated for the defence industry, the chart also includes a revised defence industry LP index, which is based on a Non‐Farm GDP implicit price deflator and is not adjusted for changes in hours worked over the period. This revised defence industry index was the result of sensitivity testing similar to that conducted on the defence industry MFP result. Annex A Chart Australian A.1 ‐ Defence Industry Labour Productivity Index – Sensitivity Testing Results provides the full results of the defence industry LP sensitivity testing.

86

Chart 6.6‐ LP Comparison – Defence (X2), Manufacturing and Mining Industries

(source: Defence industry LP indexes calculated by author; manufacturing and mininig industries LP indexes taken from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10)

Table 6.4 below compares the annual defence industry LP indexes with the manufacturing and mining industries LP indexes using the ABS productivity cycles discussed previously in this chapter.

87

Table 6.4 ‐ LP Cycle Analysis (Annual % Change) – Defence (X2), Manufacturing and Mining Industries

Defence industry (annual LP) Defence industry Manufacturing Mining Period Year (annual LP) industry industry Non‐Farm GDP / no Original (annual LP) (annual LP) labour adjustment

1 2001 to 2004 6.3% 9.8% 3.4% ‐6.7%

2 2004 to 2008 ‐1.2% ‐2.2% 1.0% ‐5.2%

3 2008 to 2009 ‐5.6% ‐10.9% ‐1.7% ‐11.3%

Total 2001 to 2009 1.0% 1.0% 1.6% ‐6.5%

(source: Defence industry LP indexes calculated by author; manufacturing and mininig indexes taken from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10)

Over the entire period the original and the Non‐Farm GDP implicit price deflator / no labour adjustment calculated defence industry LP indexes increased at an annual rate of 1.0% each, which were 0.6% points lower than the 1.6% annual LP change calculated by the ABS for the manufacturing industry. These LP results for the defence and manufacturing industries represent a higher annual average growth rate than was calculated for MFP, indicating that in both industries the capital component had a negative effect on the overall MFP calculation.

These LP results are somewhat expected considering the inherent nature of defence procurement and the defence market previously discussed in Chapter 1. Demand for Australian defence military equipment by the Australian Department of Defence generally involves small batches of specialised equipment with long fielding intervals between successive generations of weapon platforms and systems. These procurement practices combined with a non‐competitive marketplace in which the Department of Defence procurement occurs do not generally provide an environment conducive to productivity growth.

In regards to the results presented for the mining industry, the LP index trend shown is substantially different to that which occurred in the defence and manufacturing industries. It is therefore doubtful that any meaningful comparison is possible and as such the mining industry LP will not be considered in the overall LP analysis. In addition, based on the premise discussed previously that it is better not to introduce speculative factors into the measurement rather 88

than use them to explain the results, the comparisons between LP and productivity benefit distribution measures detailed in Chapter 7 onwards are based on the measured Non‐Farm GDP implicit price deflator / no labour adjustment defence industry LP results compared to the manufacturing industry LP.

Chapter summary This chapter presented the MFP and LP results calculated for the Australian defence industry, with the results showing that over the period from 2001 to 2009 the annual MFP and LP growth rate was 0.3% and 1.0%, respectively. The chapter also provided a comparative analysis of the defence industry productivity results against those calculated for the manufacturing and mining industries by the ABS. The comparative analysis showed that based on MFP calculations the defence industry had the highest annual growth rate, while based on LP calculations the manufacturing industry had the highest annual growth rate. The comparative analysis also revealed that due to the differences in the shares of labour and capital inherent in each industry, the use of LP as a measure of comparison between the defence and manufacturing industries is more appropriate than the use of an MFP measure.

89

Chapter 7 : Productivity benefits distribution empirical results and analysis This chapter presents the empirical results of the measures used to determine how the benefits of defence industry productivity growth were being distributed. The first measure presented relates to the profitability experienced by the defence industry as a whole, with individual defence industry profit results presented in Annex D. The seconde measur is concerned with changes in the wages of defence industry employees during the period, while the third and fourth measures relate to defence industry performance and the level of competition existing in the Australian defence market.

In addition to presenting the actual empirical results for these measures, a comparative analysis has been undertaken for each measure. Where possible, this comparative analysis has been conducted against similar measures calculated for several ABS‐defined industries, as well as analysing changes between the four defence industry distribution measures. The comparative analysis also includes a comparison of changes in the measures presented in this chapter with the LP measure calculated for the defence industry, which was presented previously in Chapter 6.

Empirical results and analysis – profitability To enable a comparison at the industry level, the individual firm results (see Annex D for individual firm profit results) were aggregated to the defence industry level. Chart 7.1 below presents the calculated defence accounting (EBDIT) and economic profit margins compared to those calculated for the manufacturing industry.

90

Chart 7.1 ‐ Defence Industry Annual Accounting (EBDIT) and Economic Margin

(source: Defence industry economic and accounting profit margins calculated by author; manufacturing accounting profit margins calculated by author based on ABS data)

The average annual profit margin for the defence industry between 2001 and 2009 was calculated at 10.93%, which is 1.08% points higher than the manufacturing industry average annual profit margin of 9.85%. While there is no direct comparison available with the manufacturing industry, the average annual economic profit margin calculated for the defence industry was 2.18%, indicating that the defence industry as a whole experienced monopolistic profits over the period. The higher‐than‐average profit margin experienced by the defence industry and the positive annual economic profit margin was not unexpected, considering the low level of competition expected in a monopolistic market such as the defence market. What was unexpected, however, was the sudden decrease in profit margin from 2006 onwards. The exact reason for the sudden drop in profit margin in 2006 is not known but there are some factors that may have influenced this sudden decrease, as follows.

Firstly, the reduction in profit from 2006 onwards may be the result of a fundamental shift in the relative scarcity of labour, as a result of a drop in the Australian unemployment rate from 6.9% in 2001 to 4.7% in 2006 (ABS 6202.0 Labour Force, Australia Table 01. Labour force status by Sex – Trend), coupled with the Australian mining boom from 2001 to 2008, which also increased the demand for employees whose skill set is required by both the mining and

91

defence industries (Nicholson and Dodds 2011). This increase in demand for labour is likely to increase the overall wages expense for the defence industry, which will have a negative effect on profits32. The extent to which changes in available labour can adversely affect firm profits was evident in the “profit squeeze” that occurred in the US and Germany in the 1950s (Solow 1963 p. 94), where profits in the US fell by as much as 45% (Malloy and Post 1999) as a result of a significant decrease in the availability of suitably qualified labour.

Secondly, in 2003, 2004 and 2005, final payments for the completion of a number of major procurement projects may have artificially inflated profits prior to 2006. For large defence projects the majority of the profits earned by a firm for a project (as measured by net cash flow) are generally obtained in the latter half of the lifecycle of the project (Schmidt 1992 pp. 20–22). A number of significant defence projects were completed during this period, including Tenix’s ANZAC Ship project, which had a substantial effect on the overall defence industry profitability during these three years. As indicated in the 2005 Tenix Annual Report (Tenix Defence 2005 pp. 1–2), while the whole of project profit margin for the ANZAC Ship project was around 10%, a significant amount of this profit was realised in the final years leading up to the completion of the project in 2006, as evident in Tenix’s accounting profit margin for 2003, 2004 and 2005 being 15.8%, 15.5% and 25.9%, respectively (see Annex D Chart D.9 ‐ Tenix Profit Margin). In the aggregation of the firm profitability results to create a defence industry profitability measure, these three Tenix accounting profit margin results alone directly account for over one quarter of the defence industry profitability results for 2003, 2004 and 2005. Other major programs completed during this dperio that are also likely to have inflated the overall defence industry profitability include the High Frequency Modernisation Program Phase 1 and the Airborne Early Warning and Control – Wedgetail Modifications.

Thirdly, another significant event that had an adverse effect on profitability calculations post 2005 was the merger of Thales and ADI. Historically, these two firms alone have accounted for approximately 25% of the defence industry market33. Prior to the acquisition, the average profit margin for ADI and Thales between 2001 and 2005 was calculated as 13.15% and 22.98%, respectively, while after the ADI acquisition the average profit margin for the consolidated Thales/ADI entity was 7.74% (see Annex D). This decrease alone accounted for approximately half of the decrease in profitability calculated in 2006 compared to 2005.

32 This idea is discussed in detail in the presentation of the wages empirical results later in this chapter. 33 Based on value added (see Annex A Table A.10 ‐ Australian Defence Industry Productivity Index – Company Weights (Pre‐Data Cleanse)). 92

Finally, the decline in productivity in 2005 (see Table 6.3) may have had delayed adverse effects on profits in 2006. The relationship between productivity and profitability is discussed below.

Profit vs. productivity comparison An indication of whether or not the higher average level of profit experienced by the defence industry in comparison to the manufacturing industry is reasonable may be answered by considering the change in profits compared to the change in defence industry productivity. If the manufacturing industry’s profit and productivity results are used as a baseline that is considered equitable in a competitive market, then the comparison of the defence industry profit and productivity results may provide an indication of the level of distribution of productivity benefits between the defence industry and the Department of Defence in its role of sole customer. As the productivity and profit margin results cannot be compared directly because they are fundamentally two different measures (i.e. productivity is a measure of change between two periods where the profit margin is a measure of a margin for one particular period), the change between actual annual constant accounting profits was calculated for each firm (refer Annex A

93

Table A.12 ‐ Firm Profit Results (Annual Changes EBDIT relative to previous years)) and then aggregated to obtain an annual change in defence industry profits for the period 200234 to 2009. Changes in the average annual economic profit figures for each firm were not calculated, as there are no comparative figures available for the ABS manufacturing industry. The results are presented in Table 7.1 below.

Table 7.1 ‐ Defence and Manufacturing Industry Comparison ‐ Profit vs. Labour Productivity

Period Defence Defence Defence industry: Manufacturing Manufacturing Manufacturing (1) industry: industry: profit/productivity industry: industry: LP industry: change in LP ratio change in change profit/productivity annual change (4) annual (6) ratio accounting (3) accounting (7) profit profit (2) (5)

2002‐ 11.51% 9.45% 1.2 2.10% 2.54% 0.8 2004

2004‐ ‐13.25% ‐1.21% 10.8 1.96% 1.00% 1.9 2008

2008‐ 15.54% ‐5.57% ‐2.7 ‐12.16% ‐1.70% 7.1 2009

2002‐ ‐2.91% 1.07% ‐2.7 ‐0.15% 1.05% ‐0.1 2009

In Table 7.1 above, the periods (column 1) are based on the ABS‐identified productivity cycles previously discussed Chapter 6. Columns 2 and 5 are the average annual change in total accounting (EBDIT) profits for the defence and manufacturing industries, while columns 3 and 6 are the average annual change in LP rates that occurred during each period. Columns 4 and 7 are the calculated profit/productivity ratios indicating the percentage of profit realised for every 1% change in LP generated by each industry.

34 Due to the lack of ABS EBDIT profit data available for the manufacturing industry, the period of analysis of the profit and productivity results commenced in 2002 rather than 2001. 94

Between 2002 and 2004, during the period of rapid LP growth, an annual 1% increase in LP generated by the defence industry equated to an increase in annual profit by 1.2%. During the same period, for every 1% of LP generated annually by the manufacturing industry, the annual increase in profit was 0.8%. This equates to an annual difference of an additional 0.4% points in profit for every 1% LP generated in favour of the defence industry for the period.

During the period from 2004 to 2008, a 1% annual decrease in LP generated by the defence industry equated to an annual decrease in profits of 10.8%. During the same period, for every 1% of LP generated annually by the manufacturing industry, the profits increased annually by 1.9%. This equates to an annual difference of 12.7% points in profits for every 1% LP generated in favour of the manufacturing industry.

During the period between 2008 and 2009, for every 1% decline in LP, defence industry profits increased by 2.7%, whereas manufacturing decreased by 7.1% for every 1% decline in LP.

Over the entire period from 2002 to 2009, the average annual profit for the defence industry decreased by 2.7% for every 1% increase in LP, while in comparison the manufacturing industry average annual profits decreased by only 0.1% for every 1% annual increase in LP. These results highlight that relative to LP growth experienced by both the defence and manufacturing industries, the manufacturing industry outperformed the defence industry with a 1% annual increase in LP corresponding to an additional 2.6% points.

In the context of the distribution of productivity benefits, with all other factors excluded, the above results indicate that during the period of analysis defence industry firms retained a below‐average portion of productivity benefits in the form of profits; if the manufacturing industry results were considered as being standard. The results also show a certain level of volatility during the period, indicating the possibility that the level of apportionment of benefits is also likely to vary, as evident in the drop in defence industry profits during 2004 and 2008.

Despite these initial findings, the results also present the possibility that the period of analysis may be too restricted for an appropriate comparison to occur as seen in Chart 7.2 below where the actual profits earned by the defence and manufacturing industries are presented in index form.

95

Chart 7.2 ‐ Australian Defence Manufacturing Industries Profit Indexes (Based on Current Profits)

(source: Defence industry profit index calculated by author; Manufacturing industry profit index calculated by author using data obtained from ABS catalogues 8155.0–Australian Industry 2008‐09, 8155.0–Australian Industry 2005‐06 and 8155.0–Australian Industry 2002‐04.)

As discussed previously in this chapter, the majority of profits earned by a defence firm are commonly realised in the latter half of the lifecycle of the project (Schmidt 1992 pp. 20‐22). As a number of major Australian defence projects, such as the ANZAC Ship project, were completed in 2004 and 2005 the affect that this has on overall defence industry profits is a rapid increase in profits earned as evident in Chart 7.2. Due to this sudden influx of profits as major projects are completed, between 2001 and 2005 the defence industry significantly outperformed the manufacturing industry. From 2005 to 2007, the defence industry profit level decreases which would be expected as several major projects are complete and new projects are negotiated. From 2007 to 2009 the defence industry profits index increased and is likely to continue to do so as several major defence projects, such as the Air Warfare Destroyer and Landing Helicopter Dock Ships, both of which commenced in 2007 (IDCR 2010), transition from the advanced development phase to the production phase. As the production phase for these new defence projects mature the rate at which defence industry firms realise profits is also likely to increase (Schmidt 1992 pp. 20‐22) and, based on the trend evident between 2002 and 2005, will potentially outperform the manufacturing industry once again.

Considering the above discussion, it is likely that the comparative profit analysis undertaken between the defence and manufacturing industries from 2002 to 2009 may not show the long

96

term trend since the period of analysis is unlikely to have captured an entire defence industry production cycle. Had this occurred it is possible that the results may have lead to a different conclusion relating to the relative performance between the two industries.

97

Empirical results and analysis ­ employee wages Changes in the wages of defence industry employees during the period have been calculated to provide an indication of the productivity benefits obtained by defence industry employees. To provide some meaning to the results, the calculated wage increases were compared to increases in the manufacturing, professional scientific and technical services, and mining industries.

The ABS manufacturing index was chosen, as it would give an indication of what the normal wage growth should be for a typical manufacturing firm. The professional scientific and technical services wage index was also included, as it is likely to more closely represent the actual workforce composition of a defence industry firm, while the mining wage index was included due to the high level of comparability between the skill sets required for the defence and mining industries, as evidenced by the recent claims that skilled defence workers are “switching over to the mining industry” (Nicholson and Dodds 2011). Any wage rate above this level experienced by defence industry employees not explained by differences in productivity growth could be due to defence employees receiving an above‐average share of their firm’s productivity benefits.

Chart 7.3 below is the comparison of the calculated defence industry wage index against three ABS labour price indexes for the manufacturing, professional scientific and technical services, and mining industries.

98

Chart 7.3 ‐ Industry Comparison – Employee Compensation Growth – 1999 to 2009

(source: defence industry wage index calculated by author; manufacturing, professional scientific and technical services, and mining industries wage indexes calculated by author based on ABS data)

As observed in Chart 7.3 above, defence industry employee wages increased by 5.53% per annum, with the defence industry three‐year moving average wage index increasing at 5.08% per annum. The annual increases for the manufacturing, professional scientific and technical services, and mining industries were 4.56%, 3.61% and 2.17%, respectively.

Wage vs. productivity comparison A comparison of the defence, manufacturing, professional, scientific and technical services and mining industries employee wages are presented in Table 7.2 below. Similar to the industry comparison presented in the profitability analysis previously in this chapter, a ratio based on the annual growth in wages divided by the annual productivity growth has been included in an attempt to provide evidence of whether defence industry employees are likely to be sharing in the benefits of defence industry productivity growth.

99

Table 7.2 ‐ Industry Comparison – Employee Wages and Productivity (2001‐2009)

Annual wage Annual LP ratio (wage growth / LP

growth growth growth)

Defence industry 5.53% 1.0% 5.53

Defence industry ‐ moving 5.08% 1.0% 5.08 average (3ry)

Manufacturing industry 4.56% 1.55% 2.94

Professional, scientific and 3.61% n.a. n.a. technical services industry

Mining industry 2.17% ‐6.52% ‐0.33

Based on the wage/LP ratio in Table 7.2 above (column 4), it appears that defence industry employees experience higher wage growth relative to generated productivity than manufacturing industry employees. This may indicate that relative to LP generated, defence industry employees gain an above‐average share of the productivity benefits when compared to the manufacturing norm. However, considering mining industry employees received a 2.17% average annual wage increase for the period despite mining industry LP declining by 6.52% per annum, this introduces the possibility that changes in industry productivity may not be the main influence for wage growth rates experienced by employees and that other factors such as unemployment levels and job specialisation may have a more dominant effect.

Between 2001 and 2008, the official Australian unemployment rate dropped steadily from 6.9% to 4.2% (ABS 6202.0 Labour Force, Australia Table 01. Labour force status by Sex – Trend). This drop in unemployment alone would have a positive effect on wage increases, as there is increased demand for employees. Defence industry employees would also be considered more specialised than the average manufacturing industry employee in the sense that they are not only highly qualified professionals in their respective fields, but they also tend to be experts in very sophisticated and expensive weapon platforms and systems. The fewer people there are with this level of expertise, the higher the remuneration bargaining power they are likely to have. This demand is further enhanced if the employees in demand are highly specialised in certain fields, like those required in the defence and mining industries. The extent of the increased competition between the defence and mining industries as a result of the recent

100

mining “boom” during the last decade is likely to have been a key factor in the rapid increase in the total compensation amount paid to defence industry employees (see Annex A Table A.13 ‐ Average Annual Employee Compensation (Current Values)) as both industries compete for suitable employees (Nicholson and Dodds 2011).

In addition to wage rates being inflated by a decrease in skilled labour, the ability for employees to demand higher wages potentially increases as the overall level of market competition within the defence industry market decreases35. Hirsch and Addison (1986) proposed that employee unions have a relatively easier time obtaining higher wages for employees if the firm is operating in an industry with little or no competition, such as the Australian defence industry, and therefore able to potentially earn monopolistic profit. Regardless of why the defence industry has experienced a higher level of annual growth, Hirsch (1991) points out that higher wages should also have a negative effect on profitability unless the increasing wage cost can be offset by either an increase in productivity or by passing increasing wage costs on to the consumer.

The following wchart belo shows a comparison of annual percentage changes in defence industry LP, accounting profits36 and wages.

35 Discussed previously in Chapter 3. 36 The 2002 the change in defence industry accounting profit is 126%. This is not obvious in Chart 7.4 due to the profit percentage axis maximum being set at 30% to allow a better scale for the other observations. 101

Chart 7.4 ‐ Defence Industry Comparison – Productivity, Profitability, Wages

(source: Defence industry LP, accounting profit and wage changes calculated by author)

While there is no obvious relationship between changes in LP, profitability and wages on an annual basis when analysed for the two periods 2002 to 2005 and 2005 to 2009, there are some interesting results for these variables, as shown in Table3 7. below.

Table 7.3 ‐ Defence Industry Comparison – Average Annual Change (Productivity, Profitability, Wages)

Period LP EBDIT Profit wages

2002‐2005 4.4% 12.5% 6.1%

2005‐2009 ‐1.4% ‐13.0% 5.0%

The results presented in Table 7.3 above are the average annual percentage changes for defence industry LP, accounting profit and wages. Considering Hirsch’s (1991) statement, hypothetically, if wage increases between 2001 and 2005 were paid for by increased productivity rather than profits, then the drop in profitability between 2005 and 2009 would be somewhat expected, noting the significant drop in the average annual LP growth while wages growth remained relatively constant. However, it is possible that the decrease in profitability does not cover the entire cost of maintaining the defence industry wage growth rate and that a portion of the wage cost is being paid for indirectly by the Department of

102

Defence in its role as the sole consumer through higher supply costs. As movements in the actual cost of the Department of Defence procurements will not be calculated, this point cannot be verified. However, insight may be provided later in this chapter when the performance of defence firms is analysed, as cost performance is a major factor.

Overall, the wage analysis results do provide evidence to indicate that defence industry employees are likely to be obtaining a portion of the benefits generated by productivity gains in the defence industry, but the extent is somewhat unknown. The results also indicate that other factors discussed briefly above may have a greater effect on changes in the defence industry wage rate than changes in productivity performance. Finally, the results raise the possibility that the source of benefits received by employees (i.e. from firm profit or as a cost passed on to the Department of Defence) do change over the period, the reasons for which will become more apparent later in this thesis.

103

Empirical results and analysis ­ defence industry performance While an analysis of changes in defence industry profit and defence industry employee wages may provide an indication of the level of productivity‐generated benefits being distributed to the defence industry and its employees, these measures do not provide information on the distribution of productivity‐generated benefits being transferred to the Department of Defence. To determine the distribution of productivity benefits to the Department of Defence, an analysis of changes in the quality of services, more commonly referred to by the Department of Defence as the level of contractor performance, was conducted.

Three performance indexes were calculated for the defence industry based on the defence industry’s overall performance, cost performance and schedule performance, as detailed in Chart 7.5 below.

Chart 7.5 ‐ Defence Industry Scorecard Performance Index

(source: Defence industry performance indexes calculated by author)

As seen in Chart 7.5, over the entire period the three performance indexes differ in magnitude of annual change but appear to trend in the same direction with the overall and schedule performance indexes, increasing by 5.0% and 5.1% per annum over the period, respectively, while the cost performance index increased by 2.0% per annum. Over the entire period of analysis all three performance indexes increased substantially but it is noted that for all three indexes the majority of this growth was experienced between 2001 and 2002, with the overall,

104

schedule and cost performance indexes increasing by 31.4%, 42.8% and 16.1%, respectively, during this period.

The low performance scores calculated in 2001 for all three indexes were influenced primarily by the performance scores allocated to three defence industry firms regarding three major projects in which these firms were involved. Two of the projects were the Collins Class Submarine build and upgrade projects with the third project being the Bushranger Project. The performance ratings associated with the Collins Class Submarine alone accounted for over one third of the growth evident between 2001 and 2002. While these three firms were the most influential overall, all ten defence firms included in the defence industry sample experienced substantial improvement in their performance scores between 2001 and 2002.

It is unlikely that the entire growth experienced between 2001 and 2002 is due solely to a radical improvement in the firms’ performances. From its inception in 1999 up until 2001, the Company ScoreCard program was still considered to be evolving and maturing (D. Jones, personal email, 11:10 May 18, 2011). As a result, it is likely that some of the initial ScoreCard scores were inconsistent between periods, as DMO expectations of defence industry performance levels changed and matured. The actual influence of this maturing process on the performance results appears to have tapered off in around 2002, as evident in the comparison of the performance results compared to defence industry productivity detailed below.

Performance vs. productivity comparison To determine the potential effect of changes in defence industry productivity on the performance of services the defence industry delivers to the Department of Defence, the changes in performance displayed in Chart 7.5 above are compared with the changes in LP performance. Chart 7.6 below provides a graphical comparison of the three performance indexes and the defence industry LP index over the period, while Table 7.4 details the changes in performance of the three performance indexes compared to changes in LP generated by the defence industry during the three ABS productivity cycles detailed previously in Chapter 6.

105

Chart 7.6 ‐ Defence Industry Comparison – Performance (overall, schedule, cost) vs. Productivity

(source: Defence industry performance and LP indexes calculated by author)

Table 7.4‐ Defence Industry Comparison ‐ Performance vs. Labour Productivity

Overall Schedule Cost Defence Industry Period Performance Performance Performance LP

(1) 2001 to 2004 11.3% 16.3% 6.9% 6.3%

(2) 2004 to 2008 ‐0.1% ‐2.8% ‐1.5% ‐1.2%

(3) 2008 to 2009 7.7% 6.0% 1.9% ‐5.6%

2001 to 2009 5.0% 5.1% 2.0% 1.0%

As seen in both Chart 7.6 and Table 7.4 above, there are similarities between movements in defence industry LP and movements in the three performance indexes. During period 1, a high annual LP growth rate coincides with a large increase in the overall, schedule and cost performance rates. In period 2, a decline in the annual LP growth rate also coincides with a decline in the three performance indexes. In period 3, however, despite the similarities in index movements exhibited during periods 1 and 2, a decline in defence industry productivity coincides with a substantial increase in the three performance results. An exact reason for why 106

the indexes diverge during the period 2008 to 2009 is not entirely obvious but may be related to the exclusion of the 2009 LP observation of BAE.

A second defence industry LP index labelled Defence industry – labour productivity (BAE 2009 inclusion) has been included in Chart 7.6 to indicate what the defence LP index would have been had the 2009 BAE LP observation been included. Rather than a decrease in LP in 2009, the inclusion of the BAE observation would have resulted in a significant increase in LP. While the change in productivity calculated for BAE in 2009 was considered improbable, the fact remains that even a slight increase in BAE’s productivity growth during 2009 would have been sufficient to result in a positive LP growth calculation for the defence industry for 2009. If this were the case, evidence for a positive relationship existing between defence industry LP and the overall defence industry performance would be strengthened.

In addition to the overall defence industry performance results detailed above, the Department of Defence is also interested in two specific aspects of the overall performance measure relating to the schedule and cost aspects of defence industry performance also included in Chart 7.6 above. While there are evident similarities in the direction of changes associated with the three defence industry performance indexes and the defence industry LP index, one particular difference is the magnitude of change between the overall and schedule performance indexes compared to the magnitude of change in the cost performance index. The difference in the performance indexes may be related to the type of contract used in the Department of Defence’s procurement process.

The three main types of contracts used by the Department of Defence in its procurement of goods and services are as follows:

1. Fixed‐price contracts (also referred to as firm‐price contracts) are contracts that have a firm base for labour costs and materials. The definition of a fixed‐price contract also encompasses variable‐price contracts that have provisions for price variation or other mechanisms over the period of the contract; 2. Time and Materials contracts (also referred to as variable‐price contracts) are contracts where the actual labour hours incurred are claimed at an agreed fully burdened rate and actual materials are charged as incurred. The contract may contain a negotiated mark‐up and a price cap or a “Not to Exceed” price. Price variation may also be applied to the labour wage rate and material costs if the contract is long term; and

107

3. Combination contracts, which have both fixed and variable elements.

Based on a survey of 303 major contracts entered into by the Department of Defence between July 2009 and January 2010 relating to military equipment procurement and sustainment (C. Hilder, personal email, 16:59 15 February, 2010), it was found that 79% (240 contracts) were categorised as fixed‐price contracts, 15% (46 contracts) were categorised as combination contracts, and 6% (17 contracts) were categorised as time and materials contracts. With such a high proportion of fixed‐price contracts being used, there is little motivation for defence industry firms to share potential cost savings during periods of productivity growth. Conversely, during periods of productivity decline, under the conditions of the contract type it is reasonable for the Department of Defence to expect that any cost escalations are absorbed by the defence industry firms. Although the cost component is relatively inflexible under fixed‐ price contracts, the results presented in Chart 7.6 above suggest that the schedule component may be more flexible.

At a project level, schedule, cost and quality are the three main areas of interest. These areas are generally oppositional in the sense that an improvement in one area usually occurs to the detriment of one or both of the other areas; however, if there is an improvement in productivity this could improve all areas simultaneously. As discussed above, if the cost component is fixed due to the type of contract, then changes in defence industry firm productivity will potentially have a disproportional effect on the other two components, being schedule and quality. This appears to be the case in the defence industry, where changes in generated productivity have a greater influence on movements in schedule and the overall performance levels than they do on changes in cost performance.

Regarding the magnitude of productivity benefits distributed to the Department of Defence, an indication of the potential size of these productivity benefits may be inferred based on an analysis of the actual scores calculated for each performance measure relative to the expected level of service required by the Department of Defence. The three charts below detail the performance scores37 calculated for the defence industry relative to the performance ratings set by the Department of Defence (DMO 2010b P. 3.8.3), as indicated by the horizontal lines labelled VG (very good), AC (as contracted), MA (marginal), and UI (unacceptable but improving).

37 As the Scorecard scores are considered confidential, they are not included in the charts. The omission of these figures does not impact the actual analysis undertaken. 108

Chart 7.7 ‐ Defence industry performance ‐ overall score

(source: Defence industry performance score calculated by author)

Chart 7.8 ‐ Defence industry performance ‐ schedule score

(source: Defence industry performance score calculated by author)

109

Chart 7.9 ‐ Defence industry performance ‐ cost score

(source: Defence industry performance score calculated by author)

As seen in Chart 7.7 above, for the majority of the period the overall defence industry performance was considered not to meet the contracted requirements. This pattern is also evident in Chart 7.8 and Chart 7.9 above, which show that the schedule and cost performance levels are actually rated worse than the overall performance level, with all scores allocated during the period being less than the ‘as contracted’ rating. These results suggest that the magnitude of productivity benefits distributed to the Department of Defence are likely to be low, considering the contracted performance requirements are not being met.

Overall, the results indicate that changes in defence industry productivity coincide positively with changes in the quality of services being provided to the Department of Defence, as measured by the defence industry overall performance index, which shows an annual 1.0% growth in defence industry LP coinciding with an annual increase in the overall, schedule and cost performance indexes of 5.0%, 5.1% and 2.0%, respectively, for the period. These findings provide evidence that the Department of Defence benefits from defence industry generated productivity growth; however, as the actual performance scores relative to the performance ratings show, the magnitude of benefits received is potentially low.

110

Empirical results and analysis ­ defence industry competition The previous empirical results presented in this chapter related to direct measures determining the distribution of productivity benefits to stakeholders. The empirical results presented in this section are indirect measures of productivity distribution aimed at providing evidence to indicate the potential magnitude of productivity benefits distributed between defence industry stakeholders and the Department of Defence. To determine the level of competition in the defence market, a measure of competition was developed based on the Department of Defence method of procurement. An analysis of these measurements enables consideration of extant theories discussed previously in Chapter 3 on the effect of competition on productivity generation and benefit distribution. It is expected that an indication of the magnitude of productivity benefits obtained by stakeholders will be achieved. The expectation is that a high level of competition will increase the distribution of productivity benefits received by the Department of Defence, while a low level of competition will increase the distribution of productivity benefits to defence industry firms and their employees.

As detailed in Chapter 4, a measure to determine the level of competition existing in the defence market was used based on the Department of Defence procurement method, with the various procurement methods being classified as either competitive or non‐competitive as follows:

Table 7.5 ‐ Procurement Methods Categorised as Competitive or Non‐competitive

Competitive methods of Non‐competitive methods of procurement procurement Open Tender Single supplier direct source Closed/Restricted Tender Direct source Select Tender Sole source Order against standing offer/ Orders Staged Procurement placed against standing offer Orders against standing offer / Orders placed against standing offer

Table 7.6 below displays the results of the analysis of Department of Defence procurement data from 2001 to 2009, inclusive. The table is divided into competitive and non‐competitive methods of procurement and the percentage of contracts procured under each method by total contract value and by total number of contracts. In regards to the procurement method

111

category “Order against standing offer / Orders placed against existing offer”, where the initial method of procurement may have been either competitive or non‐competitive, a review of individual contracts in this category determined that by contract value 50% of contracts were competitive in nature while 50% relating primarily to the sustainment of the Collins Class Submarines were considered non‐competitive.

Table 7.6 ‐ Defence Contract Procurement Method (% Total Contract by Value ($) and Total Number of Contracts)

Competitive methods of procurement Non‐competitive methods of procurement Value ($) Number Value ($) Number Open Tender 39.5% 9.7% Direct source 17.9% 30.2% Closed/Restricted Tender 3.3% 12.6% Sole source 29.4% 30.8% Order against standing offer/ Select Tender 2.6% 7.8% Orders placed against 3.5% 7.9% standing offer Staged Procurement 0.2% 0.5% Single supplier direct source 0.1% 0.2% Orders against standing offer / Orders placed against 3.5% 0.3% standing offer Total 49.1% 30.9% Total 50.9% 69.1% (source: Department of Defence Intrem Defence Contract Register)

For the entire period from 2001 to 2009, based on total contract value, 49.1% of procurement undertaken by the Department of Defence was deemed competitive while 50.9% was deemed non‐competitive. When measured by the total number of contracts, only 30.9% of contracts were procured competitively, with 69.1% procured non‐competitively. These results are presented on an annual basis in Chart 7.10 and Chart 7.11 below. Chart 7.10 displays the changes in the procurement method based on the number of contracts, while Chart 7.11 displays the changes in the procurement method based on total contract dollar value.

112

Chart 7.10 ‐ Defence Contract Procurement Method (Number of Contracts)

(source: Department of Defence Interim Defence Contract Register)

As seen in Chart 7.10 above, the number of non‐competitively procured contracts has always exceeded the number of competitively procured contracts, with 69.1% (745 contracts) of contracts being procured non‐competitively and 30.9% (333 contracts) procured competitively. Between 2002 and 2004, the difference between the numbers of competitive versus non‐ competitive contracts increased significantly, before decreasing in 2005. From 2005 to 2007, the difference remained fairly constant before a drop in the number of competitively procured contracts in 2008 increased the difference between the two. As detailed in Chapter 5, the low number of contracts procured prior to 2003 is due to the Interim Defence Contract Register being compulsory for official use from January 2003 (DMO 2011).

Using the contract value as the unit of measurement, the difference between non‐competitive and competitive is not as large. Chart 7.11 below details the total contract value of competitive and non‐competitive contracts procured each year from 2001 to 2009.

113

Chart 7.11 ‐ Defence Contract Procurement Method ($ Value of Contracts)

(source: Department of Defence Interim Defence Contract Register)

Over the entire period, based on contract value, 50.9% ($15.046 billion) of contracts procured were non‐competitive, with the remaining 49.1% ($14.482 billion) being considered competitive. Excluding the spike in non‐competitive contracts in 2004 and the competitive contracts in 2007, which are discussed in detail later in this chapter, the average difference between the two procurement categories is $730 million each year in favour of the non‐ competitively procured contracts.

The annual change over the period between dollar value in the difference between competitively and non‐competitively procured contracts was calculated and used as a measure of the relative changes in “competition” in the defence industry market. The results are displayed in Chart 7.12 below.

114

Chart 7.12 ‐ Defence Contract Procurement Method ($ Difference between Competitively and Non‐Competitively Procured Contracts)

(source: Calculated by author)

Chart 7.12 above illustrates the annual change in the direction and magnitude of the level of industry competition. Positive changes indicate a relative improvement in the level of competition, with positive annual observations indicating a relatively higher‐than‐average level of competitive procurement. Conversely, a negative change indicates a reduction ine th relative level of market competition, with negative observations indicating a relatively lower than average level of competitive procurement during each particular year.

The results presented in Chart 7.11 and Chart 7.12 above give the impression that based on contract value overall, there has been a relatively even split between competitive and non‐ competitive. This result may be somewhat misleading in the sense that while it is relatively easy to form a definition of non‐competitive procurement (i.e. where only one supplier is considered), the definition of a competitively procured contract in the defence industry environment is not as clear. An “open tender” procurement method is generally used to encourage market competition but if there is only one firm capable of providing the service, then the actual method of procurement becomes less effective. Similarly, defence industry firms may choose to form alliances to bid for contracts rather than bid against each other to ensure profits are maximised. In such a case, the use of an “open tender” procurement method to encourage competition may also be negated. Other factors involved in government 115

procurement may also have an adverse effect on promoting competition. Decisions may ultimately be based on political imperatives such as maintaining strategic alliances with allies or to win favour in marginal electorates by preferencing firms who use a high percentage of local industry.

The following table outlines the main contracts extracted from the IDCR comprising the majority of the competitive contracts shown in Chart 7.11 above.

Table 7.7 ‐ Significant Competitively Procured Defence Contracts

Value % of total competitive Year Contract ($m) contracts Armed Reconnaissance Helicopter through life 2002 $610 4.2% support 2005 Procurement of Troop Lift Helicopter $1,756 12.2% 2007 Procurement of Air Warfare Destroyer $4,323 29.3% 2007 Procurement Landing Helicopter Dock $2,334 16.2% Total $9,023 61.9% (source: Department of Defence Interim Defence Contract Register)

The four contracts detailed in Table 7.7 above account for 61.9% of all competitively procured contracts sampled over the entire period and were all classified by the Department of Defence as being procured under open tender arrangements. Because of this classification, these four contracts were automatically categorised as competitively procured contracts; however, on closer analysis the actual level of competition created in the defence market through the use of an open tender arrangement for these contracts is questionable38.

In relation to the Armed Reconnaissance Helicopter Through Life Support contract, the firm that was awarded this contract was the same firm from which the Department of Defence acquired the original helicopters in 2001 under a direct source procurement arrangement (IDCR 2010). Furthermore, the most recent Armed Reconnaissance Helicopter Through Life Support contract awarded in 2009 was classified by the Department of Defence as a direct sourced procurement and was actually awarded to the same firm39 (IDCR 2010). Both these

38 The remaining 38.1% consist of 329 individual contracts. A detailed analysis of these contracts was not carried out to determine their inherent competitiveness as the individual dollar value is immaterial. 39 It should be noted winning the initial sustainment contract could potentially result in the firm having a natural competitive advantage over competitors in the subsequent retendering of the sustainment contract. 116

events support the assertion that the level of true competition in the procurement of the 2002 through life support contract was relatively low or even non‐existent.

Regarding the Air Warfare Destroyer and the Landing Helicopter Dock contracts, any potential competition may have been stifled by the fact thate th Australian Government openly acknowledged the importance of ensuring the “best possible outcome for Australian industry” in its decision to “consider the Amphibious Ship and Air Warfare Destroyer proposals in concert” (Nelson 2007a). This decision to prioritise Australian industry does not necessarily align with the value for money concept, considering e th Government had previously acknowledged that in relation to shipbuilding activities conducted in Australia, there was evidence to suggest that “Australia may not be as productive as overseas producers” (The Senate 2006 p. 196) and Australian naval build projects have attracted a local build premium in the past (The Senate 2006 pp. 247‐249). The prioritising of Australian industry in tender selection was also evident in the two announcements made by the Minister of Defence in 2007 regarding the Landing Helicopter Dock (Nelson 2007a) and Air Warfare Destroyer (Nelson 2007b), where a significant portion of the respective speeches were focused one th extent of Australian industry involvement. Further, the number of tenders for both ship projects indicates that the level of competition within the market was relatively low. For the Air Warfare Destroyer project, only two tenders were received, while for the Landing Helicopter Dock project the Department of Defence actually restricted potential competition by only approaching two European shipbuilders, with American shipbuilders automatically excluded from bidding for not meeting the Department of Defence’s capability requirements (Borgu 2004 pp. 5‐7).

The effectiveness of the open tender process for the procurement of the Troop Lift Helicopters was also questionable, in that the Department of Defence’s recommended supplier was rejected by the Defence Minister in preference for a different supplier (Forbes 2004). While the specific reason for the Government’s decision is somewhat unclear, the fact that it was predicted to create “over 150 new jobs” in Australia and assure the “continuation of over 250 more jobs that were coming to an end with the completion of other Defence projects” (Combet 2007) lends support to the idea that the decision may have been more politically motivated than simply left to the competitive tender process. Additionally, while it may be coincidental, the firm that was awarded the Troop Lift Helicopter contract in 2004 was the same firm that was awarded the Armed Reconnaissance Helicopter contract in 2001 (via a

117

direct sourced procurement (IDCR 2010)), in which both aircraft have similarities in build and equipment (Combet 2007).

Based on the above discussion, if the four contracts listed in Table 7.7 were considered as having more non‐competitive attributes than competitive ones, then the relative measure of competition in the Department of Defence procurement would be as follows.

Chart 7.13‐ Department Of Defence Contract Procurement Method ($ Difference between Competitively and Non‐Competitively Procured Contracts – Major Contract Adjustment)

(source: Calculated by author)

This result is substantially different from the previous measure, with the level of competition within the Department of Defence procurement process being relatively low. The inclusion of the four contracts listed in Table 7.7 above results in the overall percentage of competitive compared to non‐competitive contracts being 18% and 82% (based on contract value), respectively. These results are comparable to previous studies relating to the United States Defence Department, where 80% of total contract dollars were found to have been spent on sole‐sourced procurement methods between 1986 and 1994 (Chan 1997 p. 17).

While Chart 7.13 does indicate a fairly low level of procurement‐induced competition in the defence market, these results should be considered in the context of inherent defence market characteristics. Fundamentally, due to the limited number of suppliers the level of competition within the defence market is extremely low, and the introduction of even a low level of 118

competition, which may be considered effectively nil in a competitive market, could potentially have a relatively high impact. This theory is investigated in greater detail below.

Competition vs. productivity comparison As outlined in Chapter 3, increased competition is generally associated with positive productivity growth. As discussed previously, while the results presented in this section indicate the level of procurement‐generated competition is low, it may be possible that slight changes in competition might have a noticeable effect on defence industry‐generated productivity. The following Chart 7.14 compares the defence industry LP to procurement‐ generated competition, as measured by the difference between competitive and non‐ competitive contracts shown in Chart 7.13.

Chart 7.14 ‐ Defence Procurement Induced Competition vs. Defence Industry LP

(source: Defence industry LP and competition measure calculated by author)

Chart 7.14 indicates a potential relationship between changes in procurement‐induced competition and changes in defence industry LP, with a possible year delay between changes experienced in the level of procured induced competition and the flow on effect that these changes may have on defence industry LP. Table 7.8 below details the periods of growth and decline in relative competition and the potential corresponding periods of change in LP growth.

119

Table 7.8 ‐ Changes in Defence Procurement Induced Competition vs. Defence Industry LP

Period Changes in relative defence Period (LP) Changes in LP (competition) industry competition 2001‐2002 Increase 2002‐2003 Increase 2002‐2004 Decrease 2003‐2005 Decrease 2004‐2006 Increase 2005‐2007 Increase 2006‐2007 Decrease 2007‐2008 Decrease

As detailed in Table 7.8, the period from 2001 to 2007, one year after a major change in procurement‐induced competition, coincides with a major directional change in defence industry‐generated LP. This trend would also have been evident from 2007 to 2009 if the BAE 2009 LP observation had been included (see Chart 7.5), where an increase in competition between 2007 and 2008 coincided with an increase in LP between 2008 and 2009. The apparent effect of changes in competition on productivity is likely to result in a shift in the magnitude of productivity benefits distributed to various stakeholders.

Chapter summary This chapter presented the empirical results of the measures associated with the productivity benefits distribution aspect of this research. The results show that when the measures are individually compared to changes in defence industry LP, evidence suggests that defence industry productivity benefits are shared amongst defence industry firms, defence industry firm employees and the Department of Defence. The competition results presented indicate that a positive relationship exists between changes in competition and productivity growth, which suggest that competition also has an effect on the distribution of productivity benefits to stakeholders.

120

Chapter 8 – Conclusions and recommendations This chapter summarises the main empirical findings of this research and outlines the contributions of the research to knowledge, as well the limitations of the research. Suggestions for future research on the productivity performance of the Australian defence industry are also discussed, in addition to the policy implications of the study.

Summary of research findings The two main objectives of this research were to measure the change in productivity of the Australian defence industry between 2001 and 2009 and to determine how the associated benefits derived from productivity in the Australian defence industry are distributed.

Productivity measurement Using a Tornqvist index number approach, this study empirically measured the productivity of the Australian defence industry using a labour‐capital multifactor productivity (MFP) measure and a partial labour productivity (LP) measure. The empirical results calculated show that between 2001 and 2009 the defence industry experienced MFP growth of 0.3% per annum and LP growth of 1.0% per annum. A comparative analysis of these results with the ABS‐calculated MFP and LP for the manufacturing and mining industries revealed that the Australian defence industry experienced the highest MFP growth rate of the three industries, outperforming the manufacturing and mining industries by 0.8% and 5.3% points, respectively. On further analysis of these results it was noted that due to the high share of labour evident in the defence industry, a LP comparison with the other two industries was more appropriate than a MFP comparison. Based on a comparison of the LP results, the manufacturing industry was the highest performer with an annual LP growth rate of 1.6%, which is 0.6% and 8.1% higher than the defence and mining industries, respectively.

Productivity benefits distribution To determine the distribution of productivity benefits to stakeholders, several measures were calculated and then compared to changes in defence industry LP. A measure of profit was calculated to determine the distribution of productivity benefits retained by defence industry firms, while changes in gross defence industry employee wages were used to measure productivity benefits distributed to defence industry employees. Since a measure of product cost was considered unattainable, a measure of the quality of military specific goods and services provided to the Department of Defence by the Australian defence industry was used to determine the productivity benefits distributed to the Department of Defence. Finally, a

121

measure of the level of competition existing in the defence industry was used to indicate the likely magnitude of the apportionment of productivity benefit distribution between stakeholders.

During the period 2002 to 2009, the average annual accounting (EBDIT) and economic profit margins calculated for the defence industry were 10.93% and 2.18%, respectively, with the most profitable years occurring between 2001 and 2005, inclusive. Despite these profit margins combined with an increase in the scale of the defence industry over the period, as evident in a 77% increase in revenue, the actual profit (EBDIT) for the defence industry was found to have declined 18.6% in total over the period. In comparison, despite the manufacturing industry’s average annual accounting (EBDIT) profit margin being slightly lower than the defence industry’s at 9.85%, it experienced only a 1.1% total decline in real profit (EBDIT) with the size of the manufacturing industry growing by 38% during the period based on changes in revenue. This overall profit result for the defence industry is somewhat unexpected given the monopolistic environment in which defence industry firms operate, however, as discussed in Chapter 7 as the period of analysis may not have captured an entire defence industry production cycle, a direct comparison with the manufacturing industry may not represent the true long term trend between the two industries due to the short period involved. Despite this, based on the results alone, compared to the LP generated by each industry, it was found that the 1% growth in LP calculated for the defence industry coincided with a 2.7% annual decline in profits, while for manufacturing a 1% growth in LP coincided with 0.1% annual decline in profits. These results indicate that during the period of analysis the defence industry potentially retains a below‐average share of productivity benefits in the form of profit.

To determine the distribution of productivity benefits to defence industry employees, the growth in annual wages was calculated and compared with other related ABS industries. The results show that between 2001 and 2009, defence industry employees experienced an annual 5.53% increase in wages, which was 0.97% points higher thant tha experienced by manufacturing employees. Relative to productivity growth generated, defence industry employees also experienced a higher level of wage growth than manufacturing employees, with a 5.53% increase in wages for every 1% increase in productivity, while manufacturing employees only received a 2.94% increase in wages for every 1% increase in productivity generated. While these results indicate that defence industry employees appear to be

122

receiving an above‐average portion of productivity benefit distribution, comparing the defence wage results with mining industry wage results indicates that factors other than productivity may have more influence on changes in industry wages.

In relation to the distribution of productivity benefits to the Department of Defence, a measure of defence industry performance was calculated and compared to changes in defence industry LP. The results show that defence industry performance was found to improve during periods of productivity growth and to decline during periods of productivity regress. When the performance of the defence industry was related to the cost and schedule components, it was found that schedule performance was more sensitive to changes in productivity than cost performance. These results indicate that while it appears some productivity benefits are distributed to the Department of Defence, it is likely that a significant portion of these benefits are non‐financial related. In regards to the magnitude of benefits distributed to the Department of Defence, while no other industry comparator was available, the actual performance scores calculated relative to the internal performance benchmark set by the Department of Defence suggest that it is unlikely that a high portion of productivity benefits are distributed to the Department of Defence.

The relative magnitude of productivity benefits distributed to stakeholders was also investigated by measuring the level of competition present in the defence market. Using a measure of competition based on the method of procurement employed by the Department of Defence, it was initially determined that by total dollar value, 49.1% of contracts entered into by the Department of Defence from 2001 to 2009 were procured competitively. On further analysis of four major contracts categorised by the Department of Defence as being procured via a method considered competitive, it was found that the actual level of competition involved was questionable. Taking this into account, the actual percentage of competitive contracts entered into during the period of analysis could be as low as 18%. Even if this were the case, the results do provide evidence to suggest that small changes in the level of competition in the highly monopolistic market in which defence industry firms operate have the potential to affect productivity generation and distribution. A pattern emerged in the results where one year after a major directional shift in measured defence industry market competition, the same directional shift was observed in the LP measurement40. Although no

40 This shift was directional only with no testing undertaken to determine if there were any similarities in relation to the magnitude of changes 123

direct evidence was found in regards to the magnitude of productivity benefit distribution, based on accepted theory it could be inferred that as competition increases, the portion of benefits being distributed to the Department of Defence would increase, whereas when competition decreases the defence industry is likely to retain an increased portion of productivity benefits.

Contributions of the research to knowledge This sub‐section details the two main areas in which this research has contributed to the economic literature. The first contribution of this study to knowledge is related to its measurement of productivity, while the second contribution concerns the attempted measure of productivity benefit distribution amongst defence industry stakeholders.

While many studies have been undertaken specifically concerned with the measurement of productivity at all levels of the economy, few studies have attempted to measure the productivity performance of a nation’s defence industry, with no known studies concerned specifically with the Australian defence industry. As such, the first major contribution of the research to knowledge is that this study is the first known attempt to measure the productivity performance of the Australian defence industry.

The second contribution of this research to knowledge relates to the attempt made to determine the distribution of productivity benefits. Although the measures used in this study are commonly used measures in productivity‐related research, the concept of using them in combination to determine the distribution of productivity benefits is relatively new and untested. The framework developed in determining the distribution of productivity benefits within an industry is considered as the second contribution of this research to knowledge.

Research limitations There are several limitations of this research. Firstly, the use of an index number approach to the measurement of productivity imposes some strict theoretical assumptions discussed in Chapter 2, including the existence of competitive behaviour between firms, firms operating at constant returns to scale and that all firms are fully efficient. It is unlikely that these assumptions are strictly adhered to when applied to the Australian defence industry; however, attempting to account for these assumptions would require advanced amendments to be made to the index number formula used, which is considered beyond the scope of this research.

124

Another limitation of this research is the limited period of analysis, with the original ten‐year period (1999 to 2009) reduced to eight years (2001 to 2009) as the research progressed and data issues were identified. The use of such a short period of analysis may not have captured the entire production cycle of defence industry manufacturing, which is likely to be greater than eight years. In fact, Solow (1963 p. 72) suggests that productivity data should be analysed over several decades to gain an understanding of productivity cycles, as annual changes in productivity are heavily influenced by other factors such as shifts in the composition of output or changes in total demand. The effect of changing the period of analysis is evident in the ABS manufacturing productivity indexes, which show that over the entire 23‐year period in which manufacturing productivity has been calculated, the annual growth rates for manufacturing MFP and LP are 0.3% and 1.3%, respectively (Australian Bureau of Statistics 2009). However, when the period is limited to 2001 to 2009, as it was in this study, the annual manufacturing productivity growth rates become ‐0.5% for MFP and 1.6% LP.

Similar to the restriction imposed as a result of the period of the study, the scope of firms included in the definition of the Australian defence industry also imposes a limitation, especially regarding the extrapolation of the research findings for other defence‐oriented firms not included in this study. As detailed in Chapter 1, the ten firms included in the research account for approximately 77% of the Australian defence industry, with several large firms such as SAAB Systems Australia, Qantas Defence Services and Sikorsky Aircraft Australia being excluded due to data availability issues. As a result, the extrapolation of these results to firms not considered in this sample may not be appropriate without further analysis.

Finally the quality and availability of data used will affect the overall results and conclusions. With official statistical data unavailable at the defence industry level, individual firm data were obtained from firms’ Financial Statements. As detailed in Chapter 5, there were consistency issues with the presentation of data not only between firms but within the same firm over the period of analysis. This resulted in not all observations being used and, in the case of labour data, some observations were constructed (refer Annex A Table Australian A.4 ‐ Defence Industry Employee Generated Data Observations) to increase the number of usable observations available for a measurement of productivity to occur. Another data‐imposed limitation is that the data used were based on the financial information the companies chose to make public, albeit under strict accounting regulation. These company accounts have been taken at face value and no consideration has been given to any factors that may influence the

125

data. For example, activities such as transfer pricing between the subsidiary and the parent could have occurred to manipulate the subsidiary’s or parent’s profit figures. Additionally, the number of consolidated entities included in each firm’s report may vary from year to year, and while an attempt was made to adjust for any major entity consolidation or disposal, there are other less obvious inclusions that may have influenced the results. Chapter 5 also details the issues associated with the use of accounting data based on an economic theory of measurement.

Suggestions for future research The results and research limitations sections infer some areas of future research to allow for a more accurate understanding of the Australian defence industry productivity performance and productivity benefit distribution. The clear recommendations are to increase the number of firms used in future productivity studies, as well as increasing the period of analysis significantly. The use of alternate methods of productivity measurement may enable the relaxation of some of the assumptions imposed by an index number approach, while also allowing the decomposition of the productivity calculation to occur. This will allow for a more accurate measure of productivity to occur, whilst enabling changes associated with other factors such as scale economies and efficiency to be identified and removed.

Policy implications As a major user of the Australian Government’s fiscal resources, the Department of Defence has a fiduciary duty to ensure these resources are properly managed. With a limited number of suppliers able to provide military goods and services within Australia, policy makers need to ensure that policies aimed at improving defence industry productivity also achieve the equitable distribution of the resultant productivity‐generated benefits. In the context of the defence market, an equitable distribution is not necessarily an even split between stakeholders, but rather is a situation where sufficient benefits are retained by the industry to ensure continual growth and progress, while the Department of Defence also receives a fair distribution of benefits as the sole consumer of military goods and services.

The results presented in this research provide empirical evidence on which future policy decisions can be based, as well as indicating how effective past policies have been. The research provides evidence to suggest that improvements in competition within the defence industry not only coincide with increased productivity growth but also with improvements in the quality of goods and services received by the Department of Defence. The research also 126

demonstrates relationships between changes in defence industry productivity and profitability. This information will provide policy makers with the ability to foresee the effect of policies related to specific aspects of the defence industry on the performance of the industry as a whole.

Concluding remarks Despite the limitations of this research, the two main objectives of this research were addressed. The study was the first to calculate a measure of productivity for the Australian defence industry, with the results indicating that from 2001 to 2009 the defence industry experienced positive productivity growth. The study was also the first known study that attempted to determine the distribution of Australian defence industry productivity benefits amongst stakeholders, with the methodology employed being transferrable to other industries. Ultimately, this research was instigated to assist the Department of Defence in meeting its obligations to the government by achieving savings in the conduct of its business. It is anticipated that the results of this research will provide government policy makers with a better understanding of the defence industry’s productivity performance and the way in which productivity benefits are distributed, especially those to the Department of Defence.

127

Annex A – Tables and charts

Tables Table A.1 ‐ Defence Company ScoreCard Performance Parameters

Performance Parameters The assessment of the contractor’s: Technical Performance Ability to deliver a product or provide a service that meets the requirements of the contract. Cost Effectiveness in forecasting, managing and controlling contract costs. Contract Schedule Timely achievement of the contracted task, milestones, delivery schedules and administrative requirements against the original contracted baseline. Round Schedule Achievements at (contract schedule) above for current reporting period. Contracting Compliance with the conditions of the contract;  company’s willingness to deliver contracted outputs;  management of the contract under the agreed terms and conditions;  performance monitoring; and  checking of progress made by subcontractors. Intellectual Property Willingness to identify, register and/or exploit intellectual property requested by the Commonwealth in accordance with the contract, and contractor’s willingness to advise the Commonwealth on its intellectual property requirement. Relationships – Willingness and performance in behaving reasonably and cooperatively in Defence/Contractor business relations with the project office.

Relationships – Willingness and performance in undertaking cooperative behaviour and Prime/Subcontractor business relations with its significant subcontractors.

Quality Systems Adherence to, and achievement of, the agreed quality plan objectives in accordance with the contract. Earned Value Management Achievement of the project’s ongoing earned value requirements. (Source: Defence Material Organisation 2010 P. 3.8.2)

128

Table A.2 ‐ Defence Company ScoreCard Assessment Ratings

Assessment Ratings Indications Very Good Contractor performance meets all contracted requirements and exceeds some or all requirements providing benefit to Defence.

As Contracted The contractor is meeting all contractual requirements.

Marginal The contractor is not meeting some contractual requirements.

Unsatisfactory The contractor is failing to meet contractual requirements, but there is (showing improvement) improvement and the possibility of recovery.

Unsatisfactory The contractor is failing to meet contractual requirements and there is a low likelihood of recovery.

Not Applicable This indicates that the assessment category is not applicable to the project or contract. (Source: Defence Material Organisation 2010 P. 3.8.3)

Table A.3 ‐ Australian Defence Industry Sample Size and Scope

Company Period of analysis Reason for chosen period ADI 2000‐2005 Purchased by Thales in 2005.

Australian Submarine 1999‐2009 Corporation

Prior to 2001, Australian Aerospace was Australian Aerospace 2001‐2009 effectively a different company.

BAE Australia 2001‐2009

Boeing Australia 2001‐2009

CAE Australia 2000‐2009 Insufficient usable data prior to 2000

Insufficient usable data prior to 2002 and not GDLS‐A 2002‐2008 available for 2009

Raytheon 2000‐2009 Insufficient usable labour data prior to 2000

Insufficient usable labour data prior to 2001. In Tenix 2001‐2007 2008 Tenix was purchased by BAE.

Thales Australia 2001‐2009 Insufficient data prior to 2000

(Source: individual firm Financial Statements)

129

Table A.4 ‐ Australian Defence Industry Employee Generated Data Observations

Company Periods Number of “# of Number of “employee analysed employees” generated expense” generated

ADI 7 0 7

Australian Submarine 11 0 1 Corporation

Australian Aerospace 9 2 0

BAE Australia 9 0 4

Boeing Australia 9 4 4

CAE Australia 10 1 5

GDLS‐A (note 1) 7 2 1

Raytheon (note 1) 10 1 4

Tenix 7 0 0

Thales Australia 9 0 3

(Source: Individual firm Financial Statements)

Note 1 – No data were available for Raytheon in 2003 or GDLS‐A in 2005. All values used (i.e. VA, capital and labour) were the average of the adjoining years.

Note 2 ‐ There were no instances where both the “# of employees” and “employee expense” was not available for the same year.

130

Table A.5 ‐ Australian Defence Industry Deflator – Sales Data

ABS 6427.0 Producer Tables 10 and 11. Articles Produced by manufacturing industries Price Indices Table 22. Selected output of division M professional, scientific and technical service numbers

Company Sales Data

ADI 1892 Explosives manufacturing 2231 Boiler, tank and other heavy gauge metal container manufacturing 2299 Other fabricated metal product manufacturing 2311 Motor vehicle manufacturing 2391 Shipbuilding and repair service 2462 Mining and construction machinery manufacturing 2499 Other machinery and equipment manufacturing nec 6923 Engineering design and engineering consulting services 700 Computer system design and related services

Australian Submarine 2231 Boiler, tank and other heavy gauge metal container manufacturing Corporation 2391 Shipbuilding and repair service 6923 Engineering design and engineering consulting services

Australian Aerospace 2394 Aircraft manufacturing and repair services 2419 Other professional and scientific equipment manufacturing 2422 Communication equipment manufacturing 6923 Engineering design and engineering consulting services 700 Computer system design and related services

BAE Australia 2394 Aircraft manufacturing and repair services 2419 Other professional and scientific equipment manufacturing 2422 Communication equipment manufacturing 6923 Engineering design and engineering consulting services 700 Computer system design and related services (Note: after 2007, due to BAE’s acquisition of Tenix, the BAE and Tenix indexes were combined at a ratio of 45% and 55%, respectively, based on their 2007 individual gross revenues)

Boeing Australia 2394 Aircraft manufacturing and repair services 2419 Other professional and scientific equipment manufacturing 2422 Communication equipment manufacturing 6923 Engineering design and engineering consulting services 700 Computer system design and related services

CAE Australia 2419 Other professional and scientific equipment manufacturing 700 Computer system design and related services

GDLS‐A 2311 Motor vehicle manufacturing

131

Raytheon 2419 Other professional and scientific equipment manufacturing 2422 Communication equipment manufacturing 6923 Engineering design and engineering consulting services 700 Computer system design and related services

Tenix 2231 Boiler, tank and other heavy gauge metal container manufacturing 2391 Shipbuilding and repair service 2419 Other professional and scientific equipment manufacturing 2422 Communication equipment manufacturing 2499 Other machinery and equipment manufacturing nec 6923 Engineering design and engineering consulting services

Thales Australia 2419 Other professional and scientific equipment manufacturing 6923 Engineering design and engineering consulting services 700 Computer system design and related services (Note: after 2005, due to Thales’s acquisition of ADI, the Thales and ADI indexes were combined at a ratio of 20% and 80%, respectively, based on their 2005 individual gross revenue)

Table A.6 ‐ Australian Defence Industry Deflator – Cost of Goods Sold Data

ABS 6427.0 Producer Table 14. Materials used in manufacturing industries, index numbers for Price Indices subdivisions

Company Cost of goods sold

ADI 18 Basic chemical and chemical product manufacturing 22 Fabricated metal product manufacturing 23 Transport equipment manufacturing

Australian Submarine 22 Fabricated metal product manufacturing Corporation 23 Transport equipment manufacturing 24 Machinery and equipment manufacturing

Australian Aerospace 23 Transport equipment manufacturing 24 Machinery and equipment manufacturing

BAE Australia 22 Fabricated metal product manufacturing 23 Transport equipment manufacturing 24 Machinery and equipment manufacturing

Boeing Australia 23 Transport equipment manufacturing 24 Machinery and equipment manufacturing

CAE Australia 24 Machinery and equipment manufacturing

GDLS‐A 24 Machinery and equipment manufacturing

Raytheon 23 Transport equipment manufacturing 24 Machinery and equipment manufacturing

Tenix 23 Transport equipment manufacturing 24 Machinery and equipment manufacturing

132

Thales Australia 18 Basic chemical and chemical product manufacturing 22 Fabricated metal product manufacturing 23 Transport equipment manufacturing 24 Machinery and equipment manufacturing

Table A.7 ‐ Australian Defence Industry Labour and Capital Shares – Individual Firm

ADI ASC AA BAE Boeing ABS L K L K L K L K L K L K 1999 90% 10% 53% 47% 2000 80% 20% 88% 12% 54% 46% 2001 81% 19% 87% 13% 55% 45% 87% 13% 97% 3% 54% 46% 2002 80% 20% 87% 13% 66% 34% 94% 6% 91% 9% 54% 46% 2003 81% 19% 89% 11% 85% 15% 93% 7% 92% 8% 53% 47% 2004 81% 19% 90% 10% 85% 15% 92% 8% 92% 8% 52% 48% 2005 80% 20% 90% 10% 89% 11% 91% 9% 90% 10% 51% 49% 2006 90% 10% 88% 12% 90% 10% 89% 11% 52% 48% 2007 90% 10% 90% 10% 90% 10% 89% 11% 53% 47% 2008 89% 11% 92% 8% 92% 8% 87% 13% 53% 47% 2009 83% 17% 93% 7% 94% 6% 82% 18% 53% 47%

CAE GDLS‐A Raytheon Tenix Thales ABS L K L K L K L K L K L K 1999 53% 47% 2000 57% 43% 86% 14% 54% 46% 2001 51% 49% 89% 11% 87% 13% 95% 5% 54% 46% 2002 47% 53% 96% 4% 83% 17% 87% 13% 93% 7% 54% 46% 2003 46% 54% 95% 5% 86% 14% 87% 13% 93% 7% 53% 47% 2004 49% 51% 94% 6% 88% 12% 86% 14% 97% 3% 52% 48% 2005 60% 40% 93% 7% 89% 11% 87% 13% 96% 4% 51% 49% 2006 61% 39% 92% 8% 91% 9% 86% 14% 98% 2% 52% 48% 2007 69% 31% 92% 8% 92% 8% 85% 15% 96% 4% 53% 47% 2008 76% 24% 90% 10% 93% 7% 94% 6% 53% 47% 2009 74% 26% 93% 7% 93% 7% 53% 47% (Source: Defence firm labour and capital shares calculated by author; ABS figures taken from ABS 5260.0.55.002 Experimental Estimates of Industry Multifactor Productivity, 2009‐10.)

133

Table A.8 ‐ Individual Firm MFP Changes Relative to Previous Observation (MFP, Output, and Input)

ADI ASC AA BAE MFP Output Input MFP Output Input MFP Output Input MFP Output Input 1999 2000 1.34 1.17 0.87 2001 1.45 1.64 1.12 1.80 1.54 0.85 2002 0.92 0.88 0.95 0.77 0.74 0.96 1.56 2.42 1.55 0.79 0.94 1.19 2003 0.95 1.04 1.10 1.67 1.95 1.17 0.51 1.42 2.78 1.15 1.11 0.96 2004 0.95 0.95 1.00 0.74 0.74 1.00 1.32 1.62 1.23 1.43 1.34 0.94 2005 0.94 0.93 0.99 0.97 1.04 1.07 0.59 0.79 1.35 0.80 0.89 1.11 2006 0.92 1.00 1.09 1.50 1.95 1.30 1.06 1.08 1.02 2007 1.37 1.48 1.09 1.03 1.25 1.22 1.02 1.07 1.05 2008 0.95 1.01 1.06 1.16 1.45 1.25 0.75 1.39 1.86 2009 0.99 1.21 1.22 0.58 0.68 1.17 1.69 1.94 1.15

Boeing CAE GDLS‐A Raytheon MFP Output Input MFP Output Input MFP Output Input MFP Output Input 1999 2000 2001 4.02 3.37 0.84 0.85 1.15 1.35 2002 0.71 0.33 0.47 0.59 0.51 0.86 1.37 1.28 0.94 2003 1.26 1.56 1.24 1.45 1.39 0.96 1.37 1.48 1.08 0.94 1.28 1.36 2004 1.15 1.26 1.10 1.64 1.65 1.01 0.68 0.60 0.88 1.01 1.22 1.21 2005 0.88 0.88 1.00 1.02 1.16 1.14 1.88 1.88 1.00 0.92 1.36 1.49 2006 1.02 1.14 1.12 0.87 0.95 1.09 1.38 1.30 0.95 1.31 1.19 0.91 2007 0.84 0.99 1.18 1.04 1.39 1.34 0.49 0.44 0.89 1.18 1.30 1.10 2008 0.84 0.86 1.02 1.49 1.62 1.09 1.06 1.01 0.95 0.89 0.98 1.11 2009 1.17 0.91 0.78 1.21 1.13 0.93 0.95 1.00 1.06

Tenix Thales MFP Output Input MFP Output Input 1999 2000 2001 2002 1.30 1.37 1.05 0.76 0.83 1.09 2003 1.45 1.51 1.04 0.76 0.89 1.17 2004 0.96 0.92 0.95 2.67 7.45 2.79 2005 1.10 1.12 1.02 0.86 0.90 1.05 2006 1.08 1.13 1.05 0.80 4.78 5.99 2007 0.76 0.65 0.86 0.81 0.93 1.15 2008 1.24 1.07 0.87 2009 0.70 0.80 1.15 (Source: Calculated by author)

134

Table A.9 ‐ Individual Firm LP Changes Relative to Previous Observation (LP, Output, and Input)

ADI ASC AA BAE LP Output Input LP Output Input LP Output Input LP Output Input 1999 2000 1.37 1.17 0.85 2001 1.43 1.64 1.15 1.84 1.54 0.84 2002 0.94 0.88 0.93 0.77 0.74 0.96 1.52 2.42 1.60 0.78 0.94 1.20 2003 0.93 1.04 1.12 1.64 1.95 1.19 0.38 1.42 3.74 1.16 1.11 0.95 2004 0.95 0.95 1.00 0.74 0.74 1.01 1.28 1.62 1.26 1.44 1.34 0.93 2005 0.95 0.93 0.98 0.96 1.04 1.08 0.57 0.79 1.40 0.81 0.89 1.11 2006 0.91 1.00 1.10 1.51 1.95 1.29 1.06 1.08 1.02 2007 1.38 1.48 1.08 1.01 1.25 1.24 1.02 1.07 1.05 2008 0.96 1.01 1.05 1.14 1.45 1.27 0.72 1.39 1.94 2009 1.06 1.21 1.14 0.57 0.68 1.19 1.69 1.94 1.15

Boeing CAE GDLS‐A Raytheon LP Output Input LP Output Input LP Output Input LP Output Input 1999 2000 2001 4.54 3.37 0.74 0.84 1.15 1.37 2002 0.76 0.33 0.43 0.66 0.51 0.78 1.52 1.28 0.84 2003 1.24 1.56 1.26 1.46 1.39 0.95 1.38 1.48 1.08 0.88 1.28 1.45 2004 1.15 1.26 1.09 1.55 1.65 1.07 0.69 0.60 0.87 0.99 1.22 1.23 2005 0.90 0.88 0.98 0.90 1.16 1.28 1.89 1.88 0.99 0.87 1.36 1.56 2006 1.04 1.14 1.10 0.80 0.95 1.19 1.38 1.30 0.95 1.32 1.19 0.91 2007 0.84 0.99 1.18 0.87 1.39 1.60 0.49 0.44 0.89 1.17 1.30 1.11 2008 0.85 0.86 1.01 1.42 1.62 1.14 1.06 1.01 0.95 0.88 0.98 1.12 2009 1.24 0.91 0.74 1.25 1.13 0.91 0.94 1.00 1.06

Tenix Thales LP Output Input LP Output Input 1999 2000 2001 2002 1.30 1.37 1.05 0.77 0.83 1.07 2003 1.46 1.51 1.04 0.77 0.89 1.16 2004 0.98 0.92 0.94 2.56 7.45 2.91 2005 1.10 1.12 1.02 0.86 0.90 1.04 2006 1.08 1.13 1.04 0.78 4.78 6.11 2007 0.77 0.65 0.84 0.83 0.93 1.12 2008 1.26 1.07 0.85 2009 0.70 0.80 1.14 (Source: Calculated by author)

135

Table A.10 ‐ Australian Defence Industry Productivity Index – Company Weights (Pre‐Data Cleanse)

Weights (based on value added) ADI ASC AA BAE Boeing CAE GDLS‐A Raytheon Tenix Thales 1999 79% 21% 0% 0% 0% 0% 0% 0% 0% 0%

2000 73% 14% 0% 0% 0% 1% 0% 12% 0% 0%

2001 22% 7% 1% 20% 30% 1% 0% 5% 12% 2%

2002 27% 6% 4% 22% 12% 1% 1% 7% 19% 2%

2003 21% 10% 4% 19% 14% 1% 1% 7% 22% 1%

2004 17% 6% 5% 22% 15% 1% 1% 8% 17% 8%

2005 16% 6% 4% 20% 14% 1% 1% 10% 20% 7%

2006 0% 5% 7% 17% 13% 1% 1% 10% 18% 27%

2007 0% 8% 9% 19% 13% 2% 1% 13% 12% 25%

2008 0% 8% 13% 26% 11% 2% 1% 13% 0% 27%

2009 0% 8% 8% 43% 9% 2% 0% 11% 0% 19%

(Source: Calculated by author)

Table A.11 ‐ Australian Defence Industry Productivity Index – Company Weights (Post‐Data Cleanse)

Weights (based on value added) ADI ASC AA BAE Boeing CAE GDLS‐A Raytheon Tenix Thales 1999 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%

2000 0% 0% 0% 0% 0% 0% 0% 0% 0% 0%

2001 22% 7% 1% 20% 30% 1% 0% 5% 12% 2%

2002 27% 6% 4% 22% 12% 1% 1% 7% 19% 2%

2003 21% 10% 4% 19% 14% 1% 1% 7% 22% 1%

2004 19% 7% 6% 23% 16% 1% 1% 8% 19% 0%

2005 16% 6% 4% 20% 14% 1% 1% 10% 20% 7%

2006 0% 5% 7% 17% 13% 1% 1% 10% 18% 27%

2007 0% 8% 9% 19% 13% 2% 1% 13% 12% 25%

2008 0% 8% 13% 26% 11% 2% 1% 13% 0% 27%

2009 0% 15% 13% 0% 15% 4% 0% 20% 0% 33%

(Source: Calculated by author) 136

Table A.12 ‐ Firm Profit Results (Annual Changes EBDIT relative to previous years)

ADI ASC AA BAE Boeing CAE GDLS‐A Raytheon Tenix Thales 2001 ‐1.3% ‐23.1% ‐383.9% ‐85.7% 2002 24.7% ‐65.1% ‐29.9% ‐168.0% 195.5% ‐34.4% 1639.6% 34.1% 89.9% 2003 2.4% 88.9% 117.1% 3.7% ‐63.5% 156.5% 404.2% 18.4% 36.9% 52.6% 2004 ‐35.3% 96.4% 131.7% ‐0.1% 147.3% 52.7% ‐138.9% 10.8% ‐10.6% 153.6% 2005 ‐30.1% ‐19.0% 220.8% 46.5% ‐10.9% 18.3% ‐132.4% 25.3% 76.5% ‐20.2% 2006 19.8% 95.3% 6.7% ‐153.7% ‐27.8% 393.7% 19.5% ‐29.7% 66.4% 2007 48.1% ‐42.6% ‐27.7% ‐131.9% 59.9% ‐71.6% 18.6% ‐69.6% ‐42.0% 2008 ‐2.3% 25.6% ‐64.2% 148.3% ‐47.2% ‐23.5% ‐8.5% 104.8% 2009 ‐32.8% ‐8.1% 370.1% ‐33.6% ‐32.3% 2.3% 20.3% (Source: Calculated by author)

Table A.13 ‐ Average Annual Employee Compensation (Current Values)

Manufacturing Professional Mining industry Defence industry industry industry 1999 $ 97,419.64 $ 40,417.21 $ 47,360.93 $ 64,123.92 2000 $ 86,173.93 $ 38,607.06 $ 49,418.80 $ 66,874.11 2001 $ 92,068.02 $ 41,805.04 $ 52,275.69 $ 68,887.39 2002 $ 94,276.55 $ 42,334.21 $ 54,881.16 $ 77,389.02 2003 $ 94,115.28 $ 45,950.98 $ 55,431.73 $ 77,672.90 2004 $ 83,762.24 $ 46,763.41 $ 57,618.63 $ 83,412.30 2005 $ 79,543.75 $ 50,391.80 $ 55,326.13 $ 87,261.26 2006 $ 88,027.94 $ 52,628.68 $ 56,369.33 $ 86,315.03 2007 $ 100,545.00 $ 55,410.35 $ 60,795.96 $ 98,564.14 2008 $ 97,621.55 $ 56,277.22 $ 65,610.05 $ 92,500.68 2009 $ 109,359.87 $ 59,742.62 $ 69,438.79 $ 105,996.32 (Source: Calculated by author)

137

Table A.14 – Company ScoreCard Statistics

Number of contracts Total contract value reported Year Round reported ($ millions) 1999 1 36 $3,296.852 2000 2 54 $15,221.139 2001 3 77 $17,767.767 2001 4 78 $21,354.262 2002 5 89 $23,045.528 2002 6 83 $22,347.564 2003 7 80 $23,051.639 2003 8 77 $24,145.338 2004 9 86 $21,406.977 2004 10 90 $21,610.205 2005 11 84 $24,079.469 2005 12 94 $24,907.966 2006 13 105 $26,531.728 2006 14 94 $26,222.407 2007 15 98 $26,651.127 2007 16 96 $31,213.654 2008 17 96 $33,047.845 2008 18 95 $34,626.219 2009 19 104 $41,372.982 2009 20 103 $35,483.529 (Source: Department of Defence Interim Defence Contract Register)

138

Charts Chart A.1 ‐ Australian Defence Industry Labour Productivity Index – Sensitivity Testing Results

(Source: LP indexes calculated by author)

139

Annex B – Individual defence industry firm MFP and LP empirical results Annex B presents the empirical results of the individual defence industry firm MFP and LP calculations, which were aggregated to form the Australian defence industry productivity indexes presented in Chapter 6. The following charts display the cumulative labour‐capital MFP and partial LP indexes calculated for each defence industry firm. The first chart displays the firm’s cumulative MFP index as well as cumulative indexes of the firm’s input and output, while the second chart displays the firm’s cumulative LP index as well as cumulative indexes of the firm’s input and output.

Australian Defence Industries Chart B.1 below shows the MFP index for Australian Defence Industries (ADI) increase 12.9% in total over the period of analysis (2.47% per annum).

Chart B.1 ‐ ADI MFP, Input and Output Indexes

(Source: Indexes calculated by author)

Between 2000 and 2001 measured MFP productivity increased by 45.4% in total, while between 2001 and 2005 measured productivity decreased by 22.3% in total (‐6.11% per year). The sudden increase in measured MFP in 2001 may be related to the sale of ADI by the Australian Government in 1999 and the significant corporate restructuring that occurred following the sale (ADI 2003).

140

On 29 November 1999 all ADI shares were acquired from the Commonwealth of Australia by a joint venture between Transfield Pty Ltd and Thompson CSF. During 2000 ADI engaged in significant restructuring activity, which combined with abnormal accounting losses of the sale resulted in a one‐off net loss of $123.8 million in 2000 (ADI 2000 p. 3). It is likely that this event caused the spike in productivity in 2001 and does not reflect actual changes in MFP productivity. If the 2000 observation is excluded as a result of the restructuring activity that occurred after the 1999 sale, measured MFP productivity for ADI between 2001 and 2005 declines at 6.11% per annum. This decrease is primarily due to an increase in input of 4% over the period while output decreased by 19%.

Chart B.2 below shows the LP index for ADI increased by 12.4% in total over the period of analysis (2.37% per annum).

Chart B.2 ‐ ADI LP, Input and Output Indexes

(Source: Indexes calculated by author)

The ADI LP results are almost identical to the ADI MFP result, which is expected considering the labour share of input for the MFP calculation was 80%.

141

Australian Submarine Corporation Chart B.3 below shows the MFP index for Australian Submarine Corporation (ASC) increased by 263% in total over the period of analysis (10.18% per year).

Chart B.3 ‐ ASC MFP, Input and Output Indexes

(Source: Indexes calculated by author)

While overall measured MFP increased by 264%, there do appear to be two distinct MFP cycles. Between 1999 and 2004 measured MFP increased by 230%, which was due to an increase in output of 192%, while input decreased by 16.4% during the same period. This rapid increase in measured productivity prior to 2003 coincides with the completion of the production phase of the Collins Class Submarine and transition of ASC into predominantly Submarine maintenance‐oriented activity. Although theoretically it is anticipated that productivity generally increases towards the end of a production run, as the production methods become more effective it is likely that a large portion of the measured productivity shown above during the 1999 to 2004 period is due to final completion payments being received by ASC, combined with a 17.9% (177 personnel) decrease in employees over the same period.

From 2004 to 2009 measured productivity increased by only 14.52% due to inputs increasing by 64.62%, while output increased by 88.52%. During this period there was also a notable 38% increase in productivity during 2007, which coincides with the commencement of payments

142

for a $2.7bn Air Warfare Destroyer contract announced on 31 May 2005 (ASC 2007 p. 44). As such, this measured increase in productivity reflects not only changes in productivity but changes caused by a sudden influx of funds as the Air Warfare Destroyer project commenced, as evidenced by the increase in output measured by value added, of which sales revenue is a major component.

Chart B.4 below shows the LP index for ASC increase 288% in total over the period of analysis (11.16% per annum).

Chart B.4 ‐ ASC LP, Input and Output Indexes

(Source: Indexes calculated by author)

The ASC LP results are almost identical to the ASC MFP result, which is expected considering the average labour share of input during the period for the MFP calculation was 89%.

143

Australian Aerospace Chart B.5 below shows the measured MFP index for Australian Aerospace (AA) decreased by 35.9% in total over the period of analysis (‐5.42% per year).

Chart B.5 ‐ AA MFP, Input and Output Indexes

(Source: Indexes calculated by author)

Although measured MFP for AA shows a total 35.9% decrease for the entire period, the majority of this decrease occurred between 2008 and 2009. From 2001 to 2008 measured MFP increased by 10.8% (1.5% per annum), with MFP declining 42.2% in 2009. The decline in MFP in 2009 was caused by a decrease in output of 32.1%, while input increased by 17.3%. Although the increase in input was consistent with the long‐term trend, the drop in output was not. There is no apparent explanation as to why output declined in 2009 but noting that a similar event occurred in 2005, it may simply be part of the AA business cycle.

During the period of analysis it also appears that AA experienced a rapid growth in its economies of scale. This is evident in the increase in revenue of 1200% combined with an increase in the number of employees from 129 to 2905 between 2001 and 2009. This rapid growth in output and input is likely due partially to AA being awarded over $5.7bn in new contracts, primarily related to the production and maintenance of the Armed Reconnaissance and Troop Lift helicopters. Despite this growth in the size of the firm, it appears that AA is operating at a constant returns to scale based on the output and input indexes trending fairly consistently during the period. 144

Chart B.6 below shows the measured LP index for AA decreasing by 58.2% in total over the period of analysis (‐10.34% per annum).

Chart B.6 ‐ AA LP, Input and Output Indexes

(Source: Indexes calculated by author)

The AA LP results are different to the AA MFP result, which is likely due to the change in labour share over the period used in the MFP calculation, as seen in Annex A Table Australian A.7 ‐ Defence Industry Labour and Capital Shares – Individual Firm. In 2009 the share of labour increased from 55% in 2001 to 93%, indicating that it was likely that low labour productivity was being offset by a relatively higher level of capital productivity during the period.

145

BAE Australia Chart B.7 below shows the MFP index for BAE Australia (BAE) increased by 40.7% in total over the period of analysis (4.37% per year).

Chart B.7 ‐ BAE MFP, Input and Output Indexes

(Source: Indexes calculated by author)

The sudden increase in input and output indexes from 2007 onwards evident in Chart B.7 above is a result of BAE acquisition of 100% of the issued share capital of Tenix Defence (Tenix) on 27 June 2008 (BAE Systems Australia Holdings 2009 p. 5), which was first reflected in BAE’s 2008 Financial Statements. Prior to BAE’s acquisition of Tenix (2001 to 2007) BAE’s measured MFP increased by 11.6% in total over the period (2.23% per year), with a total 26% increase in MFP after the 2008 acquisition (approximately 12.2% per year).

Post acquisition, BAE also experienced an increase in the scale of operations, which is evident in the changes in employee numbers and revenue eover th period. Prior to the Tenix acquisition from 2001 to 2007, BAE experienced a 25% increase in employee numbers (an increase of 550 positions) and a 23% increase in revenue. After 2007, with the consolidation of Tenix, the size of BAE increased by 350% (based on cash revenue), with the number of employees also increasing by 200% (an increase of 2167 positions). As there is nothing unusual relating to the MFP calculations prior to 2007, it is likely that BAE did experience some productivity growth between 0% and 2.23% over the period. However, as discussed above, it is

146

unlikely that BAE experienced a 12.2% growth in the two years following its acquisition of Tenix, with this growth more likely attributable to financial anomalies in its post‐acquisition Financial Statements as a result of the consolidation of the two firms.

Chart B.8 below shows the LP index for BAE increased by 38.7% in total over the period of analysis (4.18% per annum).

Chart B.8 ‐ BAE LP, Input and Output Indexes

(Source: Indexes calculated by author)

Similar to the BAE MFP index discussed above, the rapid increase in input and output indexes is attributed to the consolidation of BAE and Tenix. Prior to 2008 the total increase in LP for BAE was 14.5% (annual increase of 2.75%), which is 2.9% higher than the MFP results. This would indicate that despite the share of labour input being allocated 90% of the MFP calculation, the capital component had a slightly negative effect on the overall calculation.

147

Boeing Australia Chart B.9 below shows the MFP index for Boeing Australia (Boeing) decreased by 23.8% in total over the period of analysis (‐3.35% per annum).

Chart B.9 ‐ Boeing MFP, Input and Output Indexes

(Source: Indexes calculated by author)

The obvious decrease in measured MFP in 2002 is likely to be the result of Boeing’s disposal of its investment in two significant subsidiary companies (Hawker de Havilland Holdings Pty Ltd and Aerospace Technologies of Australia Limited) on 13 December 2002. This resulted in the gross asset base of Boeing decreasing by 97%, while the number of employees decreased by 55% (1586 positions) between 2001 and 2002. The disposal also resulted in Boeing’s input and output indexes decreasing by 57% and 67% respectively, with the measured MFP decreasing by 29.1%.

Although overall the measured MFP for Boeing indicated a decrease of 23.8%, if the 2001 MFP measurement is excluded as an anomaly due to its disposal of the subsidiaries from 2002 to 2009, measured MFP for Boeing increased by 7.39% in total over the period (1.02% per annum). However, even during the period 2002 to 2009 there are two MFP patterns evident, with Boeing experiencing MFP growth between 2002 and 2004 prior to a steady decline in MFP from 2004 to 2008 with an increase following in 2009.

148

Chart B.10 below shows the measured LP index for Boeing decreasing by 10.3% in total over the period of analysis (‐1.36% per annum).

Chart B.10 ‐ Boeing LP, Input and Output Indexes

(Source: Indexes calculated by author)

As discussed above, excluding the 2001 to 2000 result as an abnormal event, between 2002 and 2009 measured MFP for Boeing increased by 17.8% (2.33% per annum). Compared to the measured MFP results, the measured LP results are better, indicating that capital productivity was having a negative effect on overall MFP.

149

CAE Australia Chart B.11 below shows the measured MFP index for CAE Australia (CAE) increased by 842% in total over the period of analysis (28.4% per year).

Chart B.11 ‐ CAE MFP, Input and Output Indexes

(Source: Indexes calculated by author)

The increase in measured MFP is primarily due to measured output increasing by 1011% over the period, while input increased by 17.9%. The increase in input was attributed to an increase in employee numbers of 47% (47 positions), while the increase in output is due to an increase in cash revenue of 541% and materials increased by a lower rate of 368%. This difference in the two major components of the value added (i.e. output) calculation resulted in an 1111% increase in output over the period.

Similar to other firms discussed above, CAE experienced a rapid growth in measured MFP between 1999 and 2004, followed by a declined in MFP between 2004 and 2007. From 2007 to 2009 CAE experienced a significant increase in MFP, which may be a result of changes in economies of scale, as evident in CAE’s cash revenue increasing by 148% between 2007 and 2009 and its capital asset base also increasing by 100% between 2006 and 2009. In relation to the MFP increase in 2001, it is likely that this measurement was caused by a misalignment of the timing of its recognition of revenue earned and expenses paid in 2000 and 2001, with the data indicating that expenses were recognised in 2000 for revenue earned recognised in 2001.

150

Chart B.12 below shows the measured LP index for CAE increased by 649% in total over the period of analysis (25.1% per annum).

Chart B.12 ‐ CAE LP, Input and Output Indexes

(Source: Indexes calculated by author)

The primary reason why CEA’s measured LP is comparatively lower than its measured MFP is due to CEA’s measured cost of capital component decreasing by 27% over the period, combined with its share of capital allocated to the MFP calculation also decreasing by 39% during the period. In the early 2000s the share of capital used in the MFP calculation averaged 50% before gradually decreasing to 26% in 2009.

Overall, both the measured MFP and LP results could also be highly influenced by scale economies operating within CEA as the business expands, which is evident in CEA’s cash revenue increasing by 393% over the period and the number of employees peaking at an increase of 63% (62 personnel) in 2008.

151

General Dynamic Land Systems Australia Chart B.13 below shows the measured MFP index for General Dynamic Land Systems Australia (GDLS‐A) increased by 26.1% in total over the period of analysis (3.95% per annum).

Chart B.13 ‐ GDLS‐A MFP, Input and Output Indexes

(Source: Indexes calculated by author)

As seen in Chart B.13 above, the measured MFP for GDLS‐A was found to be volatile. The volatility in the measured MFP was caused primarily by increases in output in 2003 and again between 2005 and 2006. The exact reason why this occurred is unknown but it may be partially due to the recognition timing of revenue earned. During the period cash expenses remained consistent while cash revenue was volatile. This volatility would be negated in the long term but has had a significant effect on an annual basis. Opposite to the output index, the input index displays a gradual decrease of 24%, primarily due to a 25% decrease in the number of employees (22 personnel). The impact of capital on the input index is negligible considering the share of labour in the MFP calculation remained over 90% (average 93%) during the period of analysis.

Chart B.14 below shows the measured LP index for CAE increased by 30% in total over the period of analysis (4.5% per annum).

152

Chart B.14 ‐ GDLS‐A LP, Input and Output Indexes

(Source: Indexes calculated by author)

This result is slightly better than the measured MFP result, indicating that the capital component of the MFP calculation had a negative overall effect on productivity.

153

Raytheon Australia Chart B.15 below shows the measured MFP index for Raytheon Australia (Raytheon) increased by 32.9% in total over the period of analysis (3.21% per annum).

Chart B.15 ‐ Raytheon MFP, Input and Output Indexes

(Source: Indexes calculated by author)

While the measured MFP for Raytheon increased by 32.9% in total over the period, the majority of this growth occurred after 2005. Between 2000 and 2005, MFP increased by 1.8% in total (0. 4% per year), while between 2005 and 2007 MFP increased by a total of 54.4%. This increase in MFP during 2005 to 2007 is likely to have been influenced by the announcement of the Air Warfare Destroyer contractor in 2005 in which Raytheon was a prime partner. A decrease in measured MFP of 15.4% between 2007 and 2009 resulted in a more modest total between 2005 and 2009, with an increase of 30.5% (5.5% per annum).

Between 2000 and 2005 the size of Raytheon increased by 380% (based on revenue), with the number of employees increasing by 330% (759 personnel). This increase in size resulted in a 300% increase in both input and output. After 2005, employee numbers only increased by 25% (increase of 227 personnel), while revenue continued to increase by 186%.

Chart B.16 below shows the measured LP index for Raytheon increased by24.9% in total over the period of analysis (2.51% per annum).

154

Chart B.16 ‐ Raytheon LP, Input and Output Indexes

(Source: Indexes calculated by author)

The overall result is 5.9% lower than the measured MFP result, indicating that the capital component of the MFP calculation had a positive overall effect on measured MFP productivity.

155

Tenix Defence Holdings Chart B.17 below shows the measured MFP index for Tenix Defence Holdings (Tenix) increased by 62.8% in total over the period of analysis (8.46% per annum).

Chart B.17 ‐ Tenix MFP, Input and Output Indexes

(Source: Indexes calculated by author)

The change in measured MFP primarily reflects changes in output over the period, as input remained stable due to a 4% (76 personnel) decrease in the number of employees being offset by a comparatively similar sized increase in the cost of capital.

The two obvious changes in measured MFP occurred in 2003, when measured MFP increased by 45.3% and in 2006 where measured MFP decreased by 24.3%. The cause of the increase in 2003 is due to cash revenue increasing by 14%, while other factors used to calculate output remained constant. The cause of the decrease in 2007 is due to cash revenue decreasing by 15% while other factors used to calculate output remained constant.

Chart B.18 below shows the measured LP index for Tenix increased by 70.6% in total over the period of analysis (9.32% per annum).

156

Chart B.18 ‐ Tenix LP, Input and Output Indexes

(Source: Indexes calculated by author)

The overall result is 1.0% better than the measured MFP result, indicating that the capital component of the MFP calculation had a negative affect overall on measured MFP productivity.

157

Thales Australia Chart B.19 below shows the measured MFP index for Thales Australia (Thales) decreased by 26.2% in total over the period of analysis (‐3.73% per annum).

Chart B.19 ‐ Thales MFP, Input and Output Indexes

(Source: Indexes calculated by author)

While not entirely evident in the chart above due to the scale used, the measured MFP index for Thales exhibits a high level of volatility. An increase in MFP in 2004 corresponds with Thales Underwater Systems data being manually consolidation with Thales Australia Holdings data for the first time, while the increase in output and input indexes in 2006 is the result of Thales’s acquisition of ADI and subsequent consolidation of ADI and Thales financial data.

The acquisition of ADI on 16 October 2006 resulted in the number of employees reported by Thales increasing by 630% (2689 positions) and a cash revenue increase of 421%. Despite this increase in scale from 2005 to 2009, MFP decreased by 44% (approximately ‐13.5% per year). A decrease in measured MFP is not entirely unexpected considering that prior to being acquired in 2006 as a separate entity, ADI experienced a decrease in MFP of 6.11% per year between 2001 and 2005. Considering the consolidated company consists of approximately 20% Thales and 80% ADI (based on each firm’s individual 2005 cash revenue), a decreasing MFP appears reasonable. Prior to the ADI acquisition Thales’s measured MFP actually increased by 31.9% in total (7.17% per year).

158

Chart B.20 below shows the measured LP index for Thales decreased by 25.1% in total over the period of analysis (‐3.55% per annum).

Chart B.20 ‐ Thales LP, Input and Output Indexes

(Source: Indexes calculated by author)

It was anticipated that the measured LP index would be similar to the measured MFP index, considering the labour share allocated to the MFP calculation averaged 95% during the period.

159

Summary of results Table B.1 below provides a summary of the measured changes in annual MFP and LP growth or decline experience by each firm based on the individual firms’ results presented above.

Table B.1 ‐ Australian Defence Industry – Individual Firm MFP Estimates

Company Average annual Average annual Comments (relating only to figures in MFP results LP results brackets)

ADI 2.47% (‐6.11%) 2.37% (‐5.81%) Post Commonwealth disposal (2001‐2005)

ASC 10.18% 11.16%

AA ‐5.42% ‐10.43%

BAE 4.37% (2.23%) 4.18% (2.75%) Pre Tenix acquisition (2001‐2007)

Boeing ‐3.35% (1.02%) ‐1.36% (2.33%) Post subsidiary disposal (2002‐2009)

CAE 28.31% 25.09%

GDLS‐A 3.95% 4.5%

Raytheon 3.21% 2.51%

Tenix 8.46% 9.32%

Thales ‐3.73% (7.17%) ‐3.55% (6.84%) Pre Tenix acquisition (2000‐2005)

The first column lists the ten defence industry firms examined in the analysis, while the second and third columns present the average annual productivity results calculated for each firm based on an MFP and LP measure, respectively. The comments in the fourth column relate to the productivity results contained within brackets in columns two and three. The bracketed results are the measured productivity results if certain events (as detailed in the individual firm analysis previously in this annex) are excluded, such as company acquisitions and disposals.

As detailed in Table B.1, the individual annual measured MFP results were similar to the annual measured LP for all firms. This was expected considering the overall average share of labour allocated to the MFP calculation was 85% (see Annex A Table Australian A.7 ‐ Defence Industry Labour and Capital Shares – Individual Firm).

The individual MFP calculations indicate seven companies experienced MFP growth ranging from 2.47% to 28.31% per annum, while three companies experienced MFP decline ranging 160

from ‐3.35% to ‐5.42% per annum. If abnormal events listed in Table B.1 were excluded, the results would change to indicate eight companies experienced MFP growth while two companies experienced MFP decline.

The individual LP calculations indicate seven companies experienced MFP growth ranging from 2.37% to 25.09% per annum, while three companies experienced MFP decline ranging from ‐ 1.36% to ‐10.43% per annum. If abnormal events were excluded the results would change to indicate eight companies experienced MFP growth while only two companies experienced MFP decline.

It is important for these results to be interpreted in the context of how they were calculated and the assumptions imposed on the methodology used. As discussed previously, the measured productivity indexes are not calculated using “precision tools” but rather provide an indication “of the general orders of magnitudes of change” (Kendrick 1973b Ch 2 p. 12), as they not only measure changes in technology but also include changes in other factors such as efficiency, scale economies and other unaccounted for noise and error (Balk 2001).

The influence of changes in the scale of operations is especially apparent in the case of the Australian defence industry. All of the firms included in the analysis indicated signs of a substantial increase or decrease in their scale of operations, which is evident in the number of employees and revenue displayed in Annex E. Other factors detailed in previous chapters such as units of measurements, data errors, assumptions (such as labour and capital utilisation), even the deflators used in the MFP calculations, will all affect the MFP figure calculated for each firm.

Annex summary On an annual basis the changes in productivity measured at the firm level appear volatile. This would be expected especially with firms that only engaged in the manufacturing of a few major products, where the productivity gains may not be realised until late in the production process. In addition, individual firms may be more sensitive to external influences and changes in their operating environment, such as mergers and acquisitions. When aggregated to the industry level, however, it is anticipated that many of these unique firm factors will be negated and the overall industry trend will be revealed.

161

Annex C – Defence industry experimental productivity measurement (value added vs. performance output measure) – empirical results and analysis The results presented in this Annex relate to the defence industry experimental productivity measures that were calculated using a defence industry firm performance measure relative to the quality of goods and services provided to the Department of Defence rather than a value added (VA) financial measure of output commonly used in productivity measurement (OECD 2001a).

Method The experimental calculations of labour productivity (LP) using performance as a measure of output were conducted after the multifactor productivity (MFP) and LP analysis of the defence industry were calculated where VA was used as the measure of output. The method used for the calculation of the individual and industry LP based on performance output was the same as that used in the VA‐based measures detailed in Chapter 4.

A defence industry index was then calculated by aggregating the individual firm results using the VA of each firm as a weight. The VA measure was used as the weighting rather ethan th performance measure calculated to ensure consistency between the original defence industry LP index and the performance based defence industry LP index for comparative purposes41.

Results – Individual firm The following charts compare the two LP measures for all ten defence industry firms. The first index is based on a VA measure of output while the second index is based on a Company ScoreCard performance rating measure of output assigned by the Department of Defence (refer to Chapter 4 for details on how the Company ScoreCard performance rating was calculated). It should also be noted that the base dates for some firms vary due to the availability of performance data.

41 Testing conducted to determine the difference in the defence industry calculated by using a VA rather than the performance score as a weight found the difference to be 0.23% per annum. 162

Chart C.1 ‐ ADI LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

Chart C.2 ‐ ASC LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

163

Chart C.3 ‐ AA LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

Chart C.4 ‐ BAE LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

164

Chart C.5 ‐ BOEING LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

Chart C.6 ‐ CAE LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

165

Chart C.7 – GDLS‐A LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

Chart C.8 ‐ RAYTHEON LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

166

Chart C.9 ‐ Tenix LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

Chart C.10 ‐ Thales LP Comparison (VA Output vs. Performance Output)

(Source: Indexes calculated by author)

As seen in the individual firm charts above, the use of performance as a measure of output produced different results than LP calculated using VA as the measure of output. For eight of

167

the ten firms, the overall measured LP using a performance measure of output was less than when a VA output measure was used. For many of the firms, however, while the magnitude of measures differed considerably there appeared to be some similarities in the overall trend of LP change over the period. The apparent relationship between LP and performance ratings was discussed in detail in Chapter 7 and will not be discussed here further.

Results – defence industry The following chart compares the experimental measure of defence industry LP based on a performance measure of output against the defence industry LP index42 calculated with VA as the measure of output. In addition to the two observations removed in the defence industry LP calculation presented in Chapter 6, the 2006 Thales observation was also removed from the defence industry PL (performance output measure) calculation, as it was considered as being influenced more by Thales’s acquisition with ADI than any real change in productivity.

Chart C.11 ‐ Defence Industry LP ‐ VA vs. Performance Measure of Output

(Source: Indexes calculated by author)

Over the entire period, the defence industry LP (VA output measure) index decreased by 1.24% per annum, while the defence industry LP (Performance output measure) index decreased by

42 The defence industry LP index presented differs to the one presented in Chapter 6, as the observations relating to GDLS‐A 2002‐2005 and Tenix 2002‐2003 have been removed to allow a direct comparison with the defence industry LP (performance output measure) as these observations were not available in the performance index calculation. 168

3.72% per annum. Aside from the spike in the LP (performance output measure) evident in 2002 over the period of analysis, the two indexes are similar in their movement. These results are somewhat unexpected considering the substantial differences between the two indexes at the firm level.

Summary A financial VA measure of output is generally used in the calculation of productivity for two primary reasons. Firstly, it is generally the only type of output measurement where data are available in a consistent format between firms and industries over time. Secondly, it relies on the basic assumption that changes in quality and specification of output will be automatically adjusted for via the price consumers are willing to pay for products in a competitive market. In the event that a non‐financial measure of output is available, the ability to adjust the measure to consider differences in the quality and specification of output produced by different firms still presents an issue. In the case of the performance output measure used in the defence industry LP calculation presented above, the results suggest that the performance ratings assigned to each firm via the Defence Material Organisation Company ScoreCard Program43 have taken the quality issue into account. Whether or not this process is conducted knowingly, underlying expectations of the quality of services and products received by the Department of Defence appear to be having the same effect in accounting for changes in quality and specifications that is generally undertaken by the market when a VA measure of output is used. This process is also likely to have been assisted by the fact that as the sole customer for defence products, the Department of Defence is able to assess the performance of the industry as a whole rather than only being exposed to select portions of the market, which is the common case for the majority of customers operating in other industries.

Overall, the results above show that based on a measure of output derived from the perceived benefits received by the Department of Defence as the consumer of defence industry products, the defence industry has declined by 3.72% per annum. This decline is 2.48% points greater than the annual decline of 1.24% per annum indicated by the defence industry LP index based on the VA measure of output.

43 Refer to Chapter 3 for a detailed explanation of the Defence Material Organisation Company ScoreCard Program. 169

Annex D – Defence industry firm profit results – accounting (EBDIT) and economic profit margins

The following charts show the results of the accounting and economic profitability measures calculated for each defence industry firm. These results were used in the aggregation of the Australian defence industry profitability measure presented in Chapter 7.

Australian Defence Industry Chart D.1 below shows the Australian Defence Industry (ADI) experiencing an average profit margin of 11.06%, with annual profit ranging between 4% and 16% during the period of analysis. The average economic profit margin was 3.15%, with annual margins ranging from ‐ 2% to 6%.

Chart D.1 ‐ ADI Profit Margin

(Source: Profit margins calculated by author)

170

Australian Submarine Corporation Chart D.2 below shows the Australian Submarine Corporation (ASC) experiencing an average profit margin of 12% with annual profit ranging between 5% and 21% during the period of analysis. The average economic profit margin was ‐0.34% with annual margins ranging from ‐ 6% to 5%.

Chart D.2 ‐ ASC Profit Margin

(Source: Profit margins calculated by author)

171

Australian Aerospace Chart D.3 below shows Australian Aerospace (AA) experiencing an average profit margin of 4.36% with annual profit ranging between 1% and 9% during the period of analysis. The average economic profit margin was 2.17% with annual margins ranging from ‐2% to 7%.

Chart D.3 ‐ AA Profit Margin

(Source: Profit margins calculated by author)

172

BAE Australia Chart D.4 below shows BAE Australia (BAE) experiencing an average profit margin of 5.53% with annual profit ranging between ‐16% and 13% during the period of analysis. The average economic profit margin was ‐1.41% with annual margins ranging from ‐20% to 5%. An important point to note is that prior to BAE’s acquisition of Tenix in 2008, BAE’s average profit margin and economic profit margin were slightly better at 5.98% and ‐0.08%, respectively.

Chart D.4 ‐ BAE Profit Margin

(Source: Profit margins calculated by author)

173

Boeing Australia Chart D.5 below shows Boeing Australia (Boeing) experiencing an average profit margin of 8.9% with annual profit ranging between ‐10% and 21% during the period of analysis. The average economic profit margin was 0.35% with annual margins ranging from ‐20% to 10%.

Chart D.5 ‐ Boeing Profit Margin

(Source: Profit margins calculated by author)

174

CAE Australia Chart D.6 below shows CAE Australia (CAE) experiencing an average profit margin of 17.78% with annual profit ranging between ‐5% and 35% during the period of analysis. The average economic profit margin was 4.33% with annual margins ranging from ‐24% to 19%.

Chart D.6 ‐ CAE Profit Margin

(Source: Profit margins calculated by author)

175

General Dynamic Land Systems Australia Chart D.7 below shows General Dynamic Land Systems Australia (GDLS‐A) experiencing an average profit margin of 20.6% with annual profit ranging between ‐8% and 60% during the period of analysis. The average economic profit margin was 5.53% with annual margins ranging from ‐13% to 43%.

Chart D.7 ‐ GDLS‐A Profit Margin

(Source: Profit margins calculated by author)

176

Raytheon Australia Chart D.8 below shows Raytheon Australia (Raytheon) experiencing an average profit margin of 12.94% with annual profit ranging between 1% and 21% during the period of analysis. The average economic profit margin was 4.58% with annual margins ranging from ‐7% to 11%.

Chart D.8 ‐ Raytheon Profit Margin

(Source: Profit margins calculated by author)

177

Tenix Defence Holdings Chart D.9 below shows Tenix Defence Holdings (Tenix) experiencing an average profit margin of 14.97% with annual profit ranging between 6% and 26% during the period of analysis. The average economic profit margin was 4.97% with annual margins ranging from ‐1% to 13%.

Chart D.9 ‐ Tenix Profit Margin

(Source: Profit margins calculated by author)

178

Thales Australia Chart D.10 below shows Thales Australia (Thales) experiencing an average profit margin of 14.20% with annual profit ranging between 4% and 25% during the period of analysis. The average economic profit margin was 7.41% with annual margins ranging from ‐4% to 18%. Prior to Thales’s acquisition of ADI in 2006, Thales’s average profit and economic margins were higher at 19.38% and 13.27%, respectively.

Chart D.10 ‐ Thales Profit Margin

(Source: Profit margins calculated by author)

179

Annex summary Table D.1 below is a summary of the individual average firm profit results for the period presented above relative to manufacturing industry profit data obtained from the ABS.

Table D.1 ‐ Summary Firm Average Profit Margin (Accounting and Economic)

Average profit margin Average profit margin Firm (accounting EBDIT) (economic)

GDLS‐A 20.60% 3.52%

CAE 17.78% 3.94%

Tenix 14.97% 3.16%

Thales 14.20% 6.06%

Raytheon 12.94% 4.58%

ASC 12.00% ‐0.34%

ADI 11.06% 2.00%

Manufacturing industry 9.85% n.a.

Boeing 8.90% ‐1.29%

BAE 5.53% ‐1.16%

AA 4.36% 1.78%

(source: Results calculated by author)

As seen in Table D.1 above, seven of the ten defence industry firms have average accounting profit margins higher than the manufacturing industry accounting average profit margin. While an economic profit margin for the manufacturing industry was not available, seven of the ten defence industry firms had positive average economic profit margins between 1.78% and 6.06%. This level of average economic profit is not unexpected considering the monopoly characteristics evident in the Australian defence market.

These individual results have been aggregated to form a defence industry profit margin, which is presented in Chapter 7.

180

Annex E – Australian defence industry firms – changes in revenue and employees

The following charts detail changes in the revenue (expressed in constant values) and the number of employees of the ten defence industry firms over the period of analysis.

Chart E.1 ‐ ADI ‐ Changes in Revenue and Employees

(Source: ADI Financial Statements)

181

Chart E.2 ‐ ASC ‐ Changes in Revenue and Employees

(Source: ASC Financial Statements)

Chart E.3 ‐ AA ‐ Changes in Revenue and Employees

(Source: AA Financial Statements)

182

Chart E.4 ‐ BAE ‐ Changes in Revenue and Employees

(Source: BAE Financial Statements)

Chart E.5 ‐ Boeing ‐ Changes in Revenue and Employees

(Source: Boeing Financial Statements)

183

Chart E.6 ‐ CAE ‐ Changes in Revenue and Employees

(Source: CAE Financial Statements)

Chart E.7 ‐ GDLS‐A ‐ Changes in Revenue and Employees.

(Source: GDLS‐A Financial Statements)

184

Chart E.8 ‐ Raytheon ‐ Changes in Revenue and Employees

(Source: Raytheon Financial Statements)

Chart E.9 ‐ Tenix ‐ Changes in Revenue and Employees

(Source: Tenix Financial Statements)

185

Chart E.10 ‐ Thales ‐ Changes in Revenue and Employees

(Source: Thales Financial Statements)

186

Annex F – Firm Annual Reports and Financial Statements

Australian Submarine Corporation Pty Ltd Financial Statements and Reports 30 June 1999

Australian Submarine Corporation Pty Ltd Financial 30 June 2001

Australian Submarine Corporation Pty Ltd Annual Report 2003

ASC Pty Ltd Annual Report 2004

ASC Pty Ltd Annual Report 2005

ASC Pty Ltd Annual Report 2006

ASC Pty Ltd Annual Report 2007

ASC Pty Ltd Annual Report 2008

ASC Pty Ltd Annual Report 2009

ADI Limited Financial Report 30 June 2000

ADI Limited Financial Report 31 December 2001

ADI Limited Financial Report 31 December 2003

ADI Limited Financial Annual Report 31 December 2005

ADI Limited Financial Annual Report 31 December 2006

Australian Aerospace Limited and its controlled entities General Purpose Financial Report for the year ended 31 December 2004

Australian Aerospace Limited and its controlled entities General Purpose Financial Report for the year ended 31 December 2005

Australian Aerospace Limited and its controlled entities General Purpose Financial Report for the year ended 31 December 2006

Australian Aerospace Limited and its controlled entities General Purpose Financial Report for the year ended 31 December 2007

Australian Aerospace Limited and its controlled entities Special Purpose Financial Report for the year ended 31 December 2009

BAE SYSTEMS Australia Limited Annual Financial Report 31 December 2002

BAE SYSTEMS Australia Holdings Limited Annual Financial Report 31 December 2004

BAE SYSTEMS Australia Holdings Limited Annual Financial Report 31 December 2006 187

BAE SYSTEMS Australia Holdings Limited Annual Financial Report 31 December 2007

BAE SYSTEMS Australia Holdings Limited Annual Financial Report 31 December 2009

Boeing Australia Limited (formerly Eurocopter International Pacific Limited and Controlled entity) Annual Report for the year ended 31 December 2002

Boeing Australia Limited Annual Report for the year ended 31 December 2003

Boeing Australia Limited Annual Report for the year ended 31 December 2004

Boeing Australia Limited Annual Report for the year ended 31 December 2005

Boeing Australia Limited Annual Report for the year ended 31 December 2006

Boeing Australia Limited Annual Report for the year ended 31 December 2007

Boeing Defence Australia Limited Annual Report for the year ended 31 December 2009

CAE Electronics (Australia) Pty Limited Financial Report 31 March 2000

CAE Electronics (Australia) Pty Limited Financial Report 31 March 2002

CAE Australia Pty Ltd (formerly CAE Electronics (Australia) Pty Limited) Annual Report 31 March 2003

CAE Australia Pty Ltd Annual Report 31 March 2004

CAE Australia Pty Ltd Annual Report 31 March 2005

CAE Australia Pty Ltd Annual Financial Report 31 March 2006

CAE Australia Pty Ltd Annual Financial Report 31 March 2007

CAE Australia Pty Ltd Annual Report 31 March 2009

Eurocopter International Pacific Limited and Controlled entity General Purpose Financial Report for the year ended 31 December 2002

General Dynamics Land Systems – Australia PTY LTD Financial Statements for the year ended 31 December 2002

General Dynamics Land Systems – Australia PTY LTD Financial Statements for the year ended 31 December 2003

General Dynamics Land Systems – Australia PTY LTD Financial Statements for the year ended 31 December 2004

General Dynamics Land Systems Australia Pty Ltd Annual Report ‐ 31 December 2007

Raytheon Australia Pty Ltd Financial Report – 31 December 2000 188

Raytheon Australia Pty Ltd Annual Report – 31 December 2002

Raytheon Australia Pty Ltd Annual Report – 31 December 2005

Raytheon Australia Pty Ltd Annual Report – 31 December 2006

Raytheon Australia Pty Ltd Annual Report – 31 December 2007

Raytheon Australia Pty Ltd Annual Report – 31 December 2009

Tenix Defence Systems Pty Limited and Controlled Entities Special Purpose Financial Report for the year ended 30 Jun 2000

Tenix Defence Pty Limited and Controlled Entities Special Purpose Financial Report for the year ended 30 Jun 2002

Tenix Defence Pty Limited and Controlled Entities Special Purpose Financial Report for the year ended 30 Jun 2004

Tenix Defence Pty Limited and Controlled Entities Special Purpose Financial Report for the year ended 30 Jun 2005

Tenix Defence Pty Limited Special Purpose Financial Report for the year ended 30 Jun 2006

Tenix Defence Pty Limited Special Purpose Financial Report for the year ended 30 Jun 2007

Thales International Pacific Holdings Pty Limited Financial Report for the year ended 31 December 2002

Thales Australia Holdings Pty Limited (formerly Thales International Pacific Holdings Pty Limited) Financial Report for the year ended 31 December 2003

Thales Australia Holdings Pty Limited and its controlled entity Special Purpose Financial Report for the year ended 31 December 2005

Thales Australia Holdings Pty Limited and its controlled entity Special Purpose Financial Report for the year ended 31 December 2007

Thales Australia Holdings Pty Limited and its controlled entity Special Purpose Financial Report for the year ended 31 December 2009

Thales Underwater Systems Pty Limited Special Purpose Financial Report for the year ended 31 December 2005

Thales Underwater Systems Pty Limited Special Purpose Financial Report for the year ended 31 December 2006

Thales Underwater Systems Pty Limited Special Purpose Financial Report for the year ended 31 December 2007

189

Bibliography AASB – See Australian Accounting Standard

Abramovitz, M 1962, ‘Economic Growth in the United Sates’, American Economic Review, vol. 52, pp.762‐82.

ACIL Tasman, 2004, A Profile of the Australian Defence Industry: Helping align defence industry, defence industry policy, and defence strategic planning, Canberra, ACT.

Agapos, AM 1971, ‘Competition in the Defense Industry: An Economic Paradox’, Journal of Economic Issues, vol. 5, no.2, pp. 41‐55.

Agapos, AM & Gallaway, LE 1970,’Defense Profits and the Renegotiation Board in the Aerospace Industry’, The Journal of Political Economy, vol. 78, no. 5, pp. 1093‐1105.

Aigner, DJ, Knox Lovell, CA & Schmidt, P 1977, ‘Formulation and Estimation of Stochastic Frontier Models’, Journal of Econometrics, vol. 6, pp. 21‐37.

ANAO – see Australian National Audit Office

Arena, MV & Birkler, J 2009, ‘Determining When Competition Is a Reasonable Strategy for the Production Phase of Defense Acquisition’, Occasional Paper prepared for thee Office of th Sectary of Defence, RAND Corporation, Santa Monica, CA.

Assad, SD 2006, ‘Contract Management: DOD Vulnerabilities to Contracting Fraud, Waste, and Abuse’, GAO‐06‐838R DOD Contracting, United States Government Accountability Office, Washington, DC.

Australian Accounting Standard 2001, Employee Benefits, AASB 1028, Australian Accounting Standards Board, viewed 12 December 2010,

Australian Bureau of Statistics, 2009, Experimental Estimates of Industry Multifactor Productivity, Australia: Detailed Productivity Estimate, Cat. No. 5260.0.55.002 viewed 14 September 2010,

Australian Government, 2010, Portfolio Budget Statements 2010‐11, Budget Related Paper No. 1.5A & 1.5C, BSPG, Canberra.

Australian National Audit Office, 2003, Australian Industry Involvement Program, The Auditor‐ General Audit Report No. 46 2002‐03 Performance Audit, Canberra, ACT.

190

Australian National Audit Office, 2008, Defence Material Organisation Major Projects Report 2007‐08, The Auditor‐General Audit Report No. 9 2008‐09 Assurance Report, Canberra, ACT.

Aw, BY & Hwang, AR 1995, ‘Productivity and the export market: A firm‐level analysis’, Journal of Development Economics, vol. 47, no. 2, pp. 313‐332.

Palangkareya, A, Stierwald, A & Yong, J 2009, ‘Is Firm Productivity Related to Size and Age? The Case of Large Australian Firms’, Journal of Industry, Competition and Trade, vol 9. Pp. 167‐195

Baily, MN & Solow, RM 2001, "International Productivity Comparisons Built from the Firm Level," Journal of Economic Perspectives, vol. 15, no.3, pp 151‐172.

Balgati, BH, Griffin, JM & Rich, DP 1995, ‘The Measurement of Firm‐Specific Indexes of Technical Change’, The Review of Economics and Statistics, vol. 77, no.4, pp. 654‐663.

Balk, BM 2001, ‘Scale Efficiency and Productivity Change’, Journal of Productivity Analysis, vol. 15, no. 3, pp. 159‐183.

Balk, BM 2007, ‘Measuring Productivity Change without Neoclassical Assumptions: A Conceptual Analysis’, Working paper series no. WP04/2007, Centre for Efficiency and Productivity Analysis, University of Queensland, St. Lucia, Qld.

Banker, RD 1993, ‘Maximum Likelihood, Consistency and Data Envelopment Analysis: A Statistical Foundation’, Management Science, vol. 39, no. 10, pp. 1265‐1273.

Barker, T, Dunne, P & Smith, R 1991, ‘Measuring the Peace Dividend in the United Kingdom’, Journal of Peace Research, vol. 28, pp. 345‐358.

Barros, CP 2002, ‘Small Countries and the Consolidation of the European Defense Industry: Portugal as a Case Study’, Defence and Peace Economics, vol. 13, no. 4, pp. 311–319.

Barros, CP 2004, ‘Measuring Performance in Defense‐Sector Companies in a Small NATO Member‐Country’, Journal of Economic Studies, vol. 31, no.2, pp. 112–128.

Barros, CP 2005, ‘Governance and Incentive Regulation in Defence Industry Enterprises: A Case Study’, European Journal of Law and Economics, vol. 20, no.1, pp. 87‐97.

Barrios, S, Gorg, H & Strobl, E 2005, ‘Foreign direct investment, competition and industrial development in the host country’, European Economic Review, vol. 49, no. 7, pp. 1761‐1784.

Bauer, PW, Berger, AN, Ferrier, GD & Humphrey, DB 1998, ’Consistency Conditions for Regulatory Analysis of Financial Institutions: A Comparison of Frontier Efficiency Methods’, Journal of Economics and Business, vol. 50, pp. 85‐114.

Bishop, P 2003, ‘Collaboration and firm size: some evidence from the UK defence industry’, Applied Economics, vol. 35, pp. 1965‐1969.

191

Bitzer, J, Geishecker, I & Gorg, H 2007, ‘Productivity spillovers through vertical linkages: Evidence from 17 OECD countries’, Economics Letters, vol. 99, pp. 328‐331.

Black, SE & Lynch, LM 1996, ’Human Capital Investments and Productivity’, American Economic Review, vol. 86, no.2, pp. 263‐267.

Bohi, DR 1973, ‘Profit Performance in the Defence Industry’, The Journal of Political Economy, vol. 81, no. 3, pp. 721‐728.

Booth, AL & Frank, J 1999, ‘Earnings, Productivity, and Performance‐Related Pay’, Journal of Labor economics, vol. 17, no. 3, pp. 447‐463.

Borgu, A 2004, Capability of First Resort? Australia’s Future Amphibious Requirement, Australian Strategic Policy Institute, 23 July, viewed 1 April 2011,

Bureau of Labour Statistics 2007, Technical Information About the BLS Multifactor Productivity Measures, viewed 3 Mar 2011,

Burgess, JF & Wilson, PW 1995, ‘Decomposing Hospital Productivity Changes, 1985‐1988: A Nonparametric Malmquist Approach’, The Journal of Productivity Analysis, vol. 6, pp. 343‐363.

Burns, AE 1972, ‘Profit Limitation: Regulated Industries and the Defense‐Space Industries’, The Bell Journal of Economics and Management Science, vol. 3, no. 1, pp. 3‐25.

Business Council of Australia, 1986, The Measurement and Distribution of Productivity, Better Printing Services, Queanbeyan, NSW.

Cadsby, CB, Song, F & Tapon, F 2007,’Sorting and Incentive Effects of Pay for Performance: An Experimental Investigation’, Academy of Management Journal, vol. 50, no. 2, pp. 387‐405.

Camm, F 1993, ‘DoD Should Maintain Both Organic and Contract Sources for Depot Level Logistics Services’, Issues Paper, RAND Corporation, Santa Monica, CA.

Cappelen, A, Gleditsch, NP & Bjerkholt O 1984, ‘Military Spending and Economic Growth in the OECD Countries’, Journal of Peace Research, vol. 21, no. 4, pp. 361‐373.

Caves, DW, Christensen, LR & Diewert, EW 1982, ‘The Economic Theory of Index Numbers and the Measurement of Input, Output, and Productivity’, Econometrica, vol. 50, no. 6, pp.1393‐ 1414.

Caves, RE 1974, ‘Multinational Firms, Competition, and Productivity in Host‐Country Markets’, Economica, vol. 41, no. 162, pp. 176‐193.

Chan, KC 1997, ‘Defence Industry: Trends in DOD Spending, Industrial Productivity, and Competition’, GAO/PEMID‐97‐3, United States Government Accountability Office, Washington, DC.

192

Charnes, A, Clark, CT, Cooper, WW & Golany, B 1985, ‘A Developmental Study of Data Envelopment Analysis in Measuring the Efficiency of Maintenance Units in the U.S. Air Forces’, Annals of Operations Research, vol. 2, pp. 95‐112.

Charnes, A, Cooper, WW & Rhodes, E 1978, ‘Measuring the Inefficiency of Decision Making Units’, European Journal of Operational Research, vol. 2, no. 6, pp. 429‐444.

Charnes, A, Cooper, WW & Sueyoshi, T 1988, ‘A Goal Programming/Constrained Regression Review of the Bell System Breakup’, Management Science, vol. 34, no. 1, pp. 1‐26.

Cherchye, L & Post, T 2003, ‘Methodological Advances in DEA: A survey and an application for the Dutch electricity sector’, Statistica Neerlandica, vol. 57, no. 4, pp. 410‐438.

Clare, J (Minister for Defence Material) 2011, Reforms to strengthen Australian Defence Industry, media release, 29 June, Department of Defence, Canberra, viewed 19 July 2011, .

Clare, R & Johnston, K 1993 ‘Financial Performance of government Business Enterprises: An Update,’ EPAC Background Paper No. 25, Canberra: Australian Government Publishing Services , in | Gow, ID & Kells, S 1998, ‘The Theory and Measurement of Profitability’, Institute Working Paper No. 7/98, Melbourne Institute of Applied Economic and Social Research, University of Melbourne, Melbourne, Vic.

Cobb, CW & Douglas, PH 1928, ‘A Theory of Production’, The American Economic Review, vol. 18, no. 1, pp. 139‐165.

Coelli, TJ, Prasada Rao, DS, O’Donnell, CJ & Battese, GE 2005, An introduction to efficiency and productivity analysis, 2nd edn, Springer, .New York

Cohen, JS, Stevenson, R, Mintz, A & Ward, MD 1996, ‘Defense Expenditures and Economic Growth in Israel: The Indirect Link’, Journal of Peace Research, vol. 33, no. 3, pp. 341‐352.

Cohen, W 1995, ‘Empirical Studies of Innovative Activity’, in P. Stoneman (eds), Handbook of the Economics of Innovation and Technological Change, Blackwell, Oxford.

Combet, G (Parliamentary Secretary for Defence Procurement) 2007, AIR9000 Program – Australian Multi‐role Helicopters Achieve In‐service Milestone, Department of Defence, 18 Dec, viewed 1 April 2011,

Cowing, TG & Stevenson RE 1981, (eds), Productivity Measurement in Regulated Industries, Academic Press, New York, pp.179‐212.

Defence – see The Department of Defence.

Defence Material Organisation, 2005, DMO Scorecard Programs: Company Scorecard and 360o View Scorecard, Defence Publishing Services, Canberra, ACT

193

Defence Material Organisation, 2008, Industry Survey 2008, Industry Division, Defence Material Organisation, Canberra, ACT.

Defence Material Organisation, 2010a, Portfolio Budget Statements 2010‐11: Defence Material Organisation, Defence Publishing Services, Canberra, ACT, viewed 18 August,

Defence Material Organisation, 2010b, Defence procurement Policy Manual, Defence Publishing Services, Canberra, ACT.

Defence Material Organisation, 2011, Defence Material Organisation, Canberra, ACT, viewed 8 August 2011, < http://www.defence.gov.au/dmo/id/cic_contracts/cic_contracts.cfm>.

Denny, M, Fuss, M & Waverman, L 1981, ‘The Measurement and Interpretation of Total Factor Productivity in Regulated Industries, with an Application to Canadian Telecommunications’, in | TG Cowing & RE Stevenson 1981 (eds), Productivity Measurement in Regulated Industries, Academic Press, New York, pp.179‐212.

Diewert, WE 1981, ‘The Theory of Total Factor Productivity Measurement in Regulated Industries’, in TG Cowing & RE Stevenson, Productivity Measurement in Regulated Industries, Academic Press, New York, pp. 17‐44.

Diewert, WE 1992, ‘Fisher Ideal Output, Input, and Productivity Indexes Revisited’, The Journal of Productivity Analysis, vol. 3, pp. 211‐248.

DMO – see Defence Material Organisation.

Doucouliagos, H & Laroche, P 2003, ‘What Do Unions Do to Productivity? A Meta‐Analysis’, Industrial Relations, vol. 42, no.4, pp. 650‐691.

Doucouliagos, H & Laroche, P 2009, ‘Unions and Profits: A Meta‐Regression Analysis’, Industrial Relations, vol. 48, no.1 pp. 146‐184.

Fare, R, Grosskopf, S & Knox Lovell, CA 1992, ‘Indirect Productivity Measurement’, The Journal of Productivity Analysis, vol.2, pp. 283‐298.

Farrell, MJ 1957, ‘The Measurement of Productive Efficiency’, Journal of the Royal Statistical Society Series A, vol. 120, no. 3, pp. 253‐282.

Feeny, S & Rogers, M 1998, ‘Profitability in Australian Enterprises’, Melbourne Institute Working Paper No. 21/98, Melbourne Institute of Applied Economic and Social Research, University of Melbourne, Melbourne, Vic.

Fonfria, A & Correa‐Burrows, P 2010, ‘Effects of Military Spending on the Profitability of Spanish Defence Contractors’, Defence and Peace Economics, vol. 21, no. 2, pp. 177‐192.

Forbes, M 2001, ‘Army’s copter choice rejected’, The Age, 1 September, viewed 4 May 2011, 194

Garvey, A 2007, ‘The impact of IFRS on UK companies’, MSI Global Alliance, viewed 13 May 2011,

Globerman, S 1979, ‘A Note on Foreign Ownership and Market Structure in the United Kingdom’, Applied Economics, vol. 11, pp. 35‐42.

Gow, ID & Kells, S 1998, ‘The Theory and Measurement of Profitability’, Melbourne Institute Working Paper No. 7/98, Melbourne Institute of Applied Economic and Social Research, University of Melbourne, Melbourne, Vic.

Griliches, Z 1988, ‘Productivity Puzzles and R&D: Another Nonexplanation’, The Journal of Economic Perspectives, vol. 2, no. 4, pp. 9‐21.

Griliches, Z 1995, ‘R&D and Productivity: Econometric Results and Measurement Issues’, in P.Stoneman (eds), Handbook of the Economics of Innovation and Technological Change, Blackwell, Oxford, pp. 52‐89.

Griliches, Z 1996, ‘The Discovery of the Residual: A historical Note’, Journal of Economic Literature, vol. 34, no. 3, pp. 1324‐1330.

Hall, BH, Lotti, F & Mairesse, J 2009, ‘Innovation and Productivity in SME's: Empirical Evidence for Italy’, Small Business Economics, vol. 33, pp. 13‐33.

Hall, BH, Mairesse, J & Mohnen, P 2010, ‘Measuring the Returns to R&D’, CIRANO Working Papers 2010s‐02, CIRANO.

Hartley, K 2003, ‘The Future of European Defence Policy: An Economic Perspective’, Defence and Peace Economics, vol. 14, no. 2, pp. 107‐115.

Hellman, N 2011, ‘Soft Adoption and reporting Incentives: A study on the impact of IFRS on financial statements in Sweden’, Journal of International Accounting Research, vol. 10, no. 1 pp. 61‐83.

Henderson, DJ & Simar, L 2005, ‘A Fully Nonparametric Stochastic Frontier Model for Panel Data’, Discussion Paper 0417, Institut de Statistique, Universite Catholique de Louvain.

Hirsch, BT 1991, Labor Unions and the Economic Performance of Firms. Kalamazoo, MI: W.E Upjohn Institute for Employment Research, in | Doucouliagos, H & Laroche, P 2009, ‘Unions and Profits: A Meta‐Regression Analysis’, Industrial Relations, vol. 48, no.1 pp. 146‐184.

Hirsch, BT & Addison, J 1986. The Economic Analysis of Unions: New Approaches and Evidence. London: Allen & Unwin, in | Doucouliagos, H & Laroche, P 2009, ‘Unions and Profits: A Meta‐ Regression Analysis’, Industrial Relations, vol. 48, 1no. pp. 146‐184.

Hulten, C R 2001, ‘Total Factor Productivity: A Short Biography’, Working Paper 7471, National Bureau of Economic Research, Cambridge, MA.

IDCR – see Interim Defence Contract Register. 195

Interim Defence Contract Register, 2010, Department of Defence, daily updating.

Jablonski, M, Rosenblum, L & Zunze, K 1988, ‘Productivity, age and labor composition changes in the U.S.’, Monthly Labour Review, vol. 111, no. 9, pp. 34‐35.

Jorgenson, D W & Griliches, Z 1967, ‘The Explanation of Productivity Change’, Review of Economic Studies, vol. 34, no. 3, pp. 249‐280.

Kalirajan, KP & Obwona, MB 1994, 'Frontier production function: The stochastic coefficients approach', Oxford Bulletin of Economics and Statistics, vol. 56, no. 1, pp. 87‐96.

Kendrick, JW 1965, ‘Summary and Evaluation of Recent Work in Measuring the Productivity of Federal Agencies’, Management Science, vol. 12, no. 4, pp. B120‐B134.

Kendrick, JW 1973a, ‘Recent Productivity Trends in the U.S’, Vital Speeches of the day, vol. 39, no. 18, p. 562.

Kendrick, JW 1973b, Postwar Productivity Trends in the United States: 1949‐1969, National Bureau of Economic Research, New York, NY.

Kendrick, JW 1977, Understanding Productivity, The John Hopkins University Press, Baltimore, Maryland.

Klette, TJ & Kortum, S 2004, ‘Innovating Firms and Aggregate Innovation’, Journal of Political Economy, vol. 112, no. 5, pp. 986‐1018.

Kovacic, WE 1999, ‘Competition policy in the post consolidation defense industry’, Antitrust Bulletin, vol. 44, no. 2, pp. 489‐556.

Kumbhakar, SC, Park, BU, Simar, L & Tsionas, EG 2007, ‘Nonparametric stochastic frontiers: A local maximum likelihood approach’, Journal of Econometrics I, vol. 37, pp. 1‐27.

Kuosmanen, T 2006, ‘Stochastic Nonparametric Envelopment of Data: Combining Virtues of SFA and DEAd in a Unifie Framework’, MTT Discussion papers 3/2006, MTT Agrifood Research Finland, Luutnantintie, Finland.

Lichtenberg, FR 1992, ‘A Perspective on Accounting for Defense Contracts’, The Accounting Review, vol. 67, no. 4.

Lim, SH & Knox Lovell, CA 2009, ‘Profit and Productivity of US Class I Railroads’, Managerial and Decision Economics., vol 30, pp. 423‐442.

Malloy, MC & Brenner, R 1999, ‘A reply to Robert Brenner’, Against the Current, Vol. 79, viewed 18 May 2010,

Markowski, S & Hall, P 2006, ‘The economic benefits of defence industries’, CEDA Growth, vol 57, pp. 40‐49.

196

McGowan, AS & Vendrzyk, VP 2002, ‘The Relation between Cost Shifting and Segment Profitability in the Defense‐Contracting Industry’, The Accounting Review, vol. 77, no.4, pp.949‐ 969.

Meeusen, W & van den Broeck, J 1977, ‘Efficiency Estimation from Cobb‐Douglas Production Function with Composed Error’, International Economic Review., vol 8, pp. 435‐444.

Miller, EM 1978, ‘The Extent of Economies of Scale: The Effects of Firm Size on Labour Productivity and Wage Rates’, Southern Economic Journal, vol. 44, no. 3, pp. 470‐487.

Miller, EM 1981, ‘What Do Labor Productivity Data Show about Economies of Scale: Reply’, Southern Economic Journal, vol. 47, no. 3, pp. 847‐851.

Monczka, RM, Hadfield, RB, Giunipero, LC & Patterson, JL 2009, Purchasing & Supply Chain Management, 4th edn, South‐Western Cengage Learning, Mason, OH.

Morikawa, M 2010, ‘Labor unions and productivity: An empirical analysis using Japanese firm‐ level data’, Labour Economics, doi:10.1016/j.labeco.2010.02.009

Morrison, C. 1993, A Microeconomic Approach to the Measurement of Economic Performance, Springer‐Verlag, New York.

Nadiri, M 1993, ‘Innovations and Technological Spillovers’, Working paper 4423, National Bureau of Economic Research, Cambridge, MA.

Nelson, B (Minister for Defence) 2007a, $3 Billion Amphibious Ships Will Strengthen ADF, Boost Australian Industry, released 20 June, Department of Defence, Canberra, viewed 16 March 2011,

Nelson, B (Minister for Defence) 2007b, Australia’s Next Generation Air Warfare Destroyer, released 20 June, Department of Defence, Canberra, viewed 16 March 2011,

Nicholoson, B & Dodds, M 2011, Warning on job losses in defence industry, The Australia, 17 February, viewed 6 July 2011, .

Nickell, SJ 1996, ‘Competition and Corporate Performance’, The Journal of Political Economy, vol. 104, no. 4, pp. 724‐746.

Nicolini, M & Resmini, L 2010, ‘Which Firms Create Them and Which Firms Really Benefit? FDI Spillovers in New EU Member States’, Economics of Transition, vol. 18, no. 3, pp. 487‐511.

Norsworthy, JR 1984, ‘Growth Accounting and Productivity Measurement’, Review of Income and Wealth, vol. 30, no. 3, pp. 309‐29.

OECD – see Organisation for Economic Co‐operation and Development.

197

Organisation for Economic Co‐operation and Development, 2001a, Measuring Productivity: Measurement of Aggregate and Industry‐level productivity Growth, OECD Manual, Paris, France.

Organisation for Economic Co‐operation and Development, 2001b, Measuring Capital: Measurement of Capital Stocks, Consumption of Fixed Capital and Capital Services, OECD Manual, Paris, France.

Olson, JA, Schmidt, P & Waldman, DM 1980, ‘A Monte Carlo Study of Estimators of Stochastic Frontier Production Functions’, Journal of Econometrics, vol. 13, pp. 67‐82.

Ondrich, J & Ruggiero, J 2001, ‘Efficiency measurement in the stochastic frontier model’, European Journal of Operational Research, vol. 129, pp. 434‐442.

Palangkaraya, A, Stierwald, A & Yong, J 2009, ‘Is Firm Size Related to Size and Age? The Case of Large Australian Firms’, Journal of Industry Competition and Trade, vol. 9, pp. 167‐195.

Pandya, AM and Rao, NV 1998, ‘Diversification and firm performance: an empirical evaluation’, Journal of Financial and Strategic Decisions, Vol. 11, No. 2, pp. 67‐81.

Pawsey, M L 2008, ‘Australian Preparer Perceptions Towards The Quality And Complexity Of IFRS’, Faculty of Law and Management, La Trobe University, viewed 13 May 2011,

Perelman, S 1995, ‘R&D, Technological Progress and Efficiency Change in Industrial Activities’, Review of Income and Wealth, vol. 41, No. 3, pp. 349‐366.

Perramon, J & Amat, O 2006, ‘IFRS Introduction And Its Effect On Listed companies in Spain’, University Pompeu Farbra, Barcelona, viewed 13 May 2011

Pink, B 2007, ‘Information Paper: Experimental Estimates of Industry Multifactor Productivity’, Australian Bureau of Statistics Catalogue Number 5260.0.55.001, Canberra, ACT.

Pink, B 2009, ‘Information Paper: Consumer Price Index: Concepts, Sources and Methods’, Australian Bureau of Statistics Catalogue Number 6461.0, Canberra ACT.

Poole, E & Bernard, JT 1992, ‘Defence Innovation Stock and Total Factor Productivity’, Canadian Journal of Economics, vol. 25, no.2, pp. 438‐52.

Productivity Commission 1999, Microeconomic Reform and Australian Productivity: Exploring the Links, Research Paper, AusInfo, Canberra.

Productivity Commission 2009, The Australian Government, Canberra, ACT, viewed 24 June 2011,

198

Reith, P 2001, ‘Australia Needs a Strategic Approach to Defence industry Policy’, A paper presented and spoken to by the Minister for Defence, The Defence National Procurement Conference, National Convention Centre, Canberra, ACT, 26 June 2001.

Rogers, M 1998, ‘Productivity in Australian Enterprises: Evidence from the ABS Growth and Performance Survey’, Melbourne Institute Working Paper No. 20/98, Melbourne Institute of Applied Economic and Social Research, University of Melbourne, Melbourne, Vic.

Rogers, M 1999, ‘Monopoly Power, Innovation and Economic Growth’, The Australian Economic Review, vol. 32, no. 1, pp. 96‐104.

Rogers, M 2003, ‘Competition, Agency and Productivity’, Melbourne Institute Working Paper No. 20/03, Melbourne Institute of Applied Economic and Social Research, University of Melbourne, Melbourne, Vic.

Rogers, M & Tseng, YP 2000, ‘Analysing Firm‐Level Labour Productivity Using Survey Data’, Melbourne Institute Working Paper No. 10/00, Melbourne Institute of Applied Economic and Social Research, University of Melbourne, Melbourne, Vic.

Rogerson, WP 1989, ‘Profit Regulation of Defense Contractors and Prizes for Innovation’, The Journal of Political Economy, vol. 97, no. 6, pp. 1284‐1305.

Reith, PR 2001, ‘Australia Needs a Strategic Approach to Defence Industry Policy’, paper presented and spoken to by the Minister for Defence The Hon Peter Reith eMP, Th Defence National Procurement Conference, Canberra, 26 June.

Schmidt, R 1992, ‘Defense Profit Policy and Capital Investment’, A RAND Graduate School Dissertation, RAND, Santa Monica, CA.

Schreyer, P 2004, ‘Challenges for productivity measurement in OECD countries’, paper presented at the 8th OECD – NBS Workshop on National Accounts, OECD Headquarters, Paris, 6‐10 December 2004.

Seiford, LM & Thrall, RM 1990, ‘Recent Developments in DEA: The Mathematical Programming Approach to Frontier Analysis’, Journal of Econometrics, vol. 46, pp. 7‐38.

Shestalova, V 2003, ‘Sequential Malmquist Indices of Productivity Growth: An Application to OECD Industrial Activities’, Journal of Productivity Analysis, vol. 19, pp. 211‐226.

Smith, S (Minister for Defence) 2011, paper presented at the 2011 Defence and Industry Conference, Adelaide, 28‐30 June, viewed 19 July 2011, .

Snowdon, W 2008, ‘Australian Innovation to Drive Defence Capability’, The Hon. Warren Snowdon MP Minister for Defence Science and Personnel media release 079/2008, Friday 20 June 2008.

199

Solow, RM 1957, ’Technical Change and the Aggregate Production Function’, The Review of Economics and Statistics, vol. 39, no. 3, pp. 312‐320.

Solow, RM 1963, Capital Theory and the Rate of Return, North‐Holland Publishing Company, Amsterdam.

Stansberry, W 1985, ‘New productivity incentive for defense contractors’, Harvard Business Review, Vol. 63, no. 1, pp. 156‐158.

Stigler, GJ & Friedland, C 1971, ‘Profits of Defense Contractors’, The American Economic Review, vol. 61, no. 4, pp. 692‐694.

Suarez, JM 1976, ‘Profits and Performance of Aerospace Defense Contractors’, Journal of Economic Issues, vol. 10, no. 2, pp. 386‐402.

The Department of Defence, 2008, Defence Annual Report 2007‐08, viewed 21 June 2011,

The Department of Defence, 2009a, Defence Capability Plan 2009, DPS, Canberra.

The Department of Defence, 2009b, The Strategic Reform Program Delivering Force 2030, DPS, Canberra, ACT.

The Department of Defence, 2009c, Expressions of Interest to Complete a Master of Philosophy in Research for the Defence Material Organisation with the University of New South Wales, Department of Defence DEFGRAM No. 577/2009, Canberra.

The Department of Defence, 2009d, The Defence White Paper 2009, DPS, Canberra, ACT.

The Department of Defence, 2010, Building Defence Capability: A policy for a smarter and more agile defence industry base, DPS, Canberra, ACT.

The Senate, 2006, Blue Water ships: consolidating past achievements, Standing Committee on Foreign Affairs, Defence and Trade, Canberra, viewed 3 May 2011,

Thomson, M 2006, ‘Competition in Australian Defence Procurement’, Growth, vol. 57, pp. 32‐ 39.

Trewin, D 2000 ‘Australian System of National Accounts: Concepts, Sources and Methods’, Australian Bureau of Statistics Catalogue no. 5216.0, Canberra, Act.

Trewin, D 2005 ‘Australian Consumer Price Index; Concepts, Sources and Methods’, Australian Bureau of Statistics Catalogue no. 6461.0, Canberra, Act.

Tsai, KH and Wang, JC 2005, ‘External technology acquisition and firm performance: A longitudinal study’ Journal of Business Venturing, vol. 23, no.1, pp. 91‐112.

200

Van Biesebroeck, J 2007, ‘Robustness of Productivity Estimates’, The Journal of Industrial Economics, vol. LV no. 3, pp. 529‐569.

Victorian Government 2006, Defence Industry Policy Review: Submission by the Victorian Government, Victorian Government Department of Innovation, Industry and Regional Development, Melbourne, Vic.

Walker, DM 2006, ‘DOD Acquisitions: Contracting for Better Outcomes’, Government Accountability Office GAO‐06‐800T, Washington, D.C.

Worthington, AC 2000, ‘Cost Efficiency in Australian Local Government: A Comparative Analysis of Mathematical Programming and Econometric Approaches’, Financial Accountability & Management, vol. 16, no. 3, pp. 201‐223.

201