<<

THE INTERRELATIONSHIPS BETWEEN TECHNICAL STANDARDS AND

INDUSTRY STRUCTURES: ACTOR-NETWORK BASED CASE STUDIES OF THE

MOBILE AND INDUSTRIES IN THE US AND THE UK

by

DAVID ALBERT TILSON

Submitted in partial fulfillment of the requirements

For the degree of Doctor of Philosophy

Dissertation Advisor: Dr. Kalle Lyytinen

Information Systems Department

Weatherhead School of Management

CASE WESTERN RESERVE UNIVERSITY

May 2008 CASE WESTERN RESERVE UNIVERSITY

SCHOOL OF GRADUATE STUDIES

We hereby approve the dissertation of

______David Albert Tilson

Ph.D. in Management candidate for the ______degree *.

Kalle Lyytinen (signed )______(chair of the committee)

______Bo Carlsson

Carsten Sørensen ______

Youngjin Yoo ______

______

______

December 7, 2007 (date) ______

* We also certify that written approval has been obtained for any proprietary material contained therein.

ii

Copyright © 2008 by David Albert Tilson

All rights reserved

iii

For my wife, Vera, and our children, Hannah, and Leah, with love . . . .

iv

Table of Contents

List of tables ...... vii

List of figures ...... viii

Acknowledgements ...... xi

Abstract ...... xii

I. Introduction ...... 1

II. Prior research ...... 6

The economic perspective ...... 6

Non-economic perspectives ...... 27

III. Proposed Theoretical Framework ...... 40

Selecting the primary theoretical perspective ...... 40

Unpacking translation during standards making and adoption ...... 42

Revisiting the research questions and objectives ...... 46

IV. Research Design ...... 50

Research Setting ...... 50

Research Method ...... 57

V. The Early Wireless Industry and and mobile wireless systems ...... 71

The early history of ...... 71

Fundamentals of radio communications ...... 76

Coordination of spectrum usage and telecom standards creation ...... 80

First generation (1G) analog cellular radio services ...... 87

Second generation (2G) digital cellular radio services...... 103

Discussion of wireless industries in the US and the UK ...... 128

v

VI. Changes in the Wireless Industry with the transition ...... 150

Overview of changes in the conceptualization of third generation (3G) wireless 150

New mobile services – SMS and 2. packet data ...... 152

The creation of 3G standards in Europe ...... 158

The creation of 3G standards in the US ...... 180

Discussion of 3G standards creation and industry structure ...... 193

VII. The Television Industry and Convergence ...... 236

The early technology in the UK and the US ...... 236

The television industry in the UK ...... 242

The television industry in the US ...... 247

The as a threat and a new platform for services ...... 253

Discussion about the television industry and convergence ...... 263

VIII. Mobile TV and Video: The emerging story in the USA and the UK ...... 285

The development of mobile video and television ...... 286

Mobile TV and video unicast services ...... 298

Mobile TV and video broadcast services ...... 304

Discussion of mobile TV and video ...... 316

IX. Discussion and Conclusions ...... 325

Revisiting the research questions ...... 325

Contributions, limitations and further research ...... 370

Appendix 1: Interview Guide ...... 379

References ...... 381

vi

List of tables

Table 1. Sub-optimal levels of by coordination ...... 14

Table 2. Summary of the economic effect of standards ...... 20

Table 3. Example effects of standards upon industry structure (Porter’s five forces) ...... 22

Table 4. Institutional Pressures ...... 36

Table 5. Mapping high-level research questions to sub-questions consistent with the

selected actor-network theoretical perspective ...... 48

Table 6. Alignment between research questions, constructs, data collection and analysis

...... 69

Table 7. Common band designations ...... 78

Table 8. Launch of analog cellular systems in major European countries and US (adapted

from Garrard, 1997) ...... 100

Table 9. US Case study - details of interviews and interviewees ...... 194

Table 10. UK Case study - details of interviews and interviewees ...... 195

Table 11. Content producer/on-line distributor partnerships (Amobi & Donald, 2007) 262

Table 12. Bundled products offered by major UK service providers, March 2007 (,

2007b) ...... 277

Table 13. Channel bundles offered on Orange TV (as of March 2007) ...... 299

Table 14. Channel bundles offered by Mobile TV on (March 2007) ..... 302

Table 15. Summary of findings across cases for Research Question 1 ...... 335

Table 16. Summary of findings across cases for Research Question 2 ...... 346

vii

List of figures

Figure 1. Categorization of standards creation processes 12

Figure 2. Standardization strategy formulation for actors in the problematization and

interessement phases 44

Figure 3. Central role of standards in wireless industry (Lyytinen & King, 2002) 52

Figure 4. Scatterplot showing the relationship between the profitability and competition

56

Figure 5. Venture capital investment by country of management and destination, 57

Figure 6. "2x4" longitudinal case study design 63

Figure 7. Electromagnetic Spectrum ("Electromagnetic radiation," 2007) 73

Figure 8. Technical developments and application of radio leading up 1G cellular mobile

services (adapted from Garrard, 1997, p. 2) 77

Figure 9. Basic components of a cellular radio system 89

Figure 10. Early US cellular handsets 92

Figure 11. Time line of major events in first generation cellular services in USA 93

Figure 12. Time line of major events in first generation (1G) cellular services in UK 102

Figure 13. High-level GSM System Architecture 110

Figure 14. Trend in share of UK mobile subscribers (Source: Financial Times) 115

Figure 15. Time line of major events in second generation (2G) mobile services in UK /

Europe 116

Figure 16. Time line of major events in second generation (2G) cellular services in US

127

Figure 17. Key actors and actions in the development of 1G cellular in the US 130

viii

Figure 18. Key actors and actions in the development of 1G cellular in the UK 131

Figure 19. Key actors and actions in the development of 2G in the UK / Europe 135

Figure 20. Key actors and actions in the development of 2G systems in the US 137

Figure 21. Major organizational actors in the wireless industry 138

Figure 22. Migration path for GSM operators to UMTS 170

Figure 23. 3GPP Technical Specification Groups 176

Figure 24. Timeline of major events in third generation (3G) standards creation and

adoption in UK/Europe 179

Figure 25. IMT-2000 frequency allocations with respect to existing 180

Figure 26. Migration path for D-AMPS operators to 3G 182

Figure 27. 3G Migration path for cdmaOne operators 185

Figure 28. Timeline of major events in third generation (3G) standards creation and

adoption in the USA 192

Figure 29. Major organizational actors in the wireless industry (Tilson & Lyytinen, 2006)

203

Figure 30. Summary of translations involving the network operator 222

Figure 31. Strategy formulation of network operator on introduction of data capability 223

Figure 32. Changes in the US wireless industry during the transition to 3G (Tilson &

Lyytinen, 2006) 235

Figure 33. Key actors and actions in the development of British TV standard 240

Figure 34. Time line of major events UK television 245

Figure 35. Timeline of major events in US television 252

Figure 36. Percentages of broadband users viewing video on their PCs 258

ix

Figure 37. Overview of TV industry in the UK in 2007 266

Figure 38. Overview of TV industry in the US in 2007 267

Figure 39. Key actors and actions in the development of multi-channel TV and service

convergence in the 270

Figure 40. Summary of key actors and actions leading to the convergence of residential

and computing industries 273

Figure 41. receivers 287

Figure 42. In-car video entertainment systems from 1990s/2000s 290

Figure 43. Portable DVD player and example of in-car installation 290

Figure 44. Portable media players 291

Figure 45. "Location Free" and wireless tablet 293

Figure 46. SlingBox and Sling Media’s viewer application on a laptop and a

295

Figure 47. SCH-x820 CDMA handset with integrated analog TV 297

Figure 48. Handsets supporting Orange's mobile TV offering (at May 2005 launch) 300

Figure 49. Examples of the Vodafone 3G handsets that support "Sky Mobile TV" 301

Figure 50. BT Movio mobile TV architecture (BT_Movio, 2007) 307

Figure 51. Virgin Lobster 700 TV phone 308

Figure 52. UK version of the IMT2000 / UMTS band plan 312

Figure 53. Model of actor-network and actor-network building in the mobile wireless and

television industries 357

Figure 54. Upward flexibility 361

Figure 55. Downward flexibility 362

x

Acknowledgements

I want to take this opportunity to thank my advisor Kalle Lyytinen for his support and friendship over the last several years. If it wasn’t for finding Kalle on the “same wavelength”

I may never have set out on this doctoral research and certainly would not have completed it.

Sincere thanks also to the other members of my committee, Bo Carlsson, Carsten Sørensen, and Youngjin Yoo. Thank you for all your guidance and support. Special thanks to Youngjin and Carsten for their support and participation in the collection of interview data. Thanks also to Carsten’s colleague at the London School of Economics, Jonathan Liebenau, for providing introductions to several executives in the UK telecom industry.

I would also like to express my gratitude to the companies that participated in the study and to the interviewees for their insightful contributions and openness. I also want to thank

Case Western Reserve University’s IS department faculty, particularly Matt Germonprez, and its PhD students for their comments on earlier drafts. I am also grateful to Colleen Gepperth and Tedda Nathan for their assistance over the years.

Finally, I must thank my family for their limitless support and encouragement; my parents from my days in Ballycarry onwards, my wife Vera for getting me started on the

Ph.D path, and Lisa Powers for helping Vera and me stay on track.

xi

The Interrelationships between Technical Standards and Industry Structures: Actor-

Network Based Case Studies of the Mobile Wireless and Television Industries in the US

and the UK

Abstract

by

DAVID ALBERT TILSON

Technical standards ensure compatibility among the components of complex systems.

Economists and others have studied standards selection, their effects on competition, and

how sub-optimal standardization outcomes vary by the mechanisms used to them.

Actual standards creation has received less attention and their wider effects on industry

structures are less understood. This research addresses three questions: (i) how standards

making and adoption plays out in the design and implementation of large systems, (ii)

how organizational and other actors coordinate with one another, with technology, and

with standards, and (iii) how the creation and adoption of standards relates to these

patterns of coordination.

These questions are explored using in-depth case studies of the US and UK mobile

wireless and television industries. Two cases examine the development of early cellular radio standards and the data capabilities that helped transform the into a computing and multimedia platform. Two other cases look at the TV industry, its

xii convergence with telecom, and the emergence of mobile TV services. The cases draw upon archival sources and interviews with 42 executive level interviewees.

The cases show that standards, along with the characteristics of natural phenomena, shape the coordination of technologies and organizations in these industries. Economic and some social theoretical perspectives exhibit too much technological or social determinism to satisfactorily explain the relationships between standards and industry structures observed. The actor-network based process model presented conceptualizes industry changes as the dynamic interactions among actors pursuing their standardization and other strategies. This is extended to incorporate the analytical domains proposed by

Lyytinen and King (2002): the innovation space, the marketplace, and the regulatory regime. The resulting model provides a high-level view of the actor-network, and of actor-network building, in both the mobile wireless and television industries.

Analog technologies sustained stable industry structures in the telecom, TV, and other industries by limiting the potential among them. The digital transition brought many industries closer together as well as providing platforms for service innovation and increased competition. The effects of digitization and the use of digital computing in these industries are conceptualized as a radical increase in the flexibility that key interfaces offer for actor-network building. This vastly increased flexibility is used to explain convergence, the explosion in the number of interfaces requiring standardization, and other industry and standardization changes observed in the cases.

xiii

I. Introduction

Public computing services entered the IS research mainstream in the mid 90’s with

the growth of the Internet, e-commerce and related services. These services became ever

more distributed, personal and mobile as they were delivered first on desktop, laptop and tablet PCs, PDAs and most recently on mobile phones. It is usually difficult to discern the requirements of the (potential) users of public computing services, and being geographically dispersed they are difficult to train and support. The large scale information systems required to offer these services increasingly rely upon heterogeneous technical components (e.g. extensive communication and information infrastructures) developed, or operated by autonomous organizations.

The coordination of the technical components cannot be readily achieved using proprietary specifications defined by a single firm and the design, implementation and operational activities cannot be achieved through vertical integration and hierarchical control. Coordination issues are becoming even more complex as web services provided by agents play more important roles in large scale information systems.

The technological components are necessarily coordinated using technical standards1 and the activities of organizations by processes embedded in the relationships among them – which vary from industry wide practices to unique bilateral arrangements. Technical standards have profound and lasting effects in defining markets, structuring competition

(David, 1995) and existing industry structures shape technical standards (Markus,

Steinfield, Wigand, & Minton, 2006). Examples of the interrelationship between

1 Defined as “a set of technical specifications adhered to by a producer, either tacitly or as a result of a formal agreement” (David & Greenstein 1990).

1

technical standards and industry structure include standard gauges in railways (Gordon,

1996; Puffert, 2000), de facto standard architectures in personal computers (Grove, 1996)

and air interfaces for wireless phones (Funk, 2002; Lyytinen & King, 2002).

The main objective of this dissertation is to develop a theoretical perspective for

improving our understanding of how standards making and adoption relates to the

patterns of coordination that emerge in an industry involved in the creation of large scale

information systems. This objective is expressed in the following research questions:

1. How does technical standards creation and adoption play out in the

construction of large scale information systems?

2. How do organizations build their relationships and coordinate with one another

and with technology during the construction of large scale information

systems?

3. How does standards creation and adoption interact with the ways that

organizations build relationships and coordinate with one another and

technology? In other words, how do standards interact with industry structure

and technical infrastructure?

4. How do existing technical and inter-organizational coordination mechanisms

affect the design and implementation of large scale information systems?

In this dissertation we review what economic, social and institutional perspectives

have to tell us about the interactions between technical standards and industry coordination patterns. We find that each has significant shortcomings and propose an

2

actor-network based model for how human and non-human actors align their interests to

build large scale (socio-technical) information systems.

Many new public computing services are being designed for delivery via mobile

wireless platforms. This is happening on a wide scale in Japan and Korea and many are

expecting similar growth in the US and Europe during the coming decade. Classic IS adoption studies (e.g. TAM or diffusion of innovation theory) have primarily looked at how individuals adopt services within organizations. Yet, provisioning mobile services

requires the building and integration of extremely complex heterogeneous systems

regulated by technical standards and inter-organizational coordination mechanisms

(Houssos, Gazis, & Alonistioti, 2004; Olla & Atkinson, 2004). Therefore before

analyzing how individuals can adopt such services we need to examine the creation of the

large scale information systems required to offer them.

The interrelatedness of the technical components comprising mobile systems requires strict compatibility at each interface for correct overall operation. The nature of the services requires new forms of coordination to be established among, and between organizations in the wireless communication, computing and content industries (Tilson &

Lyytinen, 2006). We apply an actor-network based model to the deployment of public computing services delivered to data capable mobile wireless handsets in the US and the

UK. Studying the provisioning of these services provides a unique opportunity to examine the dynamics around a disruptive technological shift and the reordering of formerly stable industry structures.

This dissertation contributes to IS research in four ways: First, the actor-network based models presented go beyond much of the standardization literature by examining

3

the dynamics of standards creation and adoption. It provides a systematic way of

describing standardization and its role in the building of large socio-technical systems in

the mobile wireless and television industries. Secondly, the wireless industry case studies

provide rich descriptions of how new classes of large scale public information systems

are developed and the importance of starting conditions on the resulting standards and

industry structures. Thirdly, the effects of digitization are conceptualized as a radical increase in the flexibility that key interfaces offer for actor-network building. This vastly increased flexibility in actor-network building is used to explain convergence, the explosion in the number of interfaces requiring standardization, and other industry and standardization changes observed in the case studies. Lastly, by exploring technological and industry change in such a dynamic and uncertain global setting, the study expands the IS field’s knowledge of how the actor-network perspective can be applied at the macro scale with many actors.

The next chapter introduces prior research on standards creation and adoption from both economic and non-economic perspectives. In the third chapter we present a more detailed discussion of the actor-network perspective and present an actor-network based process model of standards creation and adoption. The research setting and the case based research methodology are developed in the fourth chapter along with details of data collection and analysis.

First (1G) and second generation (2G) mobile systems essentially provided a mobile version of traditional fixed telephony. The extension of these systems to include mobile data capabilities and the provision of data retrieval, messaging, and mobile networked computing is a major transition for the mobile wireless industry. Studying the transitions

4

to these third generation (3G) systems in two countries allows us to compare the how the

transitions are affected by different initial industry structures (chapter 6). To provide an understanding of these initial structures we also examine the creation of the first and second generation wireless systems and services in the US and Europe in some detail

(chapter 5).

Given the breadth of actual and potential services that can be delivered on mobile platforms it is not feasible to address them all in detail. Rather than provide superficial coverage of a range of services this dissertation devotes two chapters to developing a deep understanding of the emergence of video services on mobile devices. These services are particularly interesting as they rely on video, and radio interface standards as well as bringing together the well established television and mobile wireless industries. The evolution of the US and UK TV industries and their convergence with the telecom industries is explored in chapter 7. The development of mobile TV and video services is described in detail in chapter 8.

The four research questions are addressed extensively for each of the four industry level case studies (chapters 5 through 8). The final chapter discusses and synthesizes the findings across all the industries. A model of the actor-network, and of actor-network building, in the mobile wireless and television industries is elaborated. The digitization of each of the industries studied had profound effects on their structures and interconnections. A conceptualization of digitization in actor-network terms in presented and its wider implications for practice and research explored.

5

II. Prior research

Before the publication of classic articles by economists Farrell & Saloner (1985) and

Katz & Shapiro (1985), there was surprisingly little in the literature about technical standards (Swann, 2000). Since then considerable research has been published – particularly in economics. Here we review the economics of standards literature and three other streams of literature (Social Construction of Technology, Actor-Network Theory and Institutional Theory) that provide contrasting perspectives of the relationships between technical standards and the ways that organizations coordinate with one another

and with technology.

The economic perspective

Much of the economics literature on standards deals directly with the creation and

adoption of standards although not specifically within the context of building large scale

information systems. In this review of prior economic research we first introduce the concepts of network externalities, switching costs and lock-in which pervade the economics of standards literature. We then examine what economics has to say about each of the research questions posed in the introduction.

Network externalities, switching costs and lock-in

Network externalities (or network effects) are said to be present when the value of a product to a user depends on the number or other users of the product. Leibenstein (1950)

6

first looked at this effect, which he referred to as the "bandwagon effect," by which he meant (Besen, 1999) “the extent to which the demand for a commodity is increased due to the fact that others are also consuming the same commodity. It represents the desire of people to purchase a commodity in order to get into the swim of things; in order to conform with the people they wish to be associated with; in order to be fashionable or stylish; or, in order to appear to be one of the boys." Rohlfs (1974) later applied this perspective to understand how telecommunications networks developed. A ’s value increases with the number of other telephone users. Technologies exhibiting positive network externalities, like , experience the “bandwagon effect” when the utility created by a critical mass of users exceeds the costs associated with adopting the technology. Getting the bandwagon rolling for a new technology may require coordination among prospective buyers or under pricing by producers to make the technology more attractive to early adopters. This is especially problematic where the new technology is aimed at replacing an existing technology with existing positive network externalities (Farrell & Saloner, 1985).

There are two sources of network externalities. Direct network externalities exist where the value of the network increases directly with the number of other users of compatible products or services. For example, the value of machines, telephones and computer applications depend in a very direct way on the number of other users of these technologies (Rohlfs, 1974). As direct network externalities affect demand they are sometimes referred to as demand-side externalities. Indirect network externalities are said to exist where each user must possesses two or more components to derive benefits of a network e.g. computer hardware and software (Michael L. Katz & Shapiro, 1994) or

7

VCRs and prerecorded tapes. In a static model the value of hardware (e.g. computer or

VCR) does not depend directly on the number of other users of the hardware. However, in a dynamic model indirect network effects emerge from the incremental effect of each user’s adoption of the technology on the future supply of software or compatible hardware. Automobile owners benefiting from having a repair network and a ready supply of parts is another example of indirect network effects (Swann, 2000). Indirect network externalities affect the supply-side via economies of scale and the demand-side

(indirectly) by lowering costs.

Prior to committing to a standard companies and individuals have considerable flexibility. However, once they select a standard they generally stick with it as to do otherwise would incur switching costs (Farrell & Shapiro, 1988). Such customers are said to be subject to lock-in (Cowan, 1990; David & Greenstein, 1985). Others have questioned the sustainability of such lock-in effects (e.g. Liebowitz & Margolis, 1990).

Many papers (e.g. Farrell & Saloner, 1992) examine the use of converters and adapters as a means of bridging networks and accessing network externalities of the dominant network or standard. However, the use of such techniques can slow the process of a single standard achieving dominance (David & Steinmueller, 1990).

Standard creation and adoption (RQ1)

The coordination mechanisms used to create standards include market leadership, negotiation in committees, and government regulation (David & Greenstein, 1990; Farrell

& Saloner, 1988). While these mechanisms exhibit differing characteristics each can lead to sub-optimal levels of standardization (too much or too little).

8

Standards that emerge from adoption in the marketplace are referred to as market or de facto standards. Unsponsored de facto standards are public specifications freely available to all (David & Steinmueller, 1994) whereas sponsored standards are promoted by one or more firms. Under-standardization may occur where the costs of creating the standard would fall unfairly on some firms and where they could not internalize the positive network effects (i.e. the shared benefits are not large enough for a firm to invest in creating a standard).

When standards compete in the marketplace it is not necessarily the technology with the best technical performance that wins. The QWERTY keyboard layout (David &

Greenstein, 1985), personal computers and audio/video recording formats are frequently cited examples (Swann, 2000). The winner “is the one that has been most effective at building a wide network of followers, and of support products from third party producers

(e.g. software) that conforms to his standard (Swann, 2000).” Lock-in to existing inferior standards (excess inertia) leading to under-standardization (Farrell & Saloner, 1985) is a source of market failure where a superior standard is available.

Conversely, over-standardization can occur where positive network externalities created by the choices of the few starts an unstoppable bandwagon rolling (referred to as excess momentum by Farrell and Saloner (1985)) despite its reducing overall social welfare – another source of market failure. A firm may take strategic actions (e.g. price subsidies) to have a proprietary technology become a sponsored standard. The owner of such a standard may be able to extract monopoly rents and leverage its greater knowledge

of the standard in the development of complementary products – charges frequently

9

leveled against for example. On the other hand sponsored standards provide strong incentives for technological innovation (Blind, 2004).

Inter-firm cooperation in standards committees can provide a forum for overcoming the coordination problems associated with market standards. However, over- standardization or premature standardization can occur if a committee is rewarded for the quantity of standards produced rather than their economic or social impacts. Under represented parties (e.g. consumers, small firms or new actors that will later rely on the standard) may be disadvantaged by a standardization agenda driven by strong firms. On the other hand under-standardization (or delayed standardization) can result where committee participants have conflicting interests. A late or inadequate standard can result in the dissipation of potential positive network effects and economies of scale.

In the industry a distinction is also commonly made between two types of committee in which standards are developed – official national and international

Standards Development Organizations (SDO) which develop officially recognized de jure standards, and industry consortia which develop industry standards. Industry developed standards can be officially recognized by an SDO and given de jure status (e.g.

ANSI recognizes many standards created by the IEEE).

In international SDOs participation is typically at the national level with membership open to all countries in a region or globally. Conflicts of interest result from differing industrial policies or national interests (Schmidt & Werle, 1998). Participation in industry consortia is usually at the organizational or even individual level. Conflicts of interests result from firms playing out differing strategies to internalize the benefits of standardization. Where different alliances of organizations establish competing industry

10

consortia choosing the appropriate alliance to join can be an important strategic question

(Axelrod, 1995).

Adoption of a standard can be promoted by a hybrid of committee coordination and marketplace competition (Farrell & Saloner, 1988). For example, the Open Software

Foundation (OSF) and the Unix International, Inc. (UII) were competing forums for increasing software compatibility across Unix operating systems in the 1980s (Axelrod,

1995). Participants in these competing standards committees continued to compete in the

marketplace. More generally market concerns are likely to play an important role within

standards committee negotiations.

Individual rational standards selection decisions based on expectations of future network effects can lead to outcomes that are sub-optimal for an industry or society.

These negative effects could, in principle, be overcome using government regulation (e.g. by mandating the use of de jure standards for participation in a particular marketplace).

However, government has been described as a “blind giant” in its attempts to influence standards for the public good (David, 1987). Blind since it can not foresee the future

impact of its decisions on the path of technological development. There may be but a

small, or non-existent, “window of opportunity” in which government overcomes it blindness before it loses its power to influence outcomes. Lobbying of government by angry orphans locked-in to older standards could also contribute to under-standardization or late standardization of promising new technologies. Recognizing such limitations governments may choose to restrict their involvement to creating frameworks for standardization – leaving standardization itself to private committees. The range of

11

coordination mechanisms used to create standards is illustrated in Figure 1 and the possible sub-optimal outcomes summarized in Table 1.

Standards Creation

Market Standards Committee Standards Government (de facto) (Formal Standards) Regulation

Sponsored Unsponsored Standards Industry Development Consortia Organizations e.g. MS Widows e.g. TCP/IP

de jure de jure industry de jure (regulation)

e.g. GSM e.g. e.g. e.g. EMC 802.3

Figure 1. Categorization of standards creation processes

While there has been considerable work in economics on standards creation and adoption the focus has largely been on how network externalities lead to winner-take-all de facto standards (Weitzel, Beimborn, & König, 2006). There is a bias towards rigorous theoretical approaches using highly stylized models that abstract away almost all the details of the technology and the wider context. The applicability of findings to real world standards creation settings, particularly where standards creation relies on committees, is limited. Even where the performance of committee and market mechanisms is compared (e.g. Farrell & Saloner, 1988) the focus has been on standards selection rather than creation.

12

Model based theoretical economics suffers from technological determinism

(Howcroft, Mitev, & Wilson, 2004) – i.e. since technology/standards are treated as

exogenous they are taken to be causes of economic and social outcomes. The

identification of the need for standards and the social processes involved in their creation

are not addressed except in very limited ways – e.g. alliance building is abstracted to a

choice between pre-existing alliances (Axelrod, 1995).

The high degree of interconnectedness of the subsystems in large scale information

systems makes it difficult to assume that important compatibility standards are developed

independently of on another. The desire for variety in one domain may well be dependent

on standardization in a complementary domain or sub-system e.g. the diversity of Internet

applications relies on standardized TCP/IP and other low level communication protocols.

In summary, the most important gaps in the economics literature on standards

creation and adoption include (a) the omission of the actual creation of the technical

standards selected, (b) the lack of an explanation for coordination mechanism selection, and (c) the disregard for the interrelatedness of choices in standards creation.

Relationships among organizations and with technology (RQ2)

Next we review two of the classic economic ways of viewing the patterns of coordination in an industry – the key element of our second research question. From a transaction cost economics (TCE) perspective (Williamson, 1985) the relationships among industry participants and the extent of vertical integration is determined by the cost of the transactions among industry participants. Transactions costs are categorized as

(a) search and information costs, (b) bargaining costs, and (c) policing and enforcement

13

costs. Higher transaction costs generally favor vertical integration and hierarchical coordination while low transaction costs favor outsourcing and market based coordination.

Coordination Over-standardization Under-standardization mechanism or premature standardization or late standardization Market • Positive feedback from network • Costs of creating a standard cannot be externalities of early choices recovered as resulting positive influences final outcome externalities cannot be internalized • Predatory pricing setting or other • Co-ordination problem strategic action results in • Lock-in to old standards sponsored standard and monopolistic behavior

Committee • Incentive to produce too many • Interests of dominant firms or (SDOs) standards individuals hinder standard • Dominant firms or individuals shape standard

Committee • Interests of dominant firms or • Competing consortia unable to get (Industry consortia) individuals shape standard bandwagon rolling

Government • Blind giant • Poor understanding of technology institutions development • Narrow window for informed action • Angry orphans lobby for old standard

Table 1. Sub-optimal levels of standardization by coordination mechanism used for standards creation (adapted from Blind, 2004)

Porter’s (1980) methodology for analyzing industry structure includes examining the relationships among firms in the industry as well as with its suppliers and customers. The relationships among firms in the same industry revolve around rivalries shaped by the industry’s growth, product differences, concentration, diversity of competition, fixed costs and the presence of exit barriers. The relationships with customers and suppliers focus on bargaining power. For customers the factors are customer concentration and volume, customer information, impact on quality and performance, ability to backward integrate and price sensitivity. For the relationship with suppliers the factors are the impact of inputs on cost or differentiation, differentiation of inputs, supplier

14

concentration, and the importance of volume to supplier. The other “forces” influencing the profitability of an industry identified by Porter are barriers to entry, and the availability of substitutes.

In Porter’s discussion of industry rivalry the industry is defined narrowly as those firms producing fairly close substitutes competing with rivals, suppliers and customers.

The five forces model does not consider cooperation among players – for example it omits complementors and strategic alliances (Brandenburger & Nalebuff, 1997). The idea of substitutes in the model is one element of an organization’s relationship with technology – but again it misses products that do, or could conceivably, complement the firm’s current offering (e.g. hardware / software, distribution / content).

The central role of the firm and its most direct competitors in the model also limits the consideration of the relationships among the other participants in the wider industry.

The committee and government coordination mechanisms used in standards creation are

also largely missing from the model.

Both the transaction cost economics and competitive strategy perspectives are static.

Porter’s approach to competitive strategy focuses on identifying and defending a

profitable market position, and TCE focuses on firm boundaries in an otherwise fixed

value system. The emphasis is not on how technical or process innovation changes

industry structures, or how firms can stake out attractive positions in a new structure – despite the importance of innovators in economic history (Chandler, 1977).

By treating technology and standards as exogenous these perspectives can only

consider their impact on aspects of industry structure. They are unable to consider the

influence of the industry structure or wider aspects of society on the technology or

15

standards. Thus, TCE and competitive strategy can be criticized, along with much of neoclassical economic perspective, as suffering from technological determinism.

Relationships between standards and the patterns of coordination (RQ3)

Having reviewed the main economic perspectives on (a) standards creation and adoption, and (b) the relationships among organizations and technology, we can now examine the interrelationships among them.

We have noted that model based theoretical economics focuses upon winner-take-all de facto standards. We can deduce that commercially oriented firms have strong incentives to create and sponsor such standards – at least where network externalities can be internalized. Conversely, their competitors (and regulators) may have strong incentives to prevent them. Economists draw up this theory to provide practitioners with practical advice about how to win standards wars (e.g. Shapiro & Varian, 1998).

While rigorous economics models do not consider the influence of industry structure or industry participants on standards creation, more empirically based economics research does offer some additional insight. In this section we first review what the economic literature has to say about the economic effects of standards in general and on competition in particular. We then map these findings to the transaction cost and Porter’s competitive strategy perspectives. Finally, we consider the interrelationship between innovation and standards, the role of path dependence and look towards a more dynamic conception of the relationship between standardization and industry structure.

16

Standards and their economic effects

David (1987) proposed three categories of economic problems that standards are intended to solve: compatibility, quality and variety reduction, but later added a fourth reference (or informational) category (David & Steinmueller, 1994). Real world standards usually fulfill more than one of these functions and often cannot be placed in a single category. Nevertheless the ideas behind the categories provide an excellent means of analyzing and understanding the economic effects of standards.

The compatibility between products based on standards has several benefits that can speed the diffusion of a standard. Increased network externalities allow the of products in a wider system and promote the production of complementary products and services within the system. The compatibility also reduces obsolescence risk by facilitating the substitution of more advanced components in the future. Compatibility standards promote adoption by reducing switching costs and buyers’ fear of lock-in and can provide a platform for continued technical innovation by reducing uncertainty and risk.

Standards can increase the network externalities for both producers and consumers.

Standards can reduce or eliminate the switching costs and lock-in effects associated with proprietary technologies. Standards addressing compatibility are particularly important for information and communications technologies (Swann, 2000).

Minimum quality standards can reduce the information asymmetry between buyers and sellers (Ackerlof, 1970). By defining at least the minimum performance of the product or service the standard reduces the uncertainty associated with it and reduces both search and transaction costs. Some minimum quality standards (e.g. safety and

17

environmental standards) can also be used to correct for negative externalities. They also

bring with them the danger of “regulatory capture” where high cost producers lobby for

standards imposing high quality levels that restrict supply, eliminate lower cost rivals,

and thereby increase profits (Leland, 1979).

The reduction in variety prompted by standards allows economies of scale to be

realized and focuses the actors in the industry on a particular technological solution. This reduces uncertainty for buyers and sellers, reduces development costs and lowers prices by enabling mass production (David, 1987). However, reductions in price and in uncertainty come at the expense of choice (Dixit & Stiglitz, 1977; Farrell & Saloner,

1986) and lower utility for consumers whose needs are not satisfied by standards based products. Producers competing “within” a standard rather than “between” incompatible standards may well also face increased competition (Swann, 1985).

Standards can also provide information confirming that a product is what it is supposed to be e.g. gasoline grades (Swann, 2000) are information standards. By adhering to an information standard a producer can confirm that a product is what it is supposed to be so that a buyer does not have to perform his own tests (Tassey, 2000).

This reduces risks and transaction costs for both producer and buyer. Measurement standards provide similar benefits (Barber, 1987; Swann, 1999). The gasoline example also highlights that such standards can be seen as hybrids of the other three categories as gasoline grades also provides the benefits associated with compatibility (e.g. drivers can fill up at any gas station), variety reduction (e.g. economies of scale in production and distribution of just a few grades) and providing an assurance of minimum quality.

18

Technical standards documents are also informational as they aid the diffusion of

codified technical knowledge. Standard’s economic effects are summarized in Table 2.

The technical interrelatedness of the elements in a telecommunications or other network requires strict compatibility at each interface for the correct overall operation of the network. Coordination of the interfaces can be achieved using proprietary specifications defined by a single firm. Alternatively, standards based specifications can be used to allow complementary sub-systems from many firms to be combined within telecommunications networks.

David and Steinmueller (1994) examined the economics of compatibility standards on

the telecommunications industry. Standards are seen as reducing transaction and

switching costs as well as increasing competition between vendors for compatible sub-

systems. They lower barriers to entry by allowing smaller firms to specialize on only

parts of complete networks. The potential anti-competitive effects of standards largely

revolve around the ability of dominant firms to shape the standards in their own interests

and against competitors’ interests. Nevertheless, compatibility standards are seen as a precondition for the deregulation of network based “natural monopolies” such as telecommunications even if there remains the possibility of dominant firms using standards to limit competition.

19

Positive economic effects Negative economic effects Compatibility • Promotes positive network externalities • Bandwagon momentum (installed • Avoids lock-in and reduces possibility of base effects) may give owners of de suppliers imposing switching costs facto standards unwarranted market (unsponsored standards) power • Reduces entry barriers - firms can • Procurement standards of big specialize on part of a complete system purchasers may drive de facto • Opens up more possible combinations of standards (excess inertia/momentum) compatible products • Incentive to establish large installed • Can provide a platform for innovation base may lead to predatory pricing • Risk of standards setting dominated by major vendors with more resources for R&D and committee participation

Minimum • Certification of conformance tends to • Regulatory capture by high cost quality promote consumer surpluses producers can limit competition - Reduces information asymmetries • Non-tariff barriers to foreign - Reduces transaction costs competition • Can correct negative externalities

Variety • Economies of scale • Less choice may lower utility for some reduction • Building focus and critical mass for new • Market concentration technologies • Dominant firms can raise entry • Reduces non-functional differentiation barriers by channeling innovation and the dimensions of product towards areas where they retain comparison. advantages or IPR • Reduces monopoly rents • Performance requirements can • Reduces opportunities for leveraging remain ambiguous if only interfaces monopoly power through interface are defined – residual interoperability manipulation uncertainty can increase design and production costs

Informational • Promotes positive network externalities • Bandwagon momentum (installed • Avoids lock-in and reduces possibility of base effects) may give owners of de suppliers imposing switching costs facto standards unwarranted market (unsponsored standards) power • Reduces entry barriers - firms can • Procurement standards of big specialize on part of a complete system purchasers may drive de facto • Opens up more possible combinations of standards (excess inertia/momentum) compatible products • Incentive to establish large installed • Can provide a platform for innovation base may lead to predatory pricing • Risk of standards setting dominated by major vendors with more resources for R&D and committee participation

Table 2. Summary of the economic effect of standards2

It is clear that the economic effects of standards (summarized in Table 2) are not overwhelmingly positive or negative. In evaluating the effects of standards on

2 Extension of table from Swann (2000)

20

competition David and Steinmueller (1994) stated that “generalizations about

standardization being pro-competitive or anti-competitive are virtually certain to be

untenable (p220).” The economic outcomes depend upon the particular characteristics of

the standards and the industries into which they are introduced. So, while the focus

continues to be on the economic and competitive effects of standards we can start to see

that commercially oriented firms and regulators have incentives to influence the creation

of standards and that they ways they will try to influence them is at least somewhat

dependent upon the existing industry structure (e.g. whether there are dominant firms).

Standards, TCE and Porter’s Five Forces

From a transaction cost economics perspective (Williamson, 1985) technologies and standards affect the boundaries of, and the relationships among, industry participants by

changing the cost of the transactions among the industry participants (Wigand, Steinfield,

& Markus, 2005). Similarly, a competitive strategy perspective would expect the effects

of standards on industry structure to appear in the factors underlying industry rivalry,

threat of substitutes, entry barriers as well as the bargaining power of suppliers and

customers.

The effect of technology on transaction costs has received some attention in the IS

literature. For example, the introduction of interorganizational systems (IOS) has the

potential to reduce transaction costs. Some literature points to cases where such systems

have promoted industry consolidation (Clemons & Row, 1988), increased outsourcing

(Kraut, Steinfield, Chan, Butler, & Hoag, 1999; Malone, Yates, & Benjamin, 1987) and

more specialization in the value chain (Wigand et al., 2005). In addition information

21

systems generally reduce the barriers to entry in information industries (Clemons, Gu, &

Lang, 2003). Standards play their part by reducing the risks for all parties and the need for relationship specific technology investment.

Despite its shortcomings Porter’s five forces model of industry structure remains useful for conceptualizing the effect of technology and standards on at least some aspects of industry structure (Porter & Millar, 1985). As with the analysis of the general economic effects of standards it is not possible to state simple relationships between standards and industry structure – see Table 3 for a variety of examples of the effects of standards upon industry structure.

Increase attractiveness of industry Decrease attractiveness of industry Intensity of • Network externalities of standards • Decrease product differentiation Rivalry drive industry growth • Open standards open closed systems to among • Infrastructure standards provide competition existing innovation platform (e.g. TCP/IP) • Standards make complex infrastructure competitors • Firms differentiate on quality rather investments feasible which, once made, than minor functional differences create high fixed costs and exit barriers

Barriers to • Can increase economies of scale • Open standards can erase product Entry • Proprietary standards protect differences and reduce switching costs differentiation • Increase access to suppliers • Infrastructure standards can increase • Decrease capital requirements for costs for network builders providers of partial solutions • Proprietary standard may attract attention of regulatory agencies Pressure • Control over de facto standard may • Open standards can reduce switching from protect market from substitutes costs Substitutes • Ability to deliver highly customized • Standards based solutions can be more products from even standard easily imitated components may not be easy to emulate (e.g. Dell’s strategy)

Bargaining • Increases firm’s ability to bypass • Easier to compare product prices/features power of influence of channel • More credible threat of backward customers • More credible threat of forward integration integration • Support aggregation of customer orders

Bargaining • Promote ability to aggregated orders • Threat of forward integration more power of • Promote commoditization credible suppliers

Table 3. Example effects of standards upon industry structure (Porter’s five forces)

22

Standards and Innovation

Innovative technologies can have more radical effects upon the pattern of

relationships in an industry than those considered by TCE or the five forces model. It can

influence the scope of where firms choose to compete along several dimensions i.e.

which geographies, market segments and industry segments to participate in and the level

of vertical integration (Porter & Millar, 1985). As standards shape the direction of

technological developments through limiting variety, providing a platform for further

innovation or reducing uncertainty and risk, they play a role in this broad conception of

industry change. Thus, innovative technologies and standards can have a variety of

economic effects that impact competition and industry structure in a broad sense. Not

only can firm and industry boundaries change, but whole new industries can emerge.

The impact of standards on the direction of technological change increases as they become less a codification of regularities of practice than an early stage of design specification (David, 1995; Lyytinen, Keil, & Fomin, 2008). The creation of anticipatory standards has the character of shared R&D and plays a part in coordinating the roles of organizations in advancing technology development. The creation of anticipatory standards is important in the development of modern information systems – particularly for large scale and heterogeneous information systems, where a myriad of technical components of different sorts and diverse organizations need to be coordinated. Such anticipatory standards shape the subsequent trajectories of technologies and may well also shape the relationships among individuals and organizations. As existing industry structures are likely to play a large part in determining who participates in anticipatory

23

standards making and the interests that they pursue we can anticipate that industry

structure has an influence upon standards making.

The effect of initial conditions (RQ4)

Much of theoretical economics takes a static view of standardization – focusing on finding equilibria for standards selection problems. The dynamic nature of modern

technologies, particularly information and communication technologies, limits the

usefulness of such perspectives for understanding the creation and diffusion of standards

– the narratives around standards creation seldom reflect steady progress towards equilibrium. The concept of path dependence (W. B. Arthur, 1989; David, 2000) recognizes that outcomes are dynamically dependent upon historical events and even the order of these events. David (1985) argued that three conditions lie behind path dependent allocations: technical interrelatedness of system components (e.g. keyboard layout and typist’s skills), increasing returns (e.g. increasing value of dominant keyboard standard) and quasi-irreversibility of investment (e.g. switching to different keyboard layout standard). In disputing path dependence, and particularly its implications for market failure, Liebowitz and Margolis (1990; 1995) argue that “purposeful behavior” of economic actors needs to be accounted for.

We have found evidence in the economics literature that technical standards can have a range of effects upon industry structure and argued that existing industry structures and patterns of coordination can influence standards making and adoption. We further note that anticipatory standards are particularly important in building large scale information systems. We therefore consider the coordination of technologies through technical

24

standards and the coordination of multiple organizations involved in building large scale

information systems as a dynamic path dependent process. We conceive of path

dependence resulting from the interactions among the underlying positive-feedback

mechanisms, the purposeful, or strategic action of economic actors, and the occurrence of

unforeseeable or accidental event (Puffert, 1999).

Limitations of the economic view of standards creation and adoption

The focus of rigorous theoretical economics on de facto standards and the use of

models with highly restrictive assumptions (Swann, 2000) limits its usefulness for

addressing questions dealing with standards making in the design and implementation of large scale information systems. The stylized models are unable to address how

coordination mechanisms used in standards creation are selected, change with industry or

product maturity, differ according to who or what is being coordinated, or how they relate

to other coordination mechanisms used in the industry. The focus on finding equilibria

leaves a major gap in our understanding of the ways organizations participate in the

dynamic multi-player games that take place in both standards committees (Markus et al.,

2006) and the marketplace. By only taking account of a single choice the models do not

consider interactions among standards making or selection throughout a value-network,

or the industry level outcomes from a string of decisions. The high degree of subsystem

interconnectedness in very large scale information systems makes it difficult to assume that compatibility standards are developed independently of one another. The desire for

variety in one domain may well be dependent on the standardization in a complementary

25

domain or sub-system e.g. the diversity of Internet applications relies upon TCP/IP and

other low level standard protocols (Hanseth, Monteiro, & Hatling, 1996).

More empirically based economics provides evidence and a basis for argument for the

dynamic, path dependent interrelationship between standards making and adoption, and

the patterns of coordination among organizations (or industry structure). Commercially

oriented firms strive to create and capture the economic benefits associated with

standards or to avoid suffering the potential negative effects. Regulators or governments

may be motivated to overcome the negative social welfare outcomes of under, or over-

standardization. Firms and government interact in the shaping of the standardization and

competitive landscapes e.g. through regulatory capture. Although it does not appear

possible to describe simple relationships between standards making and industry structure

it is apparent that complex and significant relationships do exist and that they are

important to industry participants and to any regulators of those industries. Despite comprehensive theorizing about the effects of standardization there remains a lack of

systematic empirical research on whether different approaches to participation in

standardization affects firm competitiveness, profitability, knowledge acquisition or other

dependent variables (Mansell, 1995).

There are other important gaps in what economics can tell us about standards creation

in the context of building large scale information systems. The identification of the need

for standards and the social processes involved in their creation in particular have not

been addressed (Mansell & Steinmueller, 2002; Schmidt & Werle, 1998). By treating

technology and standards as exogenous, economics is unable to account for the influence

of the social dynamics and structure upon technological development.

26

Non-economic perspectives

The two main perspectives dealing with the social shaping of technology (Howcroft

et al., 2004) are Social Construction of Technology (SCOT) and Actor-Network Theory

(ANT). Both attempt to explain the interrelationships between technology and society –

during the creation and use of technology. They critique the dominance of technological

determinism – views where technology is treated as a given and where social and

organizational change is largely or wholly determined by technology (Bijker, 1995;

Howcroft et al., 2004). In these approaches the technological “black box” is opened and

the role of the social within technology examined.

The Social Construction of Technology

The Social Construction of Technology (SCOT) is one of the main perspectives

dealing with the social shaping of technology (Howcroft et al., 2004) It attempts to

explain the interrelationships between technology and society during the creation and use

of technology. It critiques the dominance of technological determinism – views where

technology is treated as a given and where social and organizational change is largely or

wholly determined by technology (Bijker, 1995; Howcroft et al., 2004). SCOT focuses on

how the social environment shapes technology. In doing so it opens the technological

“black box” and examines the role of the social.

In SCOT relevant social groups play the central role in defining and solving the problems during technology development. The meanings attributed to the developments

27

differ by group as do their understanding of the problem and their criteria for success and

failure. The technology stabilizes (or reaches closure) when the relevant social groups perceive that their problems have been solved. The development of a technology requires on-going re-negotiation among the groups. Where diverse meanings of a supposedly unproblematic technology exist it is said to possess interpretive flexibility (Pinch &

Bijker, 1987).

Bijker (1997) later added technological frames to SCOT’s vocabulary. “A technological frame refers to the structure of rules and practices that enable and constrain the interactions among the actors of a relevant social group. A technological frame is built up when interaction around an artifact builds up. It comprises heterogeneous elements (goals, problem-solving strategies, scientific theories, tacit knowledge, testing procedures, design methods) that influence the interaction with relevant social groups and lead to the attribution of meanings to technical artifacts – and thus to constituting technology (Howcroft et al., 2004).”

Standards creation is a particularly visible example of social shaping of technology.

The negotiation involved in committee standardization allows the interests of many more parties to be considered more quickly than would be the case in market based standardization (Lyytinen et al., 2008; Schmidt & Werle, 1998). At least for standards creation in committees some of the concepts associated with SCOT are easier to identify than in more general settings. The institutional setting helps explain how some groups become the relevant social groups while voting mechanisms tend to bring closure and promote stabilization.

28

SCOT, unlike the economic perspectives, therefore allows us to examine the

influence of the social and the context upon the development of standards and technologies. However, the perspective has been accused of a form of social determinism

since it focuses almost exclusively upon the social and the context at the expense of the

content and structure of the technology.

Actor-Network Theory

In addition to rejecting technological determinism Actor-Network Theory also rejects

the social determinism of SCOT. ANT strives to avoid both types of determinism by

removing the distinction between the social and the natural/technical – regarding them

symmetrically and as “phases of the same essential action (Latour, 1991, p. 129, p129).”

Next we introduce this perspective and consider its application to our research questions.

ANT views the world as networks of technical, natural and social actors (or elements)

and treats them symmetrically. For Latour (1998) “there is nothing but networks.” He

(1992) describes modern societies as having a “fibrous, thread-like character” and argues

that actors are defined solely by their ties to other actors. ANT does not distinguish

between macro and micro actors (i.e. individuals, groups, or organizations). Actors can

also be technical artifacts ranging from the smallest component to the largest system3.

The building of actor-networks is the process of overcoming the resistance of all sorts of

actors and weaving them into networks with other actors (J. Law, 1992). The challenge is

to explore how actor-networks come to generate effects like organizations, power,

innovations, standards, or industry structures.

3 As actors can be human or non-human we purposely use the pronoun “it” rather than “him” or “her.”

29

The core of ANT analysis is the process of translation (Callon, 1986; Latour, 1987)

where actors align the interests of others with their own. Translation follows three phases.

During problematization, a focal actor frames the problem and defines the identities and

interests of other actors to be consistent with its own interests. The focal-actor renders

itself indispensable by defining an Obligatory Passage Point (OPP) under its control that

other actors must pass through to achieve their interests (Callon, 1986). The OPP is typically in the focal actor’s direct path while others may have to overcome obstacles to pass through it (Callon, 1986; Sidorova & Sarker, 2000). For example, control of the

Windows API (an OPP), and the resulting huge actor-network aligned with its interests, gives Microsoft considerable power.

The definition of others’ interests and of the OPP are part of an actor’s strategy for

aligning others’ interests with its own. Other elements might include creating incentives to encourage others to overcome obstacles to passing through the OPP. In the second translation phase, interessement, the focal actor executes these strategies to convince other actors to accept its definition of their interests. The final phase, enrollment, is the moment when another actor accepts the interests defined by the focal actor. Enrollment also includes the definition of roles for actors in the newly created actor-network.

During translation the focal actor assigns interests, projects, desires, strategies, reflexes and afterthoughts (Callon, 1991) to others. Enrollment implies a degree of acceptance of the assigned roles and this plays a large part in how certain relationships among human and technological actors become inscribed in technical standards and work practices. However, actors may not fully assume the assigned role and the possibility of resistance through interpretive flexibility allows for re-inscription (Howcroft et al., 2004

30

p.346). Thus the outcomes of actor-networks building and creating inscriptions can be

unpredictable. For example, a classification scheme for nursing work intended to improve

professional recognition for nurses may also have unforeseen consequences for their

relationships with other actors in healthcare (Bowker, Timmermans, & Star, 1996).

Actor-networks with strong, stable ties can become taken for granted and used as

“packages” or “resources” in the continued construction of actor-networks (Latour,

1987). These “black-boxes” can include agents, devices, texts, relatively standardized

sets of organizational relations, social technologies, boundary protocols or organizational

forms (J. Law, 1992). For example, Bowker et al. (1996) found that a classification

scheme of nursing work acted as a black-boxed political actor. Boland and Schultze

(1996) explained how Activity-Based Costing became black-boxed through the

enrollment of allies. However, black-boxes continue to face resistance – while they are

maintained by being performed and reproduced no organization, innovation or standard is

ever complete as actors can defect at any time (Callon, 1986).

Black-boxes can exhibit the property of irreversibility – “the extent to which it is

subsequently impossible to go back to a point where that translation was only one

amongst others; and the extent to which it shapes and determines subsequent

translations.” (Callon (1991 p.150) Irreversibility not only makes it difficult to undo

previous translations, but also constraints future possibilities4.

The actor-network perspective and the growing stream of ANT based research into

intangible technologies like standards, provides some insight into the research questions.

Several studies have explored the interactions among technical interface standards,

4 Irreversibility bears a close resemblance to what David (2000; , 1985) referred to as path dependence in which accidental or serendipitous historical choices limit subsequent economic decisions. Arthur (1989) added that the even the order of small events can have a significant effect. One of the key conclusions from path dependency is that while one can’t predict system behavior it is possible in retrospect to trace the reasons for why it behaved as it did.

31

process standards and standardized classification schemes on the configuration of socio- technical actor-networks (e.g. Bloomfield, Coombs, Cooper, & Rea, 1992; Bowker et al.,

1996; Hanseth & Monteiro, 1997).

ANT provides the network-building metaphor and a vocabulary for describing the process of standards creation and adoption (RQ1). For example, the creation of an EDI data format standard as part of a national health information infrastructure required the alignment of a wide range of actors, institutional arrangements, work practices and existing standards (Hanseth & Monteiro, 1997).

The network-building for a technical standard starts with say an initial idea or the recognition of the need for such a standard. A focal actor or set of actors strives to enroll others: first to agree that a standard is required, then perhaps to engage in designing or negotiating the standard in a committee setting, and finally to adopt it. The initial idea, the interim drafts, and the final standard are not transmitted unaltered (Latour, 1986).

Rather they move through space and time in the hands of actors that react to them in different ways (modify, deflect, betray, augment, appropriate or drop). If the actors that come into contact with them do nothing the standardization process or the diffusion of the standard ceases. The transformation by actors to suit their own needs often entails some loss of control by the initiating actor. For example, Bloomfield et al. (1992) highlighted the interpretive flexibility of information systems by observing the variation in outcomes at different locations due to dissimilar translations of the same system.

Sidorova and Sarker (2000) used ANT to analyze the reasons for the failure of a reengineering project. This along with Vidgen and McMaster’s (1996) examination of how weak ties can prevent stakeholders from black-boxing an information system

32

reminds us that actor-network building does not always succeed. The widespread adoption of a standard or its failure to diffuse can be determined by the extent to which focal actors can align the interests of many types of actors (e.g. entrepreneurs and regulators in Lyytinen & Fomin, 2002).

ANT also provides a vocabulary for describing the dynamic relationships among organizations, technical actors and standards (RQ2 and RQ3). In the case of the EDI format the resulting standard was seen to have involved the inscription of behaviors in complex and non-transparent ways (Hanseth & Monteiro, 1997). An information system for the categorization of plants came to represent both plants and taxonomists (Hine,

1995). The black-boxing of the activity-based costing (ABC) approach to management accounting through the enrollment of allies (Boland & Schultze, 1996) highlights the existence of standardized patterns of coordination among social actors. In the case of the classification of nursing work the outcomes on relationships with other actors was seen as uncertain (Bowker et al., 1996). Despite a lack of research at the industry level these findings provide some indication that standards may indeed influence the ways that organizations coordinate with one another and shape industry structure in general, albeit in unpredictable ways. Of course the standards often emerge from the very same context.

So the relationships among standards and organizations are interactive and dynamic.

ANT also provides a useful way of conceptualizing and describing how standards stabilize and can become irreversible. The concept of irreversibility in particular provides a characteristic in network-building for exploring the effect of initial conditions (RQ4) on the standardization process and its outcomes. For, example, infrastructure standards, like the near ubiquitous internet protocols can limit flexibility as they become irreversible

33

(Hanseth et al., 1996; Monteiro & Hanseth, 1996). Fomin and Lyytinen (2000) explain

the success of the NMT wireless standard in terms of the pre-existing Scandinavian actor-

network configuration. The alignment of interests in the development of a geographical information system in India (Walsham & Sahay, 1996) also indicate that initial conditions, or existing actor-network configurations can vary by cultural setting.

The ANT literature on standards, while insightful, does not address the actual creation of standards in a systematic way. Most studies explore established standards and their associated stable actor-networks. ANT is generally used as a framework for historical descriptions. There is a distinct lack of attention to the problematization and interessement phases and little discussion of how the actor-networks come to be built through the strategic action of actors in pursuit of their own interests while relating to other actors to make it possible. Moreover, there is a paucity of IS research that uses

ANT at the industry level (Howcroft et al., 2004; McLean & Hassard, 2004; Walsham,

1997).

Institutional Theory

By taking an open system perspective Institutional Theory allows for organizations to be influenced by their environments. In addition to rational/performance pressures the institutional perspective also conceives of socially constructed belief, norm and rule systems as exerting considerable influence on organizations.

One focus of institutionalism has been to explain how these institutional pressures have contributed to the homogenization of structures and practices within organizational fields (DiMaggio & Powell, 1983). The main argument is that the adoption of accepted

34

organizational structures confers legitimacy and that the institutional pressures to adopt

them are transmitted by “network connections.” An institutional perspective would offer

IT researchers a vantage point for conceptualizing the digital economy as an emergent, evolving, embedded, fragmented, and provisional social production that is shaped as much by cultural and structural forces as by technical and economic ones (Orlikowski &

Barley, 2001). As the belief systems and norms underlying these pressures vary over time and from place to place institutional theory could conceivably provide a lens for studying the emergence of and change in the patterns of intra and inter-organizational coordination

(i.e. industry structures).

Economists and political scientists have focused on regulative institutional pressures

(e.g. Moe, 1984; Williamson, 1975). While sociologists first stressed normative pressures more recent work in this area has favored cultural-cognitive pressures (e.g. Douglas,

1986; Zucker, 1977). These are Scott’s (2001) pillars of institutions. These pressures and their differing bases of order and compliance, varying mechanisms and logics, diverse empirical indicators and their alternative rationales for establishing legitimacy claims are show in Table 4 (W. Richard Scott, 2001; W. R. Scott, 2005).

The role of institutions in shaping organizations has been criticized as being too deterministic. Barley’s (1986) study of the varying impact of the introduction of CT scanners on the organizational structure within radiology departments helped bring human agency back into institutional theory (DiMaggio, 1988). Oliver (1991) suggested

that organizations may well respond to regulative and normative institutional pressures in

many ways (e.g. compromise, avoidance, defiance and manipulation) rather than just

passively complying with them. Institutional forces came to be seen as guiding rather

35

than determining organizational actions. Subsequent institutional theory has relied more

on interactive and recursive models and has been particularly influenced by Giddens’

structuration theory (Barley & Tolbert, 1997) as can be readily seen by comparing his

modes of structuration with Scott’s pillars (Table 4).

Regulative Normative Cognitive-cultural

Basis of compliance Expedience Social Obligation Taken for granted

Mechanisms Coercive Normative Mimetic

Logic Instrumentality Appropriateness Orthodoxy

Indicators Rules, laws, sanctions Certification, Prevalence, isomorphism accreditation Basis of legitimacy Legally sanctioned Morally governed Culturally supported, conceptually correct

(a) Scott’s three pillars of Institutions (W. Richard Scott, 2001)

Structure Domination Legitimation Signification

(Modality) Facility Norm Interpretive Scheme

Interaction Power Sanction Communication

(b) Giddens’ modes of Structuration (Giddens, 1984, p29)

Table 4. Institutional Pressures

Committee based standardization forums typically bring with them rules, procedures

and behavioral norms which clearly favor an institutional perspective. For example, in the

telecommunications industry participation in Standards Development Organizations

(SDO) is at the national level and this brings with it institutional norms and rules

typically found in bodies dealing with international issues. These differ considerably from those found in industry consortia with organizational or individual level participation (Schmidt & Werle, 1998).

36

The logics used to define “rational” are themselves socially constructed. The rules, behavioral norms and institutional frameworks bound and define rational arguments and approaches. For example, the Internet Engineering Task Force’s (IETF) “design culture” and the “technical aesthetic” of many of its members may be part of the reason for the failure of business oriented proposals for standards (Nickerson & zur Muehlen, 2006).

The institutional perspective has not been widely applied to the creation or adoption of technical standards. However, its consideration of the creation and use of information systems gives us some idea of how the perspective could handle issues around standards.

For example, SEI’s Capability Maturity Model (CMM) has been seen as an institution that influences, sometimes detrimentally, software development in organizations seeking

CMM certification (Alder, 2005). More generally institutional pressures can lead to isomorphism in development approaches (Nicolaou, 1999). Institutions may also take active roles in shaping technologies. For example, government agencies can take active roles in promoting innovation (King et al., 1994), industry associations can promote technology adoption and diffusion (Damsgaard & Lyytinen, 2001; Damsgaard &

Scheepers, 1999). It has been suggested that Enterprise Information Systems are both objects of institutionalization and carriers of institutional logics (Gosain, 2004;

Kallinikos, 2005).

The structures of the relationships among organizations and between organizations and technologies have an institutional component. The challenge is to identify the relevant institutions and then try to understand how they influence industry structure.

The institutions at this level of analysis include government agencies, trade and industry associations, other organizations in the firm’s value chain (i.e. customers and suppliers),

37

trend-setting corporations, professional organizations and educational institutions

(Nicolaou, 1999).

In using institutional theory to examine how standards making and adoption interacts

with relationship building we could examine the pressures created by standards i.e. treat

standards as institutions themselves. We could examine how the introduction of a

standard (or technologies based upon it) changes existing institutional pressures – say through a process of structuration (Barley & Tolbert, 1997). This latter analysis would also allow us to examine the impact of the initial institutional environment as the introduction of a standard technology may play out differently in different settings

(Barley, 1986). Such a study would probably have to deal with conflicting pressures from different institutions. For example, information systems have been viewed as institutions that can conflict with existing institutions such as organizational structures (Avgerou,

2000) or older information systems (Alverez, 2001). Globalization also means organizations have to confront multiple institutional environments emanating from differing countries.

There are important limitations in using an institutional perspective as the sole lens for addressing our research question. It does not incorporate the rational/performance pressures that influence standards making/adoption. It looks more at how coercive, mimetic and normative isomorphic processes produce organizations that have very similar structures – “rationalized myths” that give organizations legitimacy. Its literature has relatively little to say about how institutions form, change, stabilize, or dissolve

(Nickerson & zur Muehlen, 2006). Questions about when it is possible to create new rules or norms are not addressed.

38

We have reviewed what economic, sociological and institutional perspectives have to tell us about how standards making and adoption relates to the patterns of coordination that emerge in an industry involved in the creation of large scale systems. This provides a platform on which to develop our own theoretical framework for these research questions.

39

III. Proposed Theoretical Framework

In the previous section we critically considered what several theoretical perspectives

can contribute to answering our research question. In this section we summarize the key limitations of each perspective and present our rationale for selecting Actor-Network

Theory (ANT) as our primary theoretical lens. We present an actor-network based

theoretical perspective that provides a conceptual framework for incorporating what we

know about standards making and adoption while providing a methodological basis for

exploring further.

Selecting the primary theoretical perspective

Economic perspectives provide considerable insight into the economic and competitive effects of technical standards (albeit with limited generalizability). However, as we are examining the creation of standards and the construction of large scale information systems technologically deterministic perspectives are not suitable as a primary lens. Technology cannot be taken as a given but must be endogenous to the selected approach. As our research question also deals with the socio-economic effects of standards it would also be inappropriate to adopt approaches exhibiting social determinism (e.g. SCOT). Basing the research on such perspectives would not allow us to consider the dynamic interactions among industry participants and technologies.

Some institutional theory based studies have examined the impact of technology upon social relations and others the institutional pressures of technology. However, there is a

40

lack of empirical work that looks at the dynamic two-way interactions among technologies and organizations. Its sole focus on long term institutional pressures limit its applicability as we cannot so readily ignore rational/performance pressures. The theory’s relatively limited ability to deal with how institutions form, change, stabilize or dissolve along with its underdeveloped view of technology also does not make it well suited to looking at the creation of complex large scale information systems.

Actor-Network Theory (ANT) is based on a viewpoint that it is not possible to study the social or the technical in isolation (Latour, 2005). We deploy ANT as the primary theoretical lens to examine the socio-technical means through which agreements are reached during standards making and adoption as well the wider structuring of the relationships among organizational actors. It lends itself to the consideration of hybrids of human and non-human elements (Walsham, 1997) and avoids both technological and social determinism. It provides theoretical and methodological bases for incorporating the persistent social and technical pressures associated with institutional actors as well as short-term rational/performance pressures (including those at the center of economic perspectives).

However, ANT is a just a conceptual and methodological framework. It does not in itself tell us anything about standardization, its effects on the relations among organizations building large scale information systems, nor about the types of connections that exist among actors. Next we unpack the translation processes actors undertake in standards making and adoption by focusing on how actors formulate diverse strategies to pursue their own interests and how they relate to others to make it possible.

41

Unpacking translation during standards making and adoption

Technical standards making entails multiple translations that include: initiation of

standardization effort, agreeing objectives and specifications, adoption by organizational

actors and finally the diffusion of products and services based on the standard.

Translations among human and non-human actors must take place at each stage for standardization to progress. Emerging standards have many characteristics (e.g. scope, flexibility, procedure for definition, legal status, and enforcement procedure) which are subject to multiple translations among many actors as the standard, and its characteristics, are incorporated into the actor-network and subsequently influence on-going translations.

A standard both is shaped by translations with, and among, other actors and it in turn shapes the actor-network in which it becomes embedded. Standardization is thus an on- going interactive process of translating and discovering new Obligatory Passage Points

(OPP) and new actors to advance the standard.

The three phases of translation are central to our analysis of how standards making and adoption plays out in the design of very large scale information systems.

Problematization By considering the benefits it hopes to realize an actor decides whether it wants to initiate a standardization effort. During problematization the actor assigns the identities, interests, values, projects, desires, strategies, reflexes and afterthoughts (Callon, 1991) (hereafter referred to simply as “interests”) to other actors to be consistent with its own. The actor tries to render itself indispensable by defining an

OPP under its control that other actors must pass through to achieve their interests

42

(Callon, 1986). The OPP is typically in the actor’s direct path while others may have to overcome obstacles to pass through it (Callon, 1986; Sidorova & Sarker, 2000). The

definition of others’ interests and of the OPP is part of the actor’s standardization

strategy. Other elements might include creating incentives to encourage others to

overcome obstacles to passing through the OPP. We define the set of strategies created to

enroll other actors as a standardization strategy.

In the case of an actor trying to create a standard using market based mechanisms its

standardization strategy includes defining the interests of the actors it wants to adopt the

technology. There may well be other heterogeneous elements, including strategies to enroll technical actors, which make up the standardization strategy. For committee

standards initial strategies include ways of defining other actors’ interests so that they

adopt the objectives of the standardization effort and participate in the committee

process. Prior to this at least one actor must have problematized the selection of either

market or committee based mechanisms for standards creation.

In formulating standardization strategies actors hypothesize or imagine alternative

actor-network configurations (Callon, 1986; Latour, 1995). The actors’ interests shape

their preferences for alternative configurations. Their perception of existing actor-

network configurations provides an understanding of the translations required to realize

alternative future configurations. Thus we can conceive of a model of strategy

formulation in which the actors’ interests are mediated by how they imagine future actor-

network configurations and the translations required to bring them about (Figure 2).

Standardization strategies may be based on more, or less, complete or sophisticated

models of existing and imagined actor-network configurations. We in no way imply full

43

knowledge of actor-network configurations nor infallible strategy formulation and

execution. We allow for actors’ recognizing the uncertainty around actor-network building and for their simultaneously imagining multiple possible futures and creating strategies they believe will make the more attractive futures (i.e. those more aligned with their own interests) more likely and unattractive futures less so.

We only conceive of actors with at least some human component as creating standardization strategies. Non-human actors are targets of enrollment and act within the heterogeneous engineering effort that is standards making by cooperating (or not) with human actors’ strategies or by acting out the inscriptions embedded within technical, legal, ideological or other artifacts.

Influences Influences Influences preferences

Perception of Actor’s Imagined existing actor- perceived future actor- (Standardization) network interests network Strategy configurations

Constrains preferences

Problematization

(Possible) Interessement Enrollment

Interactions Alters among actors Influences

Figure 2. Standardization strategy formulation for actors in the problematization and interessement phases

44

Interessement In the second translation phase, interessement, the actor executes its

strategies to convince other actors to accept its definition of their interests. Ensuing

negotiations, interactions and translations may eventually lead to the creation and

diffusion of a standard.

Interessement and other interactions among both human and non-human actors can

modify human actors’ perceptions of (a) existing and future actor-network

configurations, (b) the interests of other actors and (c) the translations required to bring

about possible future actor-network configurations. The feedback inherent in this process

(Figure 2) allows the refinement and alignment of actor’s perceptions of actor-network configurations and interests. We conceive of problematization (including strategy formulation) and the interactions among actors (including interessement or strategy execution) as an on-going dynamic process at the heart of aligning the diverse interests and imagined futures of heterogeneous actors.

Proposed standards are not transmitted unaltered (Latour, 1986). They move through time and space in the hands of actors that react to them in different ways (modify, deflect, betray, augment, appropriate or drop) (e.g. Lyytinen & Fomin, 2002). If the actors that come into contact with them do nothing the standardization process or the diffusion of the

standard ceases.

Enrollment The final phase of translation, enrollment, is the moment when another actor

accepts the interests defined by an actor. In a successful standardization effort the

interests of a range of actors are translated into an agreement on the scope and content of

a standard. The standard is inscribed in documents, software or technical artifacts, which

45

bolster its durability. The standard may be framed as the solution (OPP) to certain industry problems and the actor-network widened in time and space as more actors adopt it. If the standard builds a strong enough network of actors (implementations, users, and ratification by regulators) it can become black-boxed and irreversible. By becoming taken for granted and used as “packages” in the continued construction of actor-networks standards shape and constrain future possibilities.

While enrollment implies a degree of acceptance of assigned roles, actors may not fully assume these roles. Even where translations are achieved as envisaged the actor- network configuration may not turn out as expected (e.g. due to translations accomplished among others). The transformation of the original proposed standard by multiple actors to suit their own needs often entails some loss of control by the initiating actor, changes in OPPs and many actors’ interests. Thus the outcomes of actor-networks building and creating inscriptions can be unpredictable.

Revisiting the research questions and objectives

Our actor-network based process model of standards creation and adoption is summarized in Figure 2. It is the playing out of the standardization and other strategies of multiple actors that change both the actor-network configuration and/or actors’ perception of the configuration. These actor-network configuration changes, that include the changes in the industry structure and in the relationships among the non-technological and technological actors more generally, are at the heart of our research questions.

46

Having adopted this actor-network based process model for conceptualizing our phenomena of interest we can revisit our research questions. In Table 5 the four initial research questions are broken down into a series of sub-questions aligned with our actor- network based process model and overall theoretical perspective. While these are very abstract questions the objective is to build down-to-earth descriptions of the dynamic interactions among human and non-human actors involved in standards making and adoption during the implementation of large scale information systems. If we are able to live up to the Latour’s vision for actor-network theory the descriptions themselves will provide the explanations (Latour, 2004) of the dynamic processes that play out to produce the observed standardization, competitive, of marketplace outcomes.

The empirically based descriptive explanations of the creation and adoption of the wireless and television industries presented in chapter 5 through chapter 8 cover much of the same space already addressed by prior literature on the economic effects of standards and differences in standardization outcomes. However, we strive to create descriptions that advance beyond the explanations provided by prior theory. One way that this is achieved is by providing rich explanations for why particular economic outcomes of standardization were observed and why others were not – thereby moving beyond the ambivalence surrounding some prior theoretical explanations. The descriptions explain how the important problematizations, strategies and OPPs, enrollments, connections between actors, and changes in actor-network configurations, led to a particular outcome while preventing other outcomes from happening. Such explanations allow us to incorporate the complex dynamics of real world interactions between numerous human and technological actors – in stark contrast to theoretical approaches subject to

47

technological or social determinism, or that ignore interactions among technologies and standards.

Research question Sub-questions in actor-network terminology 1. How does technical standards creation • What actors are involved and what roles do they and adoption play out in the construction play? of large scale information systems? • What are actors’ interests, identities, values, projects and desires? • What interests, identities, values, projects and desires are assigned to others? • What imagined futures are envisaged? • What strategies are created and what OPPs are established? • How are strategies related to imagined futures? • How are the strategies formulated and executed? • In what ways do actors interact? • How are OPPs accepted or rejected? • What are the phases of actor-network building?

2. How do organizations build their • What sorts of relationships exist? relationships and coordinate with one • What sorts of coordination mechanisms are used? another and with technology during the • How are new relationships established? construction of large scale information • How do relationships change? systems? • How are relationships terminated?

3. How does standards creation and • In what ways does (a) standards creation, and (b) adoption interact with the ways that standards adoption, change the relationships among organizations build relationships and actors? coordinate with one another and • In what ways do the relationships among actors technology? In other words, how do shape (a) standards creation, and (b) standards standards interact with industry structure adoption? and technical infrastructure?

4. How do existing technical and inter- • How do actors perceive existing actor-network organizational coordination mechanisms configurations? affect the design and implementation of • What characteristics of existing actor-networks large scale information systems? shape (a) standards creation, (b) standard adoption, and (c) actor relationships? • What are the mechanisms through which existing actor-networks effect (a) standards creation, (b) standard adoption, and (c) actor relationships?

Table 5. Mapping high-level research questions to sub-questions consistent with the selected actor-network theoretical perspective

Another way we strive to advance beyond prior theory is to explain phenomena that have not been comprehensively addressed e.g. the selection of standardization mechanisms (e.g. market versus committee) or the selection of forums (e.g. established standards development organization versus industry consortia) for committee based

48

standardization. Such explanations necessarily go well beyond the simplifying assumptions embedded in some economic theoretical models e.g. that standards creation is simply an exercise in selecting among a fixed number of well defined technological alternatives with well known utilities. Again the explanations are comprised of descriptions of the problematizations, strategies and OPPs, enrollments, connections between actors and changes in actor-network configurations that led to particular forum selections, or patterns of interactions within forums.

It is these sorts of descriptions, albeit at a higher level, along with details of the actors involved that are used to explain the industry level interactions between standards making and industry structure i.e. the wider conceptualization of the socio-technical actor- network configuration.

The research questions presented in Table 5 are studied using industry level case studies of the mobile wireless and television industries in the US and the UK. The primary sources of data will be an archival study of the changes in the industries and interviews of people directly involved in organizations participating in the industries.

Comparisons between the problematizations, strategies, enrollments, inter-actor connections, and the changes in actor-network configuration for differing outcomes in similar circumstances, within or across cases, aids in the isolation of possible causes for the differences. Further details of the types of data collected and analyzed are presented in the next chapter.

49

IV. Research Design

To answer our research questions we have conducted a study of the wireless industry,

using an embedded multiple-case study design. Case studies are particularly appropriate

for the sort of research question posed in the introduction and reiterated in actor-network

terms in the previous section i.e. ‘how’ and ‘why’ questions about contemporary events

where the investigator has little or no control over the events and where the phenomenon of interest (standards creation and industry structure) and the context can not be easily distinguished from one another (Yin, 2003). In taking an actor-network perspective we assume that there is no distinction between the phenomenon and the context.

Case studies can be descriptive, exploratory or explanatory in nature. The ‘how’ research questions addressed in this dissertation imply an explanatory case study.

However, seeking to create an explanation of the relationship between technical standards and the structure of the wireless industry also requires an extensive description of the evolution of the industry and the technical standards domain as well as an exploration of

the competing explanations from the existing economic and other literature. We first

introduce the research setting and then elaborate on the research method.

Research Setting

This wireless industry has offered telephony services since the early 1980s (Bekkers,

2001; Funk, 2002). Standards play a vital role in the industry by facilitating the interoperation of the automated systems that make wireless services possible (e.g.

50

handsets, towers and base stations to support the radio links, mobile switching

centers to provide and interconnect with the public telephone

network, and backend systems for provisioning, customer service and billing).

The wireless industry makes an ideal setting to address our research questions since wireless services are critically dependent upon the creation and implementation of standards (Funk, 2001; Funk & Methe, 2001). A series of studies (Telecom_Policy, 2002) on the evolution of 1G and 2G wireless services also highlighted the importance of the relationships among industry participants and the central role of standards in the diffusion of wireless services. Synthesizing the implications of these studies Lyytinen & King

(2002) conjectured that (a) the evolution of wireless services is critically dependent upon the creation and implementation of standards, (b) many of the critical industry relationships were organized around standards, and (c) the diffusion of the services is enabled and shaped by the dynamics of the relationships among three analytically distinct

domains (illustrated in Figure 3):

• The Innovation system is the interlinked network of sites, competencies, ideas and

resources, which is capable over time of developing novel technologies and

solutions based on research and development activity. Exploitation of these

innovations and technologies in wider systems often requires the creation of

standards5.

• The marketplace is a set of organizational actors that produce some

telecommunications services or technologies (within a value network) exploiting

the technological potential defined within telecom standards

5 This definition of innovation system is not intended to bring with it any of the diverse meanings attributed to the term by other literatures. See (Carlsson, 2007; Carlsson, Jacobsson, Holmén, & Rickne, 2002) for a review of some of these literatures.

51

• The regulatory regime is any type of authority (industrial, national, international),

which can influence, direct, limit or prohibit any activity in the innovation system,

the marketplace or the regulatory regime itself6.

Innovation System Marketplace

Standards

Regulatory Regime

Figure 3. Central role of standards in wireless industry (Lyytinen & King, 2002)

Yoo et al. (2004) used the three institutional domains as constellations of actors in their actor-network based description of wireless service diffusion in Korea. Fomin et al.

(2004) adopted the framework to study the Danish wireless industry. It is adopted here to organize the examination of how translation plays out in the building and transformation of actor-networks in the US and UK wireless industries.

The transition from analog 1G to digital 2G systems was motivated by a need for more efficient use of radio spectrum, due to increasing demand (in the US case) and to provide a single pan-European standard to replace a dozen national standards (in the

European case). Although digital, 2G standards remained voice-centric and were based on the then dominant ISDN circuit-switched technology. The 2G transition brought some new operators, and the first successful data service, , brought a few new

6 Definitions adapted from (Yoo, Lyytinen, & Yang, 2004)

52

players (e.g. banks and airlines), albeit in a peripheral way. The main industry

participants remained the network operators, national and regional regulators, and the

manufacturers of infrastructure, handsets and (Funk, 2002). The overall

structure of the industry’s actor-networks remained fundamentally unchanged.

Lyytinen & King highlighted two key interfaces in 1G and 2G. By specifying how

mobile devices operate within the wireless infrastructure, air interfaces played an

important role in defining the relationship between infrastructure and device

manufacturers. The licensing/pricing policy established by regulators influenced the relationships between network operators and their customers (e.g. whether the caller or

recipient pays for calls to wireless devices has affected the operators’ marketing

strategies and user adoption and usage patterns).

The wireless industry’s actor-networks have exhibited properties of irreversibility

(Tilson & Lyytinen, 2005) and wireless telephony development has followed divergent paths in different national contexts (Fomin et al., 2004). The different technological trajectories and patterns of use have been shaped by regulation (West, 2000; West &

Fomin, 2001), market structure (Lera, 2000), and socio-cultural settings (J. E. Katz,

2003) among other factors. These differences influenced the emergence of focal actors and OPPs, and constrained the range of future actor-network configurations imagined by industry actors.

There have been diverse patterns of diffusion of data services around the world e.g. Japan and Korea, the US, and Europe have behaved differently from one another. We argue that differences in the sorts of services that emerge and how they are adopted can be explained by the types of actor-networks established, the ways in

53

which they are configured, the relationships among actors, and the way specific standards

are defined.

Conducting multiple case studies aids the external validity of case study based

research. Given the unique nature of the wireless industry in each country it is not really

viable to conduct a literal replication i.e. where we expect similar condition to lead to similar outcomes. In this dissertation we follow a theoretical replication where we expect different environments to lead to differing outcomes and where we strive to explain the outcomes and the difference between the cases. We examine the changes in the actor- networks in the US and UK to identify at least some of the reasons for inter-regional differences.

The selection of the UK and US limits the differences between the two cases in some areas – both have advanced economies, strong democratic traditions, similar population age profiles and use legal systems based on the English common law. There are highly competitive marketplaces (Figure 4) for mobile communication services in both the UK and the US. In both countries the number of mobile cellular phone subscribers now exceeds the number of fixed telephone lines, per capita expenditure on mobile communication services is similar, and at least until the late 1990s the penetration of mobile services was similar as well (ITU, 2005).

The regulatory regimes in the UK and the US have a strong market bias and exercise a relatively “light touch” in terms of intervening in the operation of the marketplace. In the US AT&T was forced to break up in 1983. The UK led the way in telecommunication deregulation in Europe with the privatization of British

Telecommunications in 1984 and the introduction of competition for telecommunications

54

services. In took several years for other European countries to follow suit. Vehicle based

cellular services commenced in 1983 and 1985 in the US and the UK respectively. The

UK issued national licenses, while the US licensed operators in over three hundred

geographical markets.

Some of the most striking differences between the US and UK are in the innovation system. While the innovation systems are very active in both countries the focus is different. The US is home to several of the major mobile wireless infrastructure (,

Motorola) and handset/device (Motorola, Kyocera) manufacturers. The UK is not home to any of the major infrastructure or handset manufacturers – perhaps a result of the UK being one of the few large European countries that decided to buy these technologies for its first generation mobile wireless system rather that attempt to nurture national champions and support national systems. However, Lucent and Motorola have their world headquarters for UMTS (Universal Mobile Telecommunications System) in the

UK. and Siemens also have significant research centers for 3G mobile in the UK.

Similarly, the US is home to several of the major vendors (Intel, Texas

Instruments, ) that supply the handset/device and infrastructure manufacturers.

While the UK is not a major supplier of semiconductors one specialist firm, ARM

Holdings plc., is a world class developer of microprocessor core designs, which are licensed by semiconductors vendors and incorporated into chipsets for the mobile wireless market – including those from Intel, Texas Instruments and Qualcomm.

Despite some differences, the UK’s regulatory regime is closer to the free market leanings of the US than other major European markets. Both countries contrast with other countries with major mobile wireless industries (e.g. Korea in which the government took

55

a much more active role in shaping the industry, and Japan where a dominant provided a fairly centralized coordinating actor). Thus the selection of these countries allows us to control for regulation as much as is possible in such a

“natural experiment.” The innovation systems in the two countries have a somewhat different focus; the UK’s venture capital funding is closer to that of the US than most other European countries (Figure 5).

EBITDA Margin

HHI Index Figure 4. Scatterplot showing the relationship between the profitability and competition7

Looking at the transition to third generation technologies (3G) in both the US and the

UK provides us with an opportunity to study the differing dynamics of actor-network building, and differing outcomes, resulting from dissimilar initial actor-network configurations. Many manufacturers serve both countries and the UK’s largest network

7 The source of this chart is a Deutsche Bank analyst report (Shvets, Coe, & Kieley, 2005). EBITDA margin is a measure of profitability and the Herfindahl-Hirschman Index (HHI) is a measure market concentration.

56

operator, Vodafone, has a presence in many countries including the US. Studying how

the same organizations are connected to the actor-networks in each country should go a

long way to answering our fourth research question. Both countries have a common

language, similar cultures and have advanced economies. More personally I am very

familiar with the culture in both countries having lived and worked in both and I have

worked in telecom and related industries for over ten years.

Figure 5. Venture capital investment by country of management and destination, 1999-2001 as a percentage of GDP (Source: OECD)

Research Method

The units of analysis in this case study are the national mobile wireless and television

industries. Within these industries are embedded a wide range of both human (e.g.

commercial, regulatory and individual) and non-human (e.g. technological artifacts, , technical standards and the laws of physics) actors. Understanding the unit of analysis is important as it bounds data collection. In this study we have to be

57

sensitive to the expansion of the relevant set of actors embedded in the industry as the

connections with the computing and content industries become richer over time.

Four criteria have been generally used to judge the quality of positivistic social

research: construct validity, internal validity, external validity and reliability (Yin, 2003,

p33). Although the case studies are not strictly positivistic in nature we still strive to

satisfy these criteria as much as possible to minimize possible criticism from researchers

steeped in the positivistic tradition.

Construct validity and data collection

While we expected more constructs to emerge during the collection and analysis of

the case study data, our research questions and the actor-network based theoretical

framework provide insights into the types of constructs we are dealing with. For example,

technical standards are central to several of the research questions and to the overall

research agenda. Although the operationalizations of the constructs become more refined

by the case study established operationalizations, first for technical standards and then

for the rest of the constructs associated with each research question.

David and Greenstein’s (1990) definition of a technical standard as “a set of technical specifications adhered to by a producer, either tacitly or as a result of a formal agreement” is a good starting point. In the mobile wireless industry these standards include the air interfaces (e.g. frequency assignments as well as , coding, signaling and power-control schemes) and network interfaces (e.g. billing, signaling, and protocols). With the expansion into mobile data services protocols at all levels of

the stack, operating systems and application environments, typically adapted from the

58

computing industry, all become important standards. Standardized industry processes e.g.

managing content are also becoming relevant. As with any study we must look at change.

So for technical standards we are interested in the events that underlie the creation and

adoption of standards. These events include the initial proposal for a standard, agreement

on the need for a standard, the release of draft and final versions and the pattern of adoption by industry actors or end-users.

Our use of the actor-network perspective, and of an actor-network based process model for strategy formulation (Figure 2), brings several other abstract constructs or concepts. Sub-questions drawing upon these more abstract actor-network based constructs were mapped to the initial high-level research questions (Table 5). To ensure that data collection is aligned with the research questions and the proposed theoretical framework we next consider how data about these abstract constructs is collected, analyzed and used to build explanatory actor-network based descriptions addressing the research questions.

In our first research question (RQ1) we want to understand how technical standards creation plays out in the construction of large scale information systems – specifically the wireless based communications and computing systems. To answer this question we need to capture information about the actors involved in the standardization process. Just who or what these actors turn out to be is an empirical issue but judging from data already collected and previous literature about standards creation (e.g. Bekkers, 2001; Schmidt &

Werle, 1998) these will certainly include organizational actors like firms that develop technology, operate networks or offer services. The rules and procedures in standards forums, the characteristics of technologies, existing technology deployments and

59

intellectual property rights are also likely actors. We need to understand the influences

these actors exert on one another to bring about particular standardization outcomes. We

have argued that this is a dynamic process and have presented an actor-network based

process model (Figure 2) of a cascade of translations as a way of conceptualizing how

this happens. We therefore have to build up a history of the events during standardization

processes and collect data pertaining to how the translations take place. The

standardization events include a map of the standardization bodies and their activities

over time (e.g. launch of standardization efforts, who initiated them and why, who was

involved, strategies played out, outcomes and adoption patterns). In accordance with our

process model this involves gathering evidence of the human actors’ perceptions of actor-

network configurations, their own perceived interests and their visions of possible future

actor-network configurations wherever possible. We also need to collect evidence about

the standardization and wider strategies they formulate, including obligatory passage

points they try to establish, interactions among actors, and enrollments. Even where

enrollment does not take place we conceptualize that the interactions among (human and

non-human) actors alters perceptions of existing and future actor-network configurations.

As this is an on-going process it is essential that the evidence collected is mapped over time.

Our second research question (RQ2) asks how organizations build their relationships and coordinate with one another and with technology during the construction of large scale information systems. The concept of the relationships among actors presents some difficulties in terms of operationalization. It subsumes the sorts of factors identified by

Porter (1980) as underlying the ‘five forces,’ as well as considerations of the boundaries

60

between firms as understood by TCE, and the taken for granted relationships associated

with long term institutions (irreversible black boxes in actor-network terms). We look for

changes in the relationships among actors based on these traditional measures. However,

our study strives to go beyond this to include the changes in the relationships with, and

among, technical actors and further dimensions found to be important to other actors.

As with the examination of standards making, we gather evidence of human actors’

perceptions of the actor-network, their own perceived interests and their visions of

possible future actor-network configurations, the strategies formulated, the OPPs they try

to establish, the interactions among actors and the enrollments that occur. The actor-

network perspective of cascading translations requires us to build a map of evidence of

changes in the relationships among human and non-human actors – this time the scope of the relevant scope is industry-wide or even broader. Again the types of changes in the

relationships cannot be fully known in advance. However, relevant data would certainly

include change in the relationships:

(a) Among industry participants: Numbers of competitors in the industry, major

mergers and acquisitions, technology and market partnerships, new entrants and

new industry participants. Success of new service and product launches. The case

studies are sensitive to changes in the actors that are considered relevant parts of

the industry (see Figure 29).

(b) With the regulatory system: Changes in regulation, creation and passing of

legislation, release or change of operator licenses and electromagnetic spectrum

allocations. We are attentive to changes in the regulatory agenda and the impact

of changing industrial policy.

61

(c) With technology: New device types, advances in the state of the art (e.g. CPU

performance/watt, display capabilities, memory density), adoption of new

technologies and retiring of old.

In our third research question (RQ3) we ask how standards creation and adoption

interacts with the ways that organizations build relationships and coordinate with one

another and technology. We are in effect expanding the scope of the first two research questions to include one another. The constructs should largely be a combination of those

already identified for those questions although we are attentive to the emergence of

others while examining the interaction between standards and inter-actor relationships.

The fourth research question (RQ4) asks how existing technical and inter- organizational coordination mechanisms affect the design and implementation of large

scale information systems. As we have argued that both standardization and coordination outcomes are dynamic this “existing” actor-network configuration is also something that is changing with time.

The multiple-case study design adopted extends beyond examining the 3G transition in the UK and the US. During the initial collection of data it became apparent that there was considerable path dependence in the creation of standards in the mobile wireless industry. It would be difficult to understand standards making and adoption for third generation systems without having a good understanding of the preceding generations as well as the standards making and adoption activities associated with them. The study design therefore includes separate cases for these earlier systems in both the US and the

UK. The UK’s strong connections with the rest of Europe also made it essential to include some of the wider European story as well.

62

The data communications capabilities deployed in 3G systems (as well as in

evolutionary enhancements to 2G systems), along with the computing capabilities of

handheld devices led to a wide range of new applications and services. While some of

these are discussed in the 3G case it was not possible to go into great detail. To overcome

this limitation a more in-depth case study of one of the new mobile applications, mobile

TV, was undertaken. As with the wireless industry it became apparent that an

understanding of the traditional television industry was also necessary. So the final

longitudinal multiple-case study used the 2x4 design depicted in Figure 6. The figure lists

the key air-interface standards for each case and the corresponding chapter.

1G and 2G Mobile 3G Mobile Wireless Traditional TV Mobile TV Wireless

• 1G AMPS • EDGE • NTSC 525-line B&W • Proprietary unicasting • 2G D-AMPS, • series • NTSC 525-line color • MediaFlo US cdmaOne, iDEN, and • UMTS • ATSC digital • DVB-H GSM

• 1G NMT, TACS, and • UMTS • 405 line B&W • Proprietary unicasting others • PAL 625 line color • MediaFlo UK/ • 2G GSM • MAC / D-MAC • DVB-H Europe • DVB-T digital • MBMS

Chapter 5 Chapter 6 Chapter 7 Chapter 8

Figure 6. "2x4" longitudinal case study design

The case studies draw heavily upon numerous sources of archival data to establish

chains of evidence to bolster construct validity. In addition interviews were an important

source of data. Interviews provide a rich source of data on actors’ interests and

strategizing, the rationales behind their strategies, as well as the ways in which they perceive past, present and future actor-network configurations (i.e. the sub-phases of translation from our model of strategy formulation, illustrated in Figure 2). Initial in-

63

depth interviews were arranged by approaching decision makers from network operators and equipment manufacturers. We follow the actor-network by asking interviewees who else we should interview. Thus we are applying the idea of network based interviews

(Latour, 1987) rather than a random sampling approach. This strategy enables us to discover the range of actors from each domain (Figure 3) necessary for the delivery of broadband wireless data services. Initial interviews provided us with a more detailed view of the organizational actors from which we need to draw interviewees (Tilson &

Lyytinen, 2004, 2005). We conducted at least one interview for each of these types of actor but in most cases conducted several for each type. Due to time constraints and the limitations of human memory interviews are not a good source for data about the timing of specific events. We therefore use the archival data to trace the dynamics of specific events associated with industry changes. The resulting fact base provides a chronology of key industry events. This allows us to answer the, who, what, when and where questions about the building and transformation of actor-networks and lays the empirical foundation for answering the research questions. The kinds of archival data sources we explored included analyst reports, books, annual reports, regulatory directives, documentation from standards bodies, and the wireless industry press (e.g. Telecom

Flash, Wireless Week), the general business press (e.g. Wall Street Journal, The Register) and technology and standards news sources (e.g. IEEE publications and press releases from standards bodies). The EBSCO Business Source Premier and Proquest ABI/Inform

Global were the main sources for news and technical press articles. Company, regulator, and ’s websites were also used. The alignment of the

64

research questions, constructs and the data collection approach and data sources is summarized in Table 6.

The interviews of industry decision makers and observers typically lasted between 90 minutes and two hours. They started with the collection of demographic information about the interviewees and their organization. We asked questions about the organization’s roles in the industry and the standardization arena. We probed the reasons for participating in standards making as well as the rationales for their standardization and wider strategies. A picture of their actor-network was created by encouraging them to talk about their relationships with other actors, how those relationships are changing, as well as how and why they build relationships. The interview guide (Appendix 1) also includes questions about technology and services, to explore how they go about problematizing and making connections with technological actors.

A total of 42 interviewees, predominantly from the US and UK, participated in 27 interviews. They included executive level employees of network operators, infrastructure, device, and semiconductor manufacturers, middleware vendors, a system integrator, content providers, the main wireless regulators, and leaders in of standards development and industry forums. The interviewees were guaranteed anonymity. More details about the interviews and the interviewees are provided in Table 9 and Table 10 in chapter 6.

Internal validity and data analysis

Where we infer explanations for the relationships between technical standards and the relationships among actors we need to deal with threats to internal validity. Claims of causality are primarily based on rigorous analysis of case study documents and

65

interviews. Transcripts of the interviews were systematically coded to identify

discussions and descriptions of:

• Each stage of translation in the standards arena and more generally

• Actor’s relationships with others

• Current and imagined future actor-network configurations

• Obligatory passage points

• Actor’s interests and its rationales for its standardization and wider strategies

Such coding provides a rigorous way of building up descriptions of actors’ interests and relationships, as well as their perceptions of past, present and future actor-network configurations. It also allowed us to systematically capture actors’ views of the changes in relationships and the rationales behind actors repositioning themselves within the actor-network. The application of the network based interviews approach allowed us to compare and contrast descriptions of the relationship between two actors from the perspectives of both. Descriptions of actors’ perceptions of the current and imagined future actor-network configurations, along with their discussion of the stages of translation, helped us understand how actors formulate their strategies. Where the events have been discussed by one or more of the interviewees we triangulate the findings of the archival and interview analyses.

For each of the first two research questions we build up timelines of the relevant events in the wireless and television industries and the data we have collected about them from both archival sources and interviews. In the analysis of these case histories we strive

to identify the sorts of problematizations, strategies and OPPs, enrollments, and

connections among actors that were responsible for the observed standardization and

industry structure outcomes i.e. the actual ways that the actor-network configuration was

66

transformed. The timelines of events provide the empirical foundation upon which to

build the actor-network based explanations for why particular standardization or

coordination outcomes were observed and why others were not.

Analysis of the interactions between standardization and coordination processes and

outcomes (i.e. the third research question) will use a combination of the data collected

and analyzed for questions one and two. Addressing the fourth research question entails

examining translations across the timelines to establish how existing actor-network

configurations affect how translations play-out. Detailed explanatory descriptions will be the primary means of answering the research questions (Latour, 2004). The alignment of data analysis to the research questions is summarized in Table 6.

We compare our answers to the research questions in the US and the UK to evaluate the impact of different actor-network topologies (e.g. differing regulatory regimes, innovation systems and markets) on actor’s strategies and relationship building. We look for underlying, or common mechanisms of actor-network building that play out in both locations and across domains (Figure 3). We also look for mechanisms unique to one location or domain. Where possible we attempt to explain the reasons for differences by location (e.g. different initial actor-network configurations) and domain and use within case or between case comparisons to help in isolating causes for differences in outcomes and strengthen the validity of the findings.

We will seek to bolster the internal validity of our findings by showing that other perspectives (e.g. Economics, SCOT, and Institutional Theory (W. Richard Scott, 2001;

Tolbert & Zucker, 1994)) and chains of events cannot satisfactorily explain the outcomes

(Eisenhardt, 1989). In addition we endeavored to strengthen face validity by asking our

67

key informants to comment critically on our explanations by sharing circulating a near

final draft of the dissertation with them.

External validity and Reliability

Yin (2003) rejects comparisons of case studies with surveys where a sample is

statistically generalized to a wider universe. Rather case studies rely on analytical

generalization of case study findings to broader theory. Case studies can support some

theories while not supporting others. They can be used to revise existing theory or to

propose new theories. We used the actor-network based model as our starting point but

remained open to comparing the ability of alternative perspectives to explain the case

study findings. We further bolster external validity by performing multiple case studies as part of the research design. As there are only so many wireless industries in the world it is not possible to select another case that would allow a literal replication and produce very similar results. Rather we employ a theoretical replication logic where we expect differing initial actor-network configurations to lead to different outcomes.

We strove to minimize errors and biases in the study so that another investigator repeating the same study should arrive at broadly the same findings and conclusions. To achieve this we fully document our procedures e.g. the interview guide is provided in

Appendix 1. In addition interview transcripts and recordings are retained and all archival data sources are explicitly cited.

68

Research Question Constructs Data collection Data sources Data analysis 1. How does technical standards Technical Standards Map of relevant standards and forums 1. Create timeline of standardization creation and adoption play - When initiated – by whom? Archival activities and outcomes out in the construction of - Actors’ actions - General business press (e.g. Wall 2. Map change in actors’ view and large scale information - Documents Street Journal, Financial Times) connections over time systems? - Standardization outcomes - Wireless industry specific press 3. Map patterns of translations to the - Patterns of adoption (e.g. Telecom Flash, Wireless standardization outcomes Actors’ view over time Week) 4. Write explanatory descriptions of Other actors - Perceptions of existing and envisioned actor- - Press releases standardization outcomes - Firms in innovation space network - Standards forum status reports, and marketplace - Strategies other documents - Regulators - OPPs - IEEE publications (Spectrum, - Forum rules / procedures - Interactions Journal of Communications) - Technology deployments - Enrollments - Existing descriptions of industry - Other events (e.g. Bekkers, 2001; Funk, 2. How do organizations build Actors Industry structure over time 2002) 1. Create timeline of events and their relationships and • Organizations - Market data inc. no. of competitors, key financials, outcomes in the coordinate with one another - Firms customer acq., churn, ARPU, Interviews - Innovation space and with technology during - Regulators - Merger/acquisitions, partnerships. . . - Industry decision makers - Marketplace the construction of large scale - Standard bodies - New services ƒ Network operators - Regulatory Regime information systems? • Technology - New entrants and industry expansion ƒ Infrastructure, semiconductor 2. Map change in actors’ view and • Standards Regulatory changes and devices manufacturers connections over time • Services - Legislation ƒ OS and middleware vendors 3. Map patterns of translations across • Regulations - Spectrum and licensing ƒ Content and service providers and within realms • Technological limitations - Industrial policy ƒ System integrators 4. Write explanatory descriptions of

69 ƒ Regulators • Other Technological developments coordination outcomes - Standards forum attendees - CPU performance / watt - Display capabilities - Industry observers and consultants - Storage - Battery capabilities

Actors’ view over time (as above) 3. How does standards creation Combination of constructs for Combination of data for RQ1 and RQ2. As for RQ1 and RQ2. 1. Combine time lines for RQ1 and RQ2 and adoption interact with the RQ1 and RQ2. 2. Map patterns of translations among ways that organizations build standards and coordination outcomes relationships and coordinate 3. Write explanatory descriptions of with one another and relationships among standardization technology? infrastructure? and coordination outcomes

4. How do existing technical and Actor-network configuration Combination of data for RQ1 and RQ2 along with a Existing histories of wireless industry 1. Add historical data to time lines inter-organizational short history of the wireless industry before 1990. developed for RQ1 and RQ2 coordination mechanisms 2. Examine translations right across affect the design and timelines. implementation of large scale 3. Write explanatory descriptions of how information systems? existing actor-network configurations affect outcomes

Table 6. Alignment between research questions, constructs, data collection and analysis

In summary, the first four chapters have introduced research questions that explore

the creation and adoption of technical standards as well as their relationship with the structure of the industries that create and deploy them. While economic theory provides substantial insight into the selection of pre-existing standards they have little to say about standards creation and they suffer from technological determinism. A review of the

Social Construction of Technology (SCOT) perspective highlights the social nature of

standards creation but also raises the criticism that SCOT exhibits social determinism.

The institutional viewpoint is limited to examining the effects of long term structures.

The actor-network based model developed in chapter 3 provides a more flexible method for conceptualizing the dynamic interactions between technology, business, regulation, and standardization; albeit with the limitation that ANT does not in itself have anything to say about the phenomena of interest. The case studies on the mobile wireless and television industries presented in the following chapters provide the opportunity to compare the ability of these perspectives to explain the observed industry and

standardization outcomes.

70

V. The Early Wireless Industry and 1G and 2G mobile wireless systems

The radio technologies underlying the services examined in this study rely on the

properties of electromagnetic waves, and the various ways that technologists have learned

to manipulate them. In this section we provide a brief outline of the

before the advent of cellular systems and a description of its key technical

characteristics. Some of the means used for the coordination of technical interfaces in

telecommunications systems and the coordination of radio spectrum usage are also

introduced.

The main body of this chapter presents case studies of the creation and adoption of

first (1G) and second generation (2G) mobile wireless systems in the US and Europe.

These are main sources used to outline the history of the development of these systems

are a number of books and dissertations8 rather than interviews. As well as providing the

background for the transition to third generation (3G) systems described in the next

chapter these cases are also used to develop a set of answers to the research questions for

the earlier phases of mobile communication.

The early history of radio

Radio is the transmission and reception of communication signals using electromagnetic energy that propagates at the speed of light. In classical theory the propagation of electromagnetic energy takes place in the form of coupled electrical and

8 For example Calhoun(1988), Garrand (1997), Mölleryd (1999), Manninen (2002), Bekkers (2001), Steinbock (2001), and Funk (2002). 71

magnetic waves, whereas in quantum theory the energy is seen as being transferred by a

flow of discrete particles called quanta or photons. Radio frequencies are usually considered to include electromagnetic radiation below around 300GHz9 (see Figure 7).

It can be argued that the history of radio started in the early 19th century with Michael

Faraday’s demonstration of electromagnetic induction and Maxwell’s (1865) theory that

an effect could be produced at some distance from the source of an electrical disturbance.

However, it was not until 1888 that Hertz demonstrated Maxwell’s predictions by using

electricity to produce sparks between conductors placed close together at the center of parabolic reflector. The spark jumping between the conductors caused a smaller spark to jump between conductors about 5 feet away. increased the range of the “spark gap” by attaching an elevated wire (i.e. an antenna) to one side of the spark gap and by connecting the other side of the gap to the Earth. Antenna and earth connections were also made at the receiver. By 1901 Marconi had refined the technology

to the point where transatlantic transmission was possible. By 1910 radio communication

between ships and shore stations was routine and air-to-ground radio communication was

established ("Radio," 2007). The rescue of 700 passengers from the Titanic in 1912

highlighted the utility of radio at sea (Millar, 1997).

Improvements in electronics and the invention of the thermionic valve (or vacuum

tube) in particular, improved receiver sensitivity and made practical the transmission of speech. By 1912 the thermionic valve could be configured as an oscillator for producing

radio energies at fairly stable frequencies. This along with the device’s ability to amplify,

and perform other signal processing functions allowed engineers to develop more

9 Hertz (Hz) is the unit of frequency corresponding to one cycle per second. The abbreviations kHz, MHz, and GHz represent frequencies of thousands, millions, and billions of Hertz respectively. 72

sophisticated radio and receivers that supported both the radio and radio communications industries. The first broadcast station in the US, KDKA,

opened in in 1920 and the BBC was formed in the UK in 1922.

Radio Spectrum Radio Spectrum

Figure 7. Electromagnetic Spectrum ("Electromagnetic radiation," 2007)

Mobile voice communication was used by fishing boats in the UK and Norway and

was an established part of aviation procedures by the 1930s. US and UK police forces

trialed radio communication in various configurations from the early . Radio was

used by the military in the first and second World Wars at sea, in the air and on land. The

term ‘Walkie-Talkie’ entered the English language around this time as US military two-

way were reduced to backpack size. The introduction of

(FM) in the mid-1930s was part of the reason behind the reduction in weight and size

(Calhoun, 1988; Garrard, 1997).

73

The importance of patents was recognized early in the commercialization of radio.

There was extensive litigation over patents in the US (nearly 1,500 infringement suits between 1920 and 1941). In the UK the generated considerable revenue from license fees paid by manufacturers (Garrard, 1997).

The commercial use of radio (referred to Private Mobile Radio or PMR) by small companies (e.g. taxi companies) emerged in the US in the 1940s and in the UK after the

Second World War. The PMR architecture typically included a base station controlled by a dispatcher that could communicate with mobile stations within its range. The systems were not routinely connected to the due to regulatory restrictions.

PMR was much more popular in the US than in the UK. According to Garrard (1997) this was due to more open licensing and better awareness of the benefits of radio by businesses.

In 1948 the was invented by scientists at AT&T’s . This device used the peculiar properties of semi-conducting crystals of germanium, silicon, and other materials. It could also be configured in circuits to perform a wide variety of signal processing functions and was superior to the thermionic valve in almost all respects: size, reliability and physical robustness, lower power consumption, lower heat dissipation, and lower production costs. These advantages and their ability to operate on lower voltages made them more suitable for use in mobile electronic equipment like pocket broadcast receivers and handheld communications transceivers (combined transmitters and receivers). Deployment of PMR by the police and other emergency services increased with the development of transistorized equipment. For example by, 1985 all operational

74

British police officers were able to keep in radio contact with their stations using pocket

sized two-way radios (Garrard, 1997).

The first mobile radio service connected to the telephone was launched in 1946 by

AT&T in St. Louis, Missouri. Calls were connected to and from the telephone network

by operators. By 1983 the there were some 150,000 mobile phone users in the country.

The use of VHF frequencies meant that few channels were available (only 12 in New

York) and users could have to wait for a long time to place a call. A similar system was

launched in the UK by the Post Office in 1959. Several generations of the systems were

rolled out in the UK with the number of users peaking at about 14,000 in 1985. Despite

being priced for the rich there were waiting lists for the privilege of becoming a

subscriber. These pre-cellular systems were particularly popular in Scandinavia. The

MTA system introduced in Sweden in 1956 was the first automatic system, i.e. human

operators were not required to establish calls to and from mobile users (Mölleryd, 1999).

Of the 150,000 subscribers in all of Europe in 1983 two thirds were in Nordic countries

(Manninen, 2002, p. 28). The main events in the pre-cellular era of wireless communications are summarized in Figure 8.

75

Fundamentals of radio communications

Communication using radio is made possible by varying some characteristic of the . The process of imposing a signal on a radio wave is called modulation and extracting it from a received radio wave is called . The simplest modulation scheme is the on/off keying of the radio wave. This digital scheme was used to transmit wireless using variations of the developed for use on wired telegraphy networks in the mid-nineteenth century. Advances in electronics made more sophisticated modulation schemes possible. Varying the amplitude of the radio wave in sympathy with an audio signal allowed trans-Atlantic speech to be demonstrated by 1915.

This (AM) scheme is still used for in the

Medium Frequency (MF) and High Frequency (HF) bands (see Table 7). Other modulation schemes vary the frequency (Frequency Modulation – FM) or the phase

(Phase Modulation – PM) of radio waves in sympathy with audio to be transmitted.

Digital modulation schemes allow digital signals to be communicated. As with analog modulation various combinations of changes in the amplitude, frequency and/or phase of the radio wave are used for the communication of digital information.

76

Technical Milestones Applications 1880s

Generation of radio waves (Hertz)

1890s Antenna / earth system (Marconi) Cross channel tests (UK)

Speech transmission (Fessenden) 1900s Transatlantic communication demonstrated Thermionic valve (Fleming) Transmission to automobile (US) Merchant shipping (UK) Transatlantic telegraph service 1910s Radio direction finding (UK) Valve transmitter (Fleming) Military aircraft for artillery spotting (UK) Transportables (UK) 1920s Police (Detroit, US) Fishing boats(UK, Norway) First international spectrum conference Aviation – navigation and control 1930s Frequency modulation (Armstrong) Telephones on ocean liners (UK)

1940s Private mobile radio systems (UK) Cellular Concept (Bell Labs) Transistor (Bell Labs) Operator-controlled mobile telephones (US) 1950s

Digital integrated circuits (various, USA) 1960s

Automatic mobile phones Solid-state telecom switches 1970s Microprocessors Cellular test (USA)

Cellular trial (USA) Cellular service (Japan) 1980s Cellular service (Nordic)

Cellular service (UK)

Figure 8. Technical developments and application of radio leading up 1G cellular services (adapted from Garrard, 1997, p. 2)

77

The radio spectrum is divided into several bands from the lowest to the highest frequencies. The common designation for these bands is shown in Table 7 along with their common uses. The characteristics of radio propagation and other factors make each

band more suitable for some applications than for others and only part of the spectrum is

suitable for mobile communications on a commercial scale. There is comparatively little

spectrum below 300MHz and certainly not enough to support mass market mobile

telephony or data services. In addition the propagation of radio signal below 50Mz can be

somewhat erratic with range varying from approximately line-of-sight to tens of

thousands of miles depending on the ionization of atmospheric layers hundreds of

kilometers above the earth’s surface – making the design of reliable mobile

communications networks challenging.

Band designation Abbreviation Frequency Range Common uses Extremely Low Frequency ELF 3-30Hz communications Super Low Frequency SLF 30-300Hz Ultra Low Frequency ULF 300Hz – 3kHz Very Low Frequency VLF 3kHz - 30kHz Navigation Low Frequency LF 30kHz - 300kHz Time signals Medium Frequency MF 300kHz – 3MHz AM Broadcasting High Frequency HF 3MHz - 30MHz Shortwave broadcasting Marine radio VHF 30MHz - 300MHz FM Broadcasting Aeronautical and marine radio Two-way radio (e.g. Emergency services) TV Broadcasting (in USA) UHF 300MHz – 3GHz TV Broadcasting Mobile telephony and data services Cordless phones Wi-Fi (802.11b/g) Mobile satellite Super High Frequency SHF 3GHz – 30GHz links (backhaul) Wi-Fi (802.11a) Satellite up/downlinks (inc. satellite TV) Extremely High Frequency EHF 30GHz – 300GHz

Table 7. Common radio spectrum band designations

78

At VHF and higher frequencies the range of radio waves propagating over the earth’s

surface tends to decrease with frequency. The sweet spot for offering mobile

communication services on a commercial scale has proven to be in the UHF band.

Coverage is large enough to make infrastructure build out economic but small enough to

allow frequency reuse to increase overall system capacity (the conceptual breakthrough

of the cellular radio architecture). The lower UHF frequencies (e.g. 800-900MHz used by

most 1G and 2G systems) can be used to provide coverage in rural areas with fewer base

stations than higher UHF frequencies (e.g. 1,800-2,200MHz used by 2G and 3G systems). Conversely, the higher frequencies are more suitable than the lower ones for more intensive frequency reuse in urban settings to provide more capacity. An additional advantage of the UHF frequency bands is that resonant antennas are practical on vehicles and hand portable devices.

Much of the early development of cellular wireless focused on ways of reducing interference between cells using the same frequencies as this determined system capacity.

One of the key tools for reducing interference is power control. For example, base station

transmitters in small urban cells should be less powerful than in macro cells in rural

areas. Similarly, mechanisms were designed to ensure that both base stations and mobile

stations only used enough power to ensure reliable communication to minimize the

interference experienced in other cells operating on the same frequencies.

Although we have only scraped the surface of radio technology it is sufficient to

allow us to understand the main trade-offs faced in the design of communications

systems. Further concepts will be introduced as required to explain the implications of

system design characteristics and the most recent generations of technology.

79

Coordination of spectrum usage and telecom standards creation

The coordination of spectrum usage as well as the definition of interfaces for the interconnection of telecommunications networks has been subject to international agreements for over a century. The ITU was formed in 1865 as the International

Telegraph Union to coordinate aspects of international telegraphy. It is one of the earliest intergovernmental organizations ever created – even predating an international organization dealing with postal affairs. By selecting the Morse system as the international standard for telegraphy it created what may be considered the first international telecommunications standard.

Early international coordination in the use of radio spectrum was carried out in the

International Radio Telegraph Convention formed in 1906. It was subsumed into the ITU in 1932, which was subsequently renamed the International Telecommunication Union

(ITU) two years later. The ITU, which became a Geneva based UN agency in 1947, is a treaty organization with member states holding the highest level of membership. Network operators and manufacturers are referred to as sectors members, and do not have all the rights of member states. It has historically had two primary functions relevant to the mobile telecommunication services:

• The allocation of radio spectrum for particular services (e.g. Broadcasting, Fixed,

Mobile, Satellite, Amateur, etc.) and the definition of generally high-level

requirements for the use of spectrum to minimize interference among its users. Lower

level system characteristics are only specified where international coordination is

essential or was deemed desirable (e.g. maritime radio and satellite orbital slots). The

80

ITU’s CCIR10 committee was responsible for these activities (renamed the ITU-R11

after the restructuring of the ITU in 1992). Member countries, which are bound by the

ITU spectrum allocations, are responsible for the actual assignment of spectrum to

users and the granting of licenses. The most important meetings of the ITU-R are the

World Radio Conferences12 (WRC) held every two or three years where changes to

the radio regulations, including spectrum allocations, are made. Between WRCs the

ITU-R’s work takes place in Study Groups, Task Groups, and Working Parties.

• The creation of telecommunications standards including those relating to audio and

video compression, narrowband and broadband ISDN, network signaling, network

management, and data communications as well as , and fax protocols. The

CCITT13 committee was originally responsible for creating telecommunications

standards but was replaced by the ITU- after the 1992 reorganization.

Around the world there are regional bodies where further harmonization of spectrum

usage or coordination of telecommunication standards is performed. The European

Conference of Postal and Telecommunications Administrations (CEPT) was established

in 1959 and was open to all European postal and telecom administrations – the monopoly

PTTs15 responsible for both the provision and regulation of telecom services, as well as

postal services. Standardization was part of the CEPT’s domain of concern and it

coordinated European proposals for ITU standards as well as the selection of options left

10 Comité Consultatif International pour les Radiocommunications (Int. Radio Consultative Committee) 11 ITU-R: ITU Radiocommunication Sector 12 Prior to 1992 the conferences were called World Administrative Radio Conferences (WARC) 13 Comité Consultatif International Téléphonique et Télégraphique (Int. Telephone and Telegraph Consultative Committee) 14 ITU-T: ITU Telecommunication Standardization Sector 15 Postal, telegraph, and telephone (or PTT) is or was a government agency responsible for postal mail, telegraph, and telephone services in most European countries. 81

open in ITU standards to facilitate more efficient network interconnection within Europe.

It also led the creation of some standards, such as the E1 primary standard, and the GSM second generation mobile communication standard. It also played a role in the harmonization of spectrum usage across Europe e.g. the identification of the 900MHz band for cellular services in 1978. To understand how the CEPT’s role changed in the

1980s and 1990s it is also necessary to understand something of the politics of Europe and its postwar institutions.

After World War II there were a series of European level institutions established to foster cooperation in Europe, at least partly intended to dilute the strong nationalism that had led to the World Wars. In 1957 the European Economic Community (EEC) was formed with six members. The UK and others joined in 1973. The EEC, which was renamed the European Community (EC), formed the main pillar of the European Union

(EU) formed by the Maastricht Treaty in 1992. The EU (and the EEC before it) has three bodies which are primarily responsible for decision making. The Council of the European

Union, which meets up to four times a year, is the main decision making body. Its meetings are attended by one minister from each EU national government. The European

Parliament (EP) is directly elected by the citizens of the member states and in most policy areas co-decides with the Council on the passage of new legislation. The European

Commission is independent of the member state governments and attends to the interests of the EU as a whole. The Commission drafts new legislation and presents it to the

Parliament and Council16. The European Commission played a lead role in setting the

EEC/EU’s telecommunications policy. The Commission’s 1984 White Paper

(COM(84)277), which was supported by the Council put forward several objectives

16 http://europa.eu/institutions/inst/index_en.htm 82

• Creating a community wide market for telecom equipment based on standards

• Promoting development of pan-European networks and services (e.g. ISDN,

GSM, and broadband)

• Establishing R&D programs for broadband communications (became RACE and

ACTS programs)

• Coordinating positions in international organizations e.g. (CEPT and ITU)

The 1987 Green Paper on telecommunication (COM(87)290) proposed the liberalization of the markets for telecommunication services and terminal equipment, but not networks. After gaining the approval of the Council, the Commission issued a series of directives to liberalize the markets for telecommunications services and terminal equipment. After another review of the telecom sector in 1992 the Commission and

Council went further in the liberalization of the telecom market. This included abolishing all the exclusive and special rights of government controlled network operators (the PTTs that existed in most European countries) by the start of 1998. The overall objective was to promote a competitive pan-European market for telecommunication – a stark contrast to the national monopoly-based industry structures that had been in place for

(Bekkers, 2001).

The CEPT’s goals of harmonizing telecommunication standards among European countries were in broad alignment with those of the European Commission. However, the

CEPT’s PTT only membership did not comply with the more balanced membership and transparency requirements of an official European Standards Development Organization

(SDO). The liberalization of telecommunications services in Europe that started in the

83

1980s led to the separation of regulatory and operational concerns of PTTs and the emergence of new telecom service providers. The 1987 Green Paper (COM(87)290) called for the creation of a new standards body for telecommunications and in 1988 the

European Telecommunications Standards Institute (ETSI) was formed. Incumbent and new telecom network operators formed their own interest group (ETNO) leaving the

CEPT as an association of European telecommunications regulators.

The formation of ETSI also opened up membership to manufacturers and all

European network operators. National regulators and standard bodies are also members.

It should be noted that primary geographical area serviced by ETSI is that of the CEPT and therefore includes countries that are not part EU member states. Companies with non-European origins (e.g. Motorola, HP, IBM) were able to join as full members because of having R&D facilities in Europe.

The most important decisions, such as approval of legally binding European

Standards, are made using a weighted national voting process – large countries have more votes than small ones. A separate tally of the votes of EU member countries determines whether a standard will be accepted by the Union. For lesser decisions the individual members of ETSI participate in the voting. The weight of the votes in this case are in proportion to a company’s telecom related revenue, or in the case of a national regulatory by the nation’s GDP. The rules for calculating revenue and therefore the number of votes can disadvantage non-European members as they are only permitted to include the revenue generated by their European presence.

ETSI’s original stance on standards creation of avoiding the recognition of overlapping standards meant that it was not open to standards that would compete with

84

those it had created. This clashed with the European Commission’s recognition of the importance of standards created in industry consortia, particularly in the ICT

(Information Computing and Technology) industry (CEC, 1996c). ETSI created a process for the approval of these so-called Publicly Available Specifications (PAS). This also

went some way to meeting ETSI’s need to comply with European anti-trust legislation.

The spectrum allocations defined by the ITU are broad and do not usually specify the

technology to be used and there is considerable flexibility for how national regulators

assign spectrum. The CEPT historically played a key role in harmonizing the use of

spectrum across Europe. Since 1995 this responsibility has fallen to the CEPT’s

European Radiocommunications Committee (ERC)17. National governments can adopt

the ERC recommendations but do not always do so if they conflict with existing spectrum allocations.

The UK’s national standards organization is the British Standards Institute (BSI).

Following the separation of British Telecom from the Post Office in 1981 the BSI was involved in setting standards for equipment that could be connected to the BT’s network

(BT, 2007). The BSI coordinates the UK participation in global (ISO/IEC18) and

European (CENELEC19/CEN20) and publishes many of the standards created by these

bodies for use in the UK (BSI, 2007). While BSI does not participate in ETSI Technical

Committees or Projects it is responsible for carrying out the public enquiry for European

standard proposals and establishing the UK’s position for voting on standards. Once

17 Also referred to as the European Radiocommuications Office (ERO), the name of the committee’s permanent offices in Denmark 18 International Standards Organization (ISO), International (IEC) 19 European Committee for Electrotechnical Standardization / Comité Européen de Normalisation Electrotechnique (CENELEC) http://www.cenelec.org/Cenelec/Homepage.htm 20 European Committee for Standardization / Comité Européen de Normalisation (CEN) http://www.cen.eu/cenorm/homepage.htm 85

approved and published by ETSI all official standards are endorsed by BSI and

transposed into British Standards.

The national standards body in the US is the American National Standards Institute

(ANSI). ANSI also participates in the global standards bodies (ISO/IEC) and coordinates with the European bodies (CENELEC/CEN). ANSI accredits various organizations, often industry associations, to create standards. There are three accredited organization that create telecommunication and data communications: the Alliance for

Telecommunications Industry Solutions (ATIS), the Institute of Electrical and Electronics

Engineers (IEEE), and the Telecommunications Industry Association (TIA).

TIA is an industry association of telecommunication and IT equipment manufacturers and is active in standards creation among its other activities. It was responsible for defining the AMPS cellular standard which was mandated by the FCC. Competing second generation cellular systems were standardized in its TR-45 and TR-46 committees. The competing standards are typically developed by the companies or consortia backing them.

ATIS is an industry body focused on the rapid development of standards. Unlike the

TIA its membership also includes operators. Its Wireless Technologies and Systems

Committee (WTSC), formerly T1P1, focus on mobile systems. ATIS coordinates with

ETSI to maintain the PCS-1900 variant of the GSM specification used by some operators in the US.

The IEEE is the professional body of the electrical and electronic engineers in the

USA. It had participated in standards development in a number of domains including telecommunications and IT. The most notable series are the 802 series of wired and

86

wireless local and metropolitan area networks (e.g. 802.11a/b/g or Wi-Fi). There is

coordination among ATIS, the TIA and the IEEE to avoid overlap in activities.

The regulation of telecommunication and broadcasting was carried-out by several

bodies. These included the ITC, Oftel, Radio Authority, Broadcasting Standards

Commission, and the Radiocommunications Agency (RA). The RA was part of the

Department of Trade and Industry. All five of these bodies were merged in December

2003 to form and integrated regulator, Ofcom. Ofcom, which has responsibility for all

aspects of radio in the UK, is an independent body i.e. not part of any government

department21.

In the US the Federal Communications Commission (FCC) is responsible for the regulation of telecommunications and broadcasting. The FCC “is an independent United

States government agency, directly responsible to Congress. The FCC was established by

the Communications Act of 1934 and is charged with regulating interstate and

international communications by radio, television, wire, satellite and cable. The FCC's

jurisdiction covers the 50 states, the District of Columbia, and US possessions.”

First generation (1G) analog cellular radio services

AT&T (known as ‘Ma Bell’) was the dominant telecommunications provider in the

US from 1913 when it was granted immunity from anti-trust prosecution in exchange for

pursuing the provision of universal telephony service in the US (King & West, 2002).

The concept of cellular radio was put forward by AT&T’s Bell Labs in 1947. The simple

21 Before the Radiocommunications Agency and Oftel were form their functions were performed by the British Ministry of Post and Telecommunication (MPT). 87

concept behind cellular was to increase the capacity of a mobile radio system by restricting the range of base stations and to reuse frequencies intensively. The small coverage ‘cells’ required the use of higher frequencies than were practical at the time. In addition, the reduced coverage of each base station necessitated the handover of calls as a mobile user moved between cells. The based electronics and electromechanical telecommunication switches of the time were not capable of supporting, at least economically, this sort of functionality and signal processing.

However, by the 1970s the development of microprocessors and other ‘integrated circuits’ incorporating multiple , increasingly flexible frequency synthesis techniques, and digital telephone switches made the cellular radio concept technically viable. The main components of a cellular radio system are illustrated in Figure 9. They include base stations, mobile stations, frequency reuse in non-adjacent cells, a subscriber for authentication and location tracking, and switching equipment with access to the fixed telephone network.

All first generation systems were analog. They used frequency modulation (FM) and shared radio spectrum among users by allocating a pair of 25kHz or 30kHz wide radio channels to a mobile user for the duration of the telephone calls. In effect mobile terminals were multi-channel FM transceivers, where channel selection was under the control of the cellular system rather than the user. This approach to sharing spectrum is referred to as Frequency Division Multiple Access (FDMA). Two channels are necessary to allow duplex communication – and this way of providing duplex communications is referred to as Frequency Division Duplex (FDD). The FDMA/FDD approach is both the easiest to understand and to develop. However, it does require that the mobile terminal is

88

able to transmit and receive simultaneously. The creation of a mobile phone that could

tune to any of hundreds of pairs of channels was a significant challenge in itself in the

1970s – implementing one of the more advanced multiple access schemes (e.g. TDMA or

CDMA discuss later in this chapter) in a would not have been economically or technologically feasible at that time.

F3 F4 F3

F1 F2 F1

F4 F3 F4

F2 F1 F2

F3 F4 F3

F1 F2 F1

Fixed Telephone Subscriber Network Mobile telephone switch database

Figure 9. Basic components of a cellular radio system

The launch of Cellular in the

AT&T unsuccessfully lobbied the FCC for additional spectrum for mobile

applications from as early as the late-1940s. However, much of the attractive UHF

spectrum (470MHz to 890MHz) was allocated to television in 1949 and it was 1970

before the FCC relocated some of this spectrum for mobile services. In 1978 AT&T were

permitted to trial a cellular phone system in . The success of this and another trial

89

carried out in Baltimore by a subsidiary of Motorola, led to the FCC putting in place a

mechanism for the granting of operator licenses and to allocate 2x20MHz (825-845 MHz

for the mobile to base station radio link and 870-890MHz for the base station to mobile

link) for cellular services. Two licenses were awarded per market22 to ensure competition.

One license was intended for existing wireline operators and the other for a non-wireline

operator. The licenses awarded between 1982 and 1989. In some markets there were

hundreds of applications. To handle the process the FCC awarded licenses using a lottery in all but the largest markets. In October 1983 the first public cellular services in the US were offered in Chicago and Baltimore – the locations of pre-existing trials. The geographical fragmentation of the cellular industry was compounded by the allocation of the cellular business to the newly formed Regional Bell Operating Companies (RBOC) during the break-up of ‘Ma Bell’ in 1984 rather than the national inter-exchange carrier

(IXC) that retained the AT&T brand (Calhoun, 1988 Ch. 3; Farquhar, 1996; Garrard,

1997 Ch. 2; King & West, 2002).

The Advanced Mobile Phone System (AMPS) standard developed by AT&T was used by all the wireless operators. AMPS specified an air interface i.e. the modulation scheme (FM) for telephony, and the signaling formats between the mobile terminals and the base stations. The signaling between the mobile terminals and base stations includes alerts to incoming calls, the number dialed for outgoing calls, the allocation of pairs of

channels for calls (and handovers between base stations where necessary).

The AMPS standard did not provide a standardized means of interconnecting separate

cellular systems to facilitate customer roaming while visiting other locations. This

22 The ‘markets’ were 305 Metropolitan Statistical Areas (MSAs) and 428 Rural Statistical Areas (RSAs). MSAs are counties that include at least one town with a population of over 50,000 and a total population of over 100,000. 90

became an important commercial requirement since the FCC’s policy of awarding

licenses for small geographical regions and the 1984 break-up of ‘Ma Bell’ led to

fragmented system coverage. To offer more attractive service operators needed to

establish roaming agreements with one another and to deploy supporting technical

mechanisms. The lack of a technical standard to facilitate roaming led to a range of solutions that were complex, and expensive, for phone users. In 1984 the Cellular

Telephone Industry Association (CTIA) was established. One of its first tasks was to enable roaming. It developed a set of requirements for roaming that allowed the

Telecommunications Industry Association (TIA) to develop the IS-41 standard for the interface between cellular networks, the first version of which was released in 1988

(Garrard, 1997 Ch. 2; Yu, 1992).

Early cellular customers generated about $4,000 of revenue per year (in 1985) with usage of over 500 minutes per month. As market penetration increased the (ARPU) decreased as tariffs were reduced and average usage was lower for the later adopters. For, example the ARPU by the end of 1991 was under $1,000 per year). Operators practiced price differentiation by allowing customers to self select usage plans that traded off fixed monthly charges against per minute charges and the geographical coverage offered (Garrard, 1997 Ch. 2).

In the first years of operation all mobile terminals were vehicle based. It was 1986 before the first hand portable handset, the 800g Motorola DynaTAC 8000X was made available in the US for about $4,000. Hand portables (see Figure 10) became more popular with the launch of the 350g Motorola Microtac in 1989. However, the take off of the handset format for mobile phones was comparatively slow in the US. While 40% of

91

sales in the US were for handsets as opposed to car phones by 1992 – the comparable

numbers were about 90% for Europe and 100% for Asia (Garrard, 1997 Ch. 2). A high-

level time line of the main events in the introduction of first generation systems in the US is provided in Figure 11.

From 1987 McCaw Communications, which had started out in cable TV, undertook a program of acquiring cellular licenses and by 1990 had become the largest operator in the US. AT&T (the IXC) reentered the cellular business by completing an acquisition of McCaw in 1994 for $11.5 billion (King & West, 2002)

(a) Motorola DynaTAC8000 (1984) (b) Motorola Microtac (1989)

Figure 10. Early US cellular handsets

It is worth noting that Japan was actually the first country to offer a commercial

cellular service in 1979. However, growth was slow with only 0.l5% penetration after 10

years. However, Japanese electronics companies were major suppliers of handsets for the

AMPS and TACS systems deployed elsewhere in the world (Garrard, 1997 Ch. 2).

92

Subscribers 8 (in millions) About 3% 7 penetration

6

5

4

3

2

1

- 1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991

Regulation 93 Early years (regulation) FCC allocates AT&T given FCC awards cellular licenses (1982-89) • FCC responds to 800 MHz cellular permission to run congestion on non- band (1974) cellular trial (1977) Divestiture of Bell cellular services (1968) Operating Companies from AT&T (1984)

Innovation Early years First hand portable Motorola MicroTAC • Cellular concept (1947) launched – Motorola launched (1988) • Transistor (1948) DynaTAC8000 (1986) • Integrated circuits (1958)

Market Early years (market) AT&T launch Cellular First public cellular • First mobile telephone (1946) trial in Chicago services services launched • AT&T lobby for cellular spectrum (1978) launched in in Chicago & (1958-68) Japan (1979) Baltimore (1983)

Standards creation IS-41 Intersystem standard Early years (standards) • IS-41-0 (1988) • Bell Labs (AT&T) – submits AMPS as • IS-41-A (1989) sole technical proposal (1971) • IS-41-B (1990)

Figure 11. Time line of major events in first generation cellular services in USA

The launch of Cellular in the Europe

No account of the early years of cellular telephone systems would be complete

without mention of the NMT systems developed in the Nordic countries (Denmark,

Finland, Norway, and Sweden). The Nordic Mobile Telephone (NMT) Group was

formed in 1969, at the instigation of Sweden, to design a system to overcome the capacity

constraints of the existing non-cellular VHF telephone systems, and also to create a larger

market to attract manufacturers. Membership was restricted to the national

telecommunications administrations of the Nordic countries and the group was part of wider inter-administration cooperation on research and operational telecom issues (Haug,

2002; Manninen, 2002; Mölleryd, 1999, p. 87).

The early requirements specified by the NMT group were mostly non-technical in

nature. The service was envisaged as an extension of traditional telephony, with

automatic call connection to and from mobile terminals, and both national and

international roaming. These requirements necessitated the coordination of frequency

allocations across the Nordic countries and the specification of a standardized inter-

switch interface to facilitate roaming. An interim non-cellular system was specified while

the microprocessor technology needed for the handover and roaming functions matured

sufficiently (Mölleryd, 1999, p. 88). The roaming capability was not supported by

Motorola which tried to enlist Swedish manufacturer SRA in an effort to convince the

NMT Group to abolish this requirement (Mölleryd, 1999, p. 91). The NMT450 system

specification released in 1977 also included a standardized interface between the switches

and the base stations. The standardization of additional interfaces allowed interoperability

94

between system components from different manufacturers, unlike AMPS, thus avoiding

lock-in to specific suppliers with proprietary interfaces. In addition to standards creation the NMT Group also took the lead in coordinating operational issues such as the user licensing, billing, numbering, and the procurement of infrastructure. The NMT standard was somewhat unusual in that it was the operators that took the lead in the development of the specification of the standard, drawing up manufacturers only as needed (Garrard,

1997 Ch. 2; Haug, 2002; Manninen, 2002).

The system’s original 450-470MHz frequency allocation allowed it to offer wide

coverage with relatively few base stations, albeit at the expense of capacity. The take-up

of the mobile phone service after its 1981 launch was more rapid than had been

envisaged and led to a shortage of capacity even after system upgrades. An updated

NMT900 specification released in 1984 used frequencies in the 900MHz band and included several enhancements. NMT900 addressed the capacity constraints of the earlier specification both by increasing the number of channels available and by supporting smaller cell sizes. These smaller cells on the other hand meant that about three times as many base stations were required to provide the same coverage as the earlier system using the 450MHz band. NMT900 networks were launched in late 1986 and supported hand portable phones which NMT450 had not (Garrard, 1997 Ch. 2; Manninen, 2002). It is also interesting to note that some industry players, such as had promoted the adoption of AMPS/TACS at 900MHz and in at least Sweden it had been Swedish

Telecom (the PTT) that had insisted on a new Nordic standard (Mölleryd, 1999, p. 95).

In the US cell phone users were charged for ‘air-time’ meaning that they were charged for both incoming and outgoing calls. In the Nordic countries the traditional

95

charging paradigm of ‘caller pays’ was retained. Unlike their US counterparts Nordic cellular phone users were not reticent to give others their cell phone numbers. The higher utility that Nordic users were able to get out of their cell phones is one of the reasons put forward to explain the more rapid adoption of the cell phone in the Nordic countries. It is also argued that the Nordic countries had gained an understanding about the more utilitarian uses of mobile communication from wider use of the earlier non-cellular mobile phone systems and PMR deployments. This was also reflected in the service not being viewed as an expensive status symbol in Nordic countries and employers being more willing to pay for the service for all level of employees where there were operational benefits (Garrard, 1997 Ch. 2; Manninen, 2002).

In the non-Nordic European countries cellular services were launched later. Spain offered a limited service from 1982 and Austria in 1984. These and later systems in the

Netherlands, Belgium, Luxembourg, Iceland, France and Andorra used variations on the

NMT450 system. The variations, typically of the precise frequencies or channel spacings used to fit in with preexisting frequency assignments in that part of the radio spectrum, were enough to limit the economies of scale that could be realized and prevent interoperability. The choice of 450MHz typically caused capacity constraints soon after launch and the Netherlands launched a NMT900 system in 1989 to increase capacity. The big European countries (UK, Germany, Italy, and France) all launched services in 1985.

In all but the UK the services were offered only by the PTTs, using nationally developed standards, and deployed infrastructure and mobile phones from national suppliers. By the end of 1991 the market penetration (Table 8) was much lower than in the Nordic countries or the US.

96

As early as 1978 the CEPT (an organization used by the European PTTs for

coordination of, among other things, telecommunications standards) agreed upon the

900MHz frequency band for pan European cellular services in Europe. The allocations

890-915MHz and 935-960MHz would support 1000 channels (with 25 kHz spacing).

When the UK selected AMPS as its cellular standard it modified it to align with this

allocation. The resulting specification was called the Total Access Communications

Systems, or TACS. The UK and Ireland launched TACS networks in 1985. Italy, Spain,

Austria and Malta launched TACS networks in 1990. No new analog cellular networks

were launched after 1990. The following year, 1991, marked launch of the first second

generation (2G) digital cellular networks in Europe (Garrard, 1997 Ch. 3).

The market penetration of these countries was much lower than those of the Nordic

countries and the USA, although penetration in the UK and Switzerland was not too far

behind. No single factor has emerged to explain the varying penetration and growth rates across all these countries. Some of the factors that have been proposed as providing at least partial explanations at the national level included distribution and size of the population, GDP, the prior understanding of the benefits of mobile radio, and the acceptance of other technology more generally (e.g. personal computers). The operators’ strategies and operational effectiveness may have also played a role. The possibly relevant differences include: service coverage, capacity, and quality, customer service, pricing structures, distribution strategies, and the range of approved mobile phones.

While the wireless standards selected may have influenced system capacity and performance there were certainly many other factors influencing the growth of the cellular mobile phone markets in the early years (Garrard, 1997 Ch. 3; Manninen, 2002).

97

The launch of Cellular in the United Kingdom23

The approach to cellular services in the UK was distinctly different from that taken by the rest of Europe and was a part of a groundbreaking restructuring of the UK’s wider telecom industry. The Conservative government of Margaret Thatcher came to power in

1979 and took a series of steps to introduce competition in telecommunications. The

British Telecommunications Act (1981) established British Telecom (BT) as a public corporation independent of the Post Office – ending an almost continuous monopoly on communications (post, telegraph, and telephony) that had persisted for centuries. The privatization of BT took place in 1984 when the government sold 50% of its shares in the company. The Telecommunications Act (1984) established a competitive market for telecom services by abolishing BT's exclusive right to provide services. Mercury

Communications (a subsidiary of Cable and Wireless) was granted a license to provide fixed line telecom services in the UK. This duopoly (BT and Mercury) was regulated by the newly established Oftel and persisted into the early 1990s when a number of new national operators were licensed.

In contrast to the rest of Europe the UK government decided to license two cellular network operators from the outset. BT was awarded one of the licenses but was required to establish a mobile subsidiary that could not be wholly owned. There were several bids for the other license from which the government selected a consortium led by Racal, an innovative manufacturer of military communications equipment. The operator formed by

BT and its partner Securicor traded as Cellnet, while Racal and its partners traded as

23 Draws upon (Garrard, 1997 Ch. 4).

98

Vodafone. Both licenses were national and stipulated that the operators were obliged to cover 90% of the UK population by the end of 1989.

The operators were not allowed to make or sell equipment, or to offer value-added services. Cellular service providers (or airtime resellers) were to sell the mobile phones and to provide service using one of the two networks. The operators could form their own service providers as subsidiaries but had to deal with them in the same way as with the other service providers. By the end of 1989 there were 39 service providers with 10 of them selling service from both networks. The arrangements between the service providers and the network operators typically involved an incentive for signing up new customers and a percentage of on-going call revenues. However, the terms left little profit for the service providers. By 1992 consolidation to realize economies of scale left over 80% of subscribers in the hands of the top ten service providers.

A committee comprising both licensees and government representatives was formed to select the cellular standard for the UK. After assessing several options, including

NMT, the committee unanimously chose a modified version of the US AMPS standards which it called the Total Access Communication System (TACS) (Manninen, 2002).

Capacity was made available in the CEPT 900MHz band and the operators launched services in early 1985. Thus cellular services were being offered in the UK before the other large countries in Europe that had been planning services for years. This can be attributed to the commercial basis upon which cellular phone services were established.

The UK government did not use the service as an opportunity to support national champions by developing its own standard (unlike in France, Germany, and Italy). The infrastructure manufactures selected by the operators were Motorola (US) and Ericsson

99

(Sweden). The initial mobile phone manufacturers were Motorola, NEC, Mobira (later

Nokia), and . The UK benefited from the economies of scale for AMPS since the phones required relatively little modification for use on the TACS system. The prices of the phones (mostly vehicle based) started at around £1,500 but fell rapidly – by 1988 the average was £700 (or £1,400 for a hand portable). The operators and services provider subsidized the cost of phones which put pressure on the manufacturers to reduce their prices. A high-level time line of the main events in the introduction of first generation systems in the UK is provided in Figure 12.

Country Standard Launch Market penetration by end date of 1991 Scandinavia Sweden NMT 450 10/81 7.0% NMT 900 12/86 Norway NMT 450 11/81 5.5% NMT 900 12/86 Denmark NMT 450 01/82 3.5% NMT 900 12/86 Finland NMT 450 03/82 5.5% NMT 900 12/86

Spain NMT 450 06/82 0.3% TACS 04/90 Austria NMT 450 11/84 1.5% TACS 07/90 Netherlands NMT 450 01/85 0.8% NMT 900 01/89 UK (x2) TACS 01/85 2.0% (2 operators) Germany Netz-C 09/85 0.8% Italy RTMS 09/85 1.0% TACS 04/90 France R2000 11/85 0.7% (2 operators from 89) NMT 450 08/89 Ireland TACS 12/85 0.9% Belgium NMT 450 04/87 0.6% Switzerland NMT 900 09/87 2.6%

USA AMPS mid-84 3.0% (many operators)

Table 8. Launch of analog cellular systems in major European countries and US (adapted from Garrard, 1997)

100

Despite having less technical expertise from the outset Vodafone is considered to have made early decisions that led to a more scalable network architecture (e.g. higher capacity switches and the use of sectored cells from the outset). The extra capacity, at least for a time, is credited with giving Vodafone a competitive advantage that allowed it to wrench market share from Cellnet despite identical pricing.

Cellnet’s management was tightly controlled by BT and was hampered by BT’s bureaucratic procedures and rigid pay structures. BT, it must be remembered, only faced market competition from 1984 before which it was a state owned monopoly like the other

European PTTs. Cellnet suffered many changes in management in its early years and was not considered as marketing driven as Vodafone. Vodafone consistently outperformed

Cellnet financially.

Garrand (1997, pp. 122-123) argues that there is a fairly strong relationship between the market penetration of cellular services after five years and the cost of the services

(adjusted for purchasing parity). This is in itself not too surprising as it simply describes the well known demand curve. However he does point out the UK falls some way off this curve by managing to achieve fairly high penetration without offering service at particularly low prices. Some of the possible explanations include competition between operators driving the rapid rollout of coverage which was seen as being necessary for competitiveness (both operators reached the target 90% coverage of the UK population by mid-1987). The resources expended by the operators in promoting the services is also credited for raising awareness of mobile communications despite the low penetration of

PMR in the country compared to the US or the Nordic countries. Entrepreneurial service providers are also credited with driving growth in the UK.

101

1,400 About 2% 1,200 penetration

1,000 UK cellular subscribers 800 (in thousands) 600

400

200

1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991

102 UK Regulation Thatcher BT split from Vodafone & Oftel created government Post Office Cellnet awarded & BT elected (1979) (1981) licenses (1982) privatized (1984)

UK Market 900 MHz TACS networks Motorola TACS cover 90% of UK MicroTAC Early years (market) networks population (1987) launched • First UK mobile telephone launched launched (1988) in Lancashire (1959) (1985)

Standards Creation / Adoption TACS (modified AMPS) selected as UK standard (1983)

Other key events

Early years NMT specification CEPT agree on First NMT450 NMT900 • Cellular concept (1947) released – includes 900MHz band for systems supports • Transistor (1948) base station to switch European cellular launched in hand • integrated circuits (1958) interface (1977) services (1978) Scandinavia portables • NMT Group formed (1969) (1981) (1984) • Bell Labs (AT&T) – submits AMPS specification to FCC (1971)

Figure 12. Time line of major events in first generation (1G) cellular services in UK

Second generation (2G) digital cellular radio services

Throughout the 1980s the European and national level telecommunications regulators, the telecommunications operators, and the equipment manufacturers worked together to create a second generation (2G) digital cellular telephony standard. The overarching objective was to create a pan-European standard. In the US the push for a second generation standard was only initiated at the end of the decade. In this instance the objective was to increase system capacity in urban areas. The way that the creation and adoption of 2G standards played out on both sides of the Atlantic are discussed in this section.

The development and launch of GSM in Europe

Consideration of the need for European level coordination on cellular services was recognized in the 1970s even before first generation systems had been launched. The

CEPT’s first contribution was a 1978 agreement that spectrum should be made available in the 900MHz band (see page 97). This band was also reserved for land mobile services in Region 2 (Europe and Africa) at the ITU World Administrative Radio Conference in

1979 (WARC-79) (Garrard, 1997 Ch. 5; Manninen, 2002). The 900MHz band was used by several countries, including the UK, for first generations analog systems, and provided the basis for the GSM second generation (2G) system supporting international roaming.

In 1980 the French PTT unsuccessfully attempted to interest other European operators in the development of a pan-European mobile . At this time France and the UK also collaborated on the possibility of adopting an NMT-based

103

system at 900MHz for first generation cellular systems – other countries were using, or

planning to use, the 450MHz band. While there was no European level collaboration at

this stage the Nordic countries and France both started some work on digital systems

around 1981. The threat of Franco-British proposal endangering the use of 900MHz for a

future pan-European system helped the Dutch garner support for a 1982 proposal to get

work started on a pan-European 2G mobile system. Thus the CEPT established a standards committee – Groupe Spéciale Mobile (GSM) – to specify a new standard for

Europe. This in itself was not necessarily a promising start as the CEPT did not have a particularly strong track record of effective standards creation (e.g. standardization was too late, ambiguous X.25 standards led to interconnectivity problems among national networks, and a lack of manufacturer representation led to unnecessarily expensive solutions) (Garrard, 1997 Ch. 5; Manninen,

2002).

The first GSM meeting held in December 1982 set high-level requirements for the new standard. These requirements included the need for international roaming, high spectral efficiency, support for hand portables, and stressed low total system costs

(particularly low cost mobile terminals). There was an early working assumption that the standard should be digital. This was partly to take advantage of potentially improved speech quality, higher capacity, better security and the ability to support more services, but also to ensure that no country’s existing analog technology would have an unfair head start. Thomas Haug, who had previously chaired the NMT group became the chairman of the GSM group (Garrard, 1997 Ch. 5; Mölleryd, 1999, p. 130).

104

The GSM Group was formed under the CEPT’s Coordination Committee for

Harmonization (CCH). The CCH limited the GSM group by preventing it reaching out to

non-CEPT groups directly (e.g. ITU), limited its scope to technical issues (and not

operational and procurement issues in contrast to the wider role of the NMT committee), and was resistant to the group setting up its own working parties. However, its position within the CEPT also allowed it to draw upon other resources. By early 1985 the GSM committee was satisfied with the technical and economic feasibility of GSM and resources committed increased and the CEPT gave the committee more flexibility. For example a “Permanent Nucleus” (PN) that would support the work between meetings became operational in mid-1986 and several Working Parties were formed to deal with specific issues like the Services and Facilities (WP1), Modulation (WP2), and Network

Aspects (WP2) (Manninen, 2002).

The GSM committee considered coordinating with a wider set of operators through the ITU. While active coordination with the ITU was rejected because of its slow pace of standards creation the GSM committee did build upon existing ITU standards e.g. the number plan and the signaling system number 7 (SS7). In 1984 the GSM committee was also approached by Bellcore (the R&D arm of the seven newly independent Regional

Bell Operating Companies) about cooperating on compatible mobile systems among other areas. However cooperation was seen as being difficult and it was feared that the wider coordination would impact the GSM timetable and the request for formal coordination was declined (Manninen, 2002).

In June 1984 a variety of radio-access schemes were selected by the GSM committee for further study (several variations of FDMA, TDMA, and CDMA were included). In an

105

effort to support their own manufacturers, political level actors in France and Germany launched their own study program, and companies from both countries formed consortia around various technologies. The consortium led by German company SEL had a head

start with in the development of a wide-band TDMA based radio interface which was

further strengthened when Italy joined the Franco-German cooperation in June 1985 and

but its weight behind it. The UK joined this group of large European countries in 1986 to

form the quadripartite group. Candidate access schemes were trialed from mid-October

1986 to January 1987. The results on agreed technical, operational, and economic criteria

clearly showed that narrow-band TDMA was superior to the other option trialed). In a

February 1987 meeting thirteen out of fifteen administrations backed the narrow-band

approach while Germany and France continued to express a preference for wide-band.

No formal decision could be reached as the CEPT required unanimous agreement.

However, the GSM Committee continued with the development of the narrow-band

TDMA approach by adopting it as a working assumption until it was resolved at the

political level within the quadripartite forum held in May. The more detailed

specification work that now followed required more input from manufacturers and from

Spring 1987 onwards manufacturers were able to participate directly in the GSM

committee, an unusual step for the CEPT which traditionally restricted participation to its

PTT members (Manninen, 2002; Temple, 2001).

The GSM air-interface finally specified uses Time Division Multiple Access (TDMA)

to share a 200 kHz radio channel among eight users. When making calls each mobile

terminal sharing a radio channel takes turns to transmit bursts of data (typically

containing digitized speech) in allocated timeslots. As with 1G systems the duplex

106

capability is provided using two radio channels – a scheme referred to as Frequency

Division Duplex (FDD). However, the uplink and downlink timeslots are allocated such

that the mobile terminal does not have to transmit and receive simultaneously –

simplifying the design of the radio in the terminals. Supporting only 8 conversations per

200 kHz meant that GSM only supported the same number of calls per MHz as NMT or

TACS. However, overall systems capacity was higher since the digital signals were more

robust to interference among cells and frequencies could thus be reused more intensively

than on the analog systems. GSM system capacity was later doubled as a more advanced

digital speech compression algorithm (the ‘half-rate codec’) supported sixteen users per

channel.

Sophisticated voice compression and radio channel coding techniques are required.

Overall system complexity was greater than expected and by the end of 1990 it was

estimated that ten manufacturers had invested 5,000 man years on the development.

However, once designs were instantiated in hardware and software, digital cellular

systems supported more economical infrastructure and mobile devices than earlier analog

systems. Infrastructure, for example, could be cheaper since eight (or sixteen) users are

supported by one radio. Other advantages of digital systems in general included higher

system capacity, and better audio quality. Better security mechanisms built into the

digital specifications made casual eavesdropping next to impossible and eliminated some

of the ways in which fraud had been perpetrated in first generation systems. In the GSM

case the larger expected market size allowed economies of scale to be realized (Garrard,

1997 Ch. 5).

107

However, the GSM specification dealt with much more than the air-interface between

the infrastructure and the mobile terminals. For example the mobile switching center

(MSC) and the base station controllers (BSC), the so called ‘A’ interface, was

exhaustively defined. The ‘A-bis’ interface between the BSCs and the base transceiver

stations (BTS) was also defined. This was intended to eliminate the risk of lock-in to the

equipment of a particular infrastructure manufacturer. As the enormity of the standardization task became apparent it became necessary to split the effort into phases to increase the likelihood of being able to deliver the most important functionality

(telephony and international roaming) by the 1991 launch date stipulated in the MoU. By

the beginning of 1990 the first release of the GSM recommendations contained more than

5,600 pages. It would be late-1995 before the specifications for phase 2 were complete.

Phase 2 and phase 2+ specifications more than quadrupled the number of pages of the

phase 1 recommendations. A high level architecture of the GSM system is depicted in

Figure 13 (Garrard, 1997 Ch. 5).

The UK took the lead in gathering the commitment to GSM that would give manufactures the confidence to make the large investments required to develop infrastructure and terminals In addition a strong signal that there would be a large market would be required for semiconductor vendors to develop the highly integrated chips necessary for economical and power efficient handset terminals (handheld terminals were already gaining popularity in the analog systems operational in the late 1980s). A recommendation from the CEPT was not seen as strong a enough commitment as PTTs routinely ignored them when they were inconvenient (Temple, 2001).

108

The UK first raised the concept of a Memorandum of Understanding (MoU) to be signed by operators across Europe committing them to launching commercial GSM networks by July 1991. This date was in itself a tricky compromise between countries that expected to run out of capacity on analog systems earlier and those that would not need the additional capacity for some years afterwards. The idea of the MoU was first raised within the quadripartite forum in May 1987 and the document drafted in July.

Sweden was included in drafting in hope of winning the support of Scandinavian countries. The MoU received wide support and was signed by 13 operators from 12 countries in September 1987. An MoU Group made up of the signatories took on the responsibility for coordinating the non-technical issues necessary for the commercial launching of GSM network including billing & accounting, marketing, roaming, technical coordination, technical compatibility, and administrative procedures of acceptance (Temple, 2001). The signing of the MoU by operators and regulators provided an unmistakable commitment to manufacturers that there would be a large market for

GSM equipment. It also sent a strong signal to other countries around the world that

GSM would be a major 2G standard – a signal that strongly influenced the adoption of

GSM in other countries (Funk, 2002).

The CEC perceived the creation of a pan-European mobile phone standard to be well aligned with its own vision of a more united Europe. In early 1985 the EC pressurized the

GSM Committee to bring forward the publication of final specifications by two years to

the end of 1986, and the committee feared that the CEC was trying to take over the

project. The committee came to the conclusion that advancing the schedule so much

would not feasible unless an existing analog system was selected. By summer the CEC’s

109

initiative to steer the development of GSM had dissipated (Manninen, 2002). Later the

CEC took on a more supportive role and issued directives mandating the deployment of a single standard, and for national administrations to free up part of the CEPT 900 MHz band (CEC, 1987a, 1987b).

Subscriber Identification Module (SIM)

Mobile Station (MS) Base Transceiver Station (BTS)

‘A-bis’ Interface

Home Location Base Station BSC Register Controller (BSC) HLR)

‘A’ Interface Fixed Telephone and other Visitor Mobile Switching Networks Location Center (MSC) Register (VLR)

Figure 13. High-level GSM System Architecture

The UK was ahead of the rest of Europe in liberalizing the provision of telecommunications services (see page 98). The Commission of the European

Community (CEC) was the driving force behind opening up the telecom market elsewhere in Europe. A 1988 Commission Directive was intended to liberalize the market for terminal equipment, and a 1990 directive required that the regulation of telecommunications be separated from the operation of telecom networks (CEC, 1988, 110

1990). In 1996 these directives were amended by a directive that would lead the way to

full competition in telecommunications services in Europe by 1998 (CEC, 1996b). By the

late-1980s these institutional changes along with the increasingly complex nature of

telecom systems meant that the CEPT with it membership restricted to monopoly PTTs

was not the ideal forum for standards creation and that broader participation was

required. The European Telecommunications Standards Institute (ETSI) was formed in

1988 and the responsibility for GSM transferred to it in 1989. ETSI membership was

open to manufacturers with R&D facilities in Europe. So Motorola, a US based company, was able to become a full member of ETSI (Garrard, 1997 Ch. 5).

To protect themselves from uncertainty about patents (or IPR) associated with the

GSM standard operators wanted manufacturers to indemnify them against any IPR

infringement claims and to make their IPR freely available to any other manufacturer to

promote rapid diffusion of GSM globally. Only Motorola stood up to these demands and

eventually, IPR was dealt with through bilateral agreements between manufacturers –

which put small manufacturers at a disadvantage. It is believed that the terms for GSM

IPR were prohibitively expensive for non-European manufacturers and by the end of

1995 Motorola, Nokia, and Ericsson held 75% of the market for GSM terminals

(Garrard, 1997 Ch. 5).

Some token GSM services were available on July 1, 1991 in Denmark, Sweden and

Finland, but mobile terminals were available in commercial quantities. The delays in

finalizing the phase 1 specification had a knock on effect on mobile terminal design and the implementation of a type approval process. Many commercial systems were launched

in mid-1992 as an interim type approval process was launched. GSM services were

111

launched in almost all Western European countries launched GSM by 1995. Neither of

the operators in the UK strictly met their MoU obligations of launching commercial GSM

services by July 1991. Vodafone launched its network in the UK in July of the following

year but did not market it or provide wide coverage. It was mainly adopted by a small

niche of customers attracted to its international roaming capabilities. Cellnet resisted launching its GSM network until July 1994 by which time recovery from the early 1990s recession and demand from consumers driven by the new PCN operators (see page 112) had reinvigorated market for mobile phone services and put pressure on the capacity of the TACS networks (Garrard, 1997 Ch. 5).

Personal Communications Network (PCN) in the United Kingdom

In early 1989 part of the UK government, the Department of Trade and Industry

(DTI), started to explore the possibility of licensing new operators to offer cheaper mobile phone services than those offered by Vodafone and Cellnet using TACS and

GSM. By the end of the year three Personal Communications Network (PCN) licenses were awarded to consortia that included several US RBOCs and

operators. The licensees had chosen the GSM standard as the basis for services although

their network would have to work in the 1,800 MHz PCN band (Garrard, 1997 Ch. 6).

After the awarding of the licenses there was considerable work in establishing the

standard to be used, the appropriate licensing terms, and network planning. During that

time there was considerable change in ownerships of the licensees and two licensees

merged in 1992. The PCN specification was available by 1993 and had been ratified as

the European standard by ETSI – the standard is referred to as PCN, Digital Cellular

112

System (DCS1800), and GSM1800. It was essentially the GSM specification operating in

the higher 1,800 MHz band (1710-1785 MHz and 1805-1880 MHz), with more channels,

and lower power. PCN terminals were only to be hand portables. The advantage for

operators was that it would take less time to develop but on the other hand it could not offer a technical basis for differentiation from existing network operators using GSM at

900 MHz. Indeed the higher frequency of operation meant that the PCN operators would need between 4 and 6 times as many cells to provide the same level of geographical coverage as operators using 900 MHz. Although 1,800 MHz provided more network capacity this was not an advantage during the initial phase of establishing a network. The

900 MHz operators had been able to build out their coverage relatively cheaply (£100-

£150 million) and grow capacity as required for a total network investment in the region

of £800 million. It was likely to cost PCN operators in around £660 million to create

enough coverage of 60% of the population (Garrard, 1997 Ch. 6).

In addition the semiconductor technology for 1,800 MHz was not as well developed as for 900 MHz. This along with the smaller potential market meant that there would be fewer handset types available and that at least initially costs would be higher. To offset

the technical disadvantage the PCN operators were provided with some relief by the

regulator. PCN operators were not required to used service providers and could market

and support their own customers. They were allowed to share network infrastructure under some circumstances, and to own their own backhaul networks. However, these

could only provide temporary relief as the existing operators were to receive the same

concessions by 1993 (Garrard, 1997 Ch. 6).

113

The growth of the consumer cellular market in the US gave PCN operators some level of confidence that latent consumer demand was likely to exist in the UK as well.

However, the delay between the awarding of the licenses and the launching of the services gave Cellnet and Vodafone plenty of time to prepare for the anticipated increase in competition. Both launched new tariffs to market that offered lower monthly fees and higher per minute charges aimed at low usage consumers (Garrard, 1997 Ch. 6).

The first PCN operator, Mercury One 2 One, launched in it service in September

1993 – almost four years after it obtained its PCN license. It focused on providing high quality regional service (including in building coverage for hand portables) initially in

London. Its primary innovation was in the marketing of mobile services to consumers. It offered lower prices than the existing operators – even providing free off-peak calls with the intention of getting people to use their mobiles without thinking. This strategy paid off and users increasingly used their mobile phone instead of fixed line ones. Mercury also developed innovative ways of selling the devices in consumer retail outlets ill-suited to the more complex process of registering customers. This was achieved by handling user registration via phone and mail. By mid-1995 One 2 One was forced to accelerate its coverage rollout plans to stay competitive and to raise an additional £1 billion of financing (Garrard, 1997 Ch. 6).

The other PCN operator, branded as Orange, launched in April 1994. They took a different tack than Mercury by striving to emulate Cellnet and Vodafone’s extensive coverage, albeit at the expense of indoor coverage. Orange achieved 90% coverage of the population by the end of 1995. Orange also innovated in the way mobile services were

114

marketed to consumers (e.g. by bundling minutes or air time with the monthly fixed fee and providing one second billing increments) (Garrard, 1997 Ch. 6).

In July 1996 Cellnet and Vodafone were allocated 2 x 11.5 MHz in the 1800 MHz

band. This allowed these operators to take advantage of the characteristics of the 900

MHz and 1800 MHz bands through the provision of additional capacity in urban areas using the higher band while still retaining the more economical coverage of rural areas.

Dual band 900/1800 MHz phones were developed to support these sorts of network configurations. The market shares of the four UK operators tended to converge in the late

1990s (see Figure 14).

Figure 14. Trend in share of UK mobile subscribers (Source: Financial Times)

A high-level time line of the main events in the introduction of second generation

systems in the UK is provided in Figure 15.

115

70 About 103% penetration 60

50 UK cellular subscribers 40 (in millions)

30

20

10

1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004

UK / European Regulation UK issues three PCN CEC directive requires use licenses of single standard and (1989) freeing up of 900MHz GSM MoU signed by operators and CEC directive requires

116 spectrum (1987) regulators from 13 separation of regulation countries (1987) from telecom operations (1990)

UK Market First GSM Vodafone Cellnet WAP & GPRS networks launches launches GSM GPRS launched launched in limited GSM network (1994) trials by Cellnet Scandinavia network (1992) (1999) (2000) (1991) One 2 One Orange launches PCN launches PCN network (1993) network (1994)

Standards 5,000 man years invested Global adoption of GSM in GSM by1990

Early years ETSI Phase 1 DCS1800 Phase 2 GSM DCS1800 125 million Globally, 1.27 billion GSM • CEPT agree on 900MHz band for formed GSM spec specification spec released renamed GSM users 400 bn SMS users worldwide European cellular services (1978) (1988) released released by (1995) GSM1800 worldwide messages (2004) • First meeting of CEPT GSM (1990) ETSI(1993) (1997) (1998) sent in 2002 standardization group (1982) Responsibility for GSM 626 GSM standardization GPRS spec EDGE spec networks in 198 transferred to released released countries (2004) ETSI(1989) (1997) (1999)

Figure 15. Time line of major events in second generation (2G) mobile services in UK / Europe

The development of 2G standards in the US

The situation in the US cellular market was very different to that in the Europe at the end of the 1980s. The US already had a single analog standard and its cellular telephone industry had been competitive from its inception. The incentive for developing a digital

second generation standard in the US came from the pragmatic need to increase capacity

in busy urban areas. As in Europe 2 x 25 MHz were allocated to cellular services (25%

larger than originally allocated). However, there were technical factors that made the

efficient use of spectrum more difficult than in Europe (wider channel spacing, reduced

trunking efficiency of having two operators in every market, and the slightly lower

800MHz band reducing frequency reuse relative to the CEPT 900MHz band) (Garrard,

1997 Ch. 8).

In 1988 the CTIA established a set of requirements for the upgrade of AMPS. The

primary objective was to increase capacity by a factor of ten while supporting a gradual

transition from AMPS. The TIA (a manufacturer’s trade association accredited by ANSI,

the American National Standards Institute) was tasked with developing the standard. This

contrasts with the CEPT forum for the development of GSM where manufacturers could

not be members – even when ETSI took over the development of GSM operators played

a much larger role than in the US (Garrard, 1997 Ch. 9).

The standard created by the TIA built upon the AMPS architecture. Individual 30 kHz

wide channels that supported one telephone conversation using analog frequency

modulation (FM) could support 3 telephone conversations using digital voice

compression and a digital phase modulation scheme. Three digital users were able to

share the same radio channel using TDMA (see page 106 for an explanation of TDMA)

117

and capacity would be doubled again when a more advanced voice codec was developed.

Transition costs were kept modest by allowing operators to digitize channels as necessary

by upgrading radio modules rather than purchasing and installing new base stations. The

official name for the new air interface standard was IS-54 but it was more often referred

to as D-AMPS (for Digital-AMPS) or just as TDMA. D-AMPS was first used in

congested areas in 1992. The first dual-mode terminals (i.e. AMPS and D-AMPS) were

larger and more expensive than AMPS only models. Sales of the dual-mode terminals

were slow to begin with and reached 7.5% of sales by 1994 (Garrard, 1997 Ch. 9).

The first version of IS-54 was accepted in November 1990. While the creation of the

IS-54 was much quicker than for GSM recommendations it should be noted that IS-54 was a more limited standard than GSM. As with AMPS it did not deal with interfaces between radio equipment and switches for example. The downside of its backward compatibility with AMPS was that it did not offer any new services capabilities. Work on an enhanced version started before IS-54 was published. The enhanced standard, IS-136, was published in 1994 and could provide capabilities similar to GSM. The intention was to use it as and when the market needed new services (Garrard, 1997 Ch. 9).

IS-54 D-AMPS could have become the single standard for 2G cellular in the US but for the action taken by a small San Diego based company, Qualcomm, that proposed an

alternative 2G technology based on a radically different multiple access scheme called

Code Division Multiple Access (CDMA).

At this point it is worth revisiting some of the ways in which multiple radio users can

share a portion of spectrum (a concept referred to as multiple access). First generation

systems employed FDMA/FDD (see page 89 for an explanation) while GSM, D-AMPS

118

and several other 2G systems employed TDMA/FDD (see page 106 for an explanation).

These approaches shared limited radio spectrum by allocating either a pair of frequencies

(in the case of FDMA/FDD) or a pair of frequencies and timeslots (TDMA/FDD). The

scheme proposed by Qualcomm for sharing radio spectrum is called Code Division

Multiple Access (CDMA). In this scheme all conversations or data communications are

transmitted on the same pair of frequencies, and at the same time (assuming FDD is used

for duplexing). The signal to be transmitted is combined with a bit stream with a much higher bit rate (known as a chip rate to distinguish it from the bit rate of the signal to be transmitted). This higher rate bit stream is a pseudo random noise sequence referred to as a chipping code (this code is the ‘C’ in CDMA). The resulting radio signal occupies 1.25

MHz in the case of Qualcomm’s 2G system. At the receiver the known chipping code is

used to recover the original signal containing, say, the digitized speech of a telephone

call. The ability of this process to act as a multiple access scheme relies on the existence

of families of chipping codes that are orthogonal to one another. Signals on the same

radio channel using other codes from the family remain as random noise when the desired

signal is recovered using its specific code. Although the CDMA concept pioneered by

Qualcomm was untried in commercial communications systems it had been used by the

military for its resistance to interception and jamming. Qualcomm had experience with

CDMA from contract work for the US military (Garrard, 1997 Ch. 9; Mock, 2005).

Theoretically the capacity of a range of radio spectrum is the same regardless of the multiple access scheme used. However, in practical systems sharing using time slots proved more efficient than using individually assigned channels. A major part of the benefit is also tied into the conversion to digital transmission allowing more

119

intensive frequency reuse. Sharing using orthogonal chipping codes in turn proved more efficient than TDMA, at least in many usage scenarios where the same frequency is reused in all cells. CDMA also makes frequency coordination, cell splitting, and dealing with coverage black spots much easier than with FDMA or TDMA. However, the superiority in terms of spectral efficiency and other characteristics was bitterly contested by the proponents of the competing standards (D-AMPS and GSM).

Once the CTIA had released the requirements for the upgrade of AMPs (in 1988)

Qualcomm started promoting CDMA as a superior alternative. In December 1988 the

FCC gave cellular operators the authority to introduce new technologies without requiring additional approval (FCC, 1988). An implication of this ruling was that the

FCC was not requiring a single national 2G standard (Garrard, 1997 Ch. 8) – a decision that left an opening for CDMA.

Qualcomm undertook a program of trying to convince the industry of CDMA’s ability to increase the capacity of cellular systems by more than an order of magnitude compared to the factor of three or so for TDMA based approaches.

(PacTel), which had severe capacity constraints in Los Angeles, was particularly receptive and partially funded Qualcomm’s first demonstration of CDMA based mobile communication in San Diego in November 1989. A further demonstration of the technology in the urban setting of City hosted by Nynex further bolstered its credibility and allowed Qualcomm to win the commitment of several operators and manufacturers for larger scale field trials. Despite the setback represented by the CTIA’s adoption of D-AMPS (in November 1990) Qualcomm continued to try to enroll network operators.

120

The first documented version of the CDMA air interface was developed by

Qualcomm in collaboration with the network operators that it had already convinced of the benefits of CDMA (PacTel, Nynex, and ). The first version, the Green

Book, was published in July 1990. Successive revisions were made before the Gold Book was finalized prior to larger scale capacity trials performed in collaboration with PacTel in San Diego in November 1991. These trials went a long way in convincing the industry that CDMA could live up to its promises and helped Qualcomm hold a successful IPO in

December of that year. The increasing commitment of PacTel, Ameritech and Nynex led the manufacturers, Motorola, AT&T (later Lucent), and OKI Electric to support wider scale trials of the technology (Mock, 2005).

Field trials of IS-54, and an alternative analog approach (N-AMPS) were also carried out in late 1991. Despite the success of the CDMA trial the CTIA board unanimously endorsed the TDMA as its sole choice for digital cellular in January 1992. However, it also asked the TIA to explore CDMA further as a potential alternative. By mid-1992 the

CTIA had asked the TIA to develop a CDMA standard which became known as IS-95.

Despite delaying tactics of proponents of TDMA the IS-95 standard was accepted by the

TIA in July 1993. Although IS-95 did not replace IS-54 it became a plausible alternative.

Delays in the introduction of TDMA and its teething problems in the first installations gave CDMA proponents the opportunity to continue to enroll new supporters. US West

New Vector placed order for CDMA equipment with and Motorola in September

1992 i.e. even before the technology had been accepted by the TIA. In March 1993 Bell

Atlantic Mobile Systems also made a commitment to CDMA. Of the major 800 MHz operators AT&T Wireless, , and Bell South went with D-AMPS (203

121

million population covered). While Airtouch, , Ameritech, Bell Atlantic/Nynex,

GTE, 360º Communications, , and US Cellular all went with CDMA/IS-95 (256

million population covered) (Garrard, 1997 Ch. 9).

An important international success for Qualcomm was Korea’s adoption of CDMA as

its national standard for wireless services in March 1993. This adoption was part of the

Korean government’s industrial policy to provide domestic manufacturers an opportunity to establish themselves in – something that had not been possible with the 1G and the existing 2G standards dominated by European and American manufacturers (Yoo, Lyytinen, & Yang, 2005).

Qualcomm and other manufacturers worked with US and Korean operators, and several manufacturers to further develop the technology in the CDMA Development

Group (CDG) formed in December 1993. The CDG played a leading role in promoting

CDMA around the world (Garrard, 1997 Ch. 9).

The world’s first commercial CDMA network was launched in Hong Kong in 1995.

In the US Prime Co (now Wireless) launched CDMA in fifteen US markets in

November 1996.

The development and launch of Personal Communication Services (PCS) in the US

Operators involved in PCN consortia in the UK petitioned the FCC for spectrum and licenses in the 1.9GHz region. The rather bureaucratic process followed by the FCC involved several rounds of asking for submissions regarding a “Notice of Enquiry”

(1990-91), publishing policy papers in 1991 and 1992, and consulting industry and the public about how to auction spectrum in 1993 and 1994 (Garrard, 1997 Ch. 9).

122

The PCS spectrum (1,850 to1,910 MHz and 1,930 to 1,990 MHz) was split into blocks of various bandwidths and geographical coverage with the actual auctions taking place from 1994 to 1996. The most attractive blocks of spectrum were auctioned in 52

Major Trading Areas (MTA) that were much larger than the MSAs and RSAs used for

AMPS licensing process. There were no stipulations about the technologies to be used or the services to be offered (data, voice, and paging services were among those envisaged).

Sprint, AT&T Wireless PCS, PrimeCo, and American Portable Telecommunications

(APT) were the biggest auction winners. The TIA was involved in creating standards for

PCS services. IS-54 TDMA and IS-95 CDMA standards were adapted for operation at the higher frequencies.

Qualcomm and the backers of D-AMPS and GSM competed for the adoption of their standard by these new operators. The claims of higher system capacity and lower capital expenditure for CDMA based networks were important components in the adoption of

CDMA by several of the major PCS operators. The teething problems of the early D-

AMPS installations and the willingness of infrastructure manufacturers, such as Nortel, and Qualcomm to provide vendor financing was also part of the overall commercial arrangements that influenced the standards adoption decision. The extra capacity offered by CDMA also allowed operators to adopt a higher bit-rate (13 kbits/s) voice codec – the higher fixed-line voice quality was a differentiator in the marketplace.

After the operators decided on the technologies they would adopt CDMA based PCS systems covered 46 MTAs (243 million population). Ameritech Wireless and GTE were among the major operators that chose CDMA for their 800 MHz systems. Two operators that had already adopted TDMA at 800 MHz (AT&T Wireless and Southwestern Bell)

123

also adopted TDMA for a total of 33 MTAs (114 million population). In a somewhat surprising move several PCS operators chose GSM as a standard (called PCS1900,

GSM1900, or IS-661). This made sense from the perspective of operators without commitments to TDMA or CDMA at 800 MHz since it allowed them to access proven technologies and equipment. More surprising was the adoption of GSM by and Bell South with prior commitments to CDMA and TDMA respectively. Cooperation between ANSI and ETSI was established to make sure that the US and European versions of GSM remained aligned. Triple band (900/1800/1900 MHz), and later quad band

(800/900/1800/1900 MHz) phones were developed to support roaming among the customers of GSM operators using different bands around the world (Garrard, 1997 Ch.

9).

APC launched its GSM based PCS1900 services in November 1995 in

Washington/Baltimore. More PCS1900 launches followed throughout 1996 and PrimeCo

PCS launched the first CDMA based PCS services in several markets in November 1996

(Garrard, 1997 Ch. 9). Like the PCN operators in the UK the PCS operators had the challenge of entering markets served by operators using 800 MHz networks. However, the challenge was even tougher as the cellular operators had already targeted consumers and many of the innovative marketing approaches used by the PCN operators in the UK had been borrowed from the US in the first place. APC borrowed ideas from Mercury

One 2 One for the distribution of handsets and completing the registration process by phone, and the bundling of minutes, and other services in one monthly fee

(Garrard, 1997 Ch. 9). A high-level time line of the main events in the introduction of second generation systems in the US is provided in Figure 16.

124

Another wireless technology that encroached into the same market addressed by the cellular and PCS companies was the proprietary iDEN (Integrated Digital Enhanced

Network) system introduced by Motorola in 1994. The services offered by iDEN were

differed from other 2G systems. It combined paging, data and cellular communications as

well the voice dispatch applications traditionally served by PMR systems. In 1996 a

company called Nextel brought iDEN to the market on Specialized Mobile Radio (SMR)

frequencies in the 800/900 MHz range i.e. neither the 800MHz nor the 1900MHz cellular

or PCS bands. Nextel provided interconnectivity with the telephone network in much the

same way as cellular and PCS operators. However, its simulation of traditional two-way

half-duplex radio often referred to as “push-to-talk” was a market differentiator. By late

2001 Nextel had 7.2 million subscribers (Luna, 2001b; Méndez-Wilson, 2001b).

The adoption of 1G and 2G standards around the world

AMPS was the most successful analog standard globally. It had been adopted by 104 countries worldwide by the end of 1998 (including its TACS variant). It nearest competitor, NMT, was adopted in only 35 countries (Funk, 2002).

As other countries decided on the adoption of a digital standard, either to address capacity limitation of analog systems or to introduce competition, they looked to the systems in Europe and the US. Garrard argues that a decisive factor in the success of

GSM was that it was guaranteed a large market as it was the single standard for Western

Europe and therefore infrastructure and mobile devices options would be plentiful. In contrast the US was bringing two standards to market and it was not certain whether one or both would succeed. Thus selecting one of them would have been a riskier decision.

125

By the end of 1998 GSM had been adopted by 120 countries, while DAMPS and CDMA

were adopted by 35 and 15 countries respectively (Funk, 2002).

The decision to adopt CDMA in Korea was driven by industrial policy. The licensing of the technology from Qualcomm and the development of infrastructure and mobile terminals was seen as a way for Korean companies to enter international mobile telephony equipment markets (Mock, 2005; Yoo et al., 2005).

126

US cellular 200 About 61% subscribers penetration (in millions) 160

120

80

40

1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004

Regulation FCC states FCC in consultative process with industry about PCS PCS spectrum auctions 127 operators can (1989-94) (1994-96) upgrade technology without new approval (12/88)

Market CTIA defines D-AMPS PCS1900 CDMA AMPS launched (GSM) PCS upgrade (1992) launched launched requirements (1995) (1996) (9/88)

Standards / Innovation D-AMPS TIA adopts D-AMPS Enhanced TDMA spec (IS-54) D-AMPS radio released spec (IS-136) interface (1990) released (1989) (1994)

(IS-95 CDMA Field trials of CTIA asks CDMA CDMA RevB) trialed in • CDMA TIA to create standard adopted by San Diego CDMA based (IS-95) • D-AMPS TIA (1999) (1989) and • N-AMPS standard adopted by NY (1990) (1991) (1992) TIA (1993)

Figure 16. Time line of major events in second generation (2G) cellular services in US

Discussion of wireless industries in the US and the UK

In this section we return to the research questions presented in Table 5 of chapter 3.

We use the descriptions of the early days of the cellular wireless industry in the US, UK, and elsewhere described in this chapter to provide a range of answers to these questions and highlight where the theoretical perspectives highlighted in chapter 2 provide the most insight.

Standards creation and adoption (RQ1)

The first research question (see Table 5) was, “How does technical standards creation and adoption play out in the construction of large scale information systems?” Here we consider the different ways in which standards were created in the first and second generation wireless systems.

The first generation of the cellular wireless industry provides a range of paths for the creation and adoption of standards. The starkest differences can be seen by differentiating among different regulatory settings. In the US, the vertically integrated monopoly telcom services provider, AT&T, was regulated by local and federal agencies. The first generation wireless standard developed by the AT&T, AMPS, was effectively given de jure status as it was the only system proposal received by the FCC. It is not surprising that the standard did not specify the signaling required for inter-operator roaming – as it is unlikely that a monopoly operator would foresee such a need. This under- standardization was later corrected by industry consortia based committee standards making when the need for inter-operator roaming was fully appreciated in a fragmented

128

and competitive marketplace for cellular services. The operators’ industry association

(CTIA), requested the manufacturers’ industry association (TIA) to develop the IS-

41standard to their requirements. The key actions and actors in the creation of the AMPS and IS-41 standards are depicted in Figure 1724.

Prior to the 1990s telecom services in most European countries were provided by, and regulated by, government owned PTTs. In the larger countries (Germany, Italy, and

France) the PTTs chose to create national cellular standards. These choices were driven by public policy interests of supporting domestic telecom equipment manufacturers.

In Scandinavia several PTTs cooperated in the creation of the NMT standard. As roaming across networks had been envisaged from the start the standard included the specification of inter-switch communications. In contrast to AMPS, a standardized interface between the switch and base stations allowed operators to integrate radio and switching equipment from different manufacturers. In Scandinavia the primary goal of the PTTs was to provide useful wireless services rather than to support domestic manufacturers. The international nature of the standards setting, the NMT Group, also served to temper any direct support for national champions.

In the UK, the two licensed operators and government representatives assessed several existing and planned cellular standards before adopting a variation of AMPS.

This contrasted sharply with the industrial policy driven decisions in other large

European countries. The UK government’s political ideology allowed technological and

24 Figure 17 is the first instance of this sort of ‘time-sequence’ diagram in this dissertation. The four analytical domains from Figure 3 (innovation system, marketplace, regulatory system, and standards) are shown as four vertical lines. The horizontal arrows depict major actions emanating from a domain and its effect on or other domains (or itself). The main actor behind the action is given along with a brief description and the date of the action. The sequence of the actions is presented in chronological order with the earliest event shown at the top of the diagram. The idea is to show the major interactions among the domains and their temporal sequence – something that is not so easily depicted using the triangular framework of Figure 3.

129

economic criteria to outweigh any temptation to support national manufacturers. Thus

UK operators were permitted to benefit from the economies of scale and lower risk associated with equipment manufactured for the much larger US cellular market. The key actions and actors in the development of 1G in the UK are shown in Figure 18.

Innovation Marketplace Regulatory Standards System System Late -1940s FCC AT&T Non-cellular mobile VHF spectrum allocations phones (1946)

AT&T AT&T Cellular concept (1947) FCC Most viable UHF Bell Labs Transistor (1948) spectrum allocated Others IC (1958) to TV (1949) Lobby for cellular spectrum (Late 1940s to 1968) 800 MHz 800 MHz cellular band cellular allocation (1974) band

FCC / TIA AMPS AT&T AMPS air interface spec submitted to FCC (1971) Infra & mobile manufacturers AMPS AT&T compatible Chicago trial equipment launched (1978) produced FCC Geographically fragmented Cellular licenses awarded (1982-89) AT&T Cellular services launched (1978)

Govt. AT&T Divestiture leaves 7 RBOCs with cellular business (1984)

CTIA (Network operators) Roaming requirements established (1984) Manufacturers IS-41 / TIA First version of inter-system Late -1980s standard developed (1988)

Figure 17. Key actors and actions in the development of 1G cellular in the US

130

In the rest of Europe adoption of TACS or NMT by smaller countries could also be driven by technological and economic criteria (as well as by spectrum availability) since these were not a large enough markets to support national standards they usually did not have national champions in the telecom sector.

Innovation Marketplace Regulatory Standards System System Late -1950s Govt. Post Office Non-cellular mobile VHF spectrum allocations phones (1959) Euro PTTs 900 MHz (CEPT) 900 MHz cellular band cellular band identified (1978) Thatcher Govt. elected (1979) Govt. BT split from Post Office (1981) Govt. Vodafone and Cellnet licensed (1982)

AMPS, NMT, etc. Performance and markets sizes of existing standards Vodafone Govt. TACS Cellnet Modified version of AMPS selected as UK standard (1983)

Govt. Oftel created and BT privatized (1984)

Vodafone Cellnet Cellular services launched (1985)

Vodafone Cellnet TACS networks cover Late -1980s 90% of UK population (1987)

Figure 18. Key actors and actions in the development of 1G cellular in the UK

While the network economic based arguments can plausibly explain standards adoption decisions in the UK and smaller European countries they do not explain the earlier decisions to create standard specifications in the first place or the choices of the

131

larger European countries to develop their own standards even when other options

existed. Scandinavia and the US were at the frontier of cellular technology and there were

no existing standards to adopt. The larger European countries had their public policy

interests i.e. supporting domestic manufacturers.

The ANT based conceptualization of standardization strategy formulation

summarized in Figure 2 is able to incorporate the wider range of interests and values

evident from the descriptions in this chapter. The problematizations behind standards creation/selection strategies formulated by operators and regulators not only included the perceived need to deal with the capacity constraints of pre-cellular VHF radio systems.

They also included support for national champions (large European countries), a desire to

use radio as a means of introducing effective competition into the telecom sector (UK),

and to realize the benefits of improved coordination in national and regional business

(Scandinavia). The enrollment of regulators was a non-issue where they were integrated

with the operators (European PTTs), or an enthusiastic backer of mobile services (UK),

but was problematic in the US since the FCC did not share AT&T’s vision of the future

use of valuable radio spectrum – preferring to allocate most of the viable UHF spectrum

to TV. While enrollment of the FCC by AT&T took decades it is also true that several

key technologies only matured sufficiently in the 1970s. By the time cellular network

operator licensing was contemplated the preference for a competitive wireless

communications industry (and a more competitive telecommunications industry more

generally) resulted in a much more fragmented industry structure than the one AT&T

would have envisaged. This fragmentation was arguably part of the reason for the slower

132

adoption of cellular services in the US compared to the Nordic countries and certainly

was the reason why national cellular services took years to develop.

The creation of the GSM specification required a huge feat of coordination across

national and European level regulatory bodies, state owned and private telecom

operators, and manufacturers in many countries spread over several continents, not to mention the extremely complex specification itself (occupying tens of thousands of

pages). European level institutions played a central role in providing the forums for

standards creation (e.g. CEPT and ETSI) and supported the standard through actions like

the creation of the MoU and the issuing of directives aimed at releasing spectrum and

mandating the use of the standard. While manufacturers and many operators had

economic based interests, the European level institutions, and at least some of national

governments, had political interests tied to a vision of a more unified Europe.

The UK’s regulators’ and operators’ choices were constrained by European level

legislation and Commission directives which made the adoption of GSM at 900MHz unavoidable. However, it would unfair to claim that the Commission was the sole coordinator. A fuller explanation must acknowledge that the CEPT countries and particularly the Quadripartite (QP) group of countries coordinated among themselves with the Commission supporting the consensus that emerged. While there had been some uncertainty about the standard for the PCN the major actors favored a GSM based solution due to a desire for a pan-European standard (ETSI and UK government) as well as reduced development time and cost (manufacturers, and operators). The UK played an

active role in the GSM group and in the QP. One particularly important contribution was

initiating the MoU, the signing of which reduced the uncertainty associated with the

133

eventual market outcomes in Europe. This reduced the risk for operators, equipment and

semiconductor manufacturers, and other actors. The relative certainty of the deployment

of GSM on a mass scale versus the possibility of selecting a losing US standard played to

the GSM’s advantage when other countries and operators selected their 2G mobile phone

standards. The key actions and actors in the development of 2G in the UK are shown in

Figure 19.

In contrast to the push for a common pan-European standard the FCC in the US took

a hands-off approach to standardization of 2G systems that allowed more space for the

innovation system to develop and explore alternatives to the evolutionary D-AMPS

approach favored by existing operators. This space allowed Qualcomm and network

operators for example to pursue standards related strategies in their own interests (to

monetize expertise in CDMA techniques, and to benefit from a superior mobile network performance/capacity respectively) which ultimately led to the commercial success of

cdmaOne and the launching of GSM services in the US. Qualcomm’s strategy involved

gradually enrolling operators with visions of superior capital and operational expenses which in turn provided the incentives for manufacturers to enroll in CDMA. The actor- network perspective of CDMA standardization depicts a rich interweaving of the creation and adoption processes that is not possible to conceptualize with technologically deterministic economic perspectives.

Nextel’s deployment of Motorola’s proprietary iDEN technology was a typical

business decision in the part of the wireless industry servicing businesses that

traditionally have used two-way radio. In the US these radio systems generally use

proprietary technologies. The inclusion of a telephony service allowed Nextel to capture

134

the wireless telephone business of these customers in addition to their two-radio business.

The key actions and actors in the development of 2G in the US are shown in Figure 20.

Innovation Marketplace Regulatory Standards System System

Late -1970s Euro PTTs 900 MHz (CEPT) 900 MHz cellular band cellular band identified (1978)

UK/France Explored possibility of NMT at 900MHz (~1981). Other PTTs concerned about threat to future pan-European solution

Dutch and other Form GSM Group to create pan- Euro PTTs (CEPT) European solution (1982)

France, Germany, Italy and Frano-German initiative to lead UK (collectively the QP) GSM development (1984), joined by Italy (1985), and UK (1986).

GSM Group / QP Select narrow band TDMA radio interface – some Franco-German opposition (1987)

Manufacturers CEPT Manufacturers allowed to participate in GSM standardization efforts (1987) European Directive requires use of single Commission 2G standard & freeing up of 900MHz spectrum (1987)

UK, QP and CEPT countries UK initiated, GSM MoU signed by operators and regulators from 13 countries (1987)

Euro Commission, GSM standardization transferred to newly formed ETSI (1989) CEPT

ETSI GSM Phase 1 GSM Phase 1 spec. released (1990) spec.

UK Three PCN licenses Govt. issued in UK (1989)

GSM 900MHz UK cellular and • Vodafone (1992) PCN operators • Cellnet (1994) Mid-1990s launch 2G networks DCS1800 • One 2 One (1993) • Orange (1994)

Figure 19. Key actors and actions in the development of 2G in the UK / Europe

135

The differing interests of regulatory actors in Europe and the US is consistent with the

ANT based model of standards creation and selection summarized in Figure 2. It should also be noted that it is also at least partially consistent with an institutional view where

‘supporting European unity,’ ‘supporting European suppliers,’ and ‘keeping government out of selecting winners’ could readily be viewed as normative or cognitive-cultural institutions (see Table 4) guiding government action. However, the European telecommunications sector underwent radical change since the 1980s. A researcher using this perspective would have to be very careful in selecting the ‘institutional pressures’ as many long standing ways of doing things were in flux. The resulting explanation would be unlikely to apply in the US or elsewhere and would differ for 1G and 2G outcomes – a fact that lends support for the inclusion of more dynamic ‘pressures’ which are referred to as ‘interests’ in the actor-network based perspective. The dynamic nature of the standardization process is amply demonstrated by changes in actors’ key interests as their strategies interact and the perceptions of current and future actor-network reconfigurations change – for example the change in the Commission’s approach from an effort to direct the development, to one supporting its development and deployment, and the CEPT surrendering its standardization role.

136

Innovation Marketplace Regulatory Standards System System Early -1980s Operators (CTIA) Requirements for AMPS upgrade defined (1982) FCC Operators permitted to deploy air interface upgrades without new approval (1988)

TIA TDMA selected as air interface upgrade (1989)

Qualcomm CDMA technology trials • San Diego (1989) • (1990) FCC D-AMPS / IS-136 Consultative process about PCS (1989-94) D-AMPS / TIA TDMA based air interface upgrade spec. released (1990) IS-54 Manufact- urers Field trials of CDMA, D-AMPS, & N-AMPS (1991) Operators (CTIA) TIA asked to develop CDMA based standard (1982)

Operators D-AMPS roll-out starts (1992) FCC PCS spectrum auctions (1994-96) IS-95 TIA CDMA based air interface upgrade spec. released (1993) IS-136 TIA Enhanced D-AMPS air interface upgrade spec. released (1994)

PCS PCS1900(GSM) services Operators launched (1995)

PCS CDMA PCS services Operators launched (1996) Motorola iDEN Nextel iDEN service launched Late -1990s (1996) IS-95(B) TIA Enhanced CDMA air interface upgrade spec. released (1999)

Figure 20. Key actors and actions in the development of 2G systems in the US

137

Inter-organizational coordination and relationship building (RQ2)

The second research question (see Table 5) was, “How do organizations build their relationships and coordinate with one another and with technology during the construction of large scale information systems?” Here we consider some of means of coordination and of relationship building associated with the first and second generation wireless systems.

The most important of the organizational actors and the key relationships among them are illustrated in Figure 21. The analytical domains (see Figure 3) to which these organizational actors belong are also depicted. While this representation of the 1G and

2G wireless industry is abstract enough to apply fairly well to all of the settings discussed

in this chapter the nature of the relationship among the actors can be quite different.

Innovation System Market Place

Airtime Customers Network resellers • Corporate Manufacturers Manufacturers Operators (UK) • Consumer

Infrastructure Manufacturers

Regulatory System Industrial Operator Spectrum Regulatory policy licenses allocation framework

Government / Regulators

Figure 21. Major organizational actors in the wireless industry25

25 This figure is adapted from one published in (Tilson & Lyytinen, 2006).

138

Regulatory-regimes relationships with one another and with other actors

For wireless services access to spectrum is an unavoidable OPP controlled by

government agencies. So it is not surprising that launching of wireless services in all the settings described has involved regulatory regimes in one way or another. However, the

nature of the relationships between regulators and other actors varied widely and has

changed over time.

In Europe during the development and launch of 1G services most operators were

government owned monopolies incorporating the regulatory function. Consequently,

coordination was an internal matter. In the larger countries where there was a close relationship with domestic telecom equipment manufacturers 1G standards creation and adoption decisions were taken to support these manufacturers.

European level regulators and institutions played a central role during the development and launch of 2G services. While promoting competition in the provisioning of fixed and mobile telecom services through a series of directives, European governmental actors shaped the 2G standardization process to favor European manufacturers over their international competition. The enrollment of national regulators and operators from numerous countries into their vision for GSM and the actor-network building around the standard was made tangible using the MoU – even if operators did not fully comply with their obligations in practice.

From the launch of 1G cellular services in the UK the regulator was not integrated with the dominant telecom provider. The regulator took the lead role in creating a competitive market for mobile wireless services in the 1980s and licensed further (PCN) operators in the 1990s. The regulator’s on-going role was that of policing the competition

139

in the telecom industry by, in many cases, imposing constraints on the former telecom monopoly (BT). The UK was not inclined to use the introduction of new services as an

industry policy tool to support domestic manufacturers. Despite these differences the UK

was, and still is subject to European level regulators and was constrained by the same directives that promoted GSM. Both UK cellular operators signed the MoU but did not

introduce GSM services on a large scale until it made business sense for them to do so,

irrespective of their MoU obligations.

The US regulator initially resisted AT&T’s lobbying for cellular spectrum for many

years. The enrollment of the FCC into a future vision including cellular services was

eventually successful. However, not all aspects of AT&T’s vision were realized. By the

time cellular services were launched the provision of telecom services was being

liberalized and two cellular operators were licensed in each market. The acceptance of the

AMPS standard was fairly uncontroversial at the time as it was the only proposal on the table. The FCC’s consultation process can lead to protracted decision making as exemplified by PCS licensing. The consultation process mandated by legislation limits the ability of the US regulator to impose its vision of the future configuration of industries – certainly compared to European regulators ability to do so. In the US the vision emerges from the regulator’s balancing of other actors expressed interests. In the case of PCS existing cellular operators strategically used the process to delay the launch of the competing services.

140

Manufacturers’ relationships with other actors

Mobile infrastructure (including switches, base stations, antenna systems, and the databases supporting mobility) are major capital investments for operators. The maintenance and upgrading of this infrastructure requires a close on-going relationship between network operators and infrastructure operators. Mobile devices are not as durable as mobile infrastructure. So the relationships between operators and device manufacturers need not be as close.

The specification of open interfaces between major infrastructure components (e.g. between radio base stations and switches) promotes additional competition among manufactures and offers network operators at least some additional protection from lock- in to proprietary solutions.

Network operators’ relationships with customers

Operators’ initial coordination with technology was achieved through trials in the first instance. The trials also increased operators’ confidence that customers could be enrolled in sufficient numbers.

The nature of the network operators’ relationships was originally that of a business to business service supplier. The additional marketing undertaken by the Vodafone and

Cellnet in the UK in the 1980s to educate and attract customers (primarily business customers) has been attributed as one reason why the penetration of 1G services was among the highest in (non-Scandinavian) Europe.

Operators seeking to enroll consumers had to envisage a different configuration of the way that the actor-network around mobile services was configured. This included a

141

radical change in the relationship with the new types of customers. Innovative operators found new ways of distributing devices and providing customer services – a necessity given the lower revenue created by consumers than business customers but also aligned with the greater capacity and resulting lower cost per connection at 1,800MHz. Operators also found creative ways of offering value to the customers (e.g. offering cheaper or even free calls at night or at the weekends) when the fixed-cost mobile infrastructure would be otherwise underutilized.

Other relationships

The US and UK/European case studies also highlight the connections between the wireless industries in the different locations. The TACS standard adopted in the UK originated in the US and the launch of the PCN initiative in the UK spurred interest in the rest of the Europe as well as in the US (albeit at 1,900 MHz rather than at 1,800 MHz).

Interaction between the US entertainment and wireless communications industry was also evident although restricted to the contention for UHF spectrum.

Relationship between standards creation/adoption and inter-organizational coordination/relationship building (RQ3)

The third research question is, “How does standards creation and adoption interact with the ways that organizations build relationships with one another and with technology?” Some of these interactions are discussed in the answers to the preceding questions, and are dealt with further here.

142

Standards creation/adoption and the regulatory regime

It has been argued (Garrard, 1997 Ch. 7) that that the interaction between GSM and the liberalization of the telecom market in Europe was a reciprocal one – the

liberalization process aided the diffusion of GSM while it acted as a catalyst for

liberalization.

Regulators in many countries used the launch of GSM as an opportunity to introduce

competition in the provision of mobile communications. In fact GSM provided many

countries in Europe with their first taste of competition in telecom and had an important

role in shaping the way telecommunications was liberalized. The typical path started with the PTT’s operational and regulatory activities being separated to ensure that the regulatory authority could treat new mobile network operators fairly. Secondly, the PTT was required to form a subsidiary for its mobile business with separate accounts to prevent hidden cross-subsidies.

A new mobile operator typically had to use the incumbent’s network to interconnect the elements of the mobile system (e.g. base stations and switching equipment). Most of the calls placed from mobile phones would also have to be terminated on the incumbent’s network. The incumbent’s control of the fixed telecom network could affect new operators’ businesses in several ways. By offering long lead time for services, providing few interconnect points, or offering unreliable service for example could delay network roll out, increase operational expenses, or result in a poor reputation with customers. The new regulators would try to ensure that new entrants would have access to leased-lines

(necessary for connecting the elements of a mobile system), and interconnections with

143

fixed telephone and data networks on the same terms as the incumbent wireless operator.

Dealing with these issues for the mobile telecom market provided a base for the regulation of competitive telecommunications more generally.

It can also be argued that liberalization also had a major influence on the success of

GSM. The CEC used Article 90 of the Treaty of Rome (which provides a mechanism for the Commission to unilaterally issue directives regarding competitive issues) to issue a

directive in January 1996 requiring that there be at least two GSM operators and one

DCS1800 operator (CEC, 1996a) in each member country. The new operators’ use of

GSM forced incumbents using analog networks to upgrade to avoid being portrayed as

using inferior technology.

The introduction of competitive GSM networks also paved the way for the

internationalization of the telecommunications business – local partners that understood

the local market and foreign operators with experience in building and operating

networks as well as providing cost effective customer service often worked together to

submit bids and build competitive mobile wireless businesses.

It is perhaps ironic that the actions that a standard given so much support by

European level institutions played an important role in the liberalization in the telecom services market while simultaneously limiting the choice of wireless standards for

European network operators to just one. It must also be noted that this was not the pattern

in the UK which liberalized its telecommunications market early and had configured the

mobile industry as a competitive one from its inception in 1982.

In the US the single AMPS 1G standard facilitated the steady consolidation of the

industry that was created by the licensing of operators in hundreds of geographically

144

small markets. The use of several 2G standards in the US has tended to constrain mergers

and acquisitions among operators to those sharing the same technology.

Standardization and equipment manufacturer / network operator relationship

Standards provide equipment manufacturers the baseline specifications for their products. Given their technical expertise equipment manufacturers also play a central role in creating standards. The agenda for standardization efforts is often set by network operators (e.g. by establishing requirements) and to some degree by regulatory bodies. In addition standardization efforts are also shaped by the existing standards as well (e.g. the evolutionary approach to the digitization of AMPS to increase capacity as needed while limiting cost).

The different families of standards (e.g. GSM and cdmaOne) shape one another - for example features of one standard are adopted by another e.g. SMS text messaging capabilities of GSM were added to other standards and similarly specified data services were developed for each of the 2G standards.

Network operators usually see standards as a means of limiting infrastructure cost which many do not believe to be a commercial differentiator. The development cost is effectively shared by many network operators by having manufacturers build to the same standard. Standards also limit the extent to which operators are locked-in to the technology of a single manufacturer.

However, some operators chose to forego the advantages of an open standard. For example, Nextel in the US adopted a proprietary standard – Motorola’s Integrated Digital

145

Enhanced Network (iDEN) – and used it to offer a fast Push to Talk (PTT) capability attractive to some small businesses as a differentiator.

One feature of the GSM standard is that the subscriber identity is held within a small smartcard referred to as the Subscriber Identity Module (SIM). This, along with the process that type approves phones for use on all GSM networks (with matching frequency bands) allows users to easily switch between phones by simply transferring the

SIM card from one phone to another. This feature of the standard gives the mobile device

manufacturer the ability to market devices directly to end users and to build up their own

brand and relationship with the customers.

The other 2G (and almost all 1G) standards associate the subscriber identity with a

single device. Consequently the network operators have much more control over the

relationship than users have with their devices. In addition CDMA operators argue that

the air interface for the cdmaOne standard is not identical across all CDMA network and

there certainly is no single type approval for cdmaOne devices. Device models are

individually approved by operators. As device manufacturers can only access subscribers

via the network operators cdmaOne operators have more power in their relationships with

device manufacturers than is the case in the relationship with GSM operators. This is

reflected in the fact that it is common for phones in Europe to only carry the device

manufacturer’s brand. In the US this is rare – devices usually carry the branding of both

the carrier and the manufacturer.

146

Semiconductor manufacturers’ and standard creation

While the descriptions of the 1G and 2G development have barely touched on

semiconductor manufactures they are responsible for the most important electrical

components in mobile devices and infrastructure. The participants of standardization

efforts take the capabilities of semiconductors and other technologies into account when

defining standards. However, standards are often specified somewhat ahead of the

capabilities of existing semiconductor devices in the expectation that by the time products

are being designed for production research will have surmounted any obstacles.

Technical standards guide semiconductor R&D and provide designs for products for

potentially large markets. First and second generation cellular standards drove the

development of the specialized transistors for radio frequency receiving and transmitting

functions at 800/900MHz and 1800/1900MHz. The processing power required for digital

voice compression and decompression, and other signal processing functions in 2G

mobile devices was challenging to achieve within the power constraints imposed by

limited battery power. However, the challenge was overcome with ever improving

semiconductor technology.

A widely adopted standard, or a standard that is expected to be widely adopted,

encourages larger investments in the development of more highly integrated and higher performing semiconductors that are designed into better performing equipment that often costs less than previous generations.

147

Role of initial conditions (RQ4)

The fourth research question is “How do existing technical and inter-organizational

coordination mechanisms affect the design and implementation of large scale information systems?” The range of answers to the first three questions that emerged from the analysis of the story of the development and launch of first and second generation wireless services were strongly influence by the particular conditions in each setting.

It is clear that the regulatory environment in each setting had a strong influence on the way in which first and second generation mobile wireless standards were created. In the regulated monopoly, and later competitive environment it was up to innovators and operators to develop standards. In the PTT era in Europe the national governments drove standardization strategies and later European level regulators played the central role and instituted the forums for standards creation. In addition to having strong influences on the standards creation and adoption decisions the different regulatory settings were important determinants of the competitiveness and success of at least fist generation offerings.

The population size of the countries or regions developing standards is important.

This along with the number of standards deployed in these countries/regions, and the success of prior radio based offerings shaped the expected home market for a standard and therefore economies of scale that would be realized in equipment development and manufacture.

Another important norm that varied by country, in some ways a standard used to coordinate actors, was who pays for calls to mobile devices users. In the UK and the rest of Europe the principle of calling party pays (CPP) was maintained for mobile telephone

148

calls. In the US the dominant scheme for mobile calls is that receiving/mobile party pays

(RPP or MPP) for incoming calls to mobile phones. These interface standards along with

the regulation of termination charges determined the flow of revenue among fixed and

mobile operators (Littlechild, 2006; Marcus, 2004). These conventions and regulations are also thought to have influenced the uptake and use of mobile devices in a rather surprising way. The CPP approach led to faster adoption since the higher termination fees charged by mobile network operators was used by operators to subsidize the cost of expensive devices. In addition users tended to keep their devices switched on since they were not being charged for incoming calls. In the RPP countries adoption was lower and, at least in the early days users would often turn off their devices more often to avoid paying for unwanted calls. The operators in the US later overcame this difficulty by offering large bundles of airtime minutes to encourage the adoption and use of the devices. The result is that there are high levels of adoption in CPP Europe but higher per minute charges and lower usages compare to the US (which use RPP).

This chapter provided a rich set of 1G and 2G case studies with which to address the research questions. These cases also provide a detailed understanding of the configuration of the wireless industries on both sides of the Atlantic prior to, and during, their transition to third generation (3G) technologies which we turn to in the next chapter.

149

VI. Changes in the Wireless Industry with the 3G transition

In this chapter we examine the evolution of the wireless industry from the provision of mobile telephony almost exclusively to the addition of a wide range of data and multimedia services. After outlining how the industry’s understanding of the third generation (3G) concept changed from its inception in the mid-1980s, we examine how key 2G specifications evolved to support new services – primarily text messaging and packet data services. This is followed by descriptions of the cases of 3G standards creation and adoption in the US and Europe (with a particular emphasis on the UK). The final section discusses the research questions in the light of these historical case descriptions. This final discussion also includes the findings of from a series of

interviews with decision makers in the wireless industries on both sides of the Atlantic.

The interviews were carried-out between 2003 and 2006 and present a deeper

investigation into the industry during its transition to the wider range of service offerings

after the creation and adoption of 3G standards.

Overview of changes in the conceptualization of third generation (3G) wireless

The idea of third generation (3G) mobile networks was raised as early as 1986 in the

ITU where the initial vision was that a single air interface would allow a pocket sized

mobile terminal to be used anywhere in the world. The concept was referred to as the

Future Public Land Mobile Telecommunications Service (FPLMTS) and ITU-R Task

Group 8/1 was established to define this unified 3G mobile system. This simple vision of

150

‘terminal mobility’ did not endure however. Fixed network operators created an all

encompassing vision of ‘personal mobility’ in which a wide range of services, broadband

as well as traditional narrowband services, would be accessible from any terminal

connected to any network. In this concept, dubbed Universal Personal

Telecommunications (UPT) within the ITU, FPLMTS and cellular radio more generally

became only one of many access mechanisms. The advantage of this wider vision for the

fixed telecommunications operators in developed countries was that the intelligence

required would be within the fixed networks they controlled, and this would reduce the competitive threat posed by mobile network operators (Garrard, 1997 Ch. 9).

An early success was agreement on a worldwide for FPLMTS

in the 2GHz region for any country that wished to use it. The 2GHz spectrum, originally

identified by the European funded RACE program, was made available for FPLMTS at

the WARC (World Administrative Radio Conference) in 1992. In 1995 the

unpronounceable FPLMTS was renamed International Mobile Telecommunications 2000

(IMT-2000) and at the World Radio Conference in 2000 (WRC2000) its frequency

allocation was expanded (Huber, 2001).

The IMT-2000 vision of a single global third generation standards was not realized. It

proved impossible to align the concept with existing 2G systems (e.g. GSM and

cdmaOne), systems (e.g. DECT developed in Europe), UMTS, and

other as yet undefined 3G system specifications. A group of regulators from Japan, the

US, and the EU (Future Advanced Mobile Universal System – or FAMOUS) had held

yearly exchanges since the early 1990s. By 1995 this group had fashioned a model of

their being an IMT-2000 family i.e. several 3G standards could be accepted by the ITU.

151

Several major 2G standards: GSM, cdmaOne, D-AMPS and PDC (in Japan) were launched in the 1990s. Operators of all these standards would require economical and technologically attractive paths for transitioning to third generation air-interfaces. In addition there were different sets of inter-network protocols for core networks: GSM

MAP for the GSM world and IS-41 for the cdmaOne, and D-AMPS in the US. While these air-interface and network protocols would form the starting points for transitioning to 3G networks the exploration of 3G technologies in Europe started even before 2G networks had been launched.

New mobile services – SMS and 2.5G packet data

Most of the services provided by GSM and the other first and second generation mobile systems had direct equivalents in fixed telecom networks e.g. voice telephony, fax, and circuit and packet data services. While the provision of those services on mobile terminals was a huge technological achievement the services in themselves were not novel. This approach of mobilizing fixed services certainly had the advantage of allowing mobile operators to benefit from the existing positive network effects associated with the fixed telephone networks, and the existing installed base of fax machines for example.

The Short Message Service (SMS) on the other hand did not have a direct parallel in fixed networks, although the transmission of text certainly had a long heritage in postal mail, telegraph, , and more recently electronic mail. SMS implementations were not connected to any of these fixed networks, at least initially, and therefore did not benefit from their positive network effects.

152

Early GSM specifications for messaging were extremely abstract. They were simply described as “Short Message Point-to-Point Mobile Terminated, Short Message Point-to-

Point Mobile Originated, and Short Message Cell Broadcast.” The cell broadcast mechanism was dropped due to a lack of commercial interest, and the point-to-point service specifications combined by a new working party (WP4) established in the CEPT in 1987 to specify mobile data services. This combined point-to-point specification provided a two-way messaging capability. However, its use as a way for people to communicate with one another had not been envisaged by all involved – many operators saw it as a machine-to-person mechanism for voicemail alerts and the like (Trosby,

2004).

SMS text messages are transmitted to, and from mobile terminals using the signaling channels of the radio link rather than the traffic channels used for voice and data services.

There were differences of opinion on whether the SMS traffic should be carried within the signaling channels of the core network or on a separate data network. The inclusion of user services within the signaling system goes against principles of good architecture and some operators were opposed to this approach. However, mobile only operators (like

Vodafone and Cellnet in the UK) did not operate their own data networks so would have had to pay extra for X.25 connections to keep short message service data separate from signaling traffic. For these operators carrying short message traffic with signaling was significantly more economic. In retrospect the decision to add SMS traffic handling to the capabilities of GSM signaling protocols (GSM MAP) facilitated the smooth interconnection of SMS traffic among operators and international roaming with SMS since it only relied on capabilities built into the GSM specification. The 160 character

153

limit for the text messages carried by the SMS service is directly related to the capacity of

signaling packets used in GSM MAP and the CCITT Signaling System Number 7 (SS7) upon which it is based (Trosby, 2004). The first draft version of the specification for the

SMS service appeared in November 1987 (Hillebrand, 2001c) and the service was included in the Phase 1 GSM specification released in 1990.

The resulting SMS service has been a major success story for second generation mobile systems – more successful by far than the fax and data services that had been expected to become significant sources of new revenue. As well as providing the messaging between mobile users it has provided a platform, albeit a very basic one, for a wide range of additional services and capabilities (e.g. mCommerce, handset customization by the downloading of parameters, and content services of all sorts) that were not envisaged when SMS was defined (Trosby, 2004).

It was 2001 before SMS was widely offered in the US by cdmaOne and D-AMPS operators (Méndez-Wilson, 2001a) and 2002 before the largest network operators agreed to offer SMS interoperability (Méndez-Wilson, 2002)

The GSM specification also included data capabilities. The basic capabilities included in the Phase 1 specification included synchronous and asynchronous 9.6 kbits/s data bearer services and access to X.25 packet switched networks at the same rate. These were functionally equivalent to using a mobile handset (or dedicated data card) as a modem for dial-up data services. These were circuit switched services and users were charged for connection time (as for voice calls). Even when accessing an X.25 packet network the connection from the mobile device to the X.25 gateway was circuit switched and users were charged for connection time. This arrangement was consistent with the

154

underlying technological approach to supporting data which was the use of a timeslot within the TDMA frame structure for the data connection. The timeslot could otherwise be used to support one voice call.

Later enhancements to the circuit switched data services included increasing the bit rate to 14.4 kbits/s while still using only one timeslot (the trade-off was a reduction in the level of error correction provided). A High-Speed Circuit Switched Data (HSCSD) service increased capacity by using multiple data slots – up to 57.6 kbits/s using 4 time slots. As several timeslots were dedicated to an HSCSD call for its duration prices per minute were higher than for the 9.6 kbits/s services.

At this point it is necessary to clarify the differences between the circuit switched and packet switched paradigms of data communications. In the circuit switched paradigm specific resources are allocated to a or a data connection for its duration – even when data is not being exchanged. The resources allocated typically exhibit a constant bit rate and a constant delay between the end-points of the call (i.e. the key requirements for voice traffic). In packet switching discrete blocks of data (called packets) are routed between network nodes over data links shared with other traffic.

Packets are buffered at each and the overall bit rate and delay varies depending on the overall level of traffic traversing the network. This statistical sharing of resources allows more economic data services to be marketed. The packet switching paradigm underlies the Internet and older packet networks like X.25. Packet switched data services are particularly well suited to web browsing, email, database access and other applications where data transmission is intermittent. The other advantage for packet

155

switched services is that they are “always on” since no lengthy call setup and clear down

procedures are required.

IBM and Motorola had first suggested the inclusion of a packet switching capability

in GSM as early as 1988 but it had been rejected. By the early 1990s ETSI SMG came under pressure from the European Commission to include a packet data capability (for

road telematics applications) and European railways wanted to adopt GSM but also required a packet data capability. There was also competitive pressure from the packet based CDPD26 capability launched by some US operators. The definition of the General

Packet Radio Service (GPRS) standard commenced in 1994 and was due for completion in 1994/5. However, the standards development work took longer than expected and in

the meantime efficient support of Internet and other IP networks emerged as an additional

requirement. GPRS was finally included in the 1997 release of the GSM specification and

the specification considerably enhanced in the 1999 release. GPRS uses one or several

timeslots of the TDMA frame for packet transmissions and provides in the order of

100kbits/s to be shared among several users. It provides an ‘always on’ capability and allows billing by usage rather than connection time (Dupuis, 2001; Hillebrand, 2001b).

The packet switched GPRS service required an overlay packet network to be added to the circuit switched GSM/MSC infrastructure. In addition software and/or hardware changes were required to existing base station controllers and switches.

A higher bit rate packet data capability called Enhanced Data Rates for GSM

Evolution (EDGE) was also included in the 1999 release of the GSM specification. While

EDGE also used the GSM TDMA frame structure different modulation and coding

26 Cellular Digital Packet Data (CDPD) was an overlay network deployed by several AMPS / D-AMPS operators in the US to provide packet based data services at up to 19.2 kbits/s (Steward, 1995).

156

schemes were used to support higher bit rates (up to 384 kbits/s). EDGE requires more

significant changes to the GSM infrastructure including new channels cards in base

stations and upgraded links between the base stations and their controllers (Hillebrand,

2001b).

ETSI SMG cooperated with the groups responsible for ANSI-136 TDMA in the US

(primarily UWCC) in the specification of EDGE. The D-AMPS specification and GSM

had grown together and GPRS and EDGE were incorporated into the ANSI-136

evolution path. GPRS and EDGE were seen by European operators as evolutionary steps towards third generation systems based on a new air interface (see next section) and are often referred to as 2.5G technologies. An enhanced version of EDGE supporting

2 Mbits/s was to be the 3G solution for D-AMPS (Bekkers, 2001Ch. 10; Hillebrand,

2001b).

The CDMA standard (IS-95) supported circuit and packet based data services at speeds up to 14.4 kbits/s. The upgraded version of the standard (IS-95B) approved by the

TIA in 1999 increased the maximum data rate to around 115.2 kb/s (Knisely, Kumar,

Laha, & Nanda, 1998).

Using mobile data services at the end of the 1990s was not particularly easy. One

typically needed a mobile phone that supported one or more of the data services, a laptop

PC, and an expensive data adapter card/cable to connect the phone to the laptop. Special

software on the PC was required to set up the data connections and the user would require

an employer or service provider supported gateway to connect to. By 1997 data only

amounted to about 0.5% of the total traffic on GSM networks. In the mid-1990s several

companies set about defining protocols that could be used to offer data services directly

157

on mobile phones. The proponents of several such initiatives (Ericsson, Nokia, Motorola,

and Unwired Planet) decided to work together to develop the Wireless Application

Protocol (WAP) and formed the WAP Forum in 1997. WAP and its related protocols

were similar to the HTTP and other protocols used on the wired Internet but took into

account the limited processing power, screen capabilities, and keybords in mobile phones

as well as the relatively low transmission speeds on mobile networks (Bekkers, 2001 Ch.

10). One feature of note was that the WAP protocol was not tied to one particular air

interface. It could work on 2.5G devices with GSM, D-AMPS, or cdmaOne based air-

interfaces, as well as on 3G based devices. Membership in the WAP Forum was opened

up after the initial version of the protocol was published. By May 200, membership had

grown to 2000. At about the same time WAP devices and services were being launched

commercially.

The creation of 3G standards in Europe27

The recognition of the telecommunications industry as an important sector for

Europe, and the positive impact of GSM in terms of employment (hundreds of thousands

of jobs) and improved efficiency in economic and private activities, led to 3G being

regarded as an important European Community interest (Niepold, 2001). European level institutions provided extensive support for innovation:

• Support of pre-competitive R&D through the RACE and ACTS programs

27 This section draws on several archival and other secondary sources. Where no specific citation is given the details of this episode of European telecom standardization draws on chapter 13 of Bekkers’ (2001) thesis.

158

• Encouraged the development of a common standard to facilitate roaming and the

realization of scale economies. The creation of ETSI in 1988 provided a forum for

such standardization. The standards developed by ETSI can be made legally

binding by European level legislation

Additionally, European level regulatory activities included:

• Harmonized spectrum usage to facilitate roaming and encourage scale economies

• Providing a regulatory environment to support the offering of services. This

included aspects of licensing, inter-operator interconnection, and equipment

certification

The creation of the 3G standard in Europe and the central role of European level

actors are described next.

European support for R&D into 3G mobile communications

Some work on third generation (3G) wireless technologies started even before the

Phase 1 GSM specification was released in 1990. After a 'definition phase' initiated by

European industry ministers in 1985, the European Council called, in 1987, for the launch of the Research into Advanced Communications in Europe (RACE) program. The

objectives of the program included the promotion of the European telecom industry, the

introduction of Integrated Broadband Communications (IBC), and the support for the

creation of a single European market for telecom equipment and services. The RACE

Mobile Project R1043 brought together 20 partners from industry and academia. It was

this project that identified the Universal Mobile Telecommunication System (UMTS) as a

159

class of service intended to support voice and low to medium data rate services over a wide area. It also identified the System (MBS) as a service to provide high data rate services in spots. The project worked to specify key elements of the

UMTS air interface, signaling, and infrastructure, as well as to identify suitable spectrum.

Deliverables from this work included inputs to ETSI, and the ITU. The spectrum identified was made available by ITU WARC 92 (Schwarz da Silva, 2001).

RACE Phase II (1990-1994) included large projects on both UMTS and MBS. The

Code Division Testbest (CODIT) and the Advanced Time Division Multiple Access

(ATDMA) projects explored CDMA and TDMA based air interfaces for UMTS respectively. The MONET project on UMTS network standards focused on integrating the infrastructure for both fixed and mobile services – the vision was the ability to offer compatible services to both mobile and fixed users. Integration with the Broadband

Integrated Services Digital Network (B-ISDN) was foreseen, as was the use of the

Intelligent Network (IN) concept developed by the ITU as a platform for advanced telecommunication services development. The RACE Vision of UMTS report written at the end of the RACE program (Swain, 1995) also emphasized this integration with wired multimedia broadband services. Thus the vision was based on the evolving telecommunications architectures being developed by traditional telecom industry players. The RACE Vision of UMTS report also acknowledged that UMTS would coexist with GSM and DECT28 technologies, and that enhancements to 2G systems would be able to offer some of the same service types envisaged for UMTS.

28 Digital Enhanced (formerly European) Cordless Telecommunications is an ETSI standard for digital cordless phones. DECT cordless phones are used by consumers and corporations.

160

The RACE program’s successor was the Advanced Communications Technologies

and Services (ACTS) program started in 1994. The UMTS activities within ACTS were

aimed at furthering the services, platforms, and technologies of UMTS. The integration

of UMTS with fixed broadband multimedia services was still central to the vision. There

were projects that addressed multimedia capable mobile devices, APIs for mobile

application development, mobile network planning tools, and infrastructure. In addition

the ACTS FRAMES project included the further development of CDMA and TDMA air

interfaces (Schwarz da Silva, 2001). The ACTS/FRAMES project was particularly well

funded (about 100 million ECUs or US$125 million) with the funding going to major

manufacturers and operators (e.g. Nokia, Ericsson, Siemens, CSEM/Pro and France

Telecom) as well as several universities (Bekkers, 2001 section 13.3.2). FRAMES

created three well elaborated specifications for air interfaces: FMA-1 (without spreading)

based on TDMA, FMA-1 (with spreading) based on hybrid of TDMA and CDMA, and

FMA-2 based purely on CDMA.

It should be noted that one of the air interfaces elaborated by FRAMES included

some novelty in the that way duplexing was achieved. The 1G and 2G standards discussed in chapter 5 used a Frequency Division Duplex (FDD) mechanism to separate the signals received by the mobile device from those transmitted by it. This duplexing scheme allows the simultaneous transmission and reception necessary to carry on a full- duplex telephone conversation (i.e. speaking and listening simultaneously). With the

Time Division Duplex (TDD) scheme full-duplex communications is enabled by the mobile device alternating between transmitting and receiving on the same frequency. The buffering of data (e.g. digitized speech) until it can be transmitted causes a slight delay.

161

TDD is usually considered to be only appropriate for communication between base stations and devices that are no more than a few hundred meters apart so that the delay does not become unacceptable for telephony. The advantage of TDD is that cheaper mobile devices can be produced since they do not have to support simultaneous transmission and reception on two frequencies. In addition the flexibility afforded by

TDD in adjusting the ratio of transmit to receive allows greater efficiencies for data services where bandwidth requirements are often asymmetrical. Of the interfaces defined by the FRAMES project the FMA-1 proposals supported both FDD and TDD, while the FMA-2 proposal only supported FDD. DECT also used TDD.

The European Commission’s role in standards creation

As the RACE program came to an end there was no technical consensus on the content of the key UMTS air interface, and other outstanding issues. Network operators and ETSI were so focused on the rollout and the evolution of GSM that they paid little attention to UMTS. The SMG5 group within ETSI charged with the standardization of

UMTS suffered from a lack of industry support and made little progress in standards creation. The lack of coordination of national administrations also weakened Europe’s ability to influence ITU TG 8/1 forum dealing with FPLMTS/IMT-2000.

The poor progress in the standardization arena and the lack of a clear policy direction around UMTS, despite 10 years of work, disappointed the European Commission. The leader of a Commission hosted workshop on UMTS in January 1995 concluded that a

UMTS Task Force should be formed to create a strategy for UMTS (Beijer, 2001;

162

Fernandes, 2001). The task force was created to clarify the UMTS vision and to

communicate it more widely (Schwarz da Silva, 2001).

The UMTS Task Force was established in February 1995 and consisted of handpicked members with industry, standards creation, and policy expertise (Beijer,

2001). Its final report published in March 1996 (European Commission DG XIII/B, 1996)

recommended:

• “The development and specification of the UMTS such that it offers true 3rd-

generation services and systems.

• UMTS standards must be open to global network operators and manufacturers.

• UMTS will offer a path from existing 2nd-generation digital systems, GSM900,

DCS1800 and DECT.

• Basic UMTS, for broad-band needs up to 2Mb/s, should be available in 2002.

• Full UMTS services and systems for mass-market services in 2005.

• GSM900, DCS1800 and DECT should be enhanced to achieve their full individual

and combinational commercial potential.

• UMTS regulatory framework (services and spectrum) must be defined by the end of

1997 to reduce the risks and uncertainties for the telecommunications industry and

thereby stimulate the required investment.

• Additional spectrum (estimated at 2x180 MHz) must be made available by 2008 to

allow the UMTS vision to prosper in the mass market.”

The UMTS vision included an evolutionary approach to networks and services from those offered on the 2G systems (GSM and DECT) that would continue to be enhanced.

However, the air interface(s) would have to be revolutionary to meet the higher

163

bandwidth needs and wider service requirements envisaged – at this stage it was

envisaged that UMTS would have many ‘faces’ including: public mobile network,

cordless telephony, wireless PABX, wireless LAN, , cordless terminal mobility, private mobile radio, satellite system, mobile data network, and paging

network. The vision now also framed UMTS as a global standard. Other calls to action

included the requirement for a regulatory framework that would stimulate investment in

UMTS, and the need to secure additional spectrum (European Commission DG XIII/B,

1996).

While the report stated that UMTS would have to support the UPT and IN concepts

that emerged from fixed network operators’ influence at the ITU, it also raised the

possibility of interworking with TCP/IP networks as well as the PSTN, ISDN, B-ISDN,

PSPDN (Packet Switched Public Data Network - typically X.25 based), and the GSM

family of networks. The Internet was mentioned in the report, albeit only once, and only

in passing.

The UMTS Task Force also recommended the establishment of another body, the

UMTS Forum, to coordinate all UMTS activities and to accelerate the creation of a

standard. There was opposition from ETSI and the European Radiocommunications

Office (the ERO was established largely by the CEPT to coordinate European frequency

allocations) which believed that such a forum would impinge on their standard setting

and frequency coordination roles. Operators were also less than keen on any prospect of

being forced into 3G while they were still developing enhanced capabilities for GSM

(e.g. the GPRS packet data capability). Despite this opposition Commission pressed

164

ahead with the establishment of the UMTS Forum29 in April 1996 albeit with more

modest goals than those set out by the UMTS Task Force (Bekkers, 2001; Garrard, 1997

Ch. 9).

The UMTS Forum was not set up as a standardization forum, rather as a body to address and coordinate all the other activities needed for the commercial success of

UMTS. A role that the GSM MoU group performed for GSM. In contrast to the CEPT or even the GSM MoU group the UMTS Forum was open to any organization that supported the UMTS vision. The Forum’s Articles of Association, which outlined its

mission, made it clear that regulators and other European and international level

(institutional) actors were the most important – the key objectives included

promoting UMTS at the highest political level, promoting a favorable regulatory environment and ensuring adequate spectrum, as well as accelerating licensing, and coordinating with the ITU and other fora (Beijer, 2001).

The UMTS Forum did not have any direct power but acted as a “pressure group trying to play the role of a glue between different kinds of activities.” It established groups to attend to regulation, spectrum, market, technology. Its numerous reports have addressed all these areas. Its first report, “A Regulatory Framework for UMTS” published in June 1997, was written to influence European level legislation on next generation mobile services. The focus on preparing for European legislation contributed to the low participation by non-Europeans and non-telecom players despite the global aspirations of the UMTS Forum and the desire to involve companies from the ICT industry. Its second report, “The Path towards UMTS Technologies for the Information

29 www.-forum.org

165

Society” published in 1998, interconnection with the Internet started to play a more

prominent part in the vision (Beijer, 2001).

As well as funding R&D into all aspects of the 3G technology and shaping the

creation of committees and institutions (e.g. UMTS Task Force and UMTS Forum), the

European Commission also introduced legislation and issued several important directives.

The Harmonization Decision (No.128/199/EC30) (UMTS Decision, 1999) adopted by the

European Parliament and Council was central in laying out the regulatory environment

for 3G in the EU. It required member states to issues 3G licenses, and asked that at least

one network use UMTS as defined by ETSI.

One article31 of the harmonization decision was interpreted by Qualcomm as

effectively barring non-UMTS 3G systems in Europe. The US Secretary of State

(Madeleine Albright), Secretary of Commerce (William Daley) and FCC Chairman

() protested this mandate for UMTS to the EU Commissioner (Eric

Bangemann). The EU responded that no such mandate existed (Buckley, 1999; "Crossed

lines," 1999).

The 1991 terminal equipment directive (TTE) created a mechanism under which the

devices type approved to ETSI specifications (e.g. GSM, DCS-1900, ERMES, DECT) in

one country could be marketed in all EU member countries. Type approval for non-

European standards had to be completed on a country by country basis. In 1999 the Radio

and Telecommunications Terminal Equipment (R&TTE) directive relaxed the terminal

30 See http://eur-lex.europa.eu/LexUriServ/site/en/oj/1999/l_017/l_01719990122en00010007.pdf 31 Article 3 paragraph 4 of the Council Decision (No.128/199/EC) stated “Given that, in line with efficient use of radio frequencies, it may be necessary to limit the number of UMTS systems authorised in Member States, if it is established in accordance with the procedure laid down in Article 17 of Directive 97/13/EC and in conjunction with CEPT, that potential types of systems are incompatible, Member States shall coordinate their approach with a view to authorising compatible types of UMTS systems in the Community.”

166

equipment rules by allowing self certification by manufacturers, and removing the

restrictions to ETSI standards – thus making it easier to market terminal equipment

conforming to non-ETSI standards. In 2000 a mandate asked ETSI to create a standard

for IMT2000 equipment to permit UMTS equipment certification. Other directives included one on interconnection (97/33/EC), licensing (97/13/EC), and terminals

(1999/5/EC). Up to 1999 many national licenses specifically referred to ETSI standards.

In addition European level procurement rules for governments, which also applied to some incumbent operators, also favored the specification of ETSI standards, at least up to

1999.

The European Commission coordinated with the CEPT in the harmonization of frequency allocations for 3G in Europe through a series of mandates. These included the allocation of 155 MHz of spectrum32 from the IMT2000 core band by January 1, 2002,

and the development of proposals for WRC2000 on IMT2000 extension bands (which largely adopted). The European Commission had historically played an additional role in harmonizing spectrum allocations by issuing directives that required member states to reserve certain spectrum for ETSI standards e.g. DECT and GSM. The commission did

not do so for UMTS but the member states were sufficiently committed to the standard to

adopt the CEPT/ERC recommendations.

The influence of Japan on 3G standards creation in Europe

Japanese operators faced with severe capacity constraints were keen to gain access to

the IMT-2000 band. Japanese manufacturers were also keen to become global players in

32 The allocation includes 2 x 60MHz paired bands for FDD mode, and unpaired segments of 15MHz and 20MHz for TDD mode.

167

infrastructure and handsets, a role they had missed in 2G (Japan’s PDC 2G standard had not been adopted widely globally). In November 1996, the Japanese Ministry of Post and

Telecommunications (MPT) established and chaired a study group to generate Japan’s

3G proposal for IMT-2000. In addition to Japanese manufacturers and operators the group included Motorola, Ericsson, Nokia, Samsung, Nortel, Qualcomm, IBM, Lucent, and LG. In early 1997 NTT DoCoMo, the dominant Japanese operator, placed orders for an experimental 3G network based on the FMA-2 Wideband-CDMA (WCDMA) air interface developed in the FRAMES project. Non-Japanese companies Ericsson, Nokia,

Motorola, and Lucent were among the firms awarded orders to widen support for

WCDMA. Indeed these orders were seen as instrumental in gaining Ericsson and Nokia’s support for WCDMA, companies that had previously had a vision of an upgraded TDMA based GSM standard as their preferred basis for 3G services. Ericsson persuaded NTT to base the core network on GSM MAP, a move that could provide GSM operators an attractive migration path to a WCDMA based 3G standard. Other Japanese mobile network operators also placed orders for experimental 3G networks based on WCDMA, even those that had recently adopted cdmaOne based 2G networks which had a different

3G migration path.

The apparent leadership threat from Japan spurred greater involvement in UMTS standardization efforts by European operators and the commercially oriented functions of manufacturers. The increasing involvement of commercial rather than technical planners refocused the vision of UMTS. Rather than seeking to integrate the wide range of potential ‘faces’ of UMTS into one specification, the new vision was more focused on voice, , , and interactive services. UMTS was reframed as

168

having an evolutionary path from GSM that would build upon GSM Mobile Switching

Centers (MSC), mobility management databases (HLRs and VLRs), and signaling

protocols (GSM MAP) for circuit switched services (like telephony), as well as the GPRS

architecture for packet based data services. Despite the different air interfaces and

frequencies there wasa smooth upgrade path to UMTS from GSM via GPRS/EDGE

infrastructures (see Figure 22).

The Selection of the UTRA (UMTS Terrestrial Radio Access) air interface

In 1997, ETSI set out to finally agree on the UTRA (UMTS Terrestrial Radio Access)

air interface. It set out a series of steps leading up to the final selection. At the twenty-

third SMG meeting (SMG#23) held in June 1997 13 proposals were grouped into five

concept groups:

Alpha Wideband CDMA

Beta Orthogonal FDMA (OFDMA)

Gamma Wideband TDMA

Delta Wideband TDMA/CDMA hybrid

Epsilon Opportunity Drive Multiple Access (ODMA)

169

1G 2G 2.5G 3G Analog Digital

HSCSD EDGE 57.6kbits/s 384kbits/s Circuit Packet

Analog GSM GPRS UMTS (FDMA) 9.6kbits/s 115kbits/s 384 kbits/s – 14.4kbits/s Circuit Packet 2Mbits/s Circuit Packet

Figure 22. Migration path for GSM operators to UMTS

The supporters of these concept groups were tasked with further developing them for formal presentation at SMG#24 in December 1997. Of the five options Alpha and Delta were seen as the strongest possibilities and had the largest numbers of supporters.

The Alpha proposal was based on the FMA-2 interface elaborated by the

ACTS/FRAME project. It was supported by Nokia, Ericsson, Lucent, Motorola, Fujitsu,

NEC, and Panasonic. It had wider operator support than the delta proposal, including

NTT and Telecom Italia Mobile, the world’s largest mobile operators. There were however more concerns about the availability of the IPR for this proposal.

Alpha was almost identical to the WCDMA interface adopted by NTT DoCoMo and other operators in Japan for their experimental 3G systems. If ETSI SMG were to select the Alpha the Japanese and European 3G air interfaces would be identical – and potentially allow Japanese manufacturers to play a larger role in the international market for mobile equipment. In November 1997 ETSI allowed Asian operators to become

Associate Members with full voting rights. Qualcomm, the CDMA pioneer, created a

European presence to participate in ETSI. However, since ETSI’s rules allocated votes

170

according to the revenue produced by companies’ European activities, Qualcomm had but one vote to Ericsson’s sixty-five (Mock, 2005, p. 203).

The Delta proposal was based on the FMA-1 (with spreading) interface elaborated by the ACTS/FRAMES project. This technology was backed by Siemens, Alcatel, Nortel and Italtel (later by Motorola, Bosch and Sony). It is notable that these companies were not suppliers for the Japanese experimental 3G networks, and that Nokia had stopped supporting FMA-1 after becoming a supplier to the DoCoMo experimental network.

At the SMG#24 meeting in December neither the Alpha, nor the Delta, proposals could win the 71% majority required. Agreement could not be reached at a follow up meeting held in January 1998 either. However, there was considerable pressure on SMG and the supporters of the various proposals to agree on an UTRA standard. The air interface proposals for the IMT-2000 systems were due to be submitted to the ITU-R

TG8/1 in June 1998. This pressure was behind the key proponents of each proposal reaching an agreement (in side meetings outside ETSI SMG groups) to merge their proposals. DoCoMo confirmed that it would support such a hybrid standard.

The UTRA air interface finally agreed upon used the Alpha FDD proposal on paired

UMTS spectrum, while the delta option would be used for TDD operation in unpaired spectrum. It was further agreed that technical parameters would be selected to achieve harmonization with GSM, and to allow operation in 2 x 5 MHz allocations to ensure it could be deployed by US operators. This compromise was effectively a victory for the

Alpha proposal since the Alpha air interface would be used in public networks. It was also the only mode specified in the 1999 release of the UMTS specification.

171

UTRA FDD and TDD formed the basis of Europe’s proposal for an IMT-2000 air-

interface. The ETSI DECT Project also submitted their Digital European Cordless

Telephone (DECT) specification which addressed the market for cordless and private

networks. The DECT Project’s interest in becoming part of the IMT-2000 family

revolved around the need for access to IMT-2000 unpaired spectrum to ensure its long

term viability.

To facilitate the harmonization of the UMTS standard based on the Wideband-

CDMA (WCDMA) with the Japanese WCDMA air interface a new organization was

established – the Third Generation Partnership Project (3GPP).

The Third Generation Partnership Project (3GPP)

A reengineering project at ETSI in the mid-1990s recognized the increasing globalization of standards creation and the need for coordination with other Standards

Definition Organizations (SDO). One way of achieving this was through “partnership” organizations among SDOs, which it referred to as ETSI Partnership Projects (EPP). Of course there had earlier coordination among SDOs. For example ETSI and ANSI T1P1 had coordinated on the specification of GSM1900. While successful the dual meeting and dual approval process was somewhat cumbersome. The coordination task for UMTS would be more complex as at least three regions would be involved: Europe, Japan, and the US (Rosenbock, 2001).

First steps to create 3GPP took place in spring 1997 within the context of European and Japanese cooperation in the UMTS Forum (Beijer, 2001, p. 160). Later, ETSI created a UMTS Globalization Group (UGG) in February 1998 to address the matter of

172

globalizing UMTS. This group led the discussions with Japanese, Korean, Chinese, and

US SDOs on possible ways forward. The Global Standards Collaboration (GRC) and

Global Radio Standardization (RAST) had been forums for loose coordination among

SDOs since 1990. These meetings provided opportunities to explore the idea of a using a

partnership organization to streamline coordination (Rosenbock, 2001).

The scope for 3GPP was set by discussions within ETSI and initially only included

UTRA (TDD and FDD), the GSM core network, and its evolution to UMTS. The ETSI

SMG continued to exist and was responsible for those aspects of GSM evolution not

covered by 3GPP. The Third Generation Partnership Project (3GPP) collaboration agreement was established in December 1998 and brought together the telecommunications standards bodies as “Organizational Partners.” Theses partners were official Standards Definition Organizations (SDOs) from Europe, the USA, and Asia. The original 3GPP partners were:

ARIB Association of Radio Industries and Businesses Japan

ATIS Alliance for Telecommunications Industry Solutions (formerly T1) USA

ETSI European Telecommunications Standards Institute Europe

TTA Telecommunications Technology Association Korea

TTC Telecommunications Technology Committee Japan)

The Chinese Wireless Telecommunications Standards Organization33 (CWTS) joined in

June 1999. ATIS34 was one of hundreds of SDOs accredited by ANSI. The other US

33 The CWTS’s role was later subsumed by the China Communications Standards Association (CCSA) 34 In 1998 ATIS was known as the T1 committee

173

SDO active in mobile communications is the TIA. It had a larger role in cdmaOne and

cdma2000 and chose “observer” status in 3GPP. A second category of partnership called

“Market Representation Partners” includes the following organizations:

Global UMTS TDD Alliance USA

IMS Forum USA

mobileIGNITE USA

3G Americas (formerly UWCC35) USA

UMTS Forum UK

Global Mobile Suppliers Association UK

GSM Association Ireland

IPV6 Forum UK

TD-SCDMA Industry Alliance China

TD-SCDMA Forum China

The 331 individual members36 include mobile, telecommunication, and data networking equipment manufacturers as well as network operators. A 3GPP permanent support group called the Mobile Competence Centre (MCC) is based at ETSI’s headquarters in Sophia Antipolis in France. The 3GPP submits proposals to the ITU through its national and regional SDOs partners.

The original scope of the 3GPP was to produce a globally harmonized 3G specifications based on evolved GSM core networks and the UTRA radio access FDD

35 In 2002 UWCC was relaunched as more comprehensive trade organization to represent TDMA and GSM firms in the Americas (Albright, 2002) 36 As of October 24, 2007

174

and TDD technologies. Four Technical Specification Groups (TSG) were formed to address Core Networks, Radio Access, Services and System Aspects, and Terminals

(Andersen, 2001). UMTS work was transferred to 3GPP in early 1999 (Hillebrand,

2001a). The scope of 3GPP was amended in 2000 to include the maintenance and development of the GSM technical specifications including evolved radio access technologies (e.g. General Packet Radio Service (GPRS) and Enhanced Data rates for

GSM Evolution (EDGE)). Thus almost all ETSI SMG work was transferred to the

GERAN (GSM EDGE Radio Access Network) TSG within 3GPP. The structure in 2007 is shown in Figure 23. These groups had more autonomy in approving specifications and setting their own work plans than was the case in ETSI. This along with more electronic working, and the reduced coordination needed in the single forum was aimed at speeding standards creation.

The formation of the 3GPP was seen by the UMTS Forum chairman as important for the globalization of UMTS, “The formation of 3GPP was no doubt the most important step to ensure the global vision of UMTS and to ensure that at least one of the IMT-2000 family members would have sufficient strength to become a global standard” (Beijer,

2001, p. 163).

The first release of the UMTS specification (Release 99) was built upon the GSM

MSC architecture for circuit switched services and the GSM GPRS core for packet data services. Release 99 was finalized in March 2000. By early 2000 the possibility of an “all

IP” core was being discussed. The benefits being that it was easier to roll out new applications and that they could be rolled out across a range of access technologies. An all IP core may also be more scalable and cheaper (Andersen, 2001). Subsequent

175

releases37 of the UMTS specification have added increasing support for IP transport and multimedia as well as higher data rates (Puuskari, 2002). For example, High-Speed

Downlink Packet Access (HSDPA) which can potentially provide data rates in excess of

10 Mbits/s.

Figure 23. 3GPP Technical Specification Groups (http://www.3gpp.org/Management/OP.htm October 2007)

The UK licenses for the use of IMT-2000 spectrum were auctioned by Ofcom in

2000. At the conclusion of the auction on April 27, the four existing 2G operators each had a 3G license and a new operator (which would be branded as ‘3’) owned largely by

Hutchinson Whampoa had also won a license. The auction, described at the time as the

37 See http://www.3gpp.org/specs/releases.htm

176

biggest ever, had raised £22.5 billion (about 2.5% of the UK’s GNP) (Binmore &

Klemperer, 2002; Quigley, 2000; Telecoms Deal Report, 2000). While the operators celebrated their wins at the time it was not long before the 3G auction were being blamed for out $700 billion of the value of European telecommunications companies as analysts argued that operators had paid too much (Klemperer, 2002). The debt incurred by BT in 3G auctions was a major driver of it reorganization in 2001, and the spinning- off of its mobile operations as (Cassy, 2001).

The new operator, 3UK, was the first to launch a 3G network in the UK. However, the larger 3G handsets were said to be behind the slow adoption by consumers. The other operators launched their 3G network in 2004 but initially focused on selling 3G data cards to business users for use with laptop PCs rather than 3G handsets to consumers. All the UK operators had adopted UMTS as their 3G technology. By this time several commercial Wi-Fi networks with thousands of hotspots were also available in the UK

(Ofcom, 2004).

The four existing 2G operators started to roll-out 3G capable handsets from late 2004 as devices that were more acceptable to consumers became available. Initial data services on the handsets were built upon the walled-garden38 content portals that operators had already launched on 2.5G enable devices (e.g. Vodafone Live! and Orange World) although O2 committed to rolling out i-mode in the UK (Ofcom, 2005a).

By the end of 2005 there were over 4 million 3G subscribers in the UK (mostly with

3UK) and video and music download services were making use of the extra bandwidth the technology provided (in the order of 384 kbits/s). T-Mobile led a move away from

38 The walled-garden approach limits users to a limited selection of content made available by the network operator. This contrasts with their usual experience on the where they have access to all content at all times.

177

limiting users to the walled-garden with its Web’n’Walk offering allowing full access to

the World Wide Web. In early 2006, HSDPA was launched by O2 (initially only on the

Isle of Man) providing data rates in excess of 1Mbits/s). City center wide deployments of

Wi-Fi were also being announced in 2006 (Ofcom, 2006a).

All the network operators rolled out HSDPA in their networks during 2006 and 2007

with the focus again being on business users using data cards for laptop PCs. Over 11%

of mobile phone users (7.8 million) were 3G subscribers by the end of 2006. With, nearly

80% of handsets being sold in early 2006 supporting web browsers and four out of five

operators offering ‘unlimited’ data tariffs for £5 per month the mobile Internet experience

was approaching that available with laptop and desktop computers. In 2006 the revenues of the mobile telecom industry in the UK exceeded those of the fixed line telephone and internet access combined. There were also more than twice as many mobile subscriptions as fixed lines. At the end of 2006 non-SMS voice data made up about 5% of mobile revenues (Ofcom, 2007a).

178

European Union funds 3G research RACE Defination Phases (1985-87), Phase1 (1988-92), Phase 2 (1990-94) 5 air interface ETSI adopts ACTS (1994-98) RACE UMTS proposals UTRA air vision assessed by interface released ETSI (1997) (Jan98) (1995) UMTS Forum 3GPP established established (1996) Europe to align Jap./Euro. NTT orders standards Japanese experimental (1998) 3G study WCDMA group systems from formed Nokia, Ericsson, (1996) Lucent and Japanese mfgrs Japan (1997)

ITU WARC ITU receives identifies 10 IMT-2000 FPLMTS proposals Qualcomm bands (1992) (1998) & Ericsson resolve IPR Qualcomm disputes demands (3/99) WARC 200 concessions 0extends IMT- 179 for licensing 2000 bands CDMA IPR (1998) Five IMT-2000 air-interfaces approved (2000)

UMTS UMTS UMTS Rel 99 Rel 4 (3/01) Rel 6 (3/05) Global (3/2000) Rel 5 (6/02) HSUPA HSDPA MBMS

1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007

UK 3G 3UK launch Remaining HSDPA auctions 3G network operators networks completed (2003) launch 3G launched (4/2000) (2004) (2006) T-Mobile launch 11% of UK Web’n’Walk mobile (2005) users on 3G (2006) UK

Figure 24. Timeline of major events in third generation (3G) standards creation and adoption in UK/Europe

The creation of 3G standards in the US

The situation in the US was different than in Europe. Operators were using D-AMPS, cdmaOne, and the GSM based PCS-1900 air-interface standards and IS-41 rather than

GSM MAP for core network signaling. An additional complication was that the IMT-

2000 bands identified by the ITU at the 1992 WARC had been partially allocated to the

PCS services (see Figure 25). It would be 2006 before substantial additional spectrum was made available by the FCC39 (Nolle, 2006; Pappalardo, 2006; Writer, 2006). Thus in the US an additional requirement for 3G standards was that they should be able to coexist with existing 2G standards since operators would have to transition to 3G ‘in-band.’

IMT- IMT-2000 2000 IMT-2000

PCS PCS

DCS/GSM1800 DCS/GSM1800

1720 1760 1800 1840 1880 1920 1960 2000 2040 2080 2120 2160 MHz

Figure 25. IMT-2000 frequency allocations with respect to existing GSM1800 (Europe) and PCS allocations (US)

The 3G migration path for GSM and D-AMPS (IS-54 / IS-136) operators in the US

PCS operator VoiceStream was a major operator of GSM technology at 1900MHz in the US (it was renamed T-Mobile after its merger with in 2001). Its initial technology choice made it almost inevitable that its evolutionary path would be the

39 Paired spectrum from 1,710-1,755 MHz and 2,110-2,155 MHz was auctioned in August/September 2006. Existing wireless operators and cable companies spent about $13.9 billion to gain access to this spectrum.

180

same as GSM operators in the rest of the world (see Figure 22) – although it would be sharing spectrum with 2G systems at 800MHz and 1900MHz rather than residing in separate spectrum as in Europe.

The US operators that had deployed D-AMPS (AT&T Wireless and Cingular being the largest after various mergers and acquisitions) worked with manufacturers on an upgrade path to 3G. The industry group supporting TDMA technology, the UWCC

(Universal Wireless Communications Consortium), set the standardization agenda with the actual standards creation work taking place in the TR45.3 subcommittee of the TIA.

The first step envisaged by the UWCC was an enhancement of IS-136, called IS-

136+, based on using an 8-PSK modulation scheme in existing 30 kHz channels. A

GPRS based packet data infrastructure would be used (Albright, 2000b).

The next stage involved the adoption of the EDGE (Enhanced Data rate for Global

Evolution) co-developed with ETSI and GSM community. A variety of versions of

EDGE were envisaged including those with bit rates sufficient to qualify as a potential member of the IMT-2000 family (Alleven, 1998; Williams, 1998). However, there was skepticism as to whether EDGE was a viable 3G technology (Smith, 2000a). TDMA based operators were positioning EDGE as a 3G technology despite GSM operators viewing it as a 2.5G evolutionary step to full 3G using UMTS (Bassuener, 2002).

The members of the UWCC included network operators and manufacturers e.g.

Ericsson, Lucent, Nokia, Nortel, Alcatel, and Motorola. Many of these manufacturers were also lead proponents of GSM and D-AMPS enhancements had increasingly adopted

GSM based solutions. The GSM and D-AMPS standards were growing together and there were proposals for interoperability between the two TDMA based technologies,

181

including dual mode handsets and supporting voice services over EDGE (Albright,

2000a; Radousky, 1999).

By late 2000 AT&T Wireless announced an indirect route to 3G (Foley, 2000). It

would deploy an overlay GSM/GPRS network using infrastructure that would support a

direct upgrade path to EDGE and UMTS (Smith, 2000b). The announcement on the

migration path was part of NTT DoCoMo’s investment in AT&T wireless that would

also see a version of i-mode40 deployed in the US as mLife (R. Lee, 2000). AT&T started its rollout of GSM/GPRS in July 2001 (Bassuener, 2001). Cingular Wireless, a joint venture between Bell South and SBC, announced a similar GSM/GPRS overlay strategy in November 2001 (Luna, 2002). Cingular was already using GSM at 1900 MHz in some of its markets. See Figure 26 for an illustration of the primary migration paths for

TDMA operators in the US.

1G 2G 2.5G 3G Analog Digital IS-136+ 64kbits/s Packet

AMPS D-AMPS/IS-54 EDGE WCDMA (FDMA) (TDMA) > 384 kbits/s 384 kbits/s – 14.4 kbits/s 9.6kbits/s Packet 2Mbits/s Circuit Circuit Packet

GSM GPRS 9.6kbits/s 115kbits/s Circuit Packet

Figure 26. Migration path for D-AMPS operators to 3G

40 i-mode is NTT DoCoMo's popular wireless Internet service which launched in Japan in early 1999

182

Supporters of the cdma2000 3G technology viewed the transition to 3G as an

opportunity to convert the TDMA operators to cdma2000. For example the CDG

published a report highlighting the difficulties TDMA operators would face migrating to

UMTS/WCDMA and recommending they reconsider cdma2000 (Albright, 2001a). They

had success with some of the smaller US operators (US Cellular and Cellular South) and

in South America (Marek, 2002). The controversy about the best migration path for

TDMA operators continued even after the major operators had already announced their intention to follow the path to UMTS/WCDMA (Clift, 2002; Drucker, 2001).

The 3G migration path for cdmaOne (IS-95) operators in the US

The development of a 3G upgrade path for operators of cdmaOne networks was initiated in mid-1997 by Lucent, Motorola, Nortel, and Qualcomm. Responsibility for this development was transferred to the CDMA Development Group (CDG) soon thereafter with participation widening to include Hughes, Samsung, NEC, Nokia, OKI,

Philips and Sony. The actual creation of CDMA based standards was performed in TIA committee TR45.5 (Bekkers, 2001 Ch. 13).

As with GSM the migration path to 3G for CDMA networks included several possible steps. In 1997 the cdmaOne specification was enhanced (IS-95B) to include a 64 kbits/s packet data service. This upgrade only required a software upgrade for more recent cdmaOne infrastructure. Its performance was broadly comparable to that of GPRS

(Thelander, 2005; West, 2002a, 2002b).

183

The next migration step for cdmaOne operators was the cdma2000 family of

standards that were submitted to the ITU for consideration as IMT-2000 3G standards.

The cdma2000 approach was much more focused on ease of upgrade for existing

cdmaOne operators than was the case for the upgrade to UMTS for GSM operators. The

first step in the evolution path, known as cdma2000 1xRTT, had backward and forward

compatibility with cdmaOne. A key part of this was retaining 1.25MHz carrier spacing in

the same frequency bands i.e. cdmaOne based operators would be performing an “in-

band” migration to 3G. This meant that IS-95 handsets functioned in 1xRTT networks

and 1xRTT handsets would operate in an IS-95 network – albeit without the more

advanced features. The 1xRTT upgrade increased user data rates to around 153 kbits/s.

While the 1xRTT upgrade required hardware changes to base stations and controllers it

also approximately doubled system voice capacity which made it an attractive upgrade

for operators irrespective of its data capabilities (Thelander, 2005).

The largest US operator of a cdmaOne network was Verizon Wireless (a joint venture

of Verizon41 and UK based Vodafone). Verizon used AMPS at 850MHz and cdmaOne at

850MHz and 1900MHz. Sprint PCS used cdmaOne in its 1900MHz PCS allocation.

Second-tier operators like Alltel, US Cellular, and others also adopted cdmaOne. Verizon Wireless launched cdma2000 1xRTT in ten markets in January 2002 and Sprint launched their nationwide upgrade to the technology in August of the same

year (Carroll, 2002; Kenedy, 2002; McCall, 2002).

There was a potential branch in the next stage of evolution of cdma2000. An

extension optimized for data traffic, know as cdma EV-DO (Evolution – Data

41 Formerly Bell Atlantic and GTE

184

Optimized42) was favored by Verizon Wireless and by mid-2001 it had move through the standardization path. While it could offer up to 2.4 Mbits/s it required that a complete

1.25MHz channel was dedicated to data services. Sprint PCS was inclined to wait for the development of an alternative upgrade path, cdma EV-DV (Evolution Data and Voice), that would support data at up to 5 Mbits/s and voice traffic on the same 1.25MHz channel

(Albright, 2001b; Luna, 2001a). However, by 2004 Sprint PCS had also committed to

EV-DO and was deploying it in many markets by mid-2005 (Pappalardo, 2005). With

Sprint’s decision to deploy EV-DO work on the standardization of EV-DV ceased

(Thelander, 2005). The 3G migration path for cdmaOne operators in the US is shown in

Figure 27.

1G 2G 2.5G 3G Analog Digital

IS-95B 64kbits/s Packet

AMPS IS-95A CDMA2000 CDMA2000 CDMA2000 (FDMA) (cdmaOne) 1XRTT EV-DO 3X EV 14.4 kbits/s 14.4 kbits/s 144kbits/s 2.4Mbits/s ?? Mbits/s Circuit Circuit Packet Packet Packet

CDMA2000 EV-DV 5Mbits/s Packet

Figure 27. 3G Migration path for cdmaOne operators

The Third Generation Partnership Project 2 (3GPP2)

While SDOs from the US (TIA and T1 Committee) were invited to participate in

3GPP it became apparent in July 1998 that ETSI would not allow non-ETSI technologies

42 In older articles EV-DO was said to stand for Evolution Data Only

185

to be addressed within the new organization (Bekkers, 2001 Sec.13.4.2). The 3GPP2 was

established to collaborate on 3G specifications based upon the ANSI/TIA/EIA-41 core network standards and the air interfaces supported by it – particularly cdma2000 (Crowe,

2000). Like 3GPP, 3GPP2 was intended to provide a setting for more rapid delivery of standards than was possible through the traditional ITU route. Specifications from 3GPP2 are still delivered to the ITU via the project’s organizational partners – these are the following five officially recognized SDOs:

ARIB - Association of Radio Industries and Businesses (Japan)

CCSA - China Communications Standards Association (China)

TIA - Telecommunications Industry Association (North America)

TTA - Telecommunications Technology Association (Korea)

TTC - Telecommunications Technology Committee (Japan)

The 3GPP2 requires that participating individual member companies be affiliated with at least one of the Organizational Partners. In addition, the Project has Market

Representation Partners (MRPs) “who offer market advice to 3GPP2 and bring a consensus view of market requirements (e.g. services, features and functionality) falling

within the 3GPP2 scope” (3GPP2, 2007). They are the CDMA Development Group

(CDG), the IPv6 Forum, and the International 450 Association (IA 450).

186

IMT-2000 proposal and IPR issues

In July 1998, the ITU received 10 proposals for IMT-2000 interfaces. They included

the FDD and TDD mode UMTS air-interfaces, and DECT from ETSI. ARIB also submitted a WCDMA proposal that was almost identical to ETSI’s UMTS proposal. The

TIA submitted several proposals including UWC-136 (EDGE) from the UWCC, and cdma2000 from the CDG. The US also submitted two other CDMA based proposals.

The IPR in the UMTS air interfaces (UTRA) came to dominate the creation of 3G standards from early-1998 until mid-1999. The primary actors in the battle over IPR were

Qualcomm and Ericsson which had been involved in litigation around CDMA patents for

several years. Qualcomm had built up a considerable portfolio of patents around CDMA

technology and there was little doubt that other manufacturers would need to license

Qualcomm’s patents to develop equipment based on the UTRA interfaces. Ericsson also

had essential IPR, although its portfolio in CDMA technology was not nearly as large as

Qualcomm’s. ETSI and the ITU would be unable to complete their standards creation

activities without all parties declaring that IPR licenses would be available on “a fair,

reasonable, and non-discriminatory basis.” In September 1998 both Ericsson and

Qualcomm declined to send such assurances to the ITU (Bekkers, 2001 Ch. 13; Mock,

2005).

Qualcomm stated that it held “essential IPR to ETSI's proposed …. candidate

submission [i.e. UMTS] and that Qualcomm would license its IPR only on fair,

reasonable and non-discriminatory terms for standards meeting a set of technical criteria

based on three fairness principles which support convergence of all proposed 3G CDMA

technologies. The fairness principles are:

187

1. A single, converged worldwide CDMA standard should be selected for 3G;

2. The converged CDMA standard must accommodate equally the two dominant

network standards in use today (IS-41 and GSM-MAP); and

3. Disputes on specific technological points should be resolved by selecting the

proposal that either is demonstrably superior in terms of performance, features,

or cost, or, in the case of alternatives with no demonstrable material difference,

the choice that is most compatible with existing technology” (Qualcomm,

1998).

Qualcomm was using its strong IPR portfolio to ensure that its operator customers would not be left as ‘angry orphans’ after having selected cdmaOne for their 2G systems.

The Qualcomm position was in response to what it saw as a series of anti-competitive actions taken by ETSI and the EU government to perpetuate the dominance of Europe in global mobile markets. Qualcomm believed that ETSI was making choices about the parameters (particularly the chip rate of the spreading code) of the air interface that would make backward compatibility with cdmaOne all but impossible. This, along with

ETSI’s unwillingness to consider supporting the ANSI-41 core network signaling protocol in addition to GSM-MAP would make cdmaOne operators’ transition to UMTS prohibitively expensive. In addition European legislation and directives appeared designed to keep non-European standards out of the market in the EU. Qualcomm took its grievances to the US government that had been having ongoing disputes with the EU on open markets, level playing fields, and their respective WTO obligations. In his

188

testimony before the sub-committee on Trade of the House Committee on Ways and

Means, Qualcomm’s senior VP for external affairs called ETSI’s behavior a “flagrant act of protectionism … creating an artificial monopoly.” In reply to a letter of protest from

US government the EU Commissioner declined to get involved in the IPR dispute. In essence manufacturers, particularly Qualcomm and Ericsson, would have to reach their own agreement (see page 166 for more on this letter) (Bekkers, 2001 Ch. 13; Mock,

2005).

The ITU imposed a deadline of Dec 31, 1998 for the resolution of the IPR issues

(later extended to March 31, 1999). Without such a resolution it was likely that UMTS could not have been accepted as a member of the IMT-2000 family. As the final deadline loomed both companies’ incentives to reach an agreement outweighed any possibility of winning definitively in the courts. In March 1999 they agreed to settle their ongoing patent disputes, cross-license their IPR, and give the ITU the assurances that they would

“license their essential patents for a single CDMA standard or any of its modes to the rest of the industry on a fair and reasonable basis free from unfair discrimination”. The

‘single CDMA standard’ they referred to included options for both UMTS and cdma2000. The press release stated that each option would support both GSM MAP and

ANSI-41 core networks. The deal also included the sale of Qualcomm’s infrastructure division to Ericsson (Qualcomm, 1999). After the dispute was resolved Ericsson joined the CDG in May 1999. However, the eleventh hour agreements between Qualcomm and

Ericsson did not arrive in time for the ITU to reach a final decision about IMT-2000 in

March 1999.

189

Many operators were disappointed that the ITU, partnership projects (3GPP and

3GPP2), and the regional SDOs were unable to reach satisfactory agreements regarding

3G standards. In early 1999 the Operators Harmonization Group (OHG) was established to drive the final stages of standardization. By mid-1999 the OHG had worked with manufacturers to reduce the number of proposals and harmonize key system parameters, particularly CDMA chip rates, to permit cost-effective multimode handsets. To further interoperability the OHG gained a consensus that each of the four proposals it was finalizing would eventually support GSM MAP and ANSI-41 core network signaling protocols (PR Newswire, 1999a, 1999b). This operator driven group had succeeded where the ITU and the SDOs had failed.

Finally, in May 2005 the ITU published recommendation M.1457 which accepted the four radio interfaces harmonized by the OHG along with DECT as official members of the IMT-2000 family:

Air-interface: Standardized by:

FDD mode of the UMTS air-interface 3GPP

TDD mode of the UMTS air-interface43 3GPP

EDGE Proposed by UWCC now under 3GPP

DECT ETSI

cdma2000 3GPP2

43 Note that the TDD mode of the UMTS air interface also forms the basis of the TD-SCDMA (Time Division – Synchronous CDMA) standard supported by China.

190

It should be noted that there were in effect two families. UMTS (FDD and TTD modes) and EDGE were standardized by 3GPP and formed the main migration path for operators with GSM/GPRS and D-AMPS based 2G networks (see Figure 22 and Figure

26)44. The cdma2000 set of interfaces formed the migration path for cdmaOne operators

(see Figure 27). Summaries of the main events in 3G standardization are provided as a timelines for Europe in Figure 24 and for the US in Figure 28.

In 2007 the 802.16 wireless metropolitan area network (WMAN) air-interface (also known as WiMAX) developed by the IEEE was added as a sixth member of the IMT-

2000 family (ITU, 2007).

44 Also NTT DoCoMo in Japan which had used the TDMA based PDC air-interface as its 2G standard.

191

Ten 3G ITU adopts ITU WARC ITU WARC proposals multiple identifies identifies IMT- submitted IMT-2000 FPLMTS 2000 bands to ITU standards bands (1992) (1995) ???/ (7/98) (11/99)

1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007

CDMA cdmaOne IS-95 IS-95 Rev A US cdma IS-95 IS-95 Rev B (cdma) (cdma) services branded (cdmaOne) (July 1993) (May 1995) launched cdmaOne (Mar 1999) (4/96) (6/97)

Intention to Cdma2000 Cdma2000 Cdma2000 Cdma2000 cdma2000 1xRTT develop 3G 1x Rev 0 1x Rev A 1x Rev B 1x Rev D standard (Oct 1999) (Jul 1999) (Feb 2002) (Mar 2004) announced 1x Rev C (6/97) (May 2002)

192 ETSI refuses Cdma2000 non-ETSI launched standards in in Korea 3GPP (1998) (10/00)

cdma2000 1xEV-DO 3GPP2 Cdma2000 Cdma2000 established 1x EV-DO 1x EV-DO (1998) Rel 0 Rev A (Oct 2000) (Oct 2000)

UMTS 3GPP Qualcomm WCDMA WCDMA WCDMA established & Ericsson Rel 99 Rel 4 (3/01) Rel 6 (3/05) to align resolve (3/ 2000) Rel 5 (6/02) HSUPA (1998) patent HSDPA MBMS dispute (3/99)

Figure 28. Timeline of major events in third generation (3G) standards creation and adoption in the USA

Discussion of 3G standards creation and industry structure

In this section we return to the research questions presented in Table 5 of chapter 3.

We use the descriptions of the development of third generation mobile wireless standards to provide a range of answers to these questions and highlight where the theoretical perspectives highlighted in chapter 2 provide the most insight.

While the high-level case study provided the ‘big picture’ for addressing the research questions it could only tap the interests and conceptualizations through second hand accounts. To add more depth to these secondary sources case several interviews of decision makers in the wireless and related industries in the US and the UK were performed. Details of the interviews and the interviewees are presented in Table 9 and

Table 10. Their names are not included since anonymity was guaranteed. There were a total of 27 interviews involving 42 interviewees. The Identifier associated with each organization shown in the table (e.g. US8 is a handset manufacturer) is used later to tie quotations back to a specific organization.

The interviews were performed between 2003 and 2006. This was after the creation of 3G air-interfaces but during the industry’s transition to the new technologies and its exploration of new service possibilities. The interview data offered insights into each of the research questions presented in Table 5. However, the data did not map neatly to particular questions. So while the themes and narratives present in the data are presented within the context of one or other of the research questions they generally provide insight into all of them. Some of these finding have already been published in a paper that

193

appeared in the journal Telecom Policy (Tilson & Lyytinen, 2006), or were presented at conferences (Tilson & Lyytinen, 2005; Tilson, Lyytinen, Sørensen, & Liebenau, 2006).

Identifier Role of Origin Role of interviewee(s) Date Length Organization (h.m) US1 Infrastructure USA • Two directors of corporate strategy group Oct 2003 1.50 manufacturer • Director of mobility standards Feb 2004 1.50 US2 System USA • CTO of pervasive computing Jan 2004 1.50 Integrator • CIO Advanced Technology • Architect of pervasive computing US3 Wireless USA • Two executives in strategy Mar 2004 2.15 network • Two executives in consumer and business operator market business development US4 Semiconductor USA • Director of engineering – responsible for Mar 2004 1.50 manufacturer & standards activities software vendor US5 Industry USA • General manager Mar 2004 0.45 consortia US6 Handset Europe • Two top R&D executives Mar 2004 1.25 manufacturer US7 Semiconductor USA • Executive for strategic planning for Apr 2004 1.30 manufacturer communication components division US8 Handset USA • VP responsible for global standards team, Apr 2004 1.00 manufacturer patents and two advanced solutions teams. • VP of Engineering for 3G US9 Regulator USA • Chief and Associate Chief of the policy and May 2004 1.40 rules relating to wireless communications US10 Content USA • Executive in corporate strategy Nov 2004 1.30 provider • Senior. VP of Technology • Director of Corporate Strategy Nov 2004 1.30 • Business development for online services • Head of new business development in cable business (including broadband wireless)

Table 9. US Case study - details of interviews and interviewees

194

Identifier Role of Origin Role of interviewee(s) Date Length Organization (h.m) UK1 Wireless network UK / • Researcher in programs on business Nov 2005 1.30 operator Europe strategy Nov 2005 1.30 • Manager in group strategy group Feb 2006 0.55 • Executive in regulation matters UK2 Wireless network UK / • VP Marketing Nov 2005 1.20 operator Europe • Project Manager – data services projects 2.15 UK3 Wireless network UK / • VP of Corporate Strategy Nov 2005 1.30 operator Europe UK4 Industry UK / • Strategic initiatives manager Nov 2005 1.30 association Europe UK5 Regulator UK • Head of R&D Nov 2005 1.40 • Manager – broadband wireless spectrum Nov 2005 1.50 and regulation UK6 Broadcaster and UK • Business development – involved in deals Nov 2005 1.00 content provider with wireless network operators UK7 Consultant UK • Several consultant that have actively Nov 2005 2.30 participated in and led engagement in the mobile and wider telecom industry UK8 Fixed network UK • Manager, new mobility service Nov 2005 1.50 operators with development Dec 2006 MVNO • Director, network convergence / innovation UK9 Customer service UK • Head of Product Marketing Nov 2005 1.20 solution provider for operators UK10 Content platform Japan • Sales manager for European market Dec 2006 1.35 provider UK11 Broadcaster and UK • Head of strategy for non-broadcast Jan 2006 1.20 content provider initiatives

Table 10. UK Case study - details of interviews and interviewees

Standards creation and adoption (RQ1)

Several visions of the existing and future actor-networks influenced the way that 3G standards were developed. The initial vision of a single global third generation air- interface standard was not realized – although by 2000 only two major families of air- interfaces remained:

• The GSM, GPRS, EDGE, UMTS (FDD and TTD) family standardized by

3GPP

• The cdma2000 family standardized by 3GPP2

195

During the earliest days of FPLMTS/IMT-2000 the fixed telecom network operators strove to include wireless communications in a wider vision of telecom services provided

by operators and that retained the intelligence within the network (and was therefore

controlled by operators). This vision of how a physical telecom network should be

configured was in alignment with the telcos’ interests in how the wider business actor- network should be configured. This vision influenced the telcos’ standardization strategies and the standardization agenda within the ITU and other SDOs. This vision persisted throughout the EU’s sponsorship of R&D on a variety of third generation related topics (RACE and ACTS programs), but could not be maintained when the configuration of another communications network, the Internet, with intelligence at the edge, came to dominate the vision for the future of interactive data services. This change in the future vision of the platform for interactive multimedia services applied to both the fixed and the mobile wireless networks.

The economic perspectives summarized in Chapter 2 could help us understanding some of the standard decisions. For example, the pragmatism of staying with

30 kHz channel spacing meant that D-AMPS operators had to make a fairly big change for their transition to 3G. For this upgrade they mostly chose UMTS at least partially based on its greater expected indirect network externalities (particularly economies of

scale) – despite cdma2000’s supporters best efforts to enroll them. However, the

economic perspective does not readily provide insight into the changes in visions within

the standards creation processes. These changing visions are readily conceptualized

within the actor-network based perspective elaborated in Chapter 3 and the dynamics of

196

the effects of the interactions among actors as by the model of strategy formulation depicted in Figure 2.

The European Commission created a vision of repeating Europe’s success in winning

the 2G standards war and invested heavily in R&D for over a decade. Despite these

investments the Commission, for many years at least, was not particularly successful in

enrolling network operators or the more commercially oriented personnel within

manufacturers. It produced a ‘UMTS Vision’ document to help propagate the vision and

created additional actors to focus attention and coordinate activities (e.g. UMTS Task

Force and UMTS Forum). However, it was not this vision of the future that garnered the

enrollment of the manufactures and operators, but rather the alternative vision of the

Japanese wireless industry displacing the Europeans as global leaders. This also entailed

a simplification of the vision by dropping many of the ‘faces’ that had been part of the

UMTS vision (these had included cordless telephony, wireless PABX, wireless LAN,

wireless local loop, private mobile radio, satellite system, and paging network). The

active role of the European Commission and other European level institutions in the

creation of the UMTS standard is in stark contrast to the ‘hands-off’ technology neutral

approach taken by the US regulatory regime.

The free trade obligations of the US and Europe, and particularly the tensions

between them on trade, played a somewhat belated role in the way that UMTS was

supported by European level institutions. However, while Qualcomm and the US may

have been able to claim a victory in principle by limiting how the Commission supported

the roll out of UMTS, it made little practical difference as UMTS became the community

wide 3G standard in any case. In this instance the adoption of 3G technologies was

197

largely dictated by 2G technology choices – the configuration of the existing wireless

communication networks and the creation of migration paths tailored specifically to

them. While one could think about the principle of free-trade as forming an institutional

basis for explaining this episode it would not extend across generations of technology or

even explain the uncertainties that raged about the outcome at the time. Rather the treaty

obligations and the status of European and the US relationship can be conceptualized as

an actor-network configuration in the regulatory regime that could be strategically used

by one set of actors to constrain the actions of others – a conception that fits very well

into the actor-network perspective.

The creation of 3G standards highlighted the preference of actors for the creation of a

standard, at least in certain circumstances, even if it was not the outcome they preferred.

For example, the supporters of the TDD based air-interface for UMTS accepted a compromise that saw their preferred option demoted to a secondary role. Similarly,

Qualcomm and Ericsson settled their IPR differences just in time to allow the selection of

IMT-2000 air-interfaces. Although we did not cover game theory in the theoretical

perspectives summarized in Chapter 2 this sort of decision making could be modeled

with some variant of the ‘battle of the sexes’ game. However, the underlying interests of

the players / actors can just as easily be incorporated into the actor-network perspective.

This perspective being flexible enough to incorporate network economics, international

trade, backward compatible technology, or other interests and actor-network configurations that do not conform to the assumptions of more restricted theoretical perspectives.

198

The interviews with decision makers in the wireless industry provided more insight

into their views of standardization and other aspects of the in the industry. While the case histories presented earlier in this chapter provide a broad overview over decades the interviews provide a snapshot of a particular point in time.

The US interviews (see Table 9) performed in 2004 highlighted three main areas of change in the standardization arena: what is being standardized, what is considered to be most strategically important, and where standardization efforts are taking place (Tilson &

Lyytinen, 2006).

What is being standardized?

Data services increased the complexity of wireless systems and introduced many more inter-related components into both the infrastructure and the wireless devices. Thus the range of interfaces and technologies subject to coordination and possible standardization moved beyond air interfaces, voice codecs and signaling protocols to include specifications higher in the stack including data representation and transmission

(e.g. chtml, WAP), application platforms (e.g. Java, BREW, PalmOS, Symbian, and WinCE) and user interfaces (e.g. Symbian Series 60).

“As the need for the number of standard interfaces increases, the problem of

interoperability increases— probably exponentially.”

(Semiconductor manufacturer – US4)

199

In addition, coordinating interactions among new and old industry participants has created a need for new interfaces. For example, from the network operator’s perspective, standards are needed to handle the management and aggregation of data flowing from content providers and service providers. An interviewee from a system integrator believed that a lack of standards in this area was holding back some applications.

As the wireless device took on more of the characteristics of a computer there was increased attention given to the modularization of the device and standardization of the internal interfaces (e.g. the Mobile Industry Processor Interface Alliance). Bluetooth also created a cross-manufacturer standard for interconnecting handsets with other devices.

Strategically important interfaces

While the air interface remained critical for interoperability in 3G systems and for realizing economies of scale there was a general consensus among the interviewees that standards battles have migrated to higher layers in the stack as well as shifting to domains that were once the sole concern of the computing industry.

“….that’s one thing that has happened the last few years… it’s not a war anymore

[between 3G air-interface standards]”

(Semiconductor manufacturer – US4)

“We are actually very active in contributing to [the IETF], because that’s critical to

the success of wireless mobility. Now we’re dealing with a whole different group of

200

people—content providers, the IT community and that’s relatively new for us. It has

become significantly more complex.”

(Infrastructure manufacturer – US1)

A number of proprietary and open platforms for the delivery of content to devices emerged and the role of the and middleware on devices and backend

systems also became more important in the industry. Open higher layer interface definitions for 3G, which mostly concern interactions with new industry participants, were being actively designed and negotiated in an industry-wide forum: the Open Mobile

Alliance45 (OMA).

In Japan, DoCoMo’s dominant position allowed it to set the technical standards for all

layers of the stack and to more or less dictate how industry participants were coordinated.

In contrast, there remained numerous service delivery platforms, device operating

systems, and user interfaces in the US and the UK. There was little prospect that a

dominant data services platform would emerge in the US or the UK in the near future.

Where standardization takes place

The advent of 3G greatly complicated the scope of the standardization effort. There were over 100 standards bodies, and participants now also come from the computing,

data networking, and content industries. In addition to the traditional standards

development organizations (SDOs; e.g. ITU-R, ETSI, TTA, TTC, ARIB, TIA, ATIS)

there were new operator and vendor driven industry consortia (e.g. 3GPP, GPP2, OMA)

as well as forums that cross the wired, wireless and computing domains (e.g. IEEE, IETF

45 www.openmobilealliance.org

201

and W3C). Even the biggest players in the industry only attended about half of them and some companies worked with partners to allow them to monitor forums they do not attend. The role of the SDOs changed. For example, the primary forum for WCDMA standardization moved to an industry consortium (3GPP) as the coordination of activities in the ITU and four 4 regional SDOs became too difficult and resource intensive.

“I think about it this way. ETSI and TIA no longer have meetings to do

standardization. 3GPP and 3GPP2 meet very frequently and are well attended. They

create the specifications and ETSI and TIA approve them. They rubberstamp them at

that point.”

(Semiconductor manufacturer – US4)

Inter-organizational coordination and relationship building (RQ2)

Certain organizational level actors emerged from the US interviews (see Table 9) as being particularly important wireless industry participants. An attempt to map the critical relationships among the traditional and new industry participants around 2004 is depicted in Figure 29 (Tilson & Lyytinen, 2006). The expansion in the number and types of industry participants introduced multiple new relationships (compare with industry structure for the 1G/2G shown in Figure 21). However, the connections among players go well beyond those in Figure 29. Industry participants strive to influence many other industry participants – well beyond those that it has direct ‘supply chain’ relationships with. In addition, at least some of the existing relationships among traditional industry participants are changing as the wireless service portfolio widens.

202

“If you look at the multiple value chains… you want to make sure that you are

influencing all the parts of this stack, so to speak. Otherwise, it’s not going to work.”

(Semiconductor manufacturer – US 7)

Innovation System Market Place

Customers Semiconductor Device Network • Corporate Manufacturers Manufacturers Operators • Consumer

OS and middleware Content vendors Providers

Infrastructure Service System Integrators/ Manufacturers Providers Solution Providers

Regulatory System New dimensions: DRM, Industrial Operator Spectrum Regulatory privacy and security policy licenses allocation framework

Government / Regulators

Figure 29. Major organizational actors in the wireless industry (Tilson & Lyytinen, 2006) (New types associated with data services are shown in dashed boxes)

This view of influencing change fits well with the model of actor-network based strategy formulation shown in Figure 2. Successive interactions among actors alter their perceptions of the current and possible future actor-network configurations. The strategies enacted to enroll new actors, and to maintain or alter existing relationships, in turn change these perceptions. The interactions of the strategy formulation loops of

203

numerous actors bring about the dynamic reconfiguration of inter-organization structures,

product offerings and the other aspects of the industry’s structure.

From an analysis of the 2004 US interview data (see Table 9) four themes about the main industry-level changes during the 3G transition emerged: services, industry participants, alternative technologies and the regulator’s changing role.

Services

The voice market in the developed world is already heavily penetrated and the average revenue per user (ARPU) for voice is declining due to fierce competition. While voice ‘‘is still King’’ as one operator put it, operators are looking to data services as a way of maintaining revenue and income growth.

“our belief is that 50% of the handsets would have data usage… the 20% discount [to

corporate customers] will be mitigated by additional power usage [of data

services]… it helps with keeping an element of growth… it’s overcoming ARPU

reduction”

(Wireless network operator – US3)

“[voice] is very competitive nowadays, with number portability, virtual network

operators, flat rates and free calls between users in the same network. . . we have no

alternative, we have to be able to make these sorts of [data] services become real and

generate more revenue for the operators”

(Handset manufacturer – US8)

204

However, in 2004 there was considerable uncertainty around the demand for data

services delivered to handsets. The patterns of adoption of initial offerings around the

world had been very different. The uncertainty remained despite considerable market

research in the US.

“there will not be one or two applications that will solve the [poor initial uptake of

3G data service in the US and Europe]… so you just will have to have a lot of

services”

(Handset manufacturer – US8)

“the services business is a kind of a new space for the telecom providers… no one

knew how to sell [wireless data services]”

(Wireless network operator – US3)

There was doubt about whether there was a need for truly broadband wireless data

connectivity to support consumer applications and a recurring thread of uncertainty about

the willingness of customers, particularly consumers, to pay for data services. An interviewee (US2) with one of the major system integrators expressed the view that the then current data transport offerings (2.5G and 3G) were too expensive to attract customers, or to support compelling business cases for many corporate customers. In addition, US consumers and service providers had not been responsive to the provision of just a general data service capability on handsets (e.g. a WAP browser).

205

“[offerings] where we’ve been financially successful have all been with real tight

integration”

(Wireless network operator – US3)

The industry had to contend with developing different offerings for corporate customers and consumers. While some corporate applications were common across industries (e.g. ‘‘email and Personal Information Management applications’’) many others were tied to specific industries and require extensive customization (e.g. package delivery and manufacturing applications).

Industry participants

There was a broad consensus among interviewees that the technical potential of wireless broadband data capabilities and new business models were bringing many new participants into the industry—particularly from the computing and content industries.

New industry participants from the computing industry included operating system and middleware vendors. Many such vendors were offering platforms for the delivery of web- style data, multimedia and other content based services. Service integrators were playing a role in the integration of mobile broadband applications into corporate backend systems. Computer game developers were also targeting mobile communication devices as gaming platforms. Content providers included the traditional creators and distributors of news, entertainment and music. By 2004 music, in the form of downloadable ring tones, had generated the greatest revenue. Additionally, new kinds of service providers

206

were staking out positions in the industry (e.g. mobile email solution providers) as the

traditional content providers tried to figure out what roles they should play in an industry

that may become a major distributor of its content.

Alternative technologies

The traditional network operators faced threats from alternative wireless data transport

technologies. Wi-Fi46 (802.11) hotspots were being deployed rapidly. Wi-Fi support was integrated into many laptop PCs and PDAs, and was seen as a likely to be a feature of future handsets. WiMAX47 (802.16) promised wider coverage and higher data rates and

would be integrated into chip sets for mobile devices in the near future.

The industry could not quite make its mind up as to whether this was an opportunity or a threat to established network operators. While these lower cost options operating in unregulated spectrum threatened to steal data traffic from the 3G network operators there was also an appreciation of their ability to accelerate the take-off of broadband wireless in general, to support the efficient use of spectrum, as well as to reduce the investments required. There were also opportunities for network operators to offer integrated billing solutions for the fragmented Wi-Fi market, and for solution providers to devise a means of abstracting away the transport technology to offer seamless roaming from Wi-Fi hotspots to 3G and back again. Some operators with more capital intensive migration paths to full 3G capability had decided to offer Wi-Fi connectivity as a cheaper alternative to address both the corporate (e.g. in convention centers and airports) and the consumer markets (e.g. in coffee shops).

46 http://www.wi-fi.org 47 http://www.wimaxforum.org

207

Finally, there was an increase in the range of user terminals used by customers for accessing data services (e.g. PDAs, laptops, tablets and devices customized for specific industrial applications). Laptops allowed users to access the same data services and applications available to desktop PCs connected via wired (dial-up, DSL, Cable, LAN) or wireless (Wi-Fi or other) networks. The simplest solution for many business applications was to use VPN technology to securely extend LAN-based applications to mobile users at

home (on say DSL), using Wi-Fi hotspots or connecting via 3G data services (e.g.

CDMA2000 1xEvDO offerings in major US cities).

Regulatory regime

Traditionally the US regulators had been responsible for the allocation of the radio

spectrum necessary for the provision of wireless services and for issuing licenses to

network operators. Several interviewees highlighted that the industry’s interactions with

regulators gained additional dimensions with the advent of data services and the

transmission of copyrighted content: namely privacy and digital rights management. The

allocation of spectrum for unlicensed applications had spurred the development of some

of the alternative data transport mechanisms (e.g. Wi-Fi and WiMAX).

The remainder of the discussion of this research question provides details of specific

themes regarding inter-organizational coordination and relationship building that

emerged from interviews with decision makers in the mobile wireless and content

industries in the UK (see Table 10). These interviews were carried-out in late 2005 and

early 2006 and therefore represent a snapshot of industry thinking of that particular time

208

which was over a year after the US interviews had been completed. Four themes are presented. The first, around walled gardens and Internet services, provides a good example of where one set of actors (network operators) attempted to create an OPP (their wireless networks and walled gardens) that would shape the actor-network in their own favor, but where it actually inhibited the building of large actor-networks. This contrasts with our second theme in which a multiplicity of potential technical and business model based OPPs for video services is resulting in ambiguity about how a stable actor-network can be built. The third theme around convergence with fixed services offers an opportunity to examine how actors in the mobile wireless industry are developing differing visions of how to build actor-networks that either connect to, or compete with, strong existing actor-networks. Finally, we present some elements of the relationships among network operators, handset manufacturers, and consumers.

Walled Gardens and Internet Services

The services offered on mobile wireless platforms had diversified and used richer media as the capabilities of the devices and the bandwidths available have improved.

Telephony continued to be the ‘killer app’ and SMS accounted for the vast majority of data service revenues. Network operators had offered ‘walled garden’ data services to customers (e.g. Vodafone Live and T-Zones) for several years. Operators looked to retain control of content and took 50-60% of service revenues. Content typically included news, sports and basic entertainment services using text, graphics and more recently video clips as well as ring tones and wallpaper downloads. While ring tones had been a particular success there was a growing realization that content creation was not really operators’

209

strength and that there was a role for brands associated with quality content. News or

sport branded by a network operator had not proven to be as compelling as, say, BBC

News or Sky Sports. The uptake of what was thought to be a mature sports-alert service

offering tripled when it was re-launched with a major content provider’s branding. There

had been a general retreat by operators from creating content and a willingness to let

content creators do what they are good at. This of course lessened the operators’ control

of content.

While walled gardens could provide customers with the most popular content,

network operators were unable to make deals with all the content providers that their

customers would like to use (e.g. preferred news sites, financial information providers or

sites covering specialized interests). T-Mobile focused on this ‘long tail48’ of content by making the full Internet available on handsets through its “Web’n’Walk” offering. While

Internet access had been available on handsets for some time the compromised experience on tiny screens and prohibitive tariffs had discouraged widespread use. T-

Mobile claim to have solved some these problems by improving rendering on PDA sized screens and by offering more compelling pricing (UK Interviews – see Table 10).

Another operator, O2, introduced the i-mode service that has proven extremely

popular in Japan. An i-mode service launched in the UK in late 2005 with about 100 sites

adopted the more generous 90/10 revenue split in hope of better aligning the interests of

48 ‘The Long Tail’ is a reference to an article published by Chris Anderson in the October 2004 issue of Wired magazine. Anderson points out that businesses like and allow customers to access many more and books than would be economically viable with physical stores. Each DVD or book in the ‘long tail’ of the sales distribution represents only a tiny fraction of the sales of ‘hits.’ Nevertheless, the cumulative sales of the all the products in the tail is a huge revenue opportunity for business models freed from the ‘tyranny of physical space.’ The interviewee who mentioned this concept was drawing a parallel with the limited content that a network operator could maintain in a walled garden compared to the much wider range of content available on the wider Internet.

210

the content providers49. It is evident that the industry had yet to find or create a dominant

actor-network configuration for data services to handsets. Operators were trying a range

of configurations for delivering data services to handsets. Each operator was offering

some combination of walled gardens with a variety of business models or relinquishing

control of content entirely by facilitating Internet surfing on handsets (UK Interviews –

see Table 10).

In the early days of offering content-based data services to wireless handsets network

operators tried to establish themselves, their wireless networks and their walled gardens as OPPs for content and services providers. While some were certainly enrolled there was a limit to the growth of the actor-network because of the tight control network operators retained over access to their OPPs i.e. there were only so many content deals that can be made and managed. This approach to actor-network building also made it more difficult for small innovative content and service providers to enter walled gardens (UK

Interviews – see Table 10).

Customers however did not enroll in large numbers and network operators had been

disappointed with the take-up of these early services. Operators revised their visions of

the possible actor-network configurations around data services to handsets and the

configurations envisaged were more diverse than before. One vision was that content

providers’ incentives would be better aligned by the more generous revenues split

associated with the i-mode business model. Another was that providing full Internet access on handsets allowed a network operator to benefit from passively joining the strong actor-networks among customers and the immense range of existing Internet

49 O2dropped its support for i-mode in 2007 by which time it had a disappointing 260,000 users (Wray, 2007b).

211

content. Yet another was allowing strong existing content and service brands more prominence within walled gardens. Each of these options entailed network operators relinquishing control of content to a greater or lesser extent. This shift in power was also reflected in the major content providers’ reluctance to enter into exclusive relationships with network operators. Content providers preferred to, and were able to, strengthen their existing broadcast or web based connections with customers across most or all mobile networks. Services on mobile devices were seen as an extension of existing links between customers and content providers to just one more distribution channel.

Video Services

As device capabilities improved, video services became possible. Handset based video calling made its debut in the UK with 3’s launch of a 3G network in 2003. One reason proposed for the slow uptake of the service was that it was restricted to other ‘3’ customers – a severe limitation in the early phase of network development. At the end of

2005 Vodafone and Orange started to offer video content to their subscribers. Time- critical news and sport channels were transmitted live while less time-critical content was looped. The wireless technologies for video on handsets relied on non-scalable unicast mechanisms but it was understood that these mechanisms would be replaced if video on the move became popular. However, there was considerable uncertainty about the broadcast technologies that would eventually dominate mobile video services. While digital (DTT) broadcasts exists the modulation and coding schemes used do not lend themselves to reception on handheld devices due to battery constraints.

A standard specifically intended for broadcasting to handheld devices (DVB-H) had been

212

successfully trialed in the UK – however there was no spectrum allocated for such transmissions and a new transmission network would be required. Piggybacking on the

digital audio broadcasting (DAB) network was conceivable. This has the advantage of

existing spectrum allocations and transmissions networks. The downside for operators of

DVB-H or DAB broadcast mechanisms is that there was no clear revenue opportunity. A

multicast enhancement to the UMTS air interface specification (MBMS) would allow

operators a more efficient mechanism for delivering popular video content to its users,

and to collect revenue (UK Interviews – see Table 10).

Content providers were keen to provide some means for customers legitimately to

access video content on-line to forestall the emergence of illegal mechanisms. This was

increasingly occurring via fixed broadband access to desktop PCs and laptops – the

‘second screen.’ The screen on handsets was seen as a ‘third screen’ that could compete

with, or complement, TV and PC screens. Video services on this ‘third screen’ were seen

as ways of accessing an audience, particularly younger generations, that were watching

less television. It could also help with customer acquisition, churn and average revenue

per user – key pay-TV industry metrics (UK Interviews – see Table 10).

Since their customers were not tied to one mobile wireless operator major content

providers were loath to make exclusive deals with just one network operator, since doing

so would alienate most of their customers. The BBC’s universal service obligation

effectively prevented it from reaching exclusive content distribution deals in the UK at

least for initial broadcasts as the audience had already paid for the content through the

annual television license fee.

213

As with other content, brand was important for video. For example, Vodafone’s video

service was branded as “Sky Mobile TV” and was therefore easier for customers to

understand than a “Vodafone TV” offering might have been (UK Interviews – see Table

10).

Services offering television to handsets had been launched using non-scalable 3G unicast mechanisms and trials of other technologies had been carried out. The brands of established video content providers were playing a more important role in the launch of mobile video services than was the case for the initial walled garden data services.

However, there remained a great deal of uncertainty about the future configuration of the actor-networks that may be built around television type services to handsets. The adoption of digital terrestrial television (DTT) as the primary would require the enrollment of technologists and nature to solve the energy storage density constraints of existing battery technologies. The DVB-H option did not have spectrum allocations and building an actor-network around this technology would require the enrollment of regulatory actors and other spectrum users – which operators were reticent to undertake given that it was widely perceived that at £22.5 billion the UK’s five

network operators overpaid for their 3G spectrum in 2000. A DMB based solution would

benefit from the existing actor-network of spectrum allocations and national transmission

infrastructure. However, network operators were unlikely to champion these technologies

as they would be difficult to use as exclusive OPPs and there was no clear business

model. The UMTS based multicast mechanism (MBMS) appears to be the technology that operators would favor, giving the central role for their network infrastructure.

However, the future actor-network configurations around television on handheld devices

214

remained unclear as there were also non-wireless models for video content distribution

(e.g. device synchronization with a PVR or a broadband connected PC) and innovations like the Slingbox50 that could provide access to your home based video content almost anywhere with Internet access. These technologies offered differing combinations of time and/or location shifting capabilities for broadcast video content or access to a ‘long tail’ of video content that might never be popular enough for broadcast. The nature of the content favored by mobile customers could be an important actor e.g. time-sensitive material would favor the enrollment of broadcast or multicast mechanisms. Mobile TV and video services are covered in more depth in chapter 8.

Convergence of mobile and fixed-services

Wireless technology was not solely used for mobile applications. The fixed-wireless

ISP ‘UK Broadband’ (a PCCW subsidiary51) used UMTS-TDD (a ‘3G’ technology) to provide service in a limited geographical area. Most interviewees doubt that the economics of providing an alternative to DSL or based Internet access was viable – particularly since recent 50-75% reductions in the regulated price for local loop unbundling. The same would go for service providers contemplating the use of WiMAX or similar technologies (UK Interviews – see Table 10).

Nevertheless network operators were cognizant of the importance of fixed broadband and saw playing in this space as strategically important. However, their views on how to do so differed. One operator viewed wireless Internet access using 3G UMTS as a pale imitation of the fixed broadband experience. Even the megabit bandwidths promised by

50 www.slingmedia.com 51 PCCW is a South East Asian communications company

215

upgrading 3G networks to HSDPA in the next few years was considered unlikely to be

competitive as fixed technologies would also have improved. This operator’s strategy

was therefore to use DSL as their main offering in the broadband market. Another operator believed that HSDPA may well provide a viable alternative to fixed broadband in near future and had no plans to offer a fixed broadband alternative in the UK.

Mobile phones had a major impact on fixed-line telephony in terms of both fixed-line substitution and the migration of voice traffic to mobile networks. Mobile network

operators differed in their visions of the relationships between fixed and mobile offerings

for data services (e.g. their assessment of the viability of HSDPA as a basis for a viable

broadband offer to compete with DSL or cable and the need to diversify into offering

fixed services differed considerably). So operators had different visions of how they

could enroll technologies and customers (UK Interviews – see Table 10).

Another difficulty for mobile network operators making broadband data services at

rates competitive with fixed offers was the potential for users to use VoIP to bypass

traditional voice services that made up the bulk of mobile network operator revenues (UK

Interviews – see Table 10). Again operators appeared to be taking different stances – ‘3’

had announced its intention to support a VoIP service on its network (, 2006) while

there was persistent press coverage concerning other operators potentially taking action

to block these services (Charny, 2006). This conflict was parallel to the wider

controversy around the extent that network neutrality would be maintained by fixed

network operators (Geist, 2005).

Since we conducted the interviews there has been increasing merger activity to bring

about converged offerings in the UK. The main cable operator, NTL, purchased Virgin

216

Mobile so that it could offer the ‘quadplay’ of cable TV, broadband, and both fixed-line

and mobile telephone services. Since Virgin Mobile was an MVNO it did not have its

own wireless network (it uses T-Mobile’s) NTL was buying a brand and a customer

service capability (Thompson, 2006). Other players also made deals that extended their

industry roles. British Telecom had an MVNO and was using in-home broadband access

points to provide lower cost use of mobile phones at home. It also planned to offer

television services via its fixed offer and longer term its 21st Century Network (21CN)

promised to provide an IP based platform for multiple services. BSkyB purchased an ISP

in October 2005 and ‘Carphone Warehouse’ announced in April 2006 that it would offer

fixed broadband and telephone services. Thus there was a considerable increase in the

number of actor-network configurations being explored that included various combinations of fixed and mobile voice and data services as well as broadcast, web and . Again brand was proving an important actor. The on-going convergence in the delivery of communications and entertainment services is covered in more depth in chapters 7 and 8.

Relationships among handset, network operators, customers

There was broad agreement that UK customers tended to first make purchasing decisions about the handset they wanted and then to pick a network operator. So offering a wide range of handsets to customers was important for operators’ competitiveness.

While the operators purchased the bulk of their handsets from ‘tier 1’ manufacturers they

also worked with smaller ODMs to develop products where they saw gaps in the handset market or opportunities to differentiate their device offer for a particular set of customers

217

(e.g. O2’s X-Series and Vodafone’s Simply devices). However, when it comes to offering

services beyond simple voice and messaging a diversity of devices was an operational

disadvantage. The application environment, browser characteristics and multimedia

capabilities varied by device – even the same handset on different operators could behave

differently. The wider the range of devices supported the more transcoding needed to be done and the greater the possibility of support issues. One content provider reported that

they had to generate between 50 and 60 video formats to support a video service in the

UK. So operators faced trade-offs between offering an attractive portfolio of devices while having to support a wider range of devices in the field – an example of trade-offs

between interests on multiple dimensions in actor-network building52. While network operators, device manufacturers and content providers all expressed the need for effective standards higher in the stack the industry as a whole has not been able create and implement such standards (UK Interviews – see Table 10).

Tier 1 device manufacturers have very strong brands in their own right and are protective of the ‘look and feel’ on their devices. Operators were generally only allowed to apply a fairly thin ‘skin’ to the devices’ user interfaces although larger operators were able to drive some level of device standards to limit the number of capability profiles it

had to support in the field. For walled garden data services the operator’s brand and

navigation preferences usually dominated once the customer entered the browser (UK

Interviews – see Table 10). It is interesting to note that the brands of several actors were

striving to establish and build ties with customers through handsets: device manufacturers, network operators and content providers.

52 Another example was network operators’ conflicting considerations in striking deals with mobile virtual network operators (MVNO). The network operator traded-off increased competition against leveraging a strong MVNO brand to make indirect connections with a wider range of potential customers.

218

Relationship between standards creation/adoption and inter-organizational coordination/relationship building (RQ3)

Changes to the ways that standards were created and the relationships among the widening network of actors in the wireless industry have already been described in the discussion of the preceding research questions. It would be unrealistically ambitious to try and chart all the interactions between the changes in the standardization processes with the changes in the industry structure. Instead this section examines some specific examples of the dynamic interactions in a few episodes discussed in particular detail by interviewees. These were also particularly important episodes since they were directly concerned with the transition to new flexible 2.5G and 3G data services.

The first examples deal with a series of episodes of network upgrading and business building at a major US based network operator (US3 in Table 9). They come from a

2004 interview with several high-level employees directly involved in setting the operator’s strategic direction. The episodes deal with standards making and adoption during this period as well as the interactions between operators, manufacturers, and potential content and service providers. The last example looks at the interrelationship between the standards for interfaces internal to the handset and a manufacturer’s relationships with its suppliers. This is a new area for standardization in the wireless industry.

219

Adoption of cdma2000 1xRTT by US based network operator

Interviewees from a network operator described the technological migration path offered by standards bodies and infrastructure manufacturers. For cdmaOne the migration path included increasing bandwidths for data services. The initial step was called 1xRTT, which in practice offered about 60kbits/s (see Figure 27).

The network operator’s technology vendors had an economic interest in selling more products and the vendors strove to enroll operators by proposing imagined futures that included increased revenues from new services enabled by network upgrades. The network operator initially attempted to create a business case for the network upgrade based on 1xRTT’s enhanced data capabilities. Ties to the broader industry actor-network, through employees that had worked in other companies, highlighted the difficulty of achieving commercial success with data services in the US (e.g. AT&T Wireless’s early

PocketNet service based on a CDPD overlay packet ). The upgrade to

1xRTT was finally justified on the near doubling of network capacity for voice traffic.

“ [1xRTT ] was kind of a very easy decision because there were voice gains. . . . . at

first we spun up a big business case around wireless high-speed data [but] every

couple of years there was a big push to do wireless data and then nothing ever came

of it. So we got more conservative on the business case.”

(Wireless network operator – US3)

Once the decision was made to upgrade to 1xRTT the network operator thought about the possibilities enabled by the new data networking capabilities. The operator put a “lot

220

of energy [into] trying to figure out what people wanted to do with their devices.” The

operator introduced a range of services (e.g. picture mail, downloadable games, ringers

and screensavers). The operator’s data services at the time of the interview were iterations of these concepts.

The exploration of the technological potential of 1xRTT highlighted the network operator’s economic interests in data and voice services. However, the imagined future of

profitable higher speed data services was considered highly uncertain. The more concrete imagined future of a more efficient telephony network turned a difficult enrollment decision into an easy one. Again the wider range of services led the operator to establish relationships with actors previously outside the wireless industry (e.g. game developers and software vendors with email solutions). It is also interesting to note the irreversibility associated with the previous decision to adopt the CDMA air-interface. The operator’s choice of technology upgrades was constrained by this early decision.

The interview with this network operator highlighted the potentialities of technological network upgrades being cast as obligatory passage points (OPPs). There were examples where the network operator problematized the technological potentiality

(e.g. the “all-digital network”) and took on the role of focal actor. There were also examples where an infrastructure manufacturer took on that role (e.g. profitable “network upgrades”). The sorts of processes that played out in the translations of technical potentialities in both standards making and adoption scenarios are summarized in Figure

30. In each case the network operator took the leading (focal) role during the interessement phase – even when the manufacturer had initially problematized the new technology. Thus Figure 30 reflects the network operator’s understanding of part of the

221

actor-network configuration – specifically its relationship with manufacturers and

technical standards around 2004 – and how interactions in these relationships play out as it strives to shape other parts of the actor-network.

Standards making Standards adoption

Manufacturer strives to have its technology Manufacturer has technology it would like to sell to accepted as standard operators • Operator uses its influence to minimize • Manufacturer makes proposal lock-in to manufacturer specific technology • Operator assesses technology and explorers imagined futures • Adoption decision • May lead to additional standards making to manufacturer manufacturer Infrastructure Infrastructure facilitate interfacing with new actors

Operator identifies technological potential or Operator faces need for new functionality

Initial focal-actor perceives a market need • Chooses between standards based or proprietary • Explorers imagined futures options • Asks for proposals from manufacturers • May choose to drive standardization process • Selects option to push through standards where no standard exists Network Network Operator Operator bodies with manufacturer

Figure 30. Summary of translations involving the network operator and its infrastructure manufacturers

US network operator’s experience with the early introduction of data capability

The network operator’s historical focus on the consumer market led it to concentrate on the potential of content-based data services for this market segment (as opposed to a

data transport only offering for corporate users). In the midst of the “dotcom craze” the

operator was sought out by major portals and other Internet content providers keen to

develop alternative outlets for their content. They viewed the relatively new operator with

its digital network as an attractive partner.

“[We were] bombarded by partners who wanted to do data with us . . . . [The]

perception was that there was a lot of business that could be done”

(Wireless network operator – US3)

222

The network operator identified its own economic interests to exploit the technological potential of their wireless network and imagined a future with consumers accessing a range of data services on wireless devices. The consumer focus was the result of the irreversibility of the operator’s entry into the wireless telephony market. The wireless data capability can be thought of as an obligatory passage point (OPP) through which content providers had to pass. The attractiveness of the content providers’ imagined futures meant that that the network operator did not have to actively convince them to enroll. The strategy formulation of the network operator during this phase is captured in Figure 31 using the actor-network based model elaborated in Chapter 3 and summarized in Figure 2.

Figure 31. Strategy formulation of network operator on introduction of data capability

The technical part of the actor-network was also expanded with the creation of an end-to-end service delivery platform (from content providers to user devices). Control of

223

the wireless network and the initially unique data transport capability gave the network operator the upper hand in negotiations with content providers. Other actors were also enrolled to make the changes to the wireless network. Middleware providers supplied the

WAP (Wireless Access Protocol) infrastructure and the WAP browser. The infrastructure and device manufacturers worked with the middleware providers to integrate WAP.

However, enrolling customers proved more difficult.

“We didn’t really know how to sell data. I mean we know how to position ourselves

and launch wireless Web. But no one knew how to sell it.”

(Wireless network operator – US3)

An actor-network around data services has endured despite its not expanding as imagined. However, the actor-network has not stabilized (i.e. it has not been black- boxed). The organizational actors are still searching for a configuration that will result in the enrollment of customers in large numbers.

At least in the US the actor-network of customers and service providers was not dense enough for the network operator to simply offer a platform for a range of services. The interviewees noted the contrast with the computer industry.

“[We] learned . . . that people buy applications . . . they don’t buy broadband. The

places where we’ve been financially successful have all been in real tight [end-to-

end] integration . . . and where we have done really poor is where we try to have a

224

generic enabler like a WAP browser. Which is kind of the opposite of what you would

expect based on what’s happened in the desktop world.”

(Wireless network operator – US3)

Interrelationship between operator/customer relationship and broadband 3G network upgrade

This example is of the interrelationship between one part of the industry structure and a standards adoption choice. The specific relationship was the one between a network operator (US3 in Table 9) and its customers. The standards adoption choice related to the potential upgrade to a higher data rate radio technology.

Large network operators have two distinct markets: consumer and enterprise.

Operators with wired businesses have historically been in a particularly strong position in the enterprise market where wireless voice and data services are but one part of overall telecommunications offerings. In contrast, operators with no fixed offerings tend to target

the consumer market. Network operators’ market positions affect their approach to data

services and their relationships with other industry participants.

“Wireless is one of our large door openers for enterprise accounts…we’re able to get

them to talk to us about other [service offerings] as well.”

(Wireless network operator – US3)

Investing in full 3G broadband capability makes the most sense for operators with a

strong enterprise focus. The provision of a secure and reliable data transport offer to

225

business users is seen as one of the keys to the enterprise segment. The first US operator to bring a broadband 3G service (based on cdma2000 1xEV-DO technology) to market is seen as targeting corporate customers.

“From my understanding of what that technology can provide, EV-DO appears to me

more of a business play”

(Wireless network operator – US3)

“We just believe the money is in the enterprise. Less price elasticity in general and a

sense [that] it’s an easier to the quantify value than [for consumer services]”

(Infrastructure manufacturer – US1)

Operators focused on the corporate market compete on cost (steep discounts are needed to win contracts) and coverage. There is less emphasis on cutting-edge handset features and the consumer side of the business receives less focus. Content services to handsets are considered less important and operators are more likely to outsource elements of their consumer offering, e.g. email solutions, service portals and application platforms.

Operators with a consumer focus face a more uncertain demand for content-based services and broadband 3G data transport is considered too expensive for most consumers. However, 2.5G upgrades providing reasonably fast data transport mechanisms using existing spectrum were considered more cost effective. The upgrade to cdma2000 1xRTT was seen as a ‘‘no regrets’’ move for operators of cdmaOne-based 2G networks since it doubled voice capacity and provided a reasonably fast (~60 kbits/s) data

226

transport mechanism (within a standard 1.25MHz channel). Upgrading networks to

broadband 3G remains an option as market uncertainty is resolved. In the meantime

cdmaOne operators with a smaller presence in the corporate segment can use the

CDMA2000 1xRTT capability to target verticals with more modest data requirements.

“Pricing of these [data] services is very high. Higher than even what corporate users

would particularly like to pay for. As a result, uptake has been slowed.”

(System integrator – US2)

Consumer-focused operators are willing to invest more effort in working with manufacturers to offer more advanced handset features. The network on the other hand is considered less of a differentiator. The focus is on reducing cost and hence pushing for standards-based solutions. Content services are considered differentiators and have a high level of visibility with customers. Consumer-focused operators are more likely to retain tighter control of content and their application delivery platform.

“Handsets tend to be a differentiator, they’re customer touched. Customers don’t

really touch the infrastructure equipment… it just needs to support whatever we need

to. Then we say how do we control cost? Our solution to controlling cost is

standardization.”

(Wireless network operator – US3)

227

Operators note that close cooperation with other industry participants is becoming increasingly critical to offering data services. For example, systems integrators bring a great deal of experience of corporate customers’ industries.

US operators that have used DAMPS/GSM-based technology face a more difficult challenge as migrating to full 3G capabilities entails a much more capital intensive overlay network. As an alternative, these operators have invested more in establishing

Wi-Fi-based hotspots. locations are concentrated according to customer focus, e.g. in airports and convention centers for corporate customers and coffee shops for consumers.

The first network operators with data service capabilities had a strong bargaining position with content and service providers. Major Internet portals and dotcom companies were very keen to have a wireless presence prior to the dotcom bust in 2000. Since then the standardization of data access mechanisms has reduced their power.

“At one time we were kind of running the show, picking and choosing [which

content/service providers] we wanted. Today, we don’t really have as much control.

Although to some extent that’s just the general IT move from proprietary to open.”

(Wireless network operator – US3)

The network operator’s earlier air-interface standard adoption decision and the possible network upgrade paths, its historical consumer market focus, and the perceived uncertainty of demand and price sensitivity for a range of services from different market segments all shaped the operators understanding of the current and possible future actor-

228

network configurations. The bias of the interviewees in 2004 was to avoid the

technology upgrade, or to at least delay it until some of the market uncertainty had been

resolved. From the high-level case study presented earlier in this chapter we know that all

the major operators eventually committed to broadband 3G network upgrades.

Presumably this operator’s vision of the future configuration of the industry and the

changing nature of its relationships with its customers made the technology upgrade an

acceptable risk.

Handset manufacturer strategizing to avoid undesirable imagined future

An interviewee from a handset manufacturer (US6 in Table 9) described a transition

to a more horizontal structure for the device market, i.e. the emergence of small numbers

of market leaders that are dominant in the production of key handset components. He pointed out that while the manufacturer was no longer able to produce all the major components, it must be very careful in making its make or buy decisions. The threat for

manufacturers is that handsets will go the way of the where Intel,

Microsoft and more recently Dell came to control key parts of the architecture and value

network and are able to extract much of the value created in the industry.

Control of key interface specifications has historically been a crucial factor in

determining the ability of industry participants to capture value (e.g. the PC industry).

While decisions on the level of participation in standards forums are often based upon the

perceived importance of the standard to the company’s products the desire to mitigate the

risks of important initiatives being dominated by others also plays a part. In terms of the

actor-network perspective an actor may engage in actor-network building to impede other

229

actors’ heterogeneous engineering efforts, to prevent other actors establishing OPPs, or to

limit the plausible imagined futures that shape strategy formulation.

Summary of changes in the wireless industry with the 3G transition

The wireless industries in the US and the UK were continuing to undergoing major

change. The sources of change came from each of the domains (Figure 3) as well as from

outside the industry. From the innovation system came improvements in mobile data

transmission, increased processing power, higher storage capacities and better displays

mobile devices. The innovation system expanded to include start-ups as well as

companies from the data networking and computing industries. The expanded innovation

system created the technological potential that made possible a broad range of mobile

wireless and computing devices, services and applications.

In the marketplace, the success of the Internet highlighted the commercial potential for wireless data transport and content-based services. For the first time network operators had to market and support multiple services, to operate in an environment of uncertain demand for services, and to collaborate with other players in the marketplace

(e.g. content providers and system integrators) to deliver end-to-end services.

By making unlicensed spectrum available the regulatory regime provided the impetus for the development of alternative wireless technologies (e.g. Wi-Fi and WiMAX). The lack of regulation concerning how operators use their licensed spectrum allocations has allowed US operators flexibility in just how and when they have chosen to implement

2.5G and 3G technologies. At the same time they have faced challenges as decisions in

230

Pennsylvania regarding community-based Wi-Fi networks show53. As with the other

domains the regulatory regime has expanded in scope as it has had to deal with privacy

and digital rights management issues. Regulators are also reexamining their role in the

light of the increasing convergence of the telecommunications, content, and computing

industries, as well as the emergence of novel hybrids of traditionally unregulated and

regulated technologies or services (e.g. VoIP based telephony over cable, Wi-Fi or

WiMAX networks).

The interviews provided evidence for changes in the industry resulting from

technology-push, market-pull and even from initiatives originating in the regulatory

regime. However, the changes in each domain have occurred concurrently and influenced

one another over several years. Change in the industry is better understood by

considering it as the outcome of the on-going dynamic interactions among the domains

(as illustrated by the time-sequence diagrams54). While increasing complexity during the

transition is creating huge challenges for industry participants, their responses to changes

in each domain create a sort of rolling coordination mechanism that limits the extent to

which one domain can be out of step with the others.

The interviewees recognized the importance of standards—but in different ways.

Manufacturers see standards as key to the building of products, and to future market growth through economies of scale and the management of expectations. Network operators see standards as an important means of constraining infrastructure costs through economies of scale and network externalities. System integrators see standards as

53 Legislation allows local network operators to veto community-based commercial offerings of Wi-Fi services in the state of Pennsylvania. 54 For example Figure 17, Figure 18, Figure 19, and Figure 20.

231

a way of building platforms for the delivery of services that cut across wired and wireless infrastructures by creating both economies of scope and scale.

While interviewees generally voiced support for open standards, their understanding of what should be open and what should be left for differentiation and competition differed. For example, a system integrator expressed a desire for common interfaces across all wireless data platforms (e.g. Wi-Fi, 2.5G, 3G) and mechanisms for seamless switching between platforms to facilitate the development of cross-platform applications.

Network operators, on the other hand, are likely to resist initiatives that would drive commoditization of wireless data capacity.

While standards play an important role in the coordination of the wireless industry, other mechanisms are also important. These include the ways in which specific capabilities are combined in particular organizations, the nature of the relationships among organizations (contracts, alliances, joint-ventures or mergers), and the industry practices and processes that develop over the years.

The on-going reorganization of the wireless industry value network brought about by the transition to 3G has already resulted in a much greater reconfiguration of the industry than was evident in the transition to 2G. This is shown in increased concentration, new types of operators and uncertainty about new players (e.g. new commercial or community-based operators using Wi-Fi to provide Internet access). The reconfiguration is continuing as new patterns of relationships, particularly those involving new participants, have yet to stabilize—in part reflecting the uncertainty about the demand for services.

232

There have been changes in the relationships among traditional industry participants.

For example, the consolidation of the US network operators has increased their power in

the industry’s value network. There is also increasing complexity and modularization of

the handset. The inclusion of sophisticated operating systems, or other application environments, and standardized interfaces between hardware and software components, raises the possibility of further horizontalization of this segment, and the redistribution of

the value captured to industry participants other than handset manufacturers.

A summary of the changes in the wireless industry with the transition to 3G is

provided in Figure 32 using the triangle framework conceptualized by Lyytinen and King

(2002) and presented earlier in Figure 3. This understanding reflects the findings of the

series of US interviews (see Table 9) carried-out in 2004 and 2005. Presenting the

findings in this format does not allow the temporal sequence of specific actions to be

depicted (unlike the ‘time-sequence’ diagrams such as Figure 17). However, it does allow

more general findings about the changes in the domains, and relationships among them, to be shown more clearly.

Role of initial conditions (RQ4)

This research question was intended to clarify the impact of the initial configuration of the actor- networks at the start of the case study period. Since chapter 5 provides a deep explanation of the background to the events covered in this chapter there will be no further discussion on this question here.

233

Many new types of services became possible with the inclusion of effective data

transmission capabilities in 2.5G and 3G wireless standards. These include WAP based

information retrieval, multimedia messaging (MMS), mobile email, and whole

track music downloads, and Internet access to name a few. To illustrate how getting into

these new services brought wireless network operators into a more complex actor-

network configurations (both inter-organizational and technological) we delved deeper

into one particular type of service: Mobile TV and video services (these services are

explored in chapter 8). To understand the actor-network configurations around mobile

TV and video services it became necessary to understand the evolution of television industry and how it has been converging with the telecommunication and computing industries. This is explored in the next chapter.

234

Innovation System Marketplace

1. Innovation system expands 1. New actors with interest in wireless • Computing industry players New and changing relationships • Content/service providers and aggregators • Operating system and middleware vendors • New players means more • System Integrators (help deliver end-to-end solutions for • Start ups focusing on point solutions potential relationships to corporate customers) 2. Expansion of technology options explore and coordinate 2. Changes in services • Wireless combined with other technologies (e.g. • More difficult to understand • New services (e.g. data transport, content based services) Computing, Internet, Cameras) relationships among other • Uncertainty about demand for new services • Wider range of devices (e.g. laptops, PDA, custom) actors and industry as a • Customization of services required (particularly for • Wireless technologies (e.g. Bluetooth, Wi-Fi, WiMAX) whole industry specific applications) 3. Increase in system complexity and the number of 3. Operators search for new sources of revenue as voice is technical interfaces to be made interoperable commoditized and voice market matures 4. Economics of 3G upgrade make alternative WLAN technologies attractive to some network operators

Standardization Arena • Increases market • Reduces technological size and economies 1. What’s standardized is changing uncertainty and risk of scale • Reduces infrastructure

235 • More interfaces higher in the stack • Reduces uncertainty • New interfaces within the wireless devices and device cost and risk • Wireless LANs from computer industry • May speed time to 2. Important standards battles moved up the stack market • Two main 3G air-interfaces selected in 1990s • Many service platforms and operating systems competing in the marketplace and in the standards forums Regulatory regime facilitates new 3. Where standardization takes place Relationships between regulator and wireless technology by making • Reduced roles of SDOs and governments innovation system expands to new unlicensed spectrum available • More industry run consortia include roles in DRM and privacy • Increase in number of relevant forums including those from the computer industry

Regulatory Regime New roles for the regulator • Privacy • Digital-rights management (DRM)

Figure 32. Changes in the US wireless industry during the transition to 3G (Tilson & Lyytinen, 2006)

VII. The Television Industry and Convergence

The preceding chapters dealt with changes in the mobile wireless telecommunications

industry as it transitions to its third generation (3G) technologies. One of the most recent additions to the types of services offered on mobile devices is and

video. The still emerging story of mobile TV in the USA and the UK is examined in

detail in the next chapter. To provide a better platform for understanding mobile TV a

thorough examination of the history of television is provided in this chapter.

In the first sections the early history of television is explored and overviews of the TV

industries in the UK and the US are provided. The emergence of video services delivered

over IP based networks in recent years is also examined in some detail. The research

questions posed in Table 5 are applied to this mini case study of the TV industry and

three key episodes of convergence with the computing and telecom industries are given

particular attention.

The early history of television technology55 in the UK and the US

The history of the development of television started in the nineteenth century.

Although there were earlier proposed systems, it was Paul Nipkow in Germany who first

patented a complete workable television in 1884. Nipkow’s

transmitter relied upon the photoconductive properties of the element and a

55 This subsection draws upon the following sources: (Glen, 2006), (Pemberton, 2006), ("television," 2007) (Walker & Ferguson, 1997)

236

spinning disk with a series of holes. This equipment was used to encode the brightness of strips of images projected upon the disk. The electrical signal so produced could be used

to vary the brightness of a light source at the receiver. Another rotating disk,

synchronized with the one at the transmitter, was used to reconstruct the picture for

viewing. In the decades that followed there were a series of technological developments

that brought mechanical television closer to being a practical technology. These included the tube (1897), the thermionic vacuum tube (1906) that allowed amplification, and the potassium hydride-coated cell with improved photosensitivity and responsiveness (1913).

In 1926 gave the first demonstration of electrically transmitted moving pictures in the UK. The technology was still based upon Nipkow’s rotating disk and was limited to 30 lines and about 10 frames per second. This technology formed the basis of experimental wireless broadcasts in the UK from 1929.

Vladimir Zworykin working with the support of RCA in the US developed an electronic television sensor that built upon the “electron gun” invented by independent inventor Farnsworth. RCA held the dominant position in radio broadcasting in the

US in those years and played a leading role in introduction of television to the marketplace. Television was the focal point of RCA’s exhibit at the New York World’s

Fair in 1939 and the company started experimental TV broadcasts in 1940.

The world’s first regular TV broadcast was launched by the BBC from a transmitter in London in 1936. Initially, broadcasts alternated between an electronic system developed by EMI (which had been privy to the Zworykin’s research at RCA) and a mechanical system from Baird’s company. The UK government decided to drop the

237

mechanical system because of reliability and performance issues after just a few months.

EMI’s 405 line / 50 fields per second became the British TV standard until the 1960s.

Broadcasts were suspended in 1939 amid concerns that a TV signal could act as a for German bombers.

The EMI system took advantage of the properties of human visual perception in a number of ways. The illusion of smooth moving pictures is possible by rapidly presenting a series of still pictures – 24 frames per second being the frame rate used by the film industry. However, the flicker perceptible at this sort of frame rate can induce severe visual fatigue. The film and TV industries got around this limitation in different ways.

Film projectors expose each film frame two or three times to reduce the perception of flickering while maintaining the slower frame rate. In analog TV a process called interlacing is used in which the odd numbered lines of the image are scanned, transmitted and displayed first, followed by the even numbered lines. These two halves of the picture each have 202.5 lines and are referred to as fields. The British and European systems use

50 fields per second. Interlacing allows the illusion of motion to be maintained at only 25 frames per second while maintaining the higher resolution (405 lines in the case of the

EMI system), and allowing the display to be updated 50 times per second to reduce the perception of flicker. As interlacing allows the problem of flicking to be overcome without having to double the bandwidth required by transmitting a full 50 frames per second it can be said to be a form of analog video compression.

The 50Hz frame rate used by the EMI and other European video standards was influenced by an earlier technological decision. To avoid a strobe effect while capturing video under lights powered by the 50Hz alternating current used in Europe the field rate

238

was set to the same value. In turn the number of lines in each frame were constrained both by the limitations of the video sensors of the day and by the need to have the field rate related to the number of lines to simplify the electronics in the receiver. So the key technical characteristics of the system were shaped by human physiology, the properties of an existing infrastructure, the scarcity of a natural resource (radio spectrum), and the economics of consumer electronics. Government played its part by selecting among television systems, allocating radio spectrum, and through support from the state owned broadcaster (the BBC).

The key actors in the development of a British TV standard were heterogeneous, ranging from a series of technical developments, existing infrastructure, the performance of electronic and mechanical systems, and the government both directly and indirectly

(through the state broadcaster). The key actors and actions that led to the development of the British 401 line TV standard is are shown in Figure 33.

The 405 line television system used VHF frequencies between 41.25 MHz and 221

MHz. Each channel occupied 5MHz of spectrum. When the second BBC channel was launched in 1964 it used a 625 line / 50 fields per second system that had become the standard in Europe that occupied 8 MHz. BBC2 was only transmitted on UHF channels

(470MHz to 862 MHz in the UK).

Despite RCA pressing to have its 441 line system adopted as the US standard the

FCC established the National Television System Committee (NTSC) in 1940 to define a standard. In 1941 the FCC approved the committee’s NTSC 525 lines of 60 interlaced fields per second system that occupied 6 MHz of spectrum (note that 60Hz is the national standard for AC power in the US). Amplitude modulation (AM) and frequency

239

modulation (FM) were specified for the transmission of video and sound respectively.

World War II also delayed the further development and deployment of TV in the US.

Innovation Marketplace Regulatory Standards System System

Technology Nipkow disk (1884) developers CRT (1897) Triode (1906) KH cell (1913)

Baird Demonstrate electrically transmitted TV (1926)

50 Hz A.C. Field rate selection Human visual Human visual perception system

EMI 401 line RCA / EMI electronic TV system developed

TV system proponents Lobby for selection of national transmission standard

Govt. Go ahead to use UK mechanical system and US originated electronic system

BBC Start of regular transmissions using two systems (1936) Mechanical TV system Reliability and quality problems

BBC Ceases use of 401line system selected as mechanical system British TV standard (1937)

Figure 33. Key actors and actions in the development of British TV standard

Television transmissions in the US first used 12 channels in the VHF range (between

44MHz and 216 MHz). In 1948 the FCC froze the licensing of TV stations while it

240

studied how to allocate spectrum for TV. In 1952 the FCC allocated 70 channels in the

UHF range (from 470 MHz to 885 MHz). The FCC also set aside 11% of channel

allocations to “educational television” at the instigation of the Commission’s first female

commissioner, Frieda Hennock – a decision that later made public television possible.

A side-effect of the way spectrum was released in 1952 was that the dominant

networks, NBC and CBS, which already had established relationships with affiliate

stations broadcasting on VHF channels, retained a competitive advantage. The newer

networks, ABC and Du Mont, could only find affiliate stations with UHF transmissions.

The installed base of VHF only TV sets, the fact that new sets were not required to cover

the UHF channels until 1962, and the more limited coverage of UHF transmitters created

a distinct disadvantage for the newer networks. More generally TV stations, the number

of which was constrained by government licensing and the availability of radio spectrum

remained obligatory passage points for content providers and advertisers for decades to

come. The allocation of so much UHF spectrum for television also severely limited what

could be made available for cellular services.

A standards battle between different color TV systems took place in the US between

CBS and RCA – the dominant radio and TV broadcasters. The RCA system had the

advantage of being backwardly compatible with the installed base of black-and-white

receivers and was adopted by the reformed NTSC and the FCC in 1953. The first color

TV sets were produced by RCA in 1954 but were at least three times as expensive as monochrome sets. RCA’s NBC network broadcast 700 hours in color while the other networks, CBS and ABC, broadcast almost none. It was the late 1960s before color TV really took off in the US.

241

The UK, and most of Western Europe, adopted a modified version of the NTSC color system called PAL which was announced in 1963 by Telefunken in Germany. These two color systems, along with the SECAM system developed in France, were adopted by other countries around the world during the 1970s. The first official color TV transmissions started in the UK with BBC2 in 1967 with BBC1 and the ITV channel converting from 1969. Color was only made available on the UHF 625 line channels. The last 405 line transmitter in the UK was turned off in 1985 and the VHF spectrum reallocated to other services.

The selection of the American standard for television was also shaped by the same sorts of technical, economic, infrastructural, and governmental actors as occurred in the

UK. The influences played out in broadly similar ways. However, one substantial difference was that much of the technical innovation in the US was directly sponsored by the dominant commercial radio broadcasters. The key technology innovators in the UK,

EMI and Baird, were not broadcasters themselves and the dominant broadcaster was state owned. The US and UK innovation systems were connected as EMI in the UK was aware of RCA’s work on electronic television.

The television industry in the UK

While the UK was a pioneer in television (the BBC launched the world’s first regular

TV broadcasts in 1936) it was almost another twenty years before commercial television was licensed. It was the mid-1980s before multichannel cable TV became available – and then only in parts of the country (about a decade after the US).

242

The UK was the first country in the world to allow cable operators to offer telephony

(Parsons & Frieden, 1998, p. 310) and by 1998 the industry was providing over 3 million

residential and 400 thousand business lines (Wheatley, 1999, p. 60). The cable companies

in the UK were backed by many of the major US cable and telecommunications

companies (Parsons & Frieden, 1998). However, the industry was plagued by financial

problems and by 2005 had fully consolidated – the remaining cable operator, NTL, was

rebranded as in 2007 (Telecomworldwire, 2006a, 2006b). In 2007 Virgin

Media was still disappointing investors (FT, 2007a).

In 1989 multichannel became available in the UK from Sky

(owned by ’s News International). It delivered several channels using

medium power Astra satellites on frequency allocations not specifically set aside for

broadcasting. It also used the traditional PAL television standard. Its competitor, BSB,

had been forced to adopt the new D-MAC television standard that was supposed to be the

European standard for higher definition television. While the high-power satellite used by

BSB allowed the use of smaller receiving antennas, and used frequencies set aside for broadcasting, it only supported 5 channels (compared to 32 channels on two co-located

Astra satellites). With both satellite operators bleeding money they merged in November

of 1998. The failure of BSB has been attributed to the delays caused by the adoption of

the D-MAC standard (Henry, 1991). BSkyB has been a commercial success – by 2006

there were more homes receiving television via satellite than via the analog broadcast network (SatelliteNews, 2006a).

The UK’s five so called ‘main’ or ‘mainstream’ television channels (BBC1, BBC2,

ITV1, Channel 4 and Five) are available free to air on the nation’s

243

broadcast networks. They are also available on the digital terrestrial service (Freeview),

as well as on the BSkyB digital satellite, and NTL digital cable platforms.

The BBC is a government owned broadcaster funded at arm’s length by an annual TV

license fee of £131.50 (as of March 2007) payable by all TV households irrespective of

whether they view BBC or not. Independent Television (ITV) is actually a network of

commercial broadcasters licensed by the regulator to broadcast in fourteen geographical

regions. The revenue for these broadcasters comes primarily from the sale of advertising

spots. Consolidation in the 1990s led to one company, ITV plc, holding all of the regional

franchises except those in , Northern Ireland and the Channel Islands (ITVplc,

2007). The other advertising supported mainstream channels, Channel 4 and Five, were

launched in 1982 and 1997 respectively. The pay-TV broadcasters (i.e. BSkyB, Virgin

Media, and newer players using broadband connections to deliver

services) raise much of their revenues directly from viewers in the form of subscriptions.

The timeline in Figure 34 summarizes the major events in the history of UK television.

The ITV, Channel 4, and Five channels are transmitted on a broadcast network owned

by a commercial company Arqiva56. In April 2007 the parent company of ,

Macquarie, also purchased the broadcasting infrastructure used to broadcast the BBC

from National Grid Wireless – a transaction that is under review by the UK authorities for its effect on competition (Snoddy, 1997; Spikes, 2007). BSkyB leases its satellite

capacity from satellite operators. Virgin Media owns and operates its own fiber and coax network on which it offers telephony and broadband services as well as pay-TV services.

With the addition of the mobile wireless offerings it gained through its acquisition of

56 Arqiva (formerly NTL Broadcast) was originally established to transmit ITV channels in the mid-1950s. It was privatized in 1991. http://www.arqiva.com/server.php?show=nav.6324 has a summary of its history.

244

Virgin Mobile UK, Virgin Media offers what is referred to as a ‘quad-play’ (i.e. a bundle of cable TV, broadband, fixed-line phone, and mobile).

Sky TV starts transmitting Mobile TV multi-channel Pay-TV via Early years • Orange starts offering satellite • World’s first regular TV 9 channels to 3G (1989) broadcast started by mobile phone users. BBC (1936) • BSkyB / Vodafone • Broadcast restarts after Color TV inaugurated offers 19 channels. war (1946) on ITV and BBC1 (2005) (1969) Cable TV BSkyB starts BT launches VoD franchises made transmitting digital service over DSL available TV via satellite (2006) Second BBC channel (1983) (1998) launched (BBC2) (1964)

1950s 1960s 1970s 1980s 1990s 2000s

Commercial 4 is launched Digital terrestrial TV Analog-digital starts (ITV) (1982) launched (On-digital) switchover (1955) (1998) (2007-2012)

First transatlantic TV Video cassette Five (fifth national Ofcom takes over transmission via recorders go on sale analog channel) regulation of telecom Telstar satellite (1974) launched and broadcasting (1962) (1997) (2003)

Figure 34. Time line of major events UK57 television

The five main stream channels in the UK produce much of their own content. BSkyB has over thirty of its own channels and the main cable operator58 has ten channels. These channels appear on their owners’ broadcast platforms and on competing platforms. As well as producing their own content, most channels commission content from independent content providers. Movies broadcasting rights, sport broadcasting rights and programming from other countries are further sources of content.

The world’s first digital terrestrial television (DTT) network was launched in the UK in 1998 utilizing a European standard called DVB-T (BBC News, 1998). The original

57 Dates from http://www.tvhistory.btinternet.co.uk/index.html, (Fox, 1990), (Ofcom, 2006b), and http://en.wikipedia.org/wiki/Timeline_of_the_BBC 58 Content creator, Flextech, was part of the cable company. After NTL acquisition of Telewest and Virgin Mobile, and the subsequent Virgin rebranding it becameVirgin Media TV (as of March 2007).

245

offering (OnDigial and later ITV Digital) failed commercially in 2002. DTV Services

Limited59 took over the broadcast license and the Freeview service was launched in

October of the same year. In the first quarter of 2006 BSkyB was the largest pay TV

operator with 7.7 million subscribers while Virgin Media (the cable operator) had 3.3

million. Another 7.1 million households received digital TV via Freeview (Ofcom,

2006a, p. 199). Two former BSkyB executives set up “Top Up TV” to augment Freeview

with ten additional pay channels with content delivered as a data stream in the digital

multiplex and stored on a set top box (STB), Subscribers can access the stored content at

anytime (Anonymous, 2004; J. Lee, 2004). The UK analog television network will be

phased out between 2007 and 2012.

When TV was first digitized the linear scheduling and centrally managed

programming models dictated by the spectrum scarcity and the broadcast architectures

still dominated. However, increased capacity in satellite and cable systems made Near

Video on Demand (NVoD) commercially viable (e.g. a popular movie broadcast on

several logical channels with offset starting times means that viewers should not have to

wait very long for the start of the movie). Interactivity made possible by upgraded cable

TV systems made true Video on Demand (VoD) possible. This adds to the on-going

challenges to linear scheduling and centralized programming model. This was first

challenged in the 1970s by the video cassette recorder (VCR) and the associated movie

rental businesses (e.g. Blockbuster) using physical stores to distribute video tapes. From

the late 1990s DVD movie rentals were being ordered online and delivered by mail (e.g.

59 As of March 2007 DTV Services Limited has five shareholders: BBC, BSkyB, National Grid Wireless, ITV and Channel 4. In April 2007 National Grid Wireless was purchased by Macquarie UK Broadcast Ventures, the parent company of Arqiva.

246

Netflix). More recently personal video recorders (e.g. TiVo), and on-line delivery of video content (e.g. YouTube, ) have added further to the challenges.

The new offerings, the ability of viewers to skip adverts using PVRs, and reductions in the number of hours spent viewing traditional television (Dan Milmo, 2006), particularly among the young (BBC News, 2006; Ofcom, 2006a, p. 257), has led to challenges for commercial TV’s advertising based business model (SatelliteNews, 2006b) and to what has been referred to as the “collapse” of ad revenue for ITV (Fluendy, 2006).

The television industry in the US

The television industry in the United States followed a different path. Right from the start TV was commercial. Broadcast TV stations are licensed by FCC to serve one of over 200 markets around the country and TV stations typically have their own

transmitters. In the first few years after the war major TV stations created their own

content. But by 1948 the economies of sharing content creation costs across larger

audiences led to the first major network TV season along the East Coast. The dominant

commercial TV networks (NBC and CBS) in the US were previously the dominant radio

broadcasters. Even the third network, which became ABC, had been an NBC network

until the FCC forced its sale. The networks borrowed heavily from their radio

programming and brought their radio advertisers with them. This competitive advantage

also meant that the two dominant networks had 78% of all affiliated stations by 1954.

The failure of one of the weaker networks, DuMont, in 1955 saved the other, ABC, by

freeing stations to become ABC affiliates (Walker & Ferguson, 1997). Some TV stations

247

are still locally owned but most are part of companies that own several stations, and some

are directly owned by the major networks.

Network affiliated TV stations broadcast programming supplied by the networks.

Advertisers seeking national coverage purchase ‘spots’ in network shows. While the TV

stations receive some ‘compensation’ for carrying the network programming, most of

their revenue comes from selling advertising space around their own programming (local news, sport, weather and other content of local interest) and small slots in the network programming. Non-affiliated TV stations have to purchase their programming. The mix usually includes movies, reruns of network programming – so called off-network

syndication – as well as first run syndications of shows from independent producers.

Non-affiliated stations are usually less profitable than affiliated stations (Walker &

Ferguson, 1997).

The Corporation for (CPB) was created by an Act of Congress in

1967 – and the Public Broadcasting Service (PBS) incorporated two years later. As of

April 2007 PBS has 348 member TV stations across the country. Funding for PBS comes

from voluntary contributions from the public, corporate underwriters, and Federal

funding (through CPB) (Walker & Ferguson, 1997).

The US television industry structure with three major networks and PBS lasted until the 1980s when a new network, Fox, was launched. As in the UK winning sports broadcast rights was a major part of Fox Networks strategy. In fact one could say that in at least this case it was the same actors, Rupert Murdoch and , enacting similar strategies with the Fox Network in the US and with BSkyB in the UK Two more studios owned broadcast networks, UPN and The WB, were launched in 1994.

248

In 2007 the main networks (ABC, NBC, CBS, and Fox) had in excess of 200

affiliates each60. The weaker networks, “The CW” (formed from a merger of UPN and

The WB) and MyNetworkTV, had significantly less. Some Spanish language networks

(e.g. Univision) had tens of affiliates, while numerous shopping and religious networks

ranged from having just a few, to many tens of affiliates. Some more specialized

networks had affiliates that broadcast their channels only on digital sub channels (e.g.

Tube Music Network and AccuWeather Channel).

During the 1970s two changes occurred that allowed the cable TV industry to move

beyond offering only local TV stations to homes that were unable to receive a clear

broadcast signal. The first was the partial deregulation of the satellite industry. This

allowed newly established cable networks (e.g. HBO, ESPN, CNN, MTV, USA Network

etc) to economically distribute their programming to cable headends across the country.

The other major change was brought about as the cable industry chipped away at Federal,

state, and local restrictions on the types of content cable companies were allowed to offer.

The impact of new networks, and cable TV, is evident in the slide of the three largest

networks’ control of the primetime audience to 60% at the end of the 1980s from nearly

90% at the start of the decade (Walker & Ferguson, 1997).

From the mid-1990s digital cable and direct broadcast satellite (DBS) operators spurred the creation of hundreds of new networks. The proliferation of niche programming led to there being in excess of 500 cable network by 2006 (Amobi &

Donald, 2006). Although there were many more networks most of them were owned by the same few major media companies (Walker & Ferguson, 1997). There was considerable merger and acquisition activity in the broadcast industry in the 1980s and

60 According to each network’s, or parent company’s, most recent annual report (accessed May 10, 2007)

249

1990s with ABC, NBC and CBS all changing hands or experiencing changes in control.

More widely media companies became fewer, larger and more international. For example, in 2007, News Corporation, Time Warner, Disney, and CBS Corp were active in movies, music, publishing, radio, television, and theme parks. Between them they owned the major networks and many of the TV stations in large markets, as well as major cable and satellite broadcasting platforms.

In September 2006 there were 1,373 licensed commercial TV stations in the U.S with about 23% of them owned by ten groups (including the networks) that accounted for 65% of the $18 billion in local spot advertising. The top English language networks (ABC,

CBS, NBC, Fox, Disney, CW and MNTV) generated the vast majority of the network television advertising (about $25 billion). Advertising revenues generated by cable TV were less than half those generated by broadcast TV (note that the main broadcast networks were also carried on cable systems). Cable companies had about 65 million subscribers while DBS operators had about 28 million. About 20 million US homes only received free-to-air broadcast television (Amobi & Donald, 2006).

The Advanced Television Systems Committee (ATSC) was established by the FCC in

1987. It evaluated numerous proposals for an improved television standard. After testing several systems the committee decided in 1993 that the final standard should be digital but that the existing proposals all had deficiencies. The proponents of these digital systems formed the Digital HDTV Grand Alliance to create a single standard. The ATSC accepted a specification in 1995, and the FCC adopted it as the US standard in 1996 (ATSC, 2007; FCC, 1996). While both use the MPEG-2 video compression algorithm ATSC uses a different modulation scheme than DVB-T to

250

superimpose the digitized video signal upon a radio frequency carrier. The standard fits within the 6MHz channel allocations used by the NTSC analog standard. The standard supports HDTV within that bandwidth or multiple standard definition programs.

In 1997 the FCC adopted a plan for the allocation of spectrum to digital television. In that plan most existing TV stations were given 6MHz for digital television. After the final transition to digital technology the improved performance of the digital standard will allow some 138MHz of UHF spectrum to be recovered and reallocated for alternative uses (FCC, 1997). The first digital terrestrial television in the US was launched at about the same time as in the UK (Sallie, 1998). The plan for the transition to digital television envisages that analog TV broadcasting will cease in February 2009. Digital video technology was adopted earlier by the direct broadcast satellite and the

industries in 1994 and 1997 respectively. The cable and satellite industries use their own

digital TV standards. A timeline of the major events in the US television industry is

presented in Figure 35.

251

Direct to home satellite Mobile TV Early years Primestar (analog) (1991) • MobiTV on SprintPCS • Experimental broadcasts DirecTV (digital) (1994) (2003) in New York by RCA • Verizon Vcast (2005) (1940) • FCC approves NTSC TV 75% of US households standard (1941) have color TVs (1976) Major FCC Digital cable restrictions on launched by TCI cable TV removed (1997) NTSC Color TV (1977) standard launched (1953)

1950s 1960s 1970s 1980s 1990s 2000s

FCC allocates UHF and Fox network launched UPN and The WB VHF spectrum for TV (1986) networks launched (1952) (1994)

First transatlantic TV Video cassette Analog to digital TV transmission via recorders go on sale transition complete Telstar satellite (197??) (Feb 2009) (1962)

Figure 35. Timeline of major events in US television

252

The Internet as a threat and a new platform for video services

Broadband data connections have also emerged as a means of bringing video services

to market. ADSL, ADSL2, ADSL2+, VDSL, and VDSL2 are standards offering from 8 to 100 Mbits/s maximum downlinks speeds. These bandwidths are sufficient to offer video services. Fiber optic technologies that can support high-

definition video have been rolled out to consumers in some locations in the US.

MPEG2 has been the video compression standard for terrestrial digital TV and DVD.

It has also been the most common format for cable and digital satellite TV, although

H.264 has also been used in satellite systems. A wider range of proprietary and open

standards have been used for IPTV and broadband video. These included MPEG2 and

H.264/MPEG4, Flash, Apple’s Quicktime, and Microsoft’s VC-1. Microsoft’s codec was popular with content providers as it had robust digital rights management (DRM) capabilities.

Video over broadband has been delivered using two distinct mechanisms – referred to as IPTV (Internet Protocol Television) and Broadband TV. IPTV is usually offered by the

broadband data connection provider and allows viewing on a TV through the use of a set

top box (STB). Like cable television IPTV uses a closed network that allows the provider

to manage end-to-end service quality. Programming uses the same ‘walled garden’ model

used by broadcast, cable and satellite television. Typically, an

(EPG) is used to navigate broadcast and on-demand content. Subscriber numbers are

limited by the reach of the physical network.

253

Broadband TV services are delivered over the public Internet, and as such there are no walls as viewers can skip from service to service. However, as Internet data is delivered on a ‘best effort’ basis there cannot be the same quality guarantees as for IPTV.

While this is not a problem for video downloads it is less acceptable for streamed mass market live TV. Nevertheless, since they are using the public Internet, broadband TV providers can potentially offer service to all broadband Internet users.

For most Broadband TV users viewing is confined to PC screens. This is often a less than ideal user experience since PCs are often located in parts of a home that are inconvenient for TV viewing and far from soft furnishings (Ofcom, 2006a). Operators and manufacturers have been trying to address this shortcoming using wireless devices that can stream content from PCs for viewing on TVs (e.g. Apple TV released in early

2007). Game consoles (e.g. Sony’s Playstation 3 and Microsoft’s Xbox360) were also at the center of plans for Video on Demand (VoD) services (Rothman, 2006).

Broadcasting is an extremely efficient way of reaching many viewers or listeners with the same content. The marginal cost of one more listener or viewer is zero. In contrast,

IPTV and Broadband TV individually send a stream of compressed video to each viewer i.e. unicasting rather broadcasting technology is used. With Unicasting extra bandwidth and server capacity is required for each additional listener or viewer. On the other hand unicasting makes offering niche content viable that simply would not be economic using a high fixed cost broadcast platform. Unicasting approaches can also, at least in theory, support much more targeted advertising than is possible via broadcasting.

Traditionally, packet data networks have offered high bandwidth to customers while sizing shared parts of networks and backbone capacity based on an assumption that not

254

all users are continuously accessing content – an assumption that becomes increasingly

problematic if accessing bandwidth hungry video services becomes the norm (Ofcom,

2006a, pp. 114-115). IP multicasting techniques could take some the strain off backbone

networks. It can be used in IPTV providers’ controlled networks but is not widely

supported on the public Internet – a disadvantage for Broadband TV providers.

Hybrid STBs can support broadcast TV reception as well as provide PVR, VoD, or

‘push VoD’ capabilities. Push VoD broadcasters ‘push’ popular content onto the hard

disks in customers’ PVRs. Customers can view the content at any time until it is replaced

by something new. Push VoD and user initiated recording of broadcast programming on a

STB hard disk distributes storage and eases the load on the unicast network.

As of 2007 increasing access and backbone bandwidths made streaming high quality

video increasingly practical but it still appeared that it would be some time before unicasting live TV to millions would be feasible. BitTorrent and other peer-to-peer networks make it easy to share illicit copies of movies and TV shows. However,

BitTorrent has launched a company to offer a legitimate distribution platform to media companies (Netherby, 2007). The founders of the Skype VoIP business have created a broadband TV service, Joost, built upon peer-to-peer technology to improve system scalability (Fine, 2007).

More generally the Internet had a significant impact on TV viewership and other media. According to a survey carried-out in October 2006 by Ofcom a third of broadband users in the UK believed they watched less TV since they started using the Internet (21% for the US), and 27 % believed they spent less time reading national (23%

for the US). Over three quarters of young adults (19-24 yrs) in both countries reported

255

having downloaded music to a PC and well over half reported having downloaded

part or whole TV programs. The figures for older age groups are lower (see Figure 36).

However, when it comes to news clips the disparity by age group was much less (Ofcom,

2006b).

An avalanche of user generated content further illustrated the potential of the Internet as a medium for accessing video content –particularly with the young (Figure 36).

YouTube for example is based on user generated (or user pirated) content. Globally,

YouTube is the most popular of these websites with over 100 million clips viewed and

65,000 uploaded daily (MacQueen, 2006). As of June 2007 YouTube was the 5th most popular website in the US and the 9th most popular in the UK61. As of June 2007,

YouTube was also supported by Apple TV – thus extending its user generated content to

the TV and strengthening Apple’s foothold as an aggregator of movies, TV shows, and

user generated content. One assumes that Apple is striving to repeat its success in the

aggregation of music for its iPod music players. acquired YouTube in November

2006 for $1.65 billion and by late 2007 was experimenting with ways of extending its

advertising supported business model to user generated video on the site (WSJ, 2007).

IPTV and Broadband TV in the United Kingdom

In the UK Channel 4 was first to offer broadband TV with its “FourDocs”

documentary service in mid-2005, and the BBC, BSkyB, MTV, and AOL brought IPTV

trials or launches to market in 2005-6 (Ofcom, 2006a). ITV launched its portal in mid-

2007 (Gibson, 2007). These offerings allowed viewers a way of catching-up with missed

episodes (typically free of charge for seven days after the original broadcast) and offered

61 According to www.alexa.com

256

their archives or other content on a pay per view (PPV) or advertising supported basis

(FT, 2007b).

British Telecom (BT), the former telecom monopoly in the UK, has experienced

traditional voice revenues falling by about 20% per year. Providing DSL based

broadband service has been an increasing part of their business. By making BT’s physical

links into homes available to competitors (LLU or Local Loop Unbundling), Ofcom,

stimulated greater competition in the broadband market. This has allowed broadband

service to be bundled with other services by non-traditional competitors – thereby increasing competition and pricing pressure on BT. For example, in 2006 BSkyB offered a triple-play bundle of satellite TV, broadband and fixed-line phone having bought an

ISP, Easynet, in 2005 (FinancialTimes, 2006). Carphone Warehouse, Vodafone, and O2, traditionally offered only mobile services but by 2007 were offering bundles of fixed broadband along with mobile services. BT were signing up fewer broadband customers and experiencing more churn as a result (Goodway, 2006).

BT was the first to use broadband to offer a national alternative to satellite and cable pay-TV. BT Vision, launched in December 2006, using a set top box (STB) with

Microsoft software (Glover, 2006b). It allowed viewers to access VoD content on their

TVs, and included a receiver for the free to air digital terrestrial service in the UK

(Freeserve) and an 80 hour personal video recorder (PVR) to support time shifting, along with its VoD capability. BT struck deals to offer a wide range of content (TVBus, 2006) including sport (Thomas, 2006). Content was offered as subscriptions as well as on a

PPV basis. While no subscription was required for TV services, customers were required to sign-up for an annual broadband contract (Wray & Tryhorn, 2006). The most likely

257

target customers for BT were seen by analysts as those who have yet to make the switch

to digital TV or those who would like to pay less, or avoid subscribing to pay-TV

services (Wray & Tryhorn, 2006). BT hired experienced broadcast and media executives

(MarketingWeek, 2005, 2006) to run BT Vision.

90%

80%

70%

60% All age groups 50% 18-24 25-44 40% 45-64 30%

20%

10%

0% Music TV shows User News Music TV shows User News video generated video generated video video

UK USA

Figure 36. Percentages of broadband users viewing video on their PCs international consumer survey carried out in October 2006 (Ofcom, 2006b)

BT had long sought the opportunity to offer video services using its network. It first persuaded British regulators to allow it to offer VoD services in 1993, and then to carry conventional TV channels in 2002 (Bannister, 1993; Europemedia, 2002; Guardian,

1998). It carried-out VoD trials using DSL technology as far back as 1994 (NYT, 1994) but did not consider the business viable at that time. The company partnered with broadcasters to support satellite and digital terrestrial TV (Barrie, 1998, 1999; Teather,

2001). This was part of a strategy to defend its core telephony revenues from cable

258

operators bundling telephony along with cable TV. Analysts saw BT’s move into IPTV

as a way of defending their broadband revenues since expected revenue from PPV

content would be a relatively small part of BT’s business (Curtis, 2006; Thomas, 2006).

BT was not alone for long in using broadband connections to deliver VoD. Tiscali, an

ISP, also launched a VoD service in March 2007 and Orange planed an offering that would also use a set top box incorporating a Freeview receiver and PVR (informitv,

2007; Wray, 2006c). BSkyB also planned to offer its own set top boxes capable of providing VoD services (Wray, 2006c) and already offer a STB incorporating a PVR.

The BSkyB’s PVR could be programmed remotely using a mobile phone and its was expected that the Orange offering would also offer this capability (FT, 2007b).

IPTV and Broadband TV in the US

The telephone industry in the US, dominated by AT&T, was prohibited from delivering television services by legislation and regulatory action in 1956 and 1972. This

prohibition was reaffirmed by the provisions of the Modification of the Final Judgment

(MFJ) that broke up AT&T in 1982 and created the RBOCs (Regional Bell Operating

Companies). The 1996 Telecommunications Act was the biggest overhaul of telecom and broadcast regulation since the Communications Act of 1934 that established the FCC.

One of the major changes was to break down many of the regulator imposed boundaries

between the telecom and broadcast industries (e.g. cable companies could offer telecom

services and the telcos could deliver television services).

During the 1990s most of the RBOCs got involved in the television business to some

degree through the acquisition of cable operators, winning cable franchises, or launching

259

DSL based VoD systems (Parsons & Frieden, 1998, pp. 129-134, 261-122). AT&T (the long-distance operator created by the MFJ) acquired large cable operators TCI and

MediaOne in 1999. The bold initiative cost AT&T in the region of $110 million but only realized $75 billion when it was sold to Comcast in 2001 (Stern, 2001).

Starting in 2004 telcos started offering video. In March 2004 SBC and EchoStar started to offer a cobranded satellite TV service that was bundled with SBC’s local, long- distance, and data offerings (Rosenbluth, 2007). By Sept 2006, SBC (rebranded as

AT&T after the purchase of the long-distance carrier) has 583,000 video subscribers and by July it had added VoD to the offering. Verizon was also one of many telecom carriers that had similar arrangements with the other major US satellite TV operator, DirecTV.

By 2006 most telecom operators had partnerships with one of the satellite TV operators.

More recently, in the mid-2000s, the remaining RBOCs made concerted efforts to enter the TV business using IPTV. Verizon rolled out a fiber-optic based service in parts of several states and hoped to pass 12-15 million homes by 2009. AT&T (formerly SBC) had a strategy of extending fiber to the block – which allows it to provide ~20Mbits/s to the home. It hopes to pass 18 million homes by mid-2008. Thus the RBOCs were competing against cable TV just as the cable operators gained traction with their telephony services (Amobi & Donald, 2006). Analysts, iSuppli, were predicting that telcos will have over 22 million TV subscribers by 2011 up from just 1.3 million at the end of 2006 (Telephony, 2007).

By 2007 CBS, ABC, NBC, and MTV were providing free streaming access to several full episodes of popular TV shows on their websites (Fox only offers it in certain markets). Advertisements, that cannot be skipped, are usually included in the streams.

260

However, convincing viewers to come to network websites was not proving easy. CBS’s

Internet strategist was quoted as saying that the web address for its service should be

“CBS.com/nobody-comeshere” and the president of CBS Interactive said, “We can’t

expect consumers to come to us. It’s arrogant for any media company to assume that.” It

may have been particularly difficult for CBS as its audience is older and less web savvy

than those of the other networks. CBS later adopted a strategy of syndicating its material

widely across the web – its video content was available for free to viewers on as many as

10 websites including, Time Warner’s AOL and Joost. CBS had relatively little direct

ownership of cable networks – which may give it more leaway to explore alternative

distribution without affecting a cable revenue stream.

The major television networks and Hollywood studios had also been exploiting online

distribution with partners – and many broadband TV outlets had been launched. The

partnerships between the major content providers and on-line distributors in early 2007

are summarized in Table 11. Both advertising and pay for download business models

were being used. CBS, NBC, and Fox, as well as many cable networks also sold shows

on Apple’s iTunes. A more recently launched on-line video distributor, Veoh, claimed to

be drawing from 20,000 video sources including PBS, Paramount, CBS, NBC and CNN

(Schonfeld, 2007).

As well as the on-going battle for viewers the networks and web properties were also

battling for the relationship with advertisers. While CBS in 2007 retained 90% of

advertising revenue for content distributed with web partners (Barnes, 2007) there were

indications that Google expected to retain 30% and to handle the relationship with advertisers (Siklos, 2007). NBC and Fox announced a joint-venture to create a site to

261

compete with YouTube, which would also syndicate content to AOL, MSN, MySpace,

TV.com, and Yahoo (Barnes, 2007).

Table 11. Content producer/on-line distributor partnerships (Amobi & Donald, 2007)

Service Major Content Providers

Amazon Unbox Online Most major film / TV studios AOL Video Warner Bros., MTV, TNT, Fox, Sony, Universal BitTorrent Warner Bros. CinemaNow Most major film / TV studios CBS, MTV, Fox, NBA Guba Warner Bros., In2TV/In2Movies Warner Bros. iTunes Major TV Networks, Disney, Paramount Joost Viacom Movielink Most major film / TV studios MSN Video Fox, Disney, NBC Vongo Most major film studios Wal-Mart Most major film / TV studios Yahoo! TV CNN, MTV, E! YouTube Paramount, Weinstein, NBC, CBS, Warner Music

Social networking sites like Facebook and MySpace also appeared to have growing importance for video content distribution as users linked to YouTube and other on-line content. In July 2005 News Corporations (which owns the Fox network) purchased

MySpace for $580 million (Amobi & Donald, 2007).

262

Discussion about the television industry and convergence

In this section we return to the research questions presented in Table 5 of chapter 3.

We use the descriptions of the television industries in the US and the UK to provide a

range of answers to these questions and highlight where the theoretical perspectives

highlighted in chapter 2 provide the most insight. To avoid unnecessary duplication additional details not presented earlier in the chapter are included as necessary and the sources cited.

Standards creation and adoption (RQ1)

The creation of television broadcast formats followed similar patterns to the creation of other wireless standards. In the US committees of actors from the innovation systems were formed to define and develop monochrome, color, and digital standards.

Competition between alternatives played out, and unlike the mobile wireless case, single standards were adopted and approved by the FCC – the single standard seen by all actors as essential for at least terrestrial television. Each standard exhibited significant path dependency: the monochrome standard’s frame rate was shaped by the existing frequency of AC power, and the color standard maintained backward compatibility with the monochrome standard.

In the UK the monochrome standard was strongly influenced by US developments (as was the choice of a variant of AMPS for 1G wireless some decades later). By the time color TV was being standardized compatibility with Europe became important, although the PAL standard selected is very similar to the system developed in the US by the

263

NTSC. The development of the enhanced MAC (Multiplexed Analog Component) analog

formats and the DVB family of standards occurred at the European level under the

auspices of the European Broadcast Union (EBU) based in Geneva. The DVB family of standards goes further than the equivalent US standards (ATSC) by specifying signal format for satellite and cable broadcasting as well as terrestrial broadcasting.

Both the US and European digital television standards (ATSC and DVB respectively) are based in turn on the MPEG2 video coding standards developed within the international context of the ISO. ATSC and DVB standards define profiles with frame- rates, resolutions, and interlacing modes to retain backward compatibility with content from the analog era.

The telecom and cable industries have developed a range of standards for the delivery of Internet access services. The DSL Forum with wide telecom and computing industry representation drove the standardization agenda through ANSI, ETSI, ATIS and the ITU.

It also undertook further activities to promote equipment compatibility. The US cable industry developed their own series of cable modem standards to provide Internet access services: DOCSIS (Data-Over-Cable Service Interface Specification). After initial attempts to develop its own cable modem standard the European industry adopted

DOCSIS with minor modifications. Various versions of DOCSIS have been converted into formal standards by the SCTE62, ANSI, ETSI, and the ITU (CableLabs, 1998; Catto,

2000; Taylor, Martens, & Jaeger, 2004).

The network economics perspective undoubtedly explains the compelling economic reasons for ensuring compatibility of national broadcast infrastructures with TV receivers

62 The Society of Cable Telecommunications Engineers (SCTE) is an ANSI accredited forum for standard creation for the cable industry.

264

and the economies of scale in the development of compatible semiconductors. The DSL and cable modem specifications are also classic examples of telecommunications industry standardization where interoperability and economies of scale are vital. Both these platforms provide a platform for the delivery of IP services to PCs or other devices. The provision of video services over the IP has not driven an industry process of settling on a single standard. Companies like Adobe, Apple, Microsoft, Real Media, and others all have their own proprietary video coding and application interfaces. The standards and the services built on them are competing in the marketplace. The flexibility of software on

PCs allows multiple proprietary approaches to be supported and frequent updates to be applied. This sort of flexibility has not historically been available for TVs.

Inter-organizational coordination and relationship building (RQ2)

An overview of the relationships among the content creators, the content aggregators

(channel owners and broadcasters), the broadcast networks, and the TV viewers in the

UK and US are presented in Figure 37 and Figure 38 respectively. For the sake of clarity equipment manufacturers and regulators are not shown.

The relationships among the various actors were explained in some depth in the main body of the TV industry case study, and will not be repeated here. In both countries the number of options for the delivery has grown overtime, as has the breadth of content. A couple of the episodes of change in the industry are discussed in further depth as we next consider the third research question.

265

TV license fee BBC PAL UHF TV Arqiva / Macquarie

ITV, C4, Analog Five Advertisers and digital broadcast network Freeview (DTT) DVB-T TV Top Up TV UHF STB/PVR

Content Channel Astra &

DVB-S Creators Owners BSkyB TV Sat STB/PVR Satellites TV viewers

Virgin Media DVB-C (cable) Coax

Various IPTV TV IP STB/PVR Broadband

(e.g. DSL)

Broadband Various Content Flow TV IP PC

Payment Flow Broadcasters Broadcast Networks

Figure 37. Overview of TV industry in the UK in 2007

Relationship between standards creation/adoption and inter-organizational coordination/relationship building (RQ3)

In the case study of the UK and US television industries there were a number of specific instances where standards have influenced industry structure. For example, frequency assignments (VHF vs. UHF) in the US influenced broadcasters’ economics (as they have done for wireless mobile operators). Broadcasters with UHF channel assignments were disadvantaged for many years. Sky’s use of frequencies assignments not explicitly reserved for broadcasting and its selection of an established television

266

signal format contributed to its circumventing the visions of other actors (somewhat like

Nextel’s use of SMR frequencies in the US).

Local PBS CPB/PBS Stations Membership subs

The Networks NTSC

Commercial V/UHF TV ABC, CBS, NBC, affiliate Independent Fox, The CW, MyNetworkTV, stations Producers Univisión, plus ATSC

shopping and TV V/UHF STB religious networks

Independent commercial Advertisers / stations Sponsors TV Stations

TV viewers

Cable/ Satellite Direct Broadcast Digital Channel Satellite (DBS) Sat Direct TV, DishTV TV Owners STB

Cable system NTSC/SCTE07 Fiber/Coax

Content Flow operators TV STB

Payment Flow Telco provided Digital multichannel TV Fiber/DSL TV STB

Broadband and Various PC IPTV IP TV STB Subscription TV

Figure 38. Overview of TV industry in the US in 2007

Rather than continue with small scale examples of this type I have chosen to address a number of large scale episodes that highlight the growing importance of convergence among the content, telecommunications, and computing industries. To avoid undue repetition the description of these episodes of the convergence and change in the industry

267

build upon the descriptions already provided but delve deeper. Finally, the role of digitization and the Internet Protocol (IP) suite in facilitating convergence is considered.

Multi-channel TV and cheaper telephone calls

The first episode we discuss is the development of multi-channel TV in the UK and the first commercial example of the convergence of broadcast and telecommunications services in the marketplace. The ways that these industries came together in the UK is contrasted with the story in the US. The key actors and the actions of the UK story are depicted in Figure 39.

The UK government was not able to impose its vision of the future of the telecom and television industries. Another satellite broadcaster, Sky TV, used the older PAL standard and lower power satellites to enter the market. The cheaper, and easier to produce, PAL receivers speeded Sky’s time to market. This, along with a more compelling content offer

(mainly soccer) contributed to Sky beating BSB in the marketplace.

BT responded to cable’s bundling of TV and telephony services by promoting satellite TV (Liska, 1987). One way it did this was to sell Sky’s offerings from its retail stores in British retail districts. The success of Sky and its impact on the roll out and adoption of cable was in BT’s interests – so supporting satellite TV was in its interest.

In the mid-1980s several commercial and regulatory actors had visions for the future of telecommunications and multi-channel TV in the UK. These actors took actions to build or transform heterogeneous networks according to those visions. The UK government achieved some level of success in increasing competition in telephony but probably less than it had hoped. Sky got around the government’s licensing of BSB by

268

using Astra satellites licensed by Luxembourg. Sky also benefited from other unencrypted programming transmitted from the two satellites at the same 19.2ºE orbital slot – referred to as a “hot-slot” in the satellite industry. The success of Sky was a setback for proponents of the MAC family of video standards who envisaged them as the basis for enhanced and high-definition television in Europe. In the end it was the visions and actions of the owners of the Astra satellites, the Grand Duchy of Luxembourg, and

Rupert Murdoch’s Sky who succeeded most closely in the heterogeneous engineering tasks they had set themselves.

The UK government, along with its system of regulators, was a major actor in this episode. It had a vision of increasing competition in the residential telephony market by allowing the bundling of telephony with cable TV. It also had a vision of introducing more competition into television market with satellite broadcasting and cable TV. It tried to control the basis for broadcast competition by licensing a satellite operator, BSB, and constraining its strategy by insisting upon the new D-MAC video standard and the use of a high-powered satellite. While this allowed BSB to be received on very small antennas

(~30cm in diameter) the trade-off was that only a handful of channels could be offered.

The characteristics of satellite broadcasting acted on the way the UK market played out by offering satellite broadcasters national coverage from day one – in contrast to cable TV systems which took years to build out. The fact that effective multi-channel satellite TV arrived just a few years after the launch of cable TV put UK cable operators at a disadvantage relative to their US counterparts which had an extra decade to penetrate the marketplace before an effective mass market satellite alternative was launched (see

Figure 34 and Figure 35).

269

Innovation Marketplace Regulatory Standards System System Mid-1980s Govt. - Cable franchises awarded - permitted Aero- - BT prohibited from video space High / medium power (mid-1980s) industry satellites make direct to home satellite TV possible Govt. D-MAC Selected D-MAC for satellite standard Govt. High-power satellite allows small antenna Awarded high-power satellite broadcasting license to BSB Luxembourg Licensed Astra satellites PAL Sky’s use of tried and tested PAL simplifies technology

Sky Sky TV launches (1989) - Used PAL video standard - Used Astra satellites - More channels than BSB - Secured key sport broadcast rights D-MAC BSB use of D-MAC delays technology development and market launch

Satellite, Astra satellites (medium power) SES “hot-slot” - More Sky channels possible Luxembourg - Other channels on same and co-located satellites (e.g. MTV) - Service offered UK/Ireland wide from start

Sky Sky merges with BSB (1989)

BT BT supports Sky satellite TV - Sells service from widespread retail network - Defensive response to cable’s TV and telephony bundles

TV viewers Sky adopted more rapidly than cable TV and becomes dominant in pay-TV Mid-1990s Cable TV struggles financially

Figure 39. Key actors and actions in the development of multi-channel TV and service convergence in the United Kingdom

The two key actions that drove the initial growth of the US cable industry were

regulators’ decisions to allow cable systems to carry more than local broadcast TV, and

the deregulation of a satellite industry that could distribute programming to cable systems

around the country. This highlights that the way that the same actor, in this case satellite

270

broadcasting, can affect the configuration of relationships in an industry depending on the

timing of the action. While satellites made multi-channel cable TV possible in the US

they played a major role in stunting the growth of cable TV in the UK. The meaning of

the ‘same actor’ is somewhat problematic here as obviously the relationships with the rest

of the TV and telecommunications industry actor-networks were not the same in the two

instances – an example of a ‘mutable mobile’ (De Laet & Mol, 2000; John Law &

Singleton, 2005).

Data communications and the PC reconfigure telecom and broadcasting industries

Next we examine the convergence of the computing and data networking industries on one side, and the companies that deliver residential communications services – which by the mid-1990s, had started to include the cable operators as well as the traditional telecommunications operators. The key actors and actions that led to the convergence of

the telecommunications services providers and the computing industries are outlined in

Figure 40. It was driven by the demand for broadband access to the Internet, or more

specifically to the World Wide Web and email applications from home based personal

computer users in the mid-1990s. Although the telcos had been providing corporations

and other organizations with data communications services for decades, the use of dial-up

data services (e.g. CompuServe, AOL, ) from home remained a niche activity and

Videotex services (e.g. the French system) did not take off in the US or the UK.

This early use of the telephone network for data services did however provide the

technical knowhow that was later enrolled to provide dial-up Internet access.

271

The Internet protocol suite (including TCP/IP) provided the communications platform

for the development of applications (such as file transfer and email). In turn Web

protocols (including HTML and HTTP) and the server and browser software that

communicated using these protocols provided a platform for the explosion of the ‘visions of the future’ of start-ups, existing businesses, other organizations, as well as the visions of individual computer users.

Just as important was the availability of personal computers and that enrolled the traditional telephone network as a means of connecting to the Internet via

Internet Service Providers (ISPs) – albeit at modest speeds. The circuit-switched

telephony platform provided the initial communications platform for the adoption of

Internet at home. The V.series of backwardly compatible modem standards, and the

telephony network provided the initial platform upon which the initial diffusion of the

Internet among consumers relied. Both cable operators and telcos saw the opportunities

to use their connections into the home to provide broadband Internet access. This was

more of an opportunity for the cable operators than the telcos which stood to lose the

second lines that had been purchased specifically for dial-up Internet access by many

customers. This challenge for the telcos was coming at the same time that revenues from

their core telephony services came under increasing pressure from alternative long

distance carriers and the deployment of cable telephony (Fransman, 2002). Later Voice

over IP (VoIP) and substitution of fixed lines with mobile phones exacerbated this trend.

In the early days of data services the computing industry enrolled the telephony

network on the latter’s terms i.e. by encoding data as tones that could be transported by

the voice network and by complying with its physical and electrical characteristics. As

272

the large demand for data communications offerings became clearer the relationship

between the computer and telecom industries changed. Communications offerings

become more data centric as the relationship shifted to being more on the computing

industry’s terms. The telcos, and their infrastructure suppliers, repurposed the DSL technology that had been initially developed to allow TV to be delivered via telcos’ local loop63 infrastructures and the cable TV industry developed their own cable broadband

data capabilities.

Innovation Marketplace Regulatory Standards System System Late -1960s US Govt. Funding Internet development

Universities IP, TCP Kahn, Cerf Internet Protocol suite becomes standardized et al. IP, TCP Internet Protocol suite becomes platform for application development IBM, Intel, OS APIs, Microsoft, Hardware and software interfaces become ISA, Clone Personal computing and standardized – promoting on-going innovation and RS232 makers servers (1980s) scale economies

Berners- HTML, Lee, HTCP World Wide Web (1989) Andreesen Web browser (1993) V.series Dial-up standards ISPs Yahoo Amazon World-wide web becomes major business tool and Corporations information / entertainment / commerce platform (mid-late 1990s) Public Infra- DSL stds. structure DSL and cable modem DOCSIS providers technology put forward as broadband solutions (mid-late 1990s) Broadband Internet services brought to market Telcos - Telcos offer DSL, telephony, (satellite TV) bundles Cable TV - Cable offers TV, cable data, cable telephony operators Early 2000s

Figure 40. Summary of key actors and actions leading to the convergence of residential telecommunications and computing industries

63 The local loop (or subscriber line) is the physical link between the customer’s premises to the telecommunications service provider’s facilities. Traditionally, the local loop was a of copper wires. Modern local loop configuration may include fiber optics (fiber-in-the-loop) and active electronics.

273

In both the US and the UK cases the regulatory regimes played pivotal roles by

allowing or disallowing competition among service providers – keeping the telecom and

broadcasting industries separate, and later lowering the barriers between them. The

timing of these decisions shaped how the industries played out in each country (e.g. the

impact of satellites on cable TV just discussed). The governments have largely stayed

away from the regulation of the computing industry and of the Internet – the Microsoft

antitrust cases notwithstanding. Despite this relatively hands-off approach to the Internet

itself the regulation of the telecommunications industry has affected the ways in which

consumers access the Internet.

In the US the 1996 Telecommunications Act created CLECs64 and forced incumbents

to make their local loops available to them at attractive wholesale rates. An arrangement referred to as Local Loop Unbundling (LLU). Despite regulatory changes leading to higher wholesale prices CLECs accounted for about 18% of the US’s 175 switched access lines as of December 2005 (Rosenbluth, 2007). An FCC decision in August 2005 eliminated the requirements for facilities sharing of broadband infrastructure by incumbents, and other facilities based wireline broadband Internet access providers.

While this policy change allowed DSL to compete on a level playing field with cable modems it greatly reduced the number of companies that competed with incumbent DSL providers – although facilities-based competitors like Covad were not affected (Duffy &

Pappalardo, 2005; Nolle, 2005). By 2007 US residential broadband access was

64 Local Exchange Carriers are telecommunications companies that own and operate the telephone switches and local loop infrastructure in the US market. The incumbents, referred to as ILECS, were typically the regional Bell operating companies (RBOCs) formed by the breakup of the original AT&T in 1984. The Competitive local exchange carriers (CLECs) are companies that compete with the incumbents by either building out their own infrastructure (facilities based) or by reselling the incumbent’s loops.

274

predominantly provided by facilities-based providers – cable had 54% of broadband customers while the other 46% were served using DSL (Rosenbluth, 2007).

In the UK the path taken by the regulatory regime was almost the direct opposite. As part of Ofcom’s Strategic Review of Telecommunications in 2005 BT, under threat of being broken up, agreed to changes in their relationships with their competitors. BT undertook to offer competitors access to their local loops and wholesale communications products on the same basis they were provided to their own retail divisions (Ofcom,

2005b). Prior to this, competitors were essentially restricted to offering rebranded versions of BT’s DSL offerings that left little opportunity for differentiation. Thus broadband competitors were guaranteed access to the incumbent’s local loops (i.e. LLU) at favorable rates not long after this right was removed in the US market. At the end of

2006 there were 1.3 million broadband unbundled lines (10% of all connections compared to 2% a year earlier). Growth continued with 1.7 million broadband lines unbundled by around 26 LLU operators by February 2007 (Kennedy, 2007; Ofcom,

2007b). In 2007 LLU operators included O2, T-Mobile, Orange, Carphone Warehouse and BSkyB. Vodafone offered DSL based broadband using a BT wholesale product as opposed to LLU. Given the late start of cable TV in the UK (relative to the US) only about 24% of the broadband connections are provided by cable – the remaining 76% being DSL (Ofcom, 2007b).

The successive waves of adoption of personal home computers, narrowband Internet access, and broadband Internet access brought telecom and cable TV operators into the world of providing IP connectivity on a mass market scale. Their primary service offerings turned from telephony and cable TV respectively to bundles of these services

275

with Internet access. The differences in the regulatory approaches stimulated many more companies to provide broadband service either singly, or as bundles with other services in the U.K than in the US. As of 2007 there are over 500 ISPs providing broadband Internet access in the UK and bundling ‘free’ broadband was used to decrease consumer churn for other services, e.g. BSkyB and Carphone Warehouse offer ‘free’ broadband in bundles with other services.

Emergence of triple-play bundles and higher-level convergence platforms

The convergence of services provided by flexible platforms has continued and more complex sets of relationships have emerged. The cable TV operators in the US and UK were able to offer ‘triple-play’ bundles of fixed-telephony, multi-channel TV, and broadband Internet access on their cable platform. Examples of product bundles offered in the UK are listed in Table 12.

We can think about the ‘cable-TV’ networks as being key technological actors that facilitated their owners’ entry into a wider range of businesses once market demand was apparent and regulatory constraints had been removed. However, these communications networks had to be extensively upgraded to support two-way communication, data traffic, digital TV, and Video-on-Demand. Thus this important actor – the cable TV network – was gradually, or not so gradually, changing and shifting its relationships over time and becoming a significantly different actor along the way.

276

Table 12. Bundled products offered by major UK service providers, March 2007 (Ofcom, 2007b)

Services AOL Be BT Orange Pipex Plus Sky Talk Tesco Toucan Virgin Voda Offered net Talk Media fone Broadband √ √ √ √ √ √ √ √ √ √ √ √ and fixed Broadband √ √ √ √ √ √ √ and mobile Broadband, √ √ √ fixed & TV

Broadband, √ √ √ √ √ √ fixed & mobile Broadband, “Triple play” “Triple play” √ √ TV and mobile

Broadband, √ √ fixed, TV & mobile “Quad play” “Quad play”

Technological limitations of the twisted-pair based telco networks of the late 1990s and early 2000s prevented the telcos from directly emulating the cable operator’s triple- play bundles. The interim solution was partnering with satellite TV operators to provide a

triple-play bundle in a sort of marketing and billing convergence rather than one based on

a common platform. Technical advances and increased deployment of fiber-optics have

overcome some of the technical obstacles that have allowed at least the largest US telcos

(Verizon and AT&T65) to bring platform based triple-plays to market. The addition of

wireless services, telephony, data and TV provide ‘quad-play’ bundles. These are more

like examples of billing/marketing convergence as they do not use the same technological

platforms even when owned by the same companies as is the case for Verizon and

AT&T, although this could conceivably change for IP based or WiMax based

wireless networks. The cable companies in the US are able to offer mobile wireless

65 This version of AT&T was most recently SBC. After SBC’s acquisition of the AT&T long distance company it adopted the AT&T brand. This is another example of changing actors. The “AT&T” we have referred to from chapter 5 onwards has changed radically over the period studied.

277

services as part of their bundles through an arrangement with Sprint-Nextel – another example of billing/marketing convergence. In the UK the cable operator purchased an

MVNO to allow it to offer a ‘quad-play’ bundle.

The role of digitization and the Internet Protocol (IP) suite

At the beginning of the 1990s the telephony, cable TV, and satellite TV networks were predominantly analog – although the core of the telephony networks were transitioning to digital transmission and switching. These networks were optimized to carry analog telephony or TV. The early bundling of analog satellite TV and telephony

(e.g. BT and Sky in the UK) was therefore an example of billing or marketing convergence as both services could not be delivered on the same technological platforms.

In the late-1990s cable, satellite and terrestrial broadcasting started to transition to digital delivery technologies. While telephony is usually still delivered to the home as an analog service, the core networks are completely digital (albeit circuit-switched). In addition a packet-switched digital overlay using DSL technology is widely deployed on the same twisted-pairs of coppers wires. Thus by the mid-2000s several potential one-way and two-way digital pipes are available to consumers in the US and the UK.

Digitization is the representation of any codifiable information (e.g. audio, video, text, and images) using combinations of binary digits (bits). It has lowered the technological walls between the broadcast and telecommunications industries. Bits can be transmitted by electrons, or photons, and can be stored by electrical charges, or magnetic fields, or the presence or absence of material (punched card, or pits on compact disks).

278

As the bit is also the basic unit of information processing in computers, digitization also brought the computer industry increasingly into the delivery of multimedia and the provision of communications services. Advances in the compression of digital audio, images, text and video, and computer processing power and network bandwidth made it increasingly feasible for services using any or all of these media to be offered using the same networks. Thus digitization was a key actor in platform convergence across broadcasting, computing, and communications, and we will have more to say about the conceptualization of digitization in actor-network terms in chapter 9.

The twisted-pair, coax, , fixed-wireless, mobile-wireless, and satellite networks still have their own characteristics in terms of bandwidth, cost, coverage, and reliability as well their own particular broadcast and unicast capabilities. While digitization has not made these networks completely equivalent each can transport any type of codifiable information (audio, video, images, and text) – they are more flexible than the earlier analog networks. These digital networks do not have to be optimized for any single type of information since they are optimized to transmit the bits which can, at least in principle, be used to represent any type. The ways in which these networks have acted in industry change have varied as their capabilities have morphed as a result of digitization and other technological advances. Digital networks are also being used in combinations to provide new types of offering (e.g. BT Vision set top boxes receive free to air digital TV as well as supporting downloads of VoD content).

So far we have focused on the fiber/coax and fiber/twisted-pair networks of the cable and telecommunications companies as the key platforms for the convergence of service delivery. However, there is increasing evidence of convergence moving ‘up the stack.’

279

The Internet Protocol (IP) suite was built as a platform to facilitate a wide range of

computing applications. As bandwidths and adoption has increased, broadband IP

connectivity is being seen as a viable platform in its own right for many of the same services that previously required their own optimized networks (e.g. cable TV and telco

networks for TV and voice respectively). Voice over IP (VoIP) service providers (e.g.

Vonage and Skype) and broadband TV providers (e.g. Joost and Google’s YouTube)

deliver voice and video services over IP connections without regard to how users obtain

their IP connectivity.

BT in the UK has committed to converting its core telecommunication network to an

IP based architecture. All its services, including telephony, will be underpinned by IP

technology (on its own IP network rather than the public Internet). Thus there are two

visions of convergence at the IP layer. The first abstracts away the underlying networks

and envisages the offering of services as being disassociated from the provision of basic

IP connectivity. The major TV networks in the US and the mainstream TV channels in

the UK as well as BSkyB are at least experimenting with broadband TV offerings –

implying that they share this vision or at least see it as a plausible outcome they need to

prepare for. The second vision foresees network operators remaining as the gatekeepers

for the provision of services using their increasingly flexible networks. As we have

discussed in this chapter commercial actors are enacting strategies that are consistent with

one or other of these visions, or even both of them. In either case the licensing of TV

stations or the allocation of cable TV franchises are not the strong obligatory passage

points for content producers and advertisers they once were.

280

However, the changes are not limited to video distribution technologies.

Technological changes in turn reshape the obligatory passage points for content providers

and advertisers. The actor-networks around TV and other video services may experience

considerable reconfigurations as both existing and new actors strive to enact strategies

and bring about visions of the future that put them in control of new obligatory passage

points. New aggregators of are emerging (e.g. Joost and Veoh) and

for Video on Demand (e.g. Amazon and Walmart’s movie download businesses or

YouTube and others for user generated content). Social networking websites and blogs

provide individuals the ability to aggregate video content on the web for their on-line

networks of friends. The new versions of packaging content replicate the traditional TV

networks in some sense but with many more potential channels tailored to any taste or

interest and not limited to a linear programming format. With all of these possible outlets

for video, effective search mechanisms (or content discovery) become particularly

important. At least one start-up business, VodPod, envisages aggregating the aggregators

by allowing anyone to create specialized channels that pull together video content from

all over the Internet – a process that at least one article has referred to as

“hyperaggregation” (Malik, 2007).

Existing aggregators of video content, TV networks and channels, still have strong

connections with consumers (through their brands). Their increasing willingness to place

their content on the Internet using advertising supported, and pay for download

arrangements provides evidence that they can change their shape by modifying how they

reach customers i.e. through the Internet and at least in some cases through new

aggregators. At the time of writing we are going through what may well turn out to be

281

another major episode of convergence – the convergence of broadcasting and the

Internet. This episode differs from the earlier convergence where cable TV networks were enrolled for the delivery of broadband Internet access. In this episode broadband

Internet access, however provisioned, acts as the platform for the delivery of an

increasing proportion of mass market video content to add to how the Internet has already been drawing audiences and advertising away from traditional distribution channels.

Role of initial conditions (RQ4)

The question of the impact of initial conditions depends on the choice of starting

point and where in the actor-network that we choose to draw the boundary. In this case study of the TV industry we started in the nineteenth century and have extended the study

into the now related telecommunications industry. This section goes even further still and

looks at a couple of historical events that had a significant impact on the wider actor-

network configuration in which the episodes described in the previous section occurred.

Luxembourg’s ability to become a focal actor in the satellite industry was at least in part facilitated by the rules established by the ITU, which was established in 1865 to facilitate the coordination of the emerging international telegraphy business (Standage,

1999). It is interesting to consider the possibility that the echoes of the actions of actors that shaped the first international electronic communications network affected the way that the television and telecommunications industries played out more than a century later.

The existence of an industry capable of providing communications and broadcasting services using artificial satellites by the 1970s is in itself a fascinating story. The space

282

race between the US and the Soviet Union was used to demonstrate national superiority

in science and technology. Wernher von Braun, the German born engineer at the center of

the US space program argued that “we are competing for allies among the many have-not

nations for whose underfed multitudes the Communist formula of life has a great appeal

(Slotten, 2002).” So the way that pay-TV developed first in the US and later the UK is

directly attributable to the actions of government leaders in the Cold War, and going

further back to the German development, under von Braun, of the V2 missile

(Vergeltungswaffen-2 or Vengeance Weapon 2) during the Second World War ("Braun,

Wernher von.," 2007) and more generally to the changing relationships among national

actors throughout the twentieth century66.

Without delving too deeply into its history digital computing is another industry

whose development was spurred by the military needs of the Second World War and the

Cold War. For example, the British Colossus was used to automate calculations for code

breaking in WWII, and the American ENIAC was used for calculating artillery firing

tables shortly afterwards. In the 1950s the US developed real-time interactive computing

in support of its SAGE Cold War air defense system, (Hughes, 1998). The Internet

technologies and its institutions were created with US Department of Defense funding

starting in the 1960s (Leiner et al., 2003). The details of how the computing industry

developed from these beginnings is covered in depth by Chandler (2005) and includes the

rise and fall of (seemingly unassailable) actor-networks built around mainframes, mini-

computers, PCs, Operating Systems and office productivity applications to name a few67.

66 For more on the history of rocketry, satellite communications, and the Cold War see (Slotten, 2002) 67 See the multi-page timeline of key events in the computing industry from the 1940s through the mid- 1990s in Appendix 4.1 of (Chandler, 2005).

283

The provision of telephony services by cable operators is an example of platform convergence – where two services, previously delivered by different companies and technologies are delivered by a single company and network. While the cable industry started later in the UK than in the US the decision to lower the regulatory barriers between the broadcast and telecommunications industries came earlier. UK cable companies were permitted to offer telephony services from the mid-1980s while their US counterparts had to wait for the 1996 Telecommunications Act.

As well as providing an overview of the television industry, this chapter has examined the story behind the convergence of the media, telecommunications, and computing industries – each built upon the deployment of technical standards. The wireless segment of the telecommunications industry and the broadcasting industries were also built upon the use of radio spectrum and heavily shaped by licensing and other regulatory actions. The government reticence to take an active role in the regulation of the computing industry and the Internet is in stark contrast – the absence of connections in an actor network can perhaps be just as important as the presence of others. Next we examine another example of convergence, this time device convergence, as video and television services are provided on mobile wireless devices.

284

VIII. Mobile TV and Video: The emerging story in the USA and the UK

The convergence of the telecommunications, computing, and media industries has

been described in preceding chapters. Chapter 6 describes how the possibilities opened

up by the digitization of mobile phone platforms and increasing capabilities of mobile

devices led to increasing convergence of mobile communications, mobile computing, and

the mobilization of entertainment and information services. Chapter 7 described another

part of the overall convergence story, that of the convergence of fixed-computing and the

Internet, with the television and telecommunications industries. In this chapter we

examine the mobilization of video entertainment and information services on mobile

devices at the convergence of mobile wireless communications, media, and computing.

The focus is on the development of mobile television and video clip based services.

Mobile wireless communications and broadcasting have been major users of the radio spectrum since the earliest days of radio. The need to coordinate the use of spectrum, and the evolving perceptions of the public interest in communications and broadcasting has

meant that national and international regulation has been a constant feature of these

industries. As we have seen in the preceding case studies technical standards have been

central to the creation of the technological systems used to deliver broadcasting and

communications services. Mobile TV and video services bring together commercial,

regulatory, and technological actors in even more ways than has already been described

in the preceding chapters.

The advent of 3G has made the transmission of digital video to and from mobile

phones feasible. Wireless network operators have started to offer a variety of video based

285

services including video telephony between 3G customers and video streams of

transportation infrastructure hotspots to help commuters. Entertainment and news services to mobile devices rely on video clip downloads and/or streamed video channels that emulate traditional broadcast television channels.

Mobile television was around before the launch of cellular communication systems and advances in electronics and computing has provided many more ways of watching video on the move. In the following sections we examine these alternatives before turning to the development of mobile TV and video services on mobile wireless devices in the

US and the UK.

The development of mobile video and television

A prototype portable television (weighing 27 pounds) was demonstrated not long after the invention of the transistor (Washington Post, 1952) but it was many years before truly mobile television was possible. The British inventory Clive Sinclair was the first to bring a pocket sized, albeit a fairly large pocket, TV to market in 1978. Sony launched its first Watchman in 1982 and the first color pocket TV (Epson ET-10) appeared in 1984.

These receivers are shown in Figure 41 along with a more modern example.

The analog TV standards (e.g. PAL in the UK and NTSC in the US) were never designed with mobile reception in mind. The continuous reception and signal processing required for analog TV reception was associated with high power consumption and thus short battery life in mobile devices. This was only partly ameliorated as manufacturers switched to LCD display technologies in the late-1989s. Antenna limitations on pocket

286

TVs also limited their ability to pull in signals and provide good quality pictures from

broadcast infrastructure designed with home based receivers and larger antennas in mind

(Yoshida, 2006).

(a) Sinclair MTV1 (1978) (b) First Sony Watchman (1982) (c) Epson ET-10 (198468)

(d) Sony FDL-PT222 Watchman Portable TV (2001)

Figure 41. Handheld television receivers

The introduction of pocket required little coordination of actors in the existing television industry. While there was undoubtedly significant innovation required

to shrink the electronics into a handheld device, the broadcast signals and content was

exactly the same as for traditional broadcast television. No coordination with content

providers or with broadcast network operators was required. However, the deficiencies in

the viewing experience and battery life, along with a the lack of a preexisting pattern of

behavior for the use of such devices in public are given as among the reasons why pocket

68 Pictures from http://www.taschenfernseher.de/e-history.htm

287

TVs have remained a small niche – never reaching the ubiquity of the cassette based portable music players in the 1980s or of iPods and other mp3 players more recently

(Birkmaier, 2006).

The low cost of pocket sized analog TVs (less than $100) indicates that price is not the main barrier to their mass market adoption. The following extracts from several reviews of analog pocket TV reveal some of the limitations of viewing TV on these devices as well as some of the reasons why a few were willing to tolerate them.

“This works just fine but the picture is so small that it gave me a headache after 45 minutes of viewing. Not for those susceptible to migraines.” “I can't imagine settling down to watch this for several hours unless you want some serious eyestrain.” “You can't make out ANY picture outdoors with any amount of natural light.” “The screen is too small. if you want to watch it in your car, you have to cover it because the back light is too week.”

“Also, it goes through batteries like a rabid raccoon goes through a dumpster.” “Also the battery life is no more than just over 2 hours”

“TV does not work, whilst I get the picture, it is in black and white and there is no voice or sound apart from hissing noise.” “It is almost impossible to get a clear picture on any channel and the sound quality is poor.” “You can't get a perfect reception on it all of the time: traveling … in the car, I nearly had the driver's eye out with the aerial twice as I tried to pick up a signal.”

“A pocket TV. It's all a bit pathetic, isn't it? Have you ever had to go to a wedding on FA Cup final day?” “This may only be good for kids or sports nuts who can't be away from their tvs.” “Yesterday I watched game 3 of Yankee - Red Sox playoffs from a hammock in the back yard.” “it did fill the bill - to see a sporting event live while watching one of our kids' activities.”

“I got this for our frequent power outages here in the Florida Keys.” “Very handy product. Works fine when cable and/or electricity are out.” “I bought this TV in July of 2003 after a massive storm knocked out power to over 75% of the Memphis area.”

[Quotes from a number of reviews of portable TV product on Amazon.com and Amazon.co.uk, retrieved March 20, 2007]

288

The screen size (typically about 2 inches diagonally) is a problem and some

implementations of the LCD technology can make outdoor viewing impossible.

Receiving the desired TV station can be difficult and short battery life is a recurring

theme. The use of a telescopic antenna to improve reception can be inconvenient and add

to the self-consciousness of using a pocket TV in public. Despite these limitations some

people are willing to accept these limitations to see sporting events that they would

otherwise miss, or to have access to TV broadcasts in the aftermath of severe weather.

Watching video tapes in vehicles became possible in the 1980s but often involved an

expensive custom installation. In the late-1980s portable VHS based players with

integrated CRT screens were available from $600-$1400 and the VF-3000 with a

3.3 inch LCD screen sold for about $1400 and weighted 5lb (Forbis, 1988). By the late-

1990s manufacturers started to offer LCD screens embedded in seat headrests or designed

to drop down from the roof in larger vehicles (vans and SUVs). VHS cassette players

were gradually superseded by DVD for these in-car entertainment systems in the 2000s

(Mateja, 1998; Reports, 1999; Wright, 1999). See Figure 42 for of examples of in-car

video entertainment systems.

Portable DVD players appeared shortly after the format was launched in 1997. Early battery powered portable DVD players cost around $1300 (, 1998) but prices declined rapidly. Players with 7 inch screens can be purchased for less than

$100 in 2007. Portable DVD players and the DVDs themselves are much smaller than their VHS cassette equivalents. This along with being battery powered meant that they could be used in a wider range of locations e.g. in aircraft and on trains as well as in cars.

289

A typical player and an example of a temporary installation in a car are shown in Figure

43.

Figure 42. In-car video entertainment systems from 1990s/2000s

Figure 43. Portable DVD player and example of in-car installation

290

Portable media players that added video capabilities to audio MP3 players started to

appear around the end of 2002 (Muther, 2002). One of the first, the Archos Jukebox

Multimedia 120, had a very small screen (about 1.5 inches) and only sold in small numbers. Later models featured increased screen sizes (for example the Archos AV380 had a 3.8 inch screen). Finding, transcoding, and transferring video content for viewing on these players took a certain amount of technical know-how. Video on mobile devices became more popular with the launch of the video version of the iPod in October 2005

(Bray, 2005). Legal and illegal infrastructures (e.g. iTunes and eDonkey respectively) for downloading video content, including TV shows and movies made finding content much easier. An alternative source for some users was to transfer video recordings made with a

PVR to their media players (e.g. “TiVo to Go” and ’s “Dish to Go”).

(a) Archos Jukebox (b) Archos AV380 (c) Apple video iPod Multimedia 120 (2002) (2003) (2005)

Figure 44. Portable media players

In 2004 Sony launched a multimedia wireless tablet in the US called “Location Free.”

Wi-Fi was used to send digitized video from a base station connected to the video sources

in the home. It allowed people to watch TV anywhere in and around their homes without

291

being constrained by the location of traditional televisions in the house (the base station

and tablet are shown in Figure 45). The downsides were its weight (5lbs) and price

($1,500). Sony also struggled to figure out how to market its many features (Colker,

2004). The tablet could also access home based video content while away from home if a

broadband Internet connection was available. Later versions allowed access from laptops

and Sony’s Play Station Portable (PSP) mobile gaming and entertainment device, in

addition to the original tablet. The PSP is an example of another type of convergence –

device convergence where a single device has multiple purposes – in this case a platform

from both mobile video gaming and mobile entertainment.

In 2005 a small California based start-up, Sling Media, launched the ‘SlingBox’ device that connected to a customer’s cable/satellite set-top box or personal video recorder (PVR). It digitized and compressed the audio and video signals for streaming over the Internet for viewing on laptops, cellphones or PDAs running Sling Media's software. The software on the remote client devices allowed the user to control set top boxes and PVRs (the SlingBox’s transmitter mimicked their remote controls).

Like the Sony’s “Location Free” the SlingBox provided a solution for ‘placeshifting’ television content. In combination with a PVR it allowed both placeshifting and timeshifting. A user with an appropriate 2.5G or 3G wireless data plan could also access the content using a smartphone (Figure 46). The SlingBox, originally priced at US$250, proved to be much more popular than Sony’s Location Free offering.

A software only solution for placeshifting and timeshifting of video has been made available by Orb Networks. Software on the user’s home PC serves up streaming video from a TV tuner or from video files stored on the PC. A laptop, PDA or smartphone with

292

a web browser and a media player (e.g. Windows Media Player or Real Media Player) is

used to access the content. The same Orb software also makes the music, photos and

other content on the home PC available remotely. Orb originally charged $10 per month

for the service but in March 2005 made the basic service free of charge and hopes to generate revenue from add-on services and advertising (Baig, 2005).

The examples of technological solutions for timeshifting and placeshifting television and other video content discussed so far have, for the most part, not involved the active

participation of the content industries nor the fixed/mobile telecommunications industry.

Portable TV manufacturers strove to provide devices that took advantage of the same free-to-air broadcasts intended for fixed receivers. Likewise portable video cassette players mostly used the same VHS cassettes used by home based players69. Both of these

approaches for mobilizing video content had parallels in audio only applications. In-car

and battery powered radios of all sizes receive the same analog radio signals as home

based receivers. Personal audio cassette players, epitomized by Sony’s , became

a hit consumer electronics product in the 1980s.

Figure 45. Sony "Location Free" base station and wireless tablet

69 There were also smaller video cassette players and recorders available from the early 1990s that used smaller cassettes containing 8mm magnetic tape. This format did not become a mainstream format except for use in camcorders and few pre-recorded tapes were released in the format.

293

However, video is different than audio – both in technological terms and in the ways that people make use of each medium. While a TV receiver could be miniaturized power consumption and picture quality remained showstoppers. The size of the VHS cassettes and the mechanisms required to handle them confined mobile use to in-vehicle applications. Music, audio books, comedy shows, news and other types of audio content can be, and are, listened to by people engaged in many mobile activities (e.g. driving, walking/running, and while using public transport). It is not practical, and in some instances simply unsafe to view video in many of these mobile scenarios. While, drivers may never take to watching video (we hope) car, train, and plane passengers have. At the time of writing this is most often achieved using portable DVD players and more recently with portable media players. Despite DVD and media players incorporating much more complex technologies than the preceding portable video cassette players the advances in semiconductor integration, data storage, and the compression of digital video have led to significantly more practical, economical and portable equipment and storage media. It is notable that many of these technological advances were not driven by the requirements of portable devices (e.g. DVDs and hard disks). However, some of the advances were primarily aimed at mobile applications (e.g. batteries and LCD displays). Both technological limitations and different usage patterns were behind the mobilization of television and video players not following the pattern set by broadcast and recorded audio content.

294

Figure 46. SlingBox and Sling Media’s viewer application on a laptop and a smartphone

The video capable media players evolved out of audio only devices as storage

capacity and display technologies advanced. This also coincided with the digitization of broadcast video as well as the increasing processing power and storage capacities of PCs and broadband Internet access technologies making the processing, storage and downloading of high-quality video a reality for many home users in the US and UK.

Technologically, these advances made mobile video practical. The launching of the

placeshifting and timeshifting digital technologies like Sony’s Location Free, the

Slingbox, and Orb Network software only solution, provide evidence that mobile access

to digital video became practical in the mid-2000s. However, the more limited sets of

scenarios for viewing video content, in comparison to audio only content, remains.

Many of these technologically diverse solutions for mobilizing digital video have

required little active participation of the content providers or players in the wireless

telecommunications industry. Wireless data links can be used to access video on laptops,

PDAs, or and in principle (but only sometimes in practice70) the

70 In the mid-2000s wireless network operators routinely blocked the use of streaming video services from the data plans they offered.

295

downloading or streaming of video should be like any other king of data. Several of the time and placeshifting technologies are only intended to allow users to have remote access to video content that they have already paid for and should be covered by fair use and legal precedent. However, once users seek to download legitimate content specifically targeted at mobile devices content providers must be directly involved. For examples TV studios have made some, but only some, of their content available for the video iPod through Apple’s iTunes service.

Viewing analog television on mobile phones was first realized in Asia with

Samsung’s launch of a handset (see Figure 47) with an analog TV tuner for the Korean market in June 2003 (George, 2004). This allowed people to watch free to air programming using their devices. However there was not much incentive for wireless network operators to support such devices as the TV feature does not generate revenue for the operator. Use of the handset to watch TV was limited to about two hours with a standard battery. It is unlikely that such handsets will ever be launched in the US or the

UK as analog TV will be phased out in February 2009 in the US and by 2012 in the UK.

Besides which several digital technologies designed with mobility in mind were already being developed and deployed.

With technological advances in multiple domains (semiconductor integration, mobile processors, flash and magnetic storage capacity, color displays, data communications capacity, processing, and even battery technology) many new ways of watching video on the move have emerged. The approaches to providing mobile television and video discussed so far have not required the active participation of the wireless network operators. Next the mobile TV and video services that have been trialed

296

or launched commercially with the active participation of the operators in the UK and the

US are described. Services offered using unicast 3G technology (i.e. where each viewer individually downloads or streams video content) are examined first, followed by the options for broadcasting video to handheld devices.

Figure 47. Samsung SCH-x820 CDMA handset with integrated analog TV tuner (Source: www.mobileburn.com)

297

Mobile TV and video unicast services

3 UK has been offering video clips since 2003 (UMTS_Forum, 2003) and T-Mobile provided a service for the delivery of video clips of goals from the Euro 2004 soccer tournament to mobile phones (BBC News, 2004). The first wireless network operator in the UK to offer multi-channel mobile TV commercially was Orange. The service, branded as Orange TV, was launched in May of 2005 with nine channels: ITN News,

CNN International, Extreme Sports Mobile, and 24/7 access to Channel

4’s Big Brother.” In October of the same year Orange made 3G services, including

Orange TV, available to pay as you go (PAYG) customers (Telecomworldwire, 2005) and added nine channels including a mobile channel dedicated to cricket (PRNewswire,

2005). As of March 2007 Orange’s 3G based mobile TV services offered 26 channels in several bundles offered at £5.00/month (see Table 13). In addition, users could subscribe to all the channels for £10.00/month. Viewing was limited to 20 hours/month (Ashley,

2005)71.

Orange used the MobiTV mobile television platform first launched in the US in

November 2003. MobiTV’s US partners included Sprint PCS and Cingular. The MobiTV brand was also prominent on the Orange TV website and the Orange TV portal on the phones themselves. As well as providing the mobile TV platform, MobiTV also dealt directly with the channels and rights owners, and packaged the content for Orange and its other operator customers – "We acquire rights, we encode and digitise the content, transport it and deliver it through a wireless operator's gateway to a consumer's device.

Then we wrap that all up as a managed service" ("MobiTV brings mobile video channels

71 20 hours of video corresponds to about 1Gigabyte according to (Victor, 2005)

298

to Europe," 2005). One restriction was that at launch viewing Orange TV was restricted to 3G customers using one of two Nokia handsets (see Figure 48). By March 2007 all 3G handsets were said to support Orange TV although some better than others72.

Table 13. Channel bundles offered on Orange TV (as of March 200773)

£5 mix pack 1 £5 mix pack 2 £5 family pack Aardman Kiss fm Smashhits My Movies Kerrang! My Movies Living tv British EuroSport ITN News Channel 4 mobile FHM TV Channel 4 ITN News Aardman Gong E! ITN News Living tv Bravo Comedy Time Cartoon Network Channel 4 Toon World Bravo Disney £5 Sky entertainment pack £5 Sky sports football pack £5 music pack Bravo Sky Sports News Kiss fm Living tv Sky Sports Soccer Special Kerrang! Cartoon Network Champions League Smash hits Sky One Soccer AM Trace E! Magic Discovery lifestyle

In October 2005 some statistics about usage were released (PRNewswire, 2005). This

showed users were watching during work breaks (36%), while traveling (19%), while

waiting for friends, in a queue (12.6%), or at home (10.1%). The CNN and ITN news

channels accounted for 34% audience share. Big Brother, an extremely popular TV show

in the UK, topped 30% of audience share at the beginning and end of the series.

72 The OrangeTV website in March of 2007 was somewhat ambiguous about compatibility and contains contradictions. 73 As detailed at http://www1.orange.co.uk/entertainment/tvMobi/channels.php March 27, 2007.

299

Figure 48. Handsets supporting Orange's mobile TV offering (at May 2005 launch)

The “Sky Mobile TV” offering for Vodafone 3G customers was launched November

1, 2005. The service was available to Vodafone’s 3G customers for free for the first three months. Some channels were broadcast ‘live’ while others were “dedicated ‘made for mobile’ channels, featuring regularly updated blocks of programming” played in loops.

A total of 19 channels were offered in two bundles that cost £5.00 per month after the free promotional period (www.3g.co.uk, 2005). The central role of sport was made evident by the emphasis on the availability of exclusive live cricket coverage in the press release announcing the launch (Vodafone, 2005). The following channels were to be part of the offering at launch or shortly thereafter:

• News, Sport & Factual Bundle: Sky News; CNN; Bloomberg; Sky Sports News;

At The Races; Discovery Factual; National Geographic Channel; History

Channel.

300

• Entertainment & Music Bundle: Sky One; Sky Movies; MTV (two channels);

Living tv; Discovery Lifestyle; Nickelodeon; Paramount Comedy; Cartoon

Network; Bravo; Biography Channel.

The “Sky Mobile TV” offering was available on all Vodafone 3G handsets from the outset. Three compatible models are show in Figure 49.

(a) Sony Ericsson V800 (b) Motorola E1000 (c) Sharp 903

Figure 49. Examples of the Vodafone 3G handsets that support "Sky Mobile TV"

As of March 2007 the Vodafone offer included 32 channels sold in three packages

(Table 14). The first two bundles were priced at £5.00/month while the third “variety”

bundle was charged at £3.00/month (the cheaper channel was not Sky branded and did

not include any Sky channels). All three bundles were also offered at £10.00/month.

The mobile TV offerings from Orange and Vodafone were severely constrained by the 3G communications mechanism used. The digitized video was transmitted to each user individually. Even if several users were simultaneously watching exactly the same content in the same cell the network the video was streamed to each user individually.

This mode is referred to as unicasting – which contrasts with the traditional broadcasting

301

approach where a single transmission can be picked up by any number of receivers within the transmitter’s range. The unicasting approach and the relatively high bandwidth requirements for video74 severely limit capacity. One analyst calculated (New Media

Age, 2005a) that if 40% of subscribers watched just eight minutes of video content a day the 3G network would “grind to a halt.” The use of wireless spectrum for unicast mobile

TV is orders of magnitude less profitable that other uses e.g. one estimate is that network operator revenue for a megabyte of SMS texting is about £268.00 but only about £0.20 for a megabyte of video data (New Media Age, 2005a).

Table 14. Channel bundles offered by Sky Mobile TV on Vodafone (March 2007)

Sky News, Sports Sky Entertainment Pack Vodafone Variety Pack & Factual pack (£5/mth) (£5/mth) (£3/mth) Sky News (live) E! Channel 4 Sky Sports News (live) MTV Trax Big Brother CNN (live) MTV Snax HBO Mobile Extreme Sports Channel Sky One ITN News At the Races Sky Movies ITN Weather Bloomberg (live) Living TV British Eurosport Discovery Factual Paramount Comedy UEFA Champions League The History Channel Bravo Chilli TV National Geographic Cartoon Network Fashion TV Nickelodeon GMTV Discovery Lifestyle Fox 24 The Biography Channel

T-Mobile trialed mobile TV in Germany and the Czech Republic in 2005, and launched a mobile TV service in Hungary in November 2005 (BusinessWire, 2005;

Cable&Sat, 2005). During the Euro 2004 soccer tournament T-Mobile UK offered video clips of goals to handsets just a few minutes after they were scored (Norris, 2004; Rigby,

2004). Despite these early forays into mobile video it was June 2007 before T-Mobile

74 Video is being transmitted at around 110 kbits/s (calculated from the Orange TV’s website which states that 20 hours of viewing corresponds to about 1Gbyte of data).

302

launched its mobile TV service in the UK. Nine channels were offered. Users were

charged £1 for 24 hours access or £3.50 for 1 month of access. There were nine channels

in the initial offering: Big Brother, Channel 4 favorites (3-5 minutes made-for-mobile

clips of popular shows), ITN news, Live Eurosport, MTV (music videos), MTV (made

for mobile programs), Pocket Comedy, Paramount Comedy, and Nickelodeon. By late

2007 it was also offering BSkyB branded news, entertainment, and music bundles for

£5/month each75.

The other UK operators, 3UK and O2 also had video services. As of November 2007

3UK offered a bundle of over 20 TV channels for £5/month, and it claimed that

“customers have downloaded more than a million reality TV clip in the last year” (C.

Arthur, 2007). O2 provided a range of video content that could be downloaded or streamed to handsets with each clip priced individually.

In the US Sprint PCS was the first wireless operator to offer a video service on its

handsets. It deployed the MobiTV solution with content from MSNBC, CNBC,

Discovery Channel, The Learning Channel, CSTV: College Sports Television, California

Music Channel, CMC Beat Lounge, CMC-USA, Independent Music Network, CNET,

Discovery Kids, ToonWorld TV Classics and Discovery en Espanol are all part of the

initial package (MobiTV, 2003). The service was available to customers with an

appropriate handset for $10 per month. Further channels, including sports content, were

added in following years, and in August 2004 a video on demand (VoD) capability was

added that provided subscribers access to about 600 new video clips per day covering

news, sport, weather and entertainment (MobiTV, 2004). Cingular Wireless launched the

MobiTV service on its network in January 2005 with 22 channels (MobiTV, 2005b) for

75 From http://www.t-mobile.co.uk/services/mobile-tv-video-services/mobile-tv/ retrieved Nov 18, 2007.

303

$10/month (plus data charges) and added 40 commercial stations in November

2005 (MobiTV, 2005a).

Verizon wireless launched its Vcast service in January 2005 (Noguchi, 2005). Rather

than stream video content to handsets VCast chose to offer video clips for download. The

initial press release promised “more than 300 daily updated videos from leading content

providers” for $15/month and access to other premium content at additional cost

(Verizon_Wireless, 2005). Three handsets supported the service at launch.

By early 2006 there were signs that the mobile video services were selling well in the

US – the Director of Wireless Data Services at Sprint-Nextel mentioned that its

subscriber numbers for the service were in the order of those of a top-ten multi-system

cable operator (implied between 700,000 and a million subscribers) (Marek, 2006). The

same article quoted rumours that Verizon Wireless’s subscriber numbers for mobile

video services were also many hundreds of thousands. These were seen as impressive

figures given the limited number of handsets on which video services were available.

Mobile TV and video broadcast services

An Alcatel study concluded that consumers were prepared to pay 10 Euros for mobile

TV and that 3G based solutions were inadequate (Glover, 2006a) for a mass market with potentially 200m users demanding some 50 channels. This, they argued, was beyond the capability of unicasting solutions then provided by 3G. The recognition that supplying video content using 3G unicasting technologies would not support a mass market was behind moves by mobile operators to explore and deploy broadcast architectures (Wray,

304

2006b). Several technological options for the broadcast of digital TV to handheld devices were proposed, trialed, and in a few cases launched commercially. These options included DMB, DVB-H, MedioFLO and ISDB-T, most of which were extensions of existing broadcast technologies. There was also an extension to the UMTS 3G technology that supported multicasting (including broadcast video) called MBMS. With the exception of ISDB-T (the Japanese format for digital television and radio broadcasting) all these options were used, trialed, or explored in the UK.

Digital Media Broadcast (DMB) built upon the European Eureka-147 standard for

Digital Audio Broadcast (DAB). The first phone to integrate DMB television reception was launched in November of 2004 (PhoneContent.com, 2004). DMB was developed in

South Korea and the first commercial services using satellite and terrestrial broadcasting were launched in May 2005 and Dec 2005 respectively (Korea_Times, 2005; Min-hee,

2005 ). In July 2005 DMB was also accepted as ETSI standards (TS_102_427 and

TS_102_428). In the short term at least a DMB/DAB based approach to mobile TV broadcasting benefited from existing spectrum allocations (i.e. the allocations for ) in the UK as well as a broadcast infrastructure supporting the digital radio services.

British Telecom (BT) and Virgin Mobile, trialed a DMB based mobile TV system in the London area in the latter half of 2005. Other partners in the trial included Microsoft,

HTC (a Tiawanese handset manufacturer) and Digital One (a joint venture backed by

Arquiva and GCap Media that operates the national commercial DAB multiplex in the

UK) (MobieCommIntl, 2005). The trial used the 20% of Digital One’s multiplex set aside for multimedia. The channels broadcast during the trials were: Sky News, Sky Sports

305

News, E4, ITV2 and the Blaze music channel. Over 50 digital radio stations were also available. One thousand of Virgin Mobile’s customers were selected as trialists.

Results of the trial showed that (Wray, 2006b) that people used the radio portion of the trial more than television (95 minutes per week on average compared to 66 minutes).

The most popular viewing times were early morning and late evening and quite a lot of viewing happened at home (Ofcom, 2006a, pp. 112-113). Virgin Mobile’s research showed that most trialists were willing to pay around £5/month and some up to £8/month.

BT subsequently set up a business to support mobile TV. It was marketed as BT

Movio (formerly BT Lifetime). Although its first commercial offering was built on DAB-

IP76 technology it positioned itself as being able to support any mobile broadcast TV technology (see Figure 50). BT claimed that their calculation that only 6 users could view mobile TV in a cell was behind their decision to bid for DAB spectrum that supported true broadcasting (New Media Age, 2005a). BT Movio were wholesaling the service to all UK operators (BT_Movio, 2007).

Virgin Mobile launched the UK’s first broadcast mobile TV service on October 1,

2006 using a DAB/DMB based solution and BT Movio’s broadcast technology. While the commercial advantage was that the digital radio spectrum and DAB/DMB broadcasting infrastructure was already in place the downside was limited capacity.

Although the service could offer up to six channels (Wray, 2006b) it offered only four by the end of 2006. A review of the first phone available for the service (the HTC Lobster

700 shown in Figure 51) noted problems with signal availability and the inconvenience of having to use a headphone cable as an antenna – this seemed clumsy in an era of

Bluetooth headsets (Miles, 2006). By December 2006 take-up appeared modest (Wray,

76 Part of the existing DAB standard (EN 300 401 v1.4.1)

306

2006a) and sales of the Lobster phone poor (Maitland, 2007; Wray, 2007c). In July 2007

it was announced that service was to be closed down – the lack of attractive handsets

was given as part of the reason behind its failure (C. Arthur, 2007; Ray, 2007).

Figure 50. BT Movio mobile TV architecture (BT_Movio, 2007)

By mid-2007 about 3.9% of the 45 million UK mobile phone users were using their handsets to watch user generated content sent by friends or family members more than once a month. However, just over 1.2% were viewing operator provided on-demand on broadcast services (C. Arthur, 2007).

The DVB-H standard was developed as an extension of the DVB-T terrestrial digital standard developed for use in Europe and elsewhere. It was developed by the European

307

Broadcast Union (EBU) with Nokia having been a particularly strong supporter. The

standard77 was ratified and published by ETSI in November 2004 as EN 302 304 v1.1.1.

The technology was deployed in a six month trial in Oxford UK by wireless network

operator, O2 and broadcast network operator Arqiva. The handsets and the broadcast management system used were provided by Nokia. The video content came from some of the country’s largest broadcasters (BBC, ITV, Channel 4, CNN, Sky News and MTV).

There were 375 volunteer trialists (all between 18 and 44 years old).

Figure 51. Virgin Lobster 700 TV phone

Trialists watched over three hours per week on average with each viewing session

lasting an average of 23 minutes. The much higher level of weekly viewing than in the

Virgin trial (66 minutes/week) bolstered arguments that customers demanded a wider

range of content than available with a six channel DMB based system (Wray, 2006d).

77 A version for transmission by satellite DVB-SH was approved February 2007, see http://www.dvb.org/news_events/press_releases/press_releases/DVB_pr154%20SVB-SH.final.pdf

308

The most popular viewing times were before 9am, between 12 pm and 2pm and

between 6pm and 8 pm (i.e. commuting and lunch times). Perhaps surprisingly 36% of trialists said they used it most often at home. Over 80% were satisfied with the service and 76% said that they would take it up within 12 months - although some did not like

the handsets. While O2 saw the results as an indication that there was demand for a

multiple channel national mobile TV service others were not so certain. One analyst

hypothesized that there could have been a novelty effect (New Media Age, 2006).

Trialists were volunteers, were not charged for the service, and 31% of them did not have

multichannel television at home – so willingness to pay may have been overstated

(NewMediaAge, 2006; Ofcom, 2006a, pp. 112-113).

Capacity constraints of 3G based unicasting were behind mobile operators’ search for a broadcast based solution (Wray, 2006b). The five main operators (3UK, O2, Orange, T-

Mobile and Vodafone) favored the open DVB-H broadcasting standard. The challenge for UK operators was that the UHF spectrum most suitable for mobile TV was used for

analog television and would not be available for other uses nationally until 2012. The five operators asked Ofcom for early use of one of Channel 36 (590 – 598 MHz) for use a

DVB-H network. The mobile operators wrote to Ofcom to ask for approval to work

together in the construction of a national mobile broadcast network. Arquiva, the broadcaster that partnered with O2 for the Oxford trial, also tried to form a consortium of

mobile operators to persuade Ofcom to release spectrum for mobile TV (Wray, 2006d).

In March 2006, Viviane Reding, the EU Commissioner for Information Society and

Media, made statements taken to mean that the commission would harmonize not only

European mobile TV frequency allocations (Reding, 2006) but also select a single

309

standard if the ‘mobile TV industry’ did not select one itself (Ward, 2007). In July 2007

Reding announced that the Commission was backing the use of DVB-H for mobile TV within the EU. The Commission was to add it to the official list of European standards, encourage its use in member states, and could take steps to make it a mandatory standard

(European Commission, 2007). In justifying the action Reding cited the Commisssion’s support of GSM: "Mobile broadcasting is a tremendous opportunity for Europe to maintain and expand its leadership in mobile technology and audiovisual services.

Europe is today at a crossroads. We can either take the lead globally – as we did for mobile telephony based on the GSM standard developed by the European industry – or allow other regions take the lion's share of the promising mobile TV market. 'Wait-and- see' is not an option. The time has come for Europe's industry and governments to switch on to mobile TV." Some operators and broadcasters, as well as proponents of alternative systems condemned the move – preferring to leave technology selection to the market

(Austerberry, 2007; Lilley, 2007; O'Halloran, 2007; Wray, 2007a).

A broadcast mechanism for UMTS networks called Multimedia Broadcast and

Multicast (MBMS) has been defined as part of 3GPP Release 6 UMTS specification. As well as bringing the scalability of the broadcast mechanism it has the advantage that it can use the unpaired spectrum that was part of many UK operators 3G licenses (see

Figure 52). Its integration with UMTS/WCDMA should also made management of the system and billing easier for wireless operators. “TDtv” is a MBMS based solution from a company called IPWireless (Wireless_News, 2006). All the UK network operators

(except T-Mobile) cooperated in trialing TDtv in Bristol for several months at the end of

310

2006. This technical trial found that the technology was not as mature as MediaFLO nor

DVB-H (Wireless_News, 2007).

In the US two companies had spectrum and plans to deploy DVB-H (Fitchard, 2006).

Hiwire Technology (owned by Aloha Partners) had 12MHz of spectrum (UHF channels

54 and 59) and plans to launch a trial in Las Vegas in collaboration with T-Mobile

(Smith, 2007). Modeo (owned by Crown Castle) had 5MHz of L-band spectrum (1.670 -

1.675GHz). The company trialed the technology in Pittsburgh in 2005 and according to the company’s website (accessed April 25, 2007) ran a 6 week beta trial in New York

City. Hiwire’s UHF spectrum should allow it to create a nationwide network more

quickly and for much less money than Modeo – $450 million as opposed to $2.2 billion

(Kharif, 2006), although a FCC waiver allowing Modeo to increase transmission power

would reduce capital and operational expenses somewhat (Ramke, 2007). The trials

included advanced features like ‘push VoD’ and DVR functionality (Ramke, 2007).

The first widely available mobile TV service in the US based on a broadcast

architecture was launched by Verizon Wireless in March 2007. It was initially available

in 20 markets (Smith, 2007). Content on eight channels came from CBS, Comedy

Central, ESPN, Fox, NBC News, NBC Entertainment and Nickelodeon (Wickham,

2007). Two compatible handsets from LG and Samsung were available at launch.

Verizon Wireless used Qualcomm’s MediaFLO network and service. Qualcomm

argued that by being ‘unencumbered by legacy terrestrial or satellite delivery formats’ it

was able to optimize the design of the MediaFLO air interface for the broadcast of video

and other multimedia content to the mass market. Competing technologies (ISDB-T, T-

DMB, S-DMB, and DVB-H), it argued, were derived from standards developed for

311

terrestrial or satellite television or radio. The claimed advantages included being able to incorporate more modern channel coding algorithms (i.e. turbo codes as opposed to convolutional ones). MediaFLO had 6 MHz of UHF spectrum (channel 55) with capacity

for 20 mobile video channels (Wickham, 2007). The MediaFLO platform was positioned

as being able to deliver both local and national streamed video content (e.g. traditional

TV channel content) using its Single Frequency Network (SFN). It also delivered video

clips (e.g. weather, music or news clips) to devices in the background so that they were

available on-demand (Push VoD) and of say financial information was also

possible (MediaFLO, 2007). In early 2007 AT&T Wireless also adopted MediaFLO as its

solution for broadcast mobile TV (Kharif, 2007; Qualcomm & AT&T, 2007).

Figure 52. UK version of the IMT2000 / UMTS band plan

BSkyB completed two trials of MediaFLO technology with Qualcomm in the UK.

The first trial was carried out in Cambridge during Summer 2006 and the second in

Manchester the following winter (Qualcomm, 2007). The UK network operators were said to be fearful that they would be cut out of mobile TV in the UK if the dominant pay-

TV broadcaster, BSkyB, went it alone with a MedaiFLO network. BSkyB beat out a

312

consortium of the mobile operators for a three year deal for mobile broadcast rights for

Premiership League soccer matches (Wray, 2006a).

Television industry perspective on mobile TV and video

Articles in the UK television industry press during 2005 and 2006 showed that there

was significant uncertainty about how to approach mobile TV. There was uncertainty

about the right sort of content for the extremely small screens on handsets, and the shorter

viewing sessions that trial users have seemed to prefer. However, there had been mixed

signals from trials and operational services about whether people wanted the same

content they got on traditional TV or shorter pieces (snacks) that took account of the

limitations of the screen size on handsets.

“A big part of our strategy is trying to find the kind of content that will help the

[mobile TV] market take off.”

[Quote from Senior VP of FremantleMedia Licensing Worldwide in (TV Business

Int., 2005)]

The following quotes sums up some of the technological uncertainty and the uncertainty around content.

Despite the launch of Orange TV, trials of DVB-H and imminent commercial

launches from BT Livetime and Sky, ITV's Fell says we have yet to see what

mobile TV will look and feel like when it becomes mass market.

313

"I wish we could get everyone to just go with one system," he says. "Instead we're

getting competing technologies being trialled and launched and it's just so

confusing. You have 3G that clearly won't be able to handle mass broadcast, you

have DVB-H that has no spectrum, and then you've got DAB that's meant for radio,

which uses less than a quarter of the bandwidth needed for a TV channel, so

presumably you'll have fewer channels. "So these are fascinating times, but we're

far from seeing what mobile TV will look like. Who's to say it needs to look like

normal TV but on a mobile. Why shouldn't it be a mix of mobisodes, clips,

highlights and, say, a channel broadcasting what you missed last night?"

[Quote from ITV employee in (New Media Age, 2005a)]

One vision of the role of mobile TV media was that it could allow viewers to catch up with missed episodes – i.e. fulfilling the role of a PVR (New Media Age, 2005a), and experiments were used to understand whether mobile TV would cannibalize or complement traditional TV viewing.

“Before long I think we’ll see dedicated content that reflects the medium’s

strengths. It will be short-form, self-contained stories with an aesthetic tailored to

the mobile screen’s size. There’ll also be more emphasis on mobile’s interactive

capabilities.”

[Quote from Senior manager of licensing at MobiTV in (TV Business Int., 2005)]

Successful experiments with mobile TV content in the UK included allowing fans to download clips of the X factor and one minute episodes of the Fox TV drama 24. These offerings were linked to popular TV shows, and other mobile offerings – such as “news

314

and gossip via SMS and MMS alerts, ring tones and wallpaper downloads, competitions,

games and quizzes” (New Media Age, 2005c; TV Business Int., 2005).

The Cartoon Network’s offerings were aimed predominantly at preteens. Games based on their cartoon characters were very popular on the Cartoon Network website.

However, the availability of games and video content to preteens is limited by the fact

that children tend to be on pre-pay mobile plans and use cheap basic handsets. The

Cartoon Network on Orange TV was seen, at least by one VP as “an effective means of reaching out to an older audience. . . . we’re finding plenty of adult businessmen are more than willing to watch an episode of Scooby Doo while they wait for a train” (New Media

Age, 2005b).

TV viewing patterns changed with the advent of multichannel television and the increasing popularity of the Internet. The audience for the mainstream terrestrial channels was falling (BBC News, 2001; Ofcom, 2006a, p. 259) due to competition from pay TV services. Broadcasters were concerned that fewer young people were watching television

(BBC News, 2006; Ofcom, 2006a, pp. 255-258). Broadcasters and advertisers saw mobile as a possible way to stay with people when they are out and about (New Media

Age, 2005c). There was uncertainty about how adverts would fit into short pieces of content – and thoughts that perhaps sponsoring channels might be more effective (New

Media Age, 2005a).

Broadcasters noted significant challenges in obtaining mobile rights, and digital rights

more generally, for existing content. Some commenting that creating new content shot

specifically for mobile was easier than renegotiating rights to existing content (New

Media Age, 2005a, 2005c).

315

Discussion of mobile TV and video

In this section we return to the research questions presented in Table 5 of chapter 3.

We use the descriptions of the emergence of mobile television and video services in the

US and the UK presented in this chapter to provide a range of answers to these questions and highlight where the theoretical perspectives highlighted in chapter 2 provide the most insight.

Standards creation and adoption (RQ1)

Most of the standards for the delivery of broadcast TV to mobile devices have been built upon other standards. The DVB-H standard builds upon the existing DVB series of standards developed in Europe for terrestrial, satellite, and cable television. The

DAB/DMB solution adds a limited video capability to the DAB systems originally

developed for audio broadcasting. The MBMS option is built upon UMTS mobile

communications standards. Only the proprietary MediaFLO system has been developed

from the ground up for mobile TV.

The creation for most of these standards was carried out in committee settings. The

proprietary MediaFLO solution was going up against solutions with more established

actor-networks. However, the supporters of MediaFLO were not simply entering a

standards war with the other options. Rather the MediaFLO business model entailed

offering a complete solution for mobile network operators. The offering included the

technologies, the broadcast infrastructure, the spectrum, and the content. Thus a more

316

complete technological and operational actor-network was offered that reduced the technological and financial risk for operators. This proved to be attractive in the US where the two largest mobile network operators adopted MediaFLO for their broadcast mode mobile TV services.

One UK operator adopted DAB/DMB as a pragmatic measure to get into mobile TV as it was the only option that was both sufficiently developed and had spectrum available.

However, this particular offering, whether because of poor device support, or the limited content capacity did not attract customers.

One might be able to argue that augmenting existing standards could be explained as attempts to extend the network externalities of existing standards by building upon well understood technologies. Many engineers would be familiar with the existing standards and there would be semiconductor designs that could provide a starting point for developing devices for the extended standards. However, the adoption of MediaFLO in the US and of DMB/DAB in the UK shows that there are more elements to the relevant actor-network than such indirect externalities. In the MediaFLO instance the building of a near complete actor-network for mobile TV was more successful at enrolling network operators. Similarly, the availability of spectrum and infrastructure were key actors without which no broadcast based offering was going to be possible.

The institutional perspective might have some explanatory power when we consider future adoption of DVB-H in Europe, if, as seems likely, the European Commission continues it support for this particular standard. It would be easy to put the pattern of adoption down to the effect of regulative institutional pressures. However, the institutional perspective has relatively little to say about the particular outcomes seen in

317

the US and UK at the time of writing. The actor-network perspective allows for the more

strategic actions based on multidimensional interests and for the role of constraints such

as spectrum availability.

The unicast mechanisms for the delivery of video to handsets have proved easier to

coordinate than the broadcast mechanisms – at least in the short-term. The MobiTV

unicast based offering was available on about 175 handsets while Verizon’s MediaFLO

based broadcast mode mobile TV was only available on two models (Kharif, 2007).

MediaFLO’s broadcast option required a separate radio receiver to be incorporated into

devices since it uses a different frequency band and modulation scheme than those used

for wireless phone and data services. Unicasting services could rely n the data

communication mechanisms that have been improving since the 2.5G wireless systems

were launched. While operators and their suppliers had to allow for different screen

resolutions and other handset capabilities this proved to be more manageable than getting

the industry to adopt a single format this high in the stack.

Inter-organizational coordination and relationship building (RQ2)

Despite an unpromising historical background for mobile TV, wireless network

operators, broadcasters, and technology developers are investing heavily in bringing TV

and other video content to the handset. The numerous trials of mobile TV broadcasting technologies and the experience with unicast services appears to have given network operators and broadcasters confidence that there were viable business models for mobile

TV, or at least that it was a valuable addition to existing business models. Mobile

operators saw mobile TV as a way of adding a service that could contribute to revenue

318

growth in the face of mature markets and margin pressure on traditional services. In the

short run mobile video and TV services could be a differentiator for network operators, but as with many of the innovations in the industry such a competitive advantage was likely to be short-lived.

Mobile TV also provided broadcasters a way of addressing declines in TV viewing,

particularly by allowing them to reach out to young people where the decline was most

pronounced (BBC News, 2006). For advertisers there was the potential to reach people

with rich content during the day as well as in the evening. The network operator 3UK

started to offer some video content free of charge with revenue coming from advertisers

paying for slots before and after the clips – e.g. ITN was supplying a range of content

using this model (O'Brien, 2007).

The overall structures of the mobile wireless and television industries were not

radically changed by mobile TV. However, it did bring about new connections between

these industries whose only prior connections concerned contention for UHF spectrum. In

the US intermediaries provided services to each industry. For example, MobiTV and

MediaFLO both negotiated content rights and provided mobile network operators with

technical solutions and content bundles. In the UK MobiTV used this model to provide

the technology and content for 3UK and Orange. Thus a new type of actor was

performing the translation of the interests of key actors in two industries to deepen the

links between these two previously sparsely connected actor-networks.

The dominant commercial pay TV broadcaster in the UK, BSkyB, was striving to

have its content available via all wireless carriers: “We’re network agnostic we’re talking

to all operators,” said a BSkyB spokesman (New Media Age, 2007). BSkyB extended its

319

role as an intermediary for other content providers into mobile TV (New Media Age,

2005c). Its winning of the mobile broadcast rights for Premiership football [soccer]

matches in a three-year deal after beating the competing offer by the mobile phone

companies showed a on-going commitment to mobile TV (Wray, 2006a).

Most of the large scale trials and commercial launches of mobile video and TV services in the UK involved traditional mobile network operators and traditional broadcasters. However, there are other possible arrangements. For example a broadcaster could conceivably go to market with a mobile TV offer of its own and make it available to any network operator or directly to customers using handsets incorporating an appropriate receiver. BSkyB trialed MediaFLO in Cambridge and in Manchester (PR

Newswire, 2007). This could indicate that BSkyB would consider a mobile broadcasting service that bypasses mobile operators. This industry configuration is common enough in broadcasting where large companies engage in content creation, aggregation, and distribution without precluding distribution of their content on competitors’ platforms.

Players from the traditional TV industry were also setting up mobile portals for their own content. For example ITV in the UK launched a mobile services portal that worked with WAP phones. Viewers initially gained access to the portal by sending an SMS to a

‘short code.’ Thus broadcast networks could bypass wireless network operators. ITV had an advantage over other organizations in that it was easier for it to market the service on it traditional TV channels.

320

Relationship between standards creation/adoption and inter-organizational

coordination/relationship building (RQ3)

Analog TV standards provided a platform for the first attempts at mobile video. These

were purely consumer electronics devices and there was no need for coordination with

broadcasters or content providers. The actor-network building was simply the attempted

enrollment of viewers. However, the technical limitations, and the clumsiness associated

with using them on the move, kept pocket TVs a niche product – manufacturers were

only enrolled niches like sports fans and people looking for a backup for emergencies.

With increasingly capable mobile devices the processing of digital video became feasible. However, technical limitations still made receiving terrestrial digital broadcast

TV on handsets impractical. Computer video formats proved more suitable and were the basis for video clip and streaming services to handsets. Nevertheless the capabilities of handsets in terms of screen resolution, processing power, and storage, remain dwarfed by those available on personal computers.

Actor-network building got significantly more complex when video services were targeted specifically at mobile devices, and differed depending on whether unicasting or broadcasting modes were deployed.

On the technical side of unicast services a considerable layer of complexity was

introduced by having to understand the capabilities of a diverse population of handsets and transcode video material to match. In addition mechanisms for the aggregation of content and charging for content had to be deployed in the network infrastructure. On the content side the rights to use the video content had to be procured. This gave intermediaries (e.g mobiTV) an opportunity to build networks of content providers and

321

video delivery solutions to enroll network operators and end users. Simultaneously it

built a network of operators to provide an attractive market for content providers. Since

these were closed systems proprietary solutions were fine. Making the necessary

technical connections, say to support many types of phones, was achieved using the

malleability of software. This sort of flexibility is only possible with digitized video. So

standardizing on the binary digit (or bit), rather than continuously varying levels, provided a platform that made these mobile TV unicast systems feasible. Data communications mechanisms in 2.5G, 3G and 3.5G mobile systems provided the flexibility that allowed mobile TV and other services to be launched without the need for extensive new physical infrastructures.

Actor-network building around a broadcast mode service was similar on the content side but radically different on the technical side. Broadcast signals could only be in one format and handsets had to incorporate the necessary receiver and signal processing. A different radio frequency and a signal format requiring processing in dedicated hardware meant that the malleability of software was not available to facilitate technical coordination (or actor-network building). Additional infrastructure and spectrum was also required to support a broadcast mode. A mass market was required to economically justify investment in broadcast mobile TV (Kharif, 2007) and a mass market could only be supported by such a broadcast mechanism. Given these constraints to the possible configuration of the industry it was likely that only a small number of mobile TV standards could ever be deployed. In Europe it looked like the selection of a single standard was being performed by regulatory intervention at the European level. In the US it was unlikely that such intervention would take place and the first to market strategy,

322

heavy investment infrastructure, and the enrollment of the two largest network operators gave MediaFLO what may well be an unassailable head start in a standards war with

DVB-H in the US.

Role of initial conditions (RQ4)

Differences in current spectrum availability for deploying broadcast mechanisms for mobile TV have been behind the different market offerings in the two countries i.e. just a few channels using DMB in the UK while 20 or so channels became available in the US using MediaFLO. Some of the differences were inadvertent e.g. one the companies that planned to offer DVB-H based mobile TV in the US originally purchased the UHF spectrum with the intention of offering data services. The fact that the UHF spectrum currently used for analog TV is likely to made available for mobile TV (and other services) in the US sooner that in Europe may drive further divergence between the ways that broadcast mobile TV will plays out. Similarly the historical reasons why some UK operators have unpaired 3G spectrum while US operators do not may be similarly influential.

Regulatory regimes played a large part in shaping, both intentionally and inadvertently, the TV and wireless communications industries through the allocation of spectrum and the licensing of broadcasters and wireless communications network operators. The differences between the regulatory regimes in the US and the UK may well have a large influence in how mobile TV develops on each side of the Atlantic.

Despite the UK’s regulator being disinclined to be involved in specifying particular

323

frequencies or standards for these or other services (Wynn, 2006) its treaty obligations with Europe may well dominate.

The European Commission’s interests in European harmonization could be considered as a normative or cognitive-cultural institutional force that underlies its regulative expression in the more active role in the selection of standards in the telecommunications and broadcasting industries. However, it is equally well captured by the actor-network’s conceptualization of harmonization as one of its on-going interests that shape its strategic actions.

324

IX. Discussion and Conclusions

The case studies presented in the preceding four chapters have covered a good deal of

ground. Chapters 5 and 6 start at the dawn of radio in the nineteenth century and take us through the development of first, second, and third generation cellular systems in the US,

the UK, and much of Europe. Chapter 8 looks at a specific set of new services, mobile

TV and video, that became feasible with the advent of third generations wireless systems.

These new services brought the television and mobile wireless industries into contact with one another. To provide the necessary context the history of the television industry is itself examined in chapter 7 along with its convergence with the telecommunications industry.

Revisiting the research questions

At the end of each chapter the findings from the event based case studies, and in the

case of chapter 6 interview data, were used to address the research questions posed in the introduction (chapter 1) and elaborated in Chapter 3 (See Table 1). In this final chapter we revisit these research questions for the last time and examine what can be learned from looking at the findings across the cases and contrast the usefulness of the economic, institutional, and actor-network based perspectives in conceptualizing what has happened across these industries over several decades.

325

Research Question 1: How does technical standards creation and adoption play out in the construction of large scale information systems?

The case studies in earlier chapters focused on the creation and adoption of air-

interfaces for three generations of mobile wireless systems and several generations of TV

broadcast formats. They provided examples of market standards (cdmaOne), committee

based de jure standards developed within SDOs (e.g. GSM in the latter phases of its

specification within ETSI), and those developed by industry (e.g. NTSC) that became mandatory. There were examples of under-standardization (e.g. AMP’s lack of an inter-

network interface to facilitate roaming). So there is some evidence from the cases that the

categorizations presented by economists are relevant. However, categorization of

standards can be difficult and change over time. For example, the creation of the GSM

standard started out in the CEPT at the instigation of European PTTs. As the standard

creation setting transitioned to ETSI (an SDO) the involvement of industry increased.

Finally, it competed in the global marketplace for adoption by network operators around

the world.

Similarly, the worldwide adoption of NMT and later AMPS in the 1G era, and

GSM/UMTS in later eras support the general contention that network externalities

influence standards adoption decisions. Indirect network externalities such as economies

of scale in equipment production, and to a lesser extent the direct network externalities of

worldwide roaming were certainly key considerations in technology adoption decisions.

However, the cases also highlighted other dimensions to these decisions that are not

accounted for by network externalities. The desire to support domestic suppliers

dominated PTT/government decisions in most of the large European countries in the 1G

326

era and in Korea for 2G. The UK’s market driven adoption of a modified version of

AMPS to gain access to benefit from equipment availability and economies of scale was

very much an outlier in Europe at the time. Also the adoption of a range of 2G standards

in the US (D-AMPS, cdmaOne, GSM, and iDEN) provided operators the opportunity to

base their choices on criteria most aligned with their own interests, envisaged futures, and

strategies. Each option provided some advantage relative to the others (such as ease-of-

upgrade, higher capacity, economies of scale in equipment, or service differentiation).

Self evidently no single standard was best for all operators.

The wireless and television industry cases also covered the creation of technologies, some of which became de facto or de jure standards. Formal economic models do not

offer direct insight into this phase of innovation, although the implications of having a

proprietary de facto standard widely adopted certainly provides innovators with strong

incentives (Shapiro & Varian, 1998).

The cases offer a wide range of settings for standards creation. In 1G cellular the

innovators in Scandinavia and in AT&T created embodiments of an idea that had been

conceived of decades before its implementation was possible. In Scandinavia the

relatively high adoption of non-cellular radio telephone services, and a tradition of

cooperation in telecommunications, led to inter-PTT cooperation in the creation of the

NMT standard despite the marginal profitability foreseen (Mölleryd, 1999, pp. 85-87).

The interests of the government owned PTTs in large European countries were tied to

industry policies aimed at supporting domestic manufacturers.

The key driver behind the instigation of GSM standardization in 1982 was a

widespread interest among national telecom administrations to coordinate use of the

327

900 MHz band. This was before most of Europe even had operational first generation cellular systems (the Nordic countries and Spain were exceptions). This early start was driven by fears of a Franco-British collaboration thwarting the realization of a future pan-

European system.

The UK later played a focal role in stabilizing the actor-network around GSM by enrolling regulators and operators to semi-binding commitments (GSM MoU) that reduced the risks of developing or adopting the standard. These actions also provided the basis for a large expected market size that drove adoption decisions elsewhere around the world (Funk, 2002). The role of the European level institutions cannot be ignored.

Through a range of measures the European Commission supported the creation GSM and formed a standardization forum more suited to its creation. Its actor-network building efforts (e.g. issuing directives on the allocation of frequencies and the issuing of mobile licenses) helped ensure that no alternative 2G standards were deployed in Europe.

European level institutions also played a central role in the creation of the UMTS standard.

The strong central role of the regulatory regime in Europe is in stark contrast to the hands-off approach practiced in the US. The FCC avoided the selection of 2G and 3G standards and was not involved in their creation. The creation of a single 1G standard

(AMPS) was simply a result of their being only one major US telecom innovator that had a vision of such a system (AT&T). It should also be noted that the FCC not getting involved in standards creation or selection did not mean that the government more generally had not played a role. Qualcomm’s competitive advantage in understanding of

328

spread-spectrum communications, and the CDMA multiple access schemes built on it,

came from project work for the US military.

It can be argued that the US hands off approach worked to the detriment of the US in

the development of 2G and 3G standards. However, it also gave the space for broader

innovation in the wireless space such as the development of CDMA technologies and the

launch of the iDEN proprietary system that addressed some users’ requirements more fully – perhaps indicating that GSM represented a sub-optimal over-standardization (see

Table 1). The adoption of CDMA as the base technology of the main 3G air-interfaces

(cdma2000 and UMTS) provided a sort of victory for the US in 3G, and for Qualcomm

with its large portfolio of IPR pertaining to the technology. The diversity of approaches

to standards creation in the US wireless industry and the role of the regulatory regimes in

Europe argue against perspectives from economics that reduce standardization to

selection among a few alternatives. In the US the alternatives were shaped specifically by

the differing network operator interests, which included backward compatibility, network

capacity, handset availability, and time-to-market considerations. In the case of CDMA

for example, Qualcomm had a stronger network of connections with the technology than

other innovators and had strong interests in commercializing it. In turn it translated the

interests of various operators (e.g. to increase network capacity, or offer higher quality

speech) to align them with its own interests in commercializing CDMA. It used this

growing actor-network of US operators to enroll manufactures’ support for the

technology. Thus, in the US the creation and adoption of wireless standards was shown

to be a dynamic and interactive process among different sorts of actors: technological,

market, regulatory, innovators, and existing standards.

329

Adopting the actor-network perspective allows the researcher to include the role of network economics as used by actors in formulating their strategies and actions. It also allows the researcher more flexibility in assigning the wider range of interests and beliefs that actually informed actors’ strategy formulation and resulted in the dynamic inter- relationships among technological and organizational actors.

The strong role of the European Commission in 2G and 3G standards creation and adoption could lend credence to an institutional perspective. The Commission and other

European level institutions certainly used coercive mechanisms (legislation and directives) to bring about outcomes it sought. One could also appeal to deeper normative or mimetic mechanisms being deployed – such as concepts like regional harmonization and being a good European being taken for granted or being part of some regional orthodoxy. However, one would have to be fairly careful in the choice of ‘institutions’ as there were major upheavals in the way that the European states coordinated with one another over the period examined e.g. expectations of the appropriate level of competition in the telecom industry, and just what parts of national sovereignty were ceded to European bodies. Many of these ‘institutions’ only existed since the Second

World War and would have to be considered as somewhat too young to have been fully bedded into Europe as various referenda on European level issues attest. In addition the institutional view would require the researcher to select different ‘institutions’ for the US, perhaps a deep rooted belief in the wisdom of the free market. While there is undoubtedly some explanatory power in these concepts, the researcher is left with more flexibility if these underlying interests and beliefs can be attributed to various actors developing their own strategies. Thus the Commission’s interests in harmonizing European markets and

330

standards, and its goals to promote competition in the telecommunications industry can

be allowed to change as they appear to have done rather than be left as unchanging

institutional logics. Even allowing for the evolution of institutions over time does not

provide more insight than the ANT perspective of the enrollment with, and defection

from, temporarily stable or black-boxed actor-network configurations. The institutional

perspective is also unable to account for actors’ strategy formulation and action based on

short-term interests or the role of technological actors.

The concept of path dependency of technology choices appeared again and again

throughout the cases. From the 50/60Hz television frame rates dictated by the prior

choices on electricity generation, to the migration paths from AMPS to D-AMPS,

cdmaOne to cdma2000, and so on. Again this conceptual adjustment to classic economics

to allow for the influence of past decisions can be readily incorporated into an actor-

network based analysis. The socio-technical actor-networks include the configurations of technological actors as well as human actors in the conceptions of existing and potential future network configurations. Considering the formulation of actors’ strategies as starting from their perception of current heterogeneous actor-network configurations thus subsumes the concept of path dependency.

Fixed network operators coordinated globally in the specification of DSL and cable modem standards. The configuration of the telecom industry in most geographies meant that there was usually just one telecom company with copper pairs entering domestic residences, and only one cable company with a coax connection. This in turn is linked to the economics and technical properties of these infrastructures. Given this configuration the telecom operators had no interests in competing with one another to offer

331

differentiated versions of DSL. Similarly, cable companies had no interests in competing with one another in cable modem technology. Telecom and cable companies competed with one another but were able to leave technical innovation to suppliers. Operators could insist that new developments were incorporated into standards to ensure that economies of scale could be realized. While network economics adequately explains this standardization outcome, the actor-network perspective allows the researcher to explicitly include the configuration of competition between not only individual companies, but also between access technologies. The actor-network perspective also supports explanations of how the history of regulatory and innovative actions combined with the economic and other characteristics of technologies to produce the industry configurations that could be explained by network economics.

In the case of television the regulatory regimes and other domains saw the creation and adoption of single standards for terrestrial broadcast TV as essential. While details varied, the regulatory regimes on both sides of the Atlantic used industry committee mechanisms to arrive at national or regional standards that were mandatory for terrestrial broadcast television. These standards were also widely used by cable and satellite broadcasters. However, even in this arena standards selection was not always straightforward. Sky in the UK chose an old standard (PAL) in place of a new one (D-

MAC) with undeniably superior picture quality supported by UK and European regulatory regimes. Doing so gave it time-to-market and network capacity advantages albeit at the risk of incurring the wrath of regulators. While this adoption of an older technology contrasts with some US mobile wireless operators risking the adoption of the

332

innovative cdmaOne technology – both marketplace actors were using their standards

selection decisions to differentiate their service offerings.

A single unicast mobile TV standard has yet to emerge. This outcome can be attributed to the greater malleability of modern electronics under the control of software.

While the fragmentation of video formats for the handset, and on the PC for that matter, results in some inconvenience, it is not sufficient to overcome the interests of companies

backing their own standards and visions of the future.

On the other hand broadcast mobile TV requires more standardization as a dedicated

receiver with appropriate signal processing capabilities has to be incorporated into mobile

devices (this constraint may be overcome by software defined radio technologies at some

point in the future). The standards for broadcast mobile TV were derived from other broadcasting and radio technologies (e.g. DMB, DVB-H, MBMS). The European

Commission again decided to back a European developed option, DVB-H. The only mobile TV standard developed from the ground up, the proprietary MediaFLO, has been deployed in the US on a former UHF television frequency that was auctioned and made available ahead of the full digital TV switch over planned for 2009. The way that standards creation and adoption has played out parallels the 2G mobile wireless standards battles (MediaFLO, the company, is even owned by Qualcomm). The only surprise is that the US regulatory regime has not formally objected to the arguably anti-competitive behavior of the European Commission picking the winning technology for mobile TV.

As for 2G and 3G wireless standards broadcast mobile TV standards have been influenced by pre-existing technologies and the more interventionist minded regulatory regime in Europe.

333

The creation of standards across all the cases in summarized in Table 15. The largest

contrast between the UK/Europe and the US was the greater tendency of the European

regulatory regime to shape some standardization outcomes through funding of R&D and

regulatory action. Over time there has also been a greater tendency for mobile wireless and other radio related standards to become global. However, the arrival of standards from the computing industries at higher levels of the stack on handsets and other devices has somewhat reduced the influence of the regulatory regime in standards making – the

need for the coordinated use of scarce radio spectrum gave it the potential for a more

focal role in the standardization of interfaces lower in the stack closer to antennas.

From a theoretical standpoint the actor-network perspective discussed in chapter 3

provides a richer way of conceptualizing the creation and adoption of standards in the

construction of large scale systems than network economics or institutional pressures

alone.

334

1G and 2G Mobile Wireless 3G Mobile Wireless Traditional TV and telecom Mobile TV

US adopted single 1G standard - AMPS US adopted multiple 3G standards US adopted mandatory standards for US free to adopt multiple mobile TV • AMPS developed by AT&T – the • FCC stays out of standards selection terrestrial broadcast TV standards dominant telecom operator and main • Upgrade paths for US operators exhibit • Committees established by FCC to • Multiple unicast and multicast innovator in the industry (Bell Labs) path dependence on 2G choices make recommendations (e.g. NTSC) technologies trialed and launched • AMPS selected by FCC as it was the • Multiple industry forums used • Industry submitted proposals for mono- in US only proposal submitted standardization chrome, color, and digital TV standards • AMPS adopted by all US operators • Development path for DAMPS merges • FCC made standards mandatory • Under-standardization i.e. lack of inter- with that of GSM • Satellite, cable, IPTV, and fiber based operator networking rectified by • Transition to 3G starts ‘in-band’ TV system operators adopted various US industry creating IS-41 • cdmaOne operators faster to market industry and proprietary standards with 3G options • US adopted multiple 2G standards • US regulatory regime defends the • FCC stays out of standards selection market based approach at the • Several options developed by US diplomatic level industry (cdmaOne, D-AMPS, iDEN) • Operators made own standard selection • Standardization globalized (ITU, • Digital TV builds on internationals 3GPP, and 3GPP2) standards (e.g. MPEG2) 335 • Standards battles move up stack • Key cable modem/ DSL standard Europe adopted many 1G standards sinternational • UK selects variant of AMPS (TACS) to Europe looks likely to select DVB-H realize economies of scale Europe adopted single 3G standards UK adopts mandatory standards for for broadcast mode mobile TV • Large European countries develop • European funded R&D into 3G terrestrial broadcast TV • Multiple unicast and multicast national standards (to support domestic technologies • Regulatory regime selected between technologies trialed and launched manufactures) • Single SDO forum (ETSI) used for options provided by industry for mono- in UK • Scandinavian countries collaborated in standardization – with most work chrome TV • European commission puts its creation of NMT standard undertaken by manufacturers • Adopted color TV standard common in weight behind the DVB-H standard • Small Euro countries adopted NMT or • Single air-interface selected after intra- Europe (PAL) TACS to benefit from scale economies European and Euro/US standardization • Pan-European digital TV standard battles played out inside and outside adopted (DVB-T)

UK / Europe UK / Europe Europe adopted single 2G standard ETSI • GSM initiated in response to fear of • Transition to 3G starts in new losing opportunity to coordinate use of frequency allocations 900MHz band • UK part of overall European • Commission supported GSM and used standardization efforts it as a tool to spur telecom competition • Adopted by all Euro operators

Chapter 5 Chapter 6 Chapter 7 Chapter 8

Table 15. Summary of findings across cases for Research Question 1

Research Question 2: How do organizations build their relationships and coordinate with one another and with technology during the construction of large scale information systems?

The third research question, which we address presently, examines the interrelationship between standards creation/adoption (RQ1), and the relationships among actors (RQ2). First, we examine the nature of the relationships among heterogeneous actors during the construction of large scale systems. The case studies focused on the structure of relationships in the wireless and TV industries including their technical,

regulatory, and commercial aspects. The cases provided a range of industry structure

configurations that changed radically over the decades.

First generation cellular

Cellular services in the US were competitive right from the start. Rather than

extending AT&T’s near monopoly on telecom services and equipment the FCC ensured

competition by allocating licenses and spectrum to new operators in addition to existing

wired network operators. The launch of cellular services, 1983, roughly coincided with the breakup of AT&T (1984) and the introduction of more competition in the telecom sector. As the cellular business ended up in the Regional Bell Operating Companies

(RBOC) the US did not have a national cellular network. From their inception cellular network operators had commercial objectives and innovation in the marketing of cellular services, such as offering a range of tariffs, allowed operators to address more of the market and enroll more customers. Up to the breakup of AT&T most innovation

336

technological innovation in the US telecom industry came from Bell Labs (Fransman,

2002).

The initial industry structures in most of Europe were comprised of government

owned monopoly PTTs that provided all telecom services. Regulation, say of spectrum

usage, was also part of the PTT or government. So, telecom industry coordination was

essentially an internal planning exercise. During this era most of the technological

innovation in the telecom sector in large European countries came from PTT laboratories

(Fransman, 2002). Telecom equipment manufacturers in these countries were national

champions supported by the government owned PPTs. Thus, innovation, regulation, and

network operation were tightly connected and centralized (Fransman, 2002). The

introduction of 1G cellular in large European countries was slow and did not have strong

commercial interests.

The development of the NMT standard was led by the PTTs of the Nordic countries.

The multinational nature of the effort restrained the PTTs from working solely with

national champions. This laid the foundation for a more scalable and cost-effective 1G

specification than those produced in other European countries.

The UK telecom industry’s structure was essentially the same as its European peers

until the early 1980s when the government instigated a program to privatize the telecom arm of the PTT and introduce competition into the industry. The introduction of cellular in the UK was part of this program and the UK was the first to introduce effective competition in the provision of cellular services in Europe. Suppliers to the commercially

oriented cellular operators were foreign owned companies, Motorola and Ericsson, which

337

had expertise from AMPS and NMT deployments respectively. By this time the regulatory regime was quite separate from the wireless and fixed telecom operators.

In summary, the actor-network configuration of the wireless industry at its inception was a reflection of national government policy for the wider telecom industry. In the US and UK cases the cellular industry was at the forefront of the introduction of competition in the telecom industry. Operators’ relationships with suppliers and customers were commercially driven. On mainland Europe cellular was incorporated into an actor- network centered on the government owned PTT monopoly and its non-commercial relationships with suppliers and customers.

Second generation cellular and PCN/PCS services

The UK cellular operators’ upgrades to second generation GSM technology took place without major changes to the wireless industry’s structure. The regulatory regime in the UK did introduce more competition by releasing new spectrum at 1,800 MHz and licensing so called PCN operators. Being relatively late to market the PCN operator’s coverage took years to approach that of established operators – a fact that was compounded by their use of 1,800 MHz spectrum that required more infrastructure than the 900 MHz to cover the same area. This fact led the PCN operators, at least initially, to focus more on consumers than corporate customers – wide coverage was not thought to be as critical for consumers. PCN operators drew upon the innovative marketing of mobile wireless services pioneered in the US to reach the consumer market.

Second generation cellular technologies brought little change to the structure of the

US wireless industry either. The switch to 2G was primarily an in-band upgrade exercise

338

(using either D-AMPS or cdmaOne). The government introduced additional competition when it released PCS spectrum in the 1,900 MHz region and licensed more operators, a move inspired by the UK’s licensing of PCN operators. The process of licensing these new PCS operators was a long drawn out procedure in the US – reflecting how incumbents could use the consultative process and litigation to impede the introduction of competition. As in the UK the PCS operators’ situation led them to focus on the consumer market segment. Nextel used the proprietary iDEN technology and non- cellular/PCS spectrum to address the needs of a particular segment of the mobile communications market that was not particularly well addressed by cellular/PCS offerings.

The biggest structural changes associated with the introduction of second generation technology occurred on mainland Europe. The European Commission used its support of the GSM standard to promote the introduction of competition in mobile communications, including the licensing of new operators in the 1,800 MHz band pioneered by the UK.

Another notable change on both sides of the Atlantic was the reconfiguration of the wireless and wider telecom innovation systems. In the US the innovations associated with the various 2G air interface offerings were developed solely by equipment and semiconductor manufacturers. In Europe manufacturers took an increasingly central role in the creation of the GSM specification. This paralleled the opening of the innovation system in telecom more generally. Before liberalization telecom industry innovation was tied to the central development laboratories of incumbent operators (e.g. AT&T’s famous

Bell Labs and BT’s Research Labs at Martlesham) and their suppliers. New competitive operators relied on the R&D carried out by suppliers and by the end of the 1990s the

339

incumbents were performing little R&D. The bulk of the R&D in the industry shifted to global suppliers like Cisco, Ericsson, Fujitsu, Lucent, NEC, Nokia, and Nortel

(Fransman, 2002).

In summary, the actor-network configuration was again dominated by the imagined futures of the regulatory regimes. For, the US and the UK these were largely consistent with those that prevailed at the introduction of first generation services. On mainland

Europe the relevant regulatory regime had shifted to the European level. This regime used mobile services to support the introduction of competition much as the US and UK had done with first generation services. The trend of increased competition in the wireless industry has continued on both sides of the Atlantic with the launch of several mobile virtual network operators (MVNOs) that offer services that leverage their own brands while relying on another operator’s radio network (e.g. Virgin Wireless operates in the

US, the UK, and other countries where it has a strong brand presence).

The service offerings of 2G based systems were still modeled largely on their fixed telecommunications counterparts (i.e. telephony, fax, and low speed circuit switched data). The main exception was the provision of the SMS text messaging capability that was incorporated first into GSM and later into the other 2G standards. Although fixed and mobile telecom networks became digital their architectures still predominantly followed the circuit switched paradigm of the preceding analog era.

Third generation and data services

The switch to third generation mobile wireless technologies brought more significant changes to the structure of the wireless telecom industries. More accurately it was the

340

switch to packet based data services that loosed the connections inscribed in previous

circuit switched architectures. The new flexibility offered by the ‘always on’ packet

based data services, along with the increasing malleability of the software driven mobile

devices allowed new types of services to be offered on mobile devices. These included

gaming, location based services, and music, images, video, m-commerce, and web access

in various forms. This paralleled the changes in the provision of interactive services on fixed networks. Internet access, which uses the TCP/IP packet based protocols, was offered to homes and business using a range of access technologies built on familiar physical infrastructures (i.e. copper pairs, coax cable, and increasingly optical fiber).

This new flexibility and the wider range of services brought the wireless industry into contact with the content and computing industries. Connections were made between actors and groups of actors that had little or no prior interrelationships. Since this increase in the complexity of actor-network building around wireless services the configuration of industry structures has been in flux. Network operators’ initial approach was often to use their networks and walled gardens as obligatory passage points for both content providers and consumers. However, the uncertainty around the mix of services that would enroll customers spurred experimentation in services offerings and commercial relationships.

One trend in the mid-2000’s was for wireless operators to loosen their control of actor- network building. For example, open access to the content of the wider Internet became more common. This trend in loosening restrictions has continued – at the time of writing

AT&T Wireless in the US and O2 in the UK has allowed Apple to launch the iPhone and a range of Apple branded services on their networks.

341

In parallel, regulatory regimes allowed innovation on radio interfaces using

unlicensed spectrum. The most prominent are the 802.11x series of interfaces (also

known as Wi-Fi) developed by the computing industry within the IEEE. Wi-Fi initially

supported Internet access from hotspots, typically using laptop computers. Wi-Fi

capability has been incorporated into traditional mobile wireless handsets and other

mobile devices. Its deployment has gone beyond hotspots to include whole city districts

in both the US and UK. The traditional wireless network operators also adopted Wi-Fi in

a couple of new ways. First, using hotspots as an interim alternative to upgrading to

broadband 3G technologies until technological and market uncertainty was resolved, secondly, using Wi-Fi (or Bluetooth) as an alternative access technology for voice traffic

at consumers’ homes. This latter use fulfils visions of fixed-mobile convergence that

existed for over twenty years. More generally Wi-Fi allowed some level of data

communications mobility without requiring a commercial relationship with one of the

traditional wireless network operators.

The innovation system in wireless was further widened as the handset became a

mobile computing device. Companies from the computing industries innovation system

offered operating systems, application platforms, browsers, and various other software

technologies. Manufacturers and operators coordinated with these new innovators to

incorporate these technologies into handsets and wireless infrastructure.

The complexity of, and the size of investments in, wireless infrastructure has meant

that operators and infrastructure manufacturers’ relationships with one another are long-

term and multifaceted. The relationships typically last decades, can span generations of

technology, and the operators and manufacturers often coordinate their standardization

342

activities. In contrast the operators’ relationships with device manufactures are shorter term and focused on current and upcoming devices (Funk, 2002). However, handset manufacturers can develop connections with customers (e.g. through particularly attractive handset models) that are impossible for the manufacturers of ‘invisible’ infrastructure.

In chapter 2, transaction cost economics (TCE) and Porter’s five forces model (P5F) were introduced as possible tools for understanding the relationships among organizations. Both perspectives were criticized as being static. Porter’s approach to competitive strategy focuses on identifying and defending a profitable market position and TCE focuses on firm boundaries in otherwise fixed value systems. By treating technology and standards as exogenous these perspectives only consider their impact on limited aspects of industry structure. They are unable to consider the influence of the industry or wider society on technological development and standards creation.

The summary of the case findings presented above provides a lot of evidence that any theoretical perspective that looks to explain the structure of the wireless or television industries must incorporate the actions of the regulatory regime and the scarcity of radio spectrum. Regulatory regimes defined the terms of the competitive playing field for these industries and the extent to which this was possible was constrained by the spectrum it made available for licensed and unlicensed uses. Of course the various participants in the wireless industry, the TV broadcasting industry, the military, the emergency services, and other users of UHF spectrum, also shaped spectrum allocation and the actions of the regulatory regime – through lobbying, participation in consultation exercises, and litigation. The central role of the regulatory regime was paralleled in the fixed telecom /

343

cable business where the services that can be offered was closely regulated (e.g. for many

years telcos could not offer video services and US cable companies could not offer

telephony).

The cases also provide examples where actors change the industry structures directly.

For example, Sky in the UK built an actor-network around the configuration of international coordination procedures for geostationary satellites, the transmission properties of these satellites, and the properties of an older standard, to deliver pay TV services to the UK market. In this case Sky contested the role of focal actor and bypassed the UK regulatory regime’s obligatory passage points of licensing and spectrum allocations for satellite TV. In the US Nextel became a significant participant in the wireless industry through innovative actor-network building using a proprietary

technology and the deployment of spectrum envisaged for other uses.

The actor-network based perspective allows for the varied interests of commercial

and non-commercial organizations and the characteristics of technology, spectrum, and

standards to be incorporated into models that explain the dynamic interrelationship

among them. With the switch-off of analog television extremely attractive UHF spectrum

(~700 MHz) will become available for other applications. This will provide existing and new types of actors to enroll wireless spectrum in potentially new ways, again changing the structure of the industry. Press reports at the time of writing, for example, indicate that search and advertizing giant Google was bidding for this spectrum in the US.

The different ways in which organizations and technologies built their relationships with one another in each of the cases studied is summarized in Table 16. The most significant differences in the ways that this was achieved in Europe and the US included

344

the greater role of the market in selecting wireless standards in the US – although both countries created mandatory standards for terrestrial broadcast television. The US tended to issue television and wireless mobile licenses for small geographical areas rather than national licenses. The timing of the release of spectrum for wireless services differed, resulting in 2G and 3G upgrades occurring in-band in the US. The increasing distribution of telecom R&D was common as was the widening of the types of actors involved with wireless (see Figure 29). The US and UK share the distinction of being at the forefront of introducing completion in fixed and mobile telecom services. Some technological changes affected both countries in similar ways (e.g. digital switching and VCRs), while others (e.g. satellite broadcasting) acted at different times with contrasting impacts.

345

1G and 2G Mobile Wireless 3G Mobile Wireless Traditional TV Mobile TV

First and second generation New digital capabilities Growth of terrestrial broadcast TV Mobile TV and video on handsets • Regulation by FCC • Operators upgrade in-band • Government allocates VHF and UHF • Unicast models offered by many - Duopolies in geographical markets • Multiple standards selected spectrum for TV operators - Licenses reserved for wireline • Continued consolidation of network • FCC licenses TV stations with local • FCC auctions some UHF spectrum operators allocated to RBOCs operators leads to fewer larger coverage suitable for mobile TV broadcast - No national cellular network operators • ‘TV Networks’ grow from existing radio • FCC does not get involved in specifying • Innovation system • Exploration of new services to exploit networks broadcast standards - Initially centralized (AT&T Bell Labs) new ‘always on’ digital capabilities • TV stations affiliate with ‘networks’ • Both open and proprietary broadcast US - Later more open (e.g. IS41) mobile TV standards are developed, • Operators consolidate Other TV sources trialed, and deployed • Digital upgrades left to operators • Cable TV pushes for wider role • PCS operators bring more competition • Satellite supports distribution of ‘Cable Networks’ • Operators develop long term relationships • R&D accomplished by equipment Mobile video predominantly via with infrastructure manufacturers suppliers handsets • Mobile telephones take on • VHS tapes and later DVDs provide • Relationships with terminal manufacturers • Consumers already carry mobile characteristics of mobile computer alternative for movie distribution not as deep phone handset – platform for new • Deeper connections with the • Competition from computer based • UHF transistors, ICs, and digital switches solutions services – OPPs for content computing and content industries providers make cellular telephony feasible • Standards creation becomes more • Switch to digital TV to release spectrum for other purposes • Unicast offerings from aggregators

346 global • Broadcast mobile TV requires United Kingdom standards and spectrum • Cellnet and Vodafone duopoly • National networks TV in the UK • Adopts US technologies United Kingdom • Publically owned BBC provides Mobile TV and video on handsets • Higher adoption than non-Scandinavian • New spectrum for 3G national broadcast networks • Unicast models offered by many Europe • Single standard selected (UMTS) • Regional and national commercial operators • GSM from pan-Europe collaboration • Auction prove costly for UK networks licensed over the decades • Delays in making new spectrum • New PCN operators operators • Satellite (BSkyB) becomes dominant available • One new operator (3UK) mechanism for distribution of pay TV • Both open and proprietary standards UK / Europe UK / Europe Rest of Europe are trialed, and deployed • National 1G cellular networks offered • European Commission backing DVB-H by PTTs – only widely adopted in as broadcast mobile TV standard Scandinavia and Switzerland • GSM collaboration used by EC to introduce telecom competition

Chapter 5 Chapter 6 Chapter 7 Chapter 8

Table 16. Summary of findings across cases for Research Question 2

Research Question 3: How does standards creation and adoption interact with the

ways that organizations build relationships and coordinate with one another and

technology? In other words, how do standards interact with industry structure and

technical infrastructure?

The effects of standards captured in the economic literature were categorized under compatibility, minimum quality, variety reduction, and informational (discussed in chapter 2 and summarized in Table 2). If we go back to the AMPS first generation cellular standard we observe that it provided the compatibility essential for the operation

of cellular systems, and allowed firms to specialize on part of the overall system (e.g.

mobile terminals). Its variety reduction effects included economies of scale in equipment

production and the provision of a platform for further development. The minimum

quality and informational effects were perhaps less relevant.

Much of the research that led to this categorization of technical standards and the

identification of their effects was derived from studies of the telecommunications

industry (e.g. David & Steinmueller, 1994). So, it is not surprising that one can map the

impacts of standards to these categories. And with a little more consideration one could,

at least some of the time, make out the links between the actions taken by various actors

and the effects that they were striving to achieve.

It would not be particularly interesting to exhaustively map observed standardization

effects to those already categorized. Suffice it to say that the overall thrust of this

previous work is supported. Instead this section first focuses on a few particularly

important effects that shaped or changed the wireless or television industries, that go

beyond those previously cataloged, and illustrate the tight coupling between the effects of

347

standards and industry structure. These examples provide a basis for the synthesis of the

actor-network based loop model of actors’ strategy formulation (Figure 2) and Lyytinen

& King’s (2002) analytical domains (the triangle framework illustrated in Figure 3). This is followed by a discussion of a more fundamental observation on the effect of

digitization (standardization on 0 and 1) on the structure of the telecom, wireless, and

television industries.

Interrelationships between standards and the wireless and television industries

Compatibility is provided by just about all technical interfaces, although what is made

compatible varies considerably. For example, air interface standards provided the

technical compatibility for mobile terminals to interoperate with wireless infrastructure.

Additionally, roaming was supported if compatible inter-network interfaces were also

deployed. Higher level interface standards, text messaging and telephony for instance,

allowed interoperable services to be offered irrespective of the particular underlying radio

and network interfaces. The precise location of a technical interface standard within the

existing configuration of technical and organizational relationships changes its effect

considerably. As we already covered in the discussion of the preceding research

questions different 2G wireless interfaces had idiosyncratic characteristics, such as backward compatibility, technical maturity, network capacity, and voice quality. This

variance in characteristics on several dimensions meant that one option was more

attractive to some US network operators than others given their current and envisaged

positions in the wireless industry actor-network. However, these options were not at all

exogenous. They were created to fulfill the perceived needs of operators with differing

348

requirements. The creation and selection of technical standards cannot be readily

separated. The heterogeneous engineering, or actor-network building, of network

operators and manufactures was tied together.

One effect of the single AMPS standard developed by the centralized innovation

system in US telecom was that it facilitated the consolidation of the wireless industry that

started out so geographically fragmented. By the time the US wireless industry addressed

the creation an inter-network standard to facilitate roaming the US had a more distributed

innovation system. Operators and manufacturers coordinated with one another in the creation of the IS-41 through industry associations (CTIA and TIA). These core standards facilitated the consolidation of the industry and the coordination of its participants for mutual benefit. These sorts of effects were not as relevant in early days of the European industry since national licenses were generally issued. Idiosyncratic frequency assignments and standards selections technologically constrained cross-border compatibility (except for Scandinavia).

Second generation systems reversed this pattern of standards fragmentation as Europe settled on one standard, while the US fragmented across cdmaOne, D-AMPS, iDEN, and

GSM based standards. The uncertainty about the success of any particular standard in the

US was a contributing factor to GSM’s wide adoption around the world.

The GSM standard was used as a tool by the European Commission to introduce competition into the mobile wireless industry across Europe. Actions, like the separation or the regulatory function from the operation of telecom networks, were necessary to support this. These changes in the relationships between incumbent network operators and the national regulatory regimes helped set the stage for the liberalization of the wider

349

European telecom industry. This can be contrasted with the French, German, and Italian

national 1G standards that reinforced existing PTT centric industry configurations.

These episodes from the 1G and 2G cases illustrate that the outcomes of standardization, technical standards, can bolster existing actor-network configurations – building upon the existing configurations with minimal impact upon what was already there. On the other hand they can be tools for actors to bring about change, such as industry consolidation, or market liberalization. The creation of the standards themselves occurred within evolving actor-network configurations incorporating industry structures, existing standards, and technologies. Expectations concerning the diffusion of standards can become self fulfilling by shaping actors’ envisaged futures.

The success of Nextel’s deployment of the proprietary iDEN system provides evidence that the GSM and other cellular interfaces with similar functionality reduced variety to an extent that did indeed lower the utility for some potential wireless users. An alternative vision of how a voice service could be used by customers allowed a small operator to become a significant competitor in what was already an extremely competitive industry. Similarly, the characteristics of the tried and tested PAL standard was a key part of the heterogeneous actor-network building that allowed Sky to reconfigure the UK’s television industry. So, on occasion focal actors can indentify novel approaches to heterogeneous engineering, which allows them to bring about more radical change in the configuration of the actor-networks, rather than simply replicating existing configurations.

The market position of attacker wireless operators (i.e. not affiliated with incumbent telecom providers) spurred them to include SMS text messaging traffic within the GSM

350

MAP signaling channels – otherwise they would have been forced to invest in additional

data networks. This inadvertent easing of inter-operator connectivity and SMS roaming accelerated the network effects behind the take-off of SMS which was, at least initially, was not connected to other types of communication networks. Certain elements of the actor-network configuration (communications network characteristics and operators commercial arrangements) shaped this standardization outcome – an outcome that had wider implication than originally envisaged.

The way that differences in approaches to standardization affect industry relationships can be subtle and extend beyond the actors they were originally intended to coordinate.

For example, both the cdmaOne and GSM air-interfaces provided the requisite compatibility between handsets and infrastructure. However, unlike cdmaOne handsets

GSM handsets are type approved for use on all GSM networks – assuming that the necessary frequency bands are supported. With GSM mobile terminals a subscriber information module (SIM) provides the mapping of the customer to the device and the corresponding account with an operator. These differences in the specifications tied to air interfaces gives GSM handset manufacturers the ability to market devices directly to customers and gives them more power in their relationships with GSM network operators. So, while an air-interface is primarily required to coordinate technology its secondary characteristics affect the configuration of non-technological industry relationships.

It is perhaps obvious that the organizational actors changed radically over the timescales involved in the case studies. A basic tenet of the actor-network perspective is that actors are defined by their relationship with others. So, by definition organizational

351

actors must change as the actor-networks around them are reconfigured – such as in the ways described in the case studies. The AT&T of 2007 was certainly not the same as the

AT&T of the 1960s although some remnant of the older version remained in the relationships of customers with its brand. Commercial and other organizations also changed internally. The names of companies and other aspects may have remained black- boxed to some external actors, but the internal changes reflected the external ones. For example, the physical telecom companies’ communications networks and the expertise required to operate them were transformed along with external changes in markets, technologies, regulations, and standards. Similarly the cable companies morphed from simple providers of TV on one-way based networks, to multi-service providers built upon two-way interactive networks increasingly relying on fiber optics.

Such morphing of actors extended beyond organizations, and technologies, to include intangible actors like shared visions, technical standards, and architectural conventions.

For, example the meaning of PCN and PCS changed during their incubation, and the meaning of the Internet changed beyond all recognition since it was first conceived.

The actor-network conceptualization of actors’ strategy formulation was summarized in Figure 2. This model, shown as a loop, captures the dynamics of actors’ perceptions of existing and future actor-network configurations. The interactions among heterogeneous actors bring about changes in their perceptions of existing and imagined future network configurations which influence their strategies. These interactions and changes can be considered to be an on-going continuous process. In the analysis of the case studies we captured some of these dynamic interactions among actors in the wireless and television industries. Lyytinen & King’s (2002) analytical domains (the triangle

352

framework illustrated in Figure 3) were used to help manage the complexity of the real world data and the findings were summarized in time sequence diagrams (e.g. Figure 17,

Figure 18, Figure 19, and Figure 20). These diagrams show a variety of interactions among the innovation system, marketplace, regulatory regime, and technical standards during the development and deployment of first and second generation wireless communications systems. The organizational and other human actors within these analytic domains engage in the on-going strategy development loops as depicted in

Figure 2. A synthesis of the case findings building on Lyytinen & King’s (2002) triangle framework and the actor-network based perspective is presented in Figure 53 – with the loops depicted in the domains representing the continuous interactions among actors as they update their understandings of current and imaged future configurations.

In Figure 53 the particulars of specific episodes in the wireless and television industries are abstracted away. The numbered steps in each domain exemplify how the individual elements of the actor-network based loop model apply broadly in each domain.

The arrows among the domains provide a rough guide to the sorts of influences that the domains have upon one another. The innovation space and the technologies tied to it open up new possibilities for marketplace actors, while the actions of marketplace actors provide guidance for potentially fruitful strategies in the innovation space. Thus the interrelationship between the innovation space and the marketplace incorporated the ideas of “market pull” and “technology push” while allowing for a more interactive process than suggested by the terms push and pull. The regulatory regime shapes many elements of the actor-network configuration perceived by actors in the innovation space and the marketplace. This is achieved in many ways, but industrial policy and regulatory stances,

353

spectrum allocation decisions, and network operator / broadcaster licensing are among the types of action that have special impact on the mobile wireless and broadcast

industries. Of course the effects of the actions of the regulatory regime are themselves

shaped by the lobbying, consultation, litigation, and other actions of the innovation space

and marketplace actors.

Technical standards are shown as central in Lyytinen & King’s (2002) triangle

framework and in Figure 53. The (A) arrows from the domains to the oval depicting

standards in the figure are used to illustrate the various actions taken by actors acting out

standardization strategies intended to bring about the benefits of attractive imagined

futures, or to avoid the disadvantages of unattractive imagined futures. The (B) arrows in

the other direction depict the effects or actions of technical standards. As we have noted

that these effects can be categorized and that the compatibility and variety reduction

categories have been the most prominent in the wireless and television industries.

However, the actual effects of standards are as much determined by the configuration of

the rest of the actor-network as the characteristics of the standard itself. Thus the timing

of standards creation and adoption actions is critical, and it is not surprising that David

and Steinmueller (1994) found that generalizations about the effects of standards in the

telecommunications industry were “virtually certain to be untenable.”

The model of actor-network building presented in Figure 53 is abstract enough to

apply to both the wireless and television industries as both rely on frequency allocation

and licensing by the regulatory regime. Most recently these two industries have built

more interconnections with one another as the work together to deliver video content

services to mobile handsets. Decades earlier they influenced one another through their

354

interactions with the regulatory regime as they both lobbied to obtain or retain access to

the UHF spectrum with it characteristics that were attractive to both industries. The

model also broadly applies to the wider telecommunications and the computing industries

– albeit with less emphasis on frequency allocation for the former and an altogether

smaller set of interconnections with the regulatory regime for the latter.

Throughout the wireless and television industry cases technological actors and

natural phenomena have played central roles in shaping the configuration of the

industries and changes in those configurations. Without the properties of electromagnetic

radiation, and the electronic circuits capable of modulating information onto and

retrieving it from such radiation, neither of these industries would exist, at least not in

anything like their current or historical configurations. The properties of radio spectrum

allocated to various services have acted to shape the coverage, capacity, and economics

of communication and broadcast systems, as well as the businesses built upon them. The

photo electric properties of selenium and phosphorous along with the properties of

electrons in a vacuum were key to development of television. The physiology of the

human visual and auditory senses has shaped the ways in which video and audio content is encoded into electrical, mechanical, and magnetic media and transmitted in various

ways. The miniaturization of electronics through the semiconducting properties of silicon

and other substances, and the advancement of the technologies for incorporating literally

millions of components on a tiny chip of these materials acted on all the industries discussed. Digital transmission, processing, and storage rely on these and other technological advances. The innovation system maintains the deepest interconnections

with these non-human actors, through the development of technologies and research into

355

the characteristics of various natural phenomena. The innovation system’s understanding

of various natural phenomena has improved as technologies advance, and many

technologies advanced as those understandings are harnessed – or enrolled to use actor- network vocabulary. These technological and natural actors exert their influence throughout the actor-network illustrated in Figure 53 both directly and through the actions of the innovation space. While the conceptualization of these non-human actors incorporates an ability to act on other actors it does not extend the symmetry to strategy

formulation attributed to human and organizational actors.

By acting upon one another and upon other actors the characteristics of nature and

technologies ultimately shape the configuration of actor-networks. Next we examine how

the characteristics of these non-human actors can also shape the way in which actor-

network building can be performed. The impact of digital technologies in particular is

examined.

356

Innovation Space Marketplace

• Provide basis for new market visions 1. Perceptions of existing technological and • Enable / constrain products, services, and 1. Perceptions of existing market and other actor- wider actor-networks internal / external coordination possibilities networks 2. Visions of new technological actor-networks • Alter economics of products / services a. Customers 3. Ideas for bringing about visions • Enable / constrain convergence / b. Partners/competitors 4. Technological / conceptual development divergence of products and services c. Internal organization and resources/capabilities 5. New artifacts and concepts absorbed into and modify 2. Visions of new market actor-networks technological actor-network (range of timescales) • Change OPPs 3. Ideas for bringing about visions

4. Market / partnering / internal development Technology and nature • Provide basis for new technological visions 5. New products/configurations absorbed into and • New requirements / problems / opportunities modify market actor-networks (Change OPPs) identified A (Potential) customers & users A B Technical standards B • Lobbying • Lobbying 357 • Consultation • Consultation • Litigation • Litigation

B A • Industrial policy • Regulatory framework o R&D investment • Spectrum allocation o Military spending Regulatory Regime • Spectrum allocation • Licensing

• Licensing 1. Build model of market, technology and wider actor-networks 2. Visions of new actor-network configurations. 3. Policy making, regulating, intervening, to bring Key about visions A Actors seeking to realize benefits or avoid 4. New policies and regulations absorbed into and disadvantages of standards by influencing modify actor-networks (change OPPs) the way in which they are created B Effects of standards vary widely dependent Competition / industry policy on time and the configuration of the rest of the actor-network

Figure 53. Model of actor-network and actor-network building in the mobile wireless and television industries

The impact of digitization on actor-network building

The case studies in chapters 4 to 8 provide a platform for the discussion of the effect that digitization has had upon the configuration and interconnection of the fixed telecom, mobile telecom, television, media, and other industries. In addition, they also highlight a fundamental and far reaching effect that digitization has had upon the process of actor-network building in these industries.

Digitization here means the series of technological innovations that allowed an extremely wide range of data types to be represented as numbers and ultimately as binary digits (i.e. by 1’s and 0’s) that can be easily stored, transmitted, and manipulated. The types of data that can be encoded in bits (the more common name for binary digits) include text, audio (voice, music, and ), video of a range of resolutions and frame rates, and images of increasing resolution and quality (a transition that all but destroyed the chemical photography business).

The reduction of all these types of data to ones and zeros is, in one way of considering standards, the final stage of variety reduction (see Table 2) – while simultaneously providing tremendous potential flexibility. Conceptually, the circuitry for handling binary signals is much simpler than those for analog signals. However, the standards for the representation of the wide range of information in digital form can result in extremely complex interface and algorithm specifications. While digitization simplifies standardization at the lowest layers its very flexibility can make it much more complex at higher layers. In actor-network terms digital representations are extremely malleable and enable a wider range of potential actor-network configurations.

358

However, the first wave of digitization did not unleash the full potential of this

flexibility. Before discussing this further it is worth revisiting the nature of the analog systems that existed before the digital transition. In analog transmission systems information was encoded in the variation of amplitudes, frequencies or phases of electrical signals. Similarly, information was stored using physical or magnetic variations

(e.g. ink on paper, the variations in the groove of a vinyl record, or the intensity of the

fields on magnetic tape). The majority of the analog transmission and storage systems

were dedicated to one type of information or another. AM/FM radio, telephones, LP

records and audio cassettes for audio, TV and video cassettes for video, and books and

magazines, for text and pictures. The ways in which information was encoded in

electrical signals was closely coupled to the capabilities of the analog electronics in transmitters and receivers. Similarly the way analog information was stored was closely coupled with the electrical, magnetic, chemical, or physical characteristics of the materials used to store the information. Limitations in analog electronics meant that the values of the electrical components in the circuitry (e.g. the values of resistors, capacitors, and inductors that determine the characteristics of filters and ) had to be selected during design to match the particular signal or storage format. The extremely tight coupling between the design of the transmission and storage formats and the design of the devices that processed them left little room for flexibility. The industries built around these analog technologies (e.g. broadcasting, telephony, and music distribution) had their own standards and often remained in stable configurations for decades. The separateness of the devices and formats used for different applications

359

constrained the actor-network building possible and helped keep the industries that deployed them separate.

There were a few examples of convergence in the analog domain. For example the earliest facsimile machines were analog and made use of analog telephone and radio channels. Computers used gateway devices (voice-band modems) to communicate using the telephone network. Similarly, audio cassettes were used by early home computers for data and program storage. The gateway devices adapted to the pre-existing standards and infrastructure characteristics – in these examples by representing images and computer data using audible tones. These early examples of using analog technologies for uses other than the ones originally intended were among the first examples of the telecommunication and computer industries starting to encroach upon and make connections with one another. The telephone network and the audio storage media thus provided a certain amount of flexibility for actor-network building. They provided what I refer to as upward flexibility, by which I mean that a range of applications could make use of the capabilities afforded them. In actor-network terms an actor that provides upward flexibility is defined as one that provides a set of capabilities, or exhibits a set of characteristics that can be used in many ways for further actor network building (see

Figure 54a) . For example, the telephone jack provides a set of capabilities, an analog audio circuit-switched connection, and conforms to well defined electrical and physical characteristics. The telephone jack is an obligatory passage point for a heterogeneous engineer wanting to make use of the black-boxed analog telephone network (see Figure

54a).

360

Phone Fax Modem Actors making use of A’s capabilities

A

(a) Interface A on edge of actor-network provides (b) Phone jack, and the network behind it, provides capabilities or characteristics for on-going actor- circuit-switched 300 - 3,300Hz analog audio network building connections

Figure 54. Upward flexibility

The telephone jack also offered downward flexibility. By this I mean that

successive generations of telephone switching technologies (electromechanical,

electronic, stored-program control, and fully digital), transmission technologies (twisted pair, coax, microwave, satellite, optical fiber), and voice coding formats (e.g. baseband analog, FM, PCM, ADPCM, DCME, VoIP) provided essentially the same capability at the telephone jack (see Figure 55b). In actor-network terms an actor that possesses downward flexibility is one that can offers the same capabilities, or exhibits the same set of characteristics, by deploying one of a range of socio-technical actor-network configurations (see Figure 55a). So the telephone jack was also an obligatory passage point for network engineers wanting to build or reconfigure networks to offer the same capabilities to network users.

361

Actor A offers capabilities for actor- Telephone jack offers circuit-switched network building 300 - 3,300Hz analog audio connections

A

Analog Early digital Advanced digital switching and switching and switching and transmission transmission transmission

(a) Interface A on edge of actor-network provides the (b) Phone jack provides same capabilities using a same capabilities or characteristics using a variety of variety of switching and transmission technologies actor-network configurations

Figure 55. Downward flexibility

The flexibility of digitization is that, in principle at least, the same digital storage technologies and transmission schemes can store or transmit just about any type of information. In principle digitization has the potential to facilitate more interactions among a range of industries that were hitherto kept apart by the limitations of the analog technologies underpinning them. Of course there are practical limitations. Different types of information require diverse transmission bit rates and storage capacities. The earliest digital electrical transmission systems in the nineteenth century relied on telegraphic transmitters and receivers with throughput in the range of 15-20 words per minute – in the order of 10 bits/s. Such bit rates are only useful for the transmission of text. Even if images, audio, or video could have been encoded digitally such low bit rates would not have supported their transmission. It was only with the orders of magnitude increases in the bit rates associated with modern digital transmission technologies that the sending of these media over the same networks became technologically and economically viable.

Similarly the first digital electronic storage mechanisms provided limited storage. As

362

with transmission technologies digital storage capacities have increased by many orders

of magnitude and allowed flexible multimedia storage at modest cost.

Both the technical and human portions of the actor-network constructed around

analog telephony proved resilient. The initial digitization of the telecom networks built

upon the architectures of the analog system. This is not surprising since the digital

network elements had to interoperate with analog elements during the decades long

transition to digital. In addition the engineers and managers working for operators and

manufacturers had their expertise and work processes anchored in analog telephony and

its associated technologies, interfaces, and architectures.

Having standard telephone interfaces to homes and businesses provided both

‘upward’ and ‘downward’ flexibility for actor-network building – but the extent of that flexibility was constrained by the overarching central architectural assumption that the purpose of the network was to provide circuit switched voice channels between analog devices. Not making the digital transmission and switching capabilities at the core of modernized telephony networks widely available limited the extension of the upward flexibility to non-voice applications – particularly for residential customers. Businesses started to get access to somewhat more flexible network capacity in the 1980s. The interfaces made available were typically those used by telecommunications operators within their own networks (e.g. T1 (1.5 Mbits/s) in the US and E1 (2 Mbits/s) in Europe) and were usually restricted to point-to-point applications. The telecommunications operators envisaged opening up more of their digital networks to their customers by offering circuit switched data services (usually referred to as the Integrated Services

363

Digital Network or ISDN). However, the rollout of ISDN and the adoption of this vision were curtailed by the relatively sudden take-off of the Internet.

The upgrade to 2G mobile standards was also largely a replication of the preceding wireless analog circuit switched architecture using digital transmission.

Narrowband circuit switched data services were included but these did not provide the

flexible basis for reconfiguring actor-networks in any substantial way. Somewhat later the digitization of terrestrial, cable, and satellite television was simply the digitization of the same broadcast architectures developed to deliver analog television decades earlier. In all of these cases the digitization retained the tight coupling between the transmission networks and the limited range of services offered. Thus the early digitization of the industries studied in this dissertation was confined to offering more of essentially the same services albeit with improved economics and often better quality.

The full impact of digitization’s potential flexibility was not really felt until the packet switching architectures and the Internet protocol suite (often referred to by two of its most important protocols – TCP/IP) became a widely available platform. The Internet offered a platform for the development of applications that were not tightly coupled to the transmission networks underpinning them. TCP/IP provided both the upward and downward flexibility. It provided upward flexibility for the creation of any application or service that could use the simple communications capabilities it provided. It also provided downward flexibility in that an extremely wide range of physical networks with extremely diverse characteristics could be used to provide fully compatible and flexible cross network connectivity. Differences between the physical and layer characteristics were largely abstracted away at the IP layer. Thus TCP/IP broke the tight

364

coupling between a service or application and the underlying network used to deliver it

(Cerf & Kahn, 1974).

The upward and downward flexibility provided by TCP/IP was not in itself sufficient to unlock the flexibility of digital representation and transmission. The digital computer also loosened, if not broke, the tight coupling between analog electronics dedicated to processing just one signal format and built for a single purpose (e.g. the analog telephone and the television). The programmability of the computer in all its forms is ideally suited to manipulating the 0’s and 1’s used to represent digitized information. The digital computer, and more importantly the software that controls it, provides nearly unlimited flexibility in the ways that the information can be manipulated.

Layers of de facto and open standards have provided some constraints that limit the unnecessary variability that might otherwise leave the underlying flexibility totally unmanageable. While there are numerous document, image, and video coding formats their number is bounded and most computer systems can be configured to work with any of them. There are a few platforms or sets of Application Programming Interfaces (API)

such as Windows, MacOS, and Unix related operating systems. These APIs exhibit

considerable upward flexibility by providing capabilities that support a wide range of

applications. Various protocols that use TCP/IP for connectivity exhibit downward

flexibility by providing web browsing (e.g. xhtml, http), file transfer (e.g. ftp), and email

(e.g. pop, smtp, imap) services across operating systems. More recently the XML, SOAP,

REST, and many other web services protocols have been developed to support the

seamless coordination of all sorts of users, services, and computers – thereby exhibiting

both upward and downward flexibility.

365

Thus in the 1990s two major developments started to loosen the constraints of the analog era and to drive digital convergence. TCP/IP broke the tight coupling between services and underlying networks, and software opened up the possibility of flexible multipurpose electronic equipment. All manner of interactive information, entertainment, gaming, communication, and other services could be deployed using combinations of text, audio, images, and video on copper pair, coax, fiber, or radio based networks.

Linked to ERP systems and databases, technologies that predated the widespread use of the Internet, radically new business models were developed e.g. Amazon, EBay, and

Netflix.

Modern digital devices and data networks do not necessarily have behaviors and relationships as deeply inscribed into them as their analog and early digital predecessors.

They are much more flexible and dynamically reconfigurable. In actor-network terms the loosening of the connections between the applications or services and the technological networks that support them unleashed an enormous amount of flexibility in actor-network building. This, along with the malleability of software for digital processors has weakened, or even shattered, the strong links among services, devices and content distribution networks.

The use of the Internet protocol suite in particular opens up a practically unlimited range of possible connections with any device connected to the Internet. So while APIs provide actor-network builders easy access to a wide range of computer peripherals the capabilities of TCP/IP extend this access to applications and peripherals on hundreds of millions of computers – and of course to the businesses, organizations, or people behind

366

them. This extends even more broadly as it includes any stationary or mobile artifact that can possess an IP address and respond to digital stimulation.

IP ‘black boxes’ the capabilities of networks and the services reachable via those networks. Networks and services are no longer tightly coupled to one another. IP offers universal addressability and an abstraction of many other actor-networks (i.e. those exposed via an IP address and appropriate protocols). It no longer matters (at least not as much) how far away from one another they are (in space or actor-network terms) since IP provides at least the potential for them to be interconnected. So, there are many more options for the heterogeneous engineer to create, and to try out, new services or applications as he no longer has to build a physical network (difficult) before their ability to enroll other actors can be tested. With TCP/IP there are so many other potential actors that one can enroll, which combined with the malleability of software, provides an unbounded toolkit for heterogeneous engineering.

In the old analog and early digital worlds stability came from inscriptions in the networks and devices that were interconnected in ways that were extremely difficult to break down. Their characteristics were constrained, inscribed, by the fixed values of electrical components and later by an installed base that made it extremely difficult to change. The actor-networks built around telephony, , and even non-networked computing, were not be able to encroach on one another. The inscriptions in the technology were so immutable that interconnection was all but impossible. Once an architecture was established actor-network building was a case of creating “more of the same” sorts of connections e.g. an increasing penetration of telephones, TVs, or PCs.

367

Digitization, malleable software, IP connectivity, and supporting standards

brought new visions for expanding the services that could be provided on the same

physical networks. Thus the telecom, broadcast, and computing industries’ actor-network

started to become more intertwined.

The issue now becomes how some of the inestimable possibilities made available

by software and IP networks become stable actor-networks. We do not know where the

architectural control points are in networks with such flexibility and huge interconnection potential, nor the conditions that will bring stability. The flexibility implies the creation of more types of connection with other actors as opposed to creating “more of the same” sorts of connections. This could also bring with it the potential for turmoil in existing industries. Customers can disappear quickly and the economics of old ways of doing things more easily reconfigured (e.g. the music industry in the last decade).

In summary, consideration of the changes in the mobile telecommunications, fixed telecommunications, and broadcasting industries on both sides of the Atlantic allowed us to conceive of digitization, software, and IP in particular, as means of abstracting away many concrete details of electronic devices, physical networks, and business models – a radical qualitative change in the heterogeneous engineer’s toolkit.

368

Research Question 4: How do existing technical and inter-organizational

coordination mechanisms affect the design and implementation of large scale

information systems?

While this question was included with the others for completeness it became a

little redundant. The case studies for the wireless and television industry studies go back as far as the nineteenth century. The analysis has gone back to the beginning of electronics and before there were wireless communications or broadcasting industries.

Nevertheless we can point to a number of instances where the configuration of the wider actor-networks impinged on the way that these industries developed.

The great geopolitical conflicts of the twentieth century drove changes that undoubtedly touched many technological and social domains. The development of advanced rocketry in the Second World War and the space race in the Cold War directly led to satellite communications and satellite broadcasting. The availability of this satellite technology transformed the cable industry in the US, but later stymied its development in the UK. The more ‘hands-on’ regulation style and even the existence of, the European level institutions can be directly tied back to the aftermath of WW II.

Many of the technologies that became key actors in the industries we have looked at also have been pioneered for military applications or their development funded by the military. These include the computer, the Internet Protocol suite, hand portable radios, and the technology behind CDMA.

So, it is certainly fair to point out that critical events in the development of the

industries we have examined in the case studies also come from outside the triangle

framework used to organize the analysis of the case study data.

369

More generally the idea of the “existing technical and inter-organizational

coordination mechanisms” is captured as actors’ perceptions of existing actor-network

configurations in the model of actors’ strategy formulation. These perceptions provide

the foundation upon which actors develop their visions of future configurations as well as their standardization and other strategies. This model is illustrated in Figure 2 and combined with Lyytinen and King’s (2002) analytical domains for the wireless and television industries in Figure 53.

Contributions, limitations and further research

This research has covered a much wider domain than is common for most

dissertations in the information systems field. This, along with the multiple case study

methodology means that articulating its contribution is not quite as straightforward as for

more traditional hypothesis testing in studies incorporating previously validated

constructs. On the other hand that very breadth provides a view of the ‘big picture’ often

missing from very tightly focused research. This last section summarizes the empirical,

theoretical and methodological contributions from this research, briefly discusses its

limitations, and points toward further work that could build on this foundation.

Empirically, the results provide detailed descriptions of the dynamics involved in the

development of technical standards and the coordination of the multiple organizations

and technological devices required to deliver complex systems, services, and

applications. This is likely to be an increasingly important dimension of information

systems research as systems designers struggle with and refine the ways of harnessing the

flexibility of web services, and networked computing more generally, through the use of

370

standards and other mechanisms. Describing the development of such services for mobile

computing platforms is of particular interest for the information systems field as we can

reasonably expect the number and importance of such services to increase in the coming

decade – particularly as more of the world’s population experiences their first interactive

computing services on mobile phone handsets. The rich two-country, two-industry case

studies presented also provide an advanced starting point for follow-on research with

differing objectives or theoretical perspectives.

Throughout the history of the wireless and television industries technological actors

and the characteristics of natural phenomena played important roles in the shaping of

technical standards and industry structures. The inscriptions of particular arrangements in

analog equipment and communications or broadcast networks contributed to stable

industry structures enduring for decades and limited the interconnections among the actor-networks of separate industries. The high-degree of upward and downward

flexibility of modern communications networks and protocols, the global access they

provide, and the dynamic reconfigurability of software controlled devices represents a

major change in the ways that actor-network building (or heterogeneous engineering) can

be accomplished in many spheres of human activity. This flexibility has facilitated the

interconnection of previously distant actor-networks (e.g. those around the wireless

communications, computing, and media industries), and has vastly increased the speed with which reconfiguration can occur. These changes must rank up there with the

industrial revolution and the first waves of electronic communications and broadcasting

in terms of the changes they have wrought in the way that people work, live, and interact.

371

The assumption of an essentially fixed context in which strategy formulation and action takes place becomes untenable in industries where these kinds of technological actors are present – and radical changes in configuration become more likely as new ways of actor-network building are made possible. The actor-network based model of strategy formulation (illustrated by the loop model in Figure 2) conceives of actors’ perceptions of the actor-network configuration constantly changing as various actors enact their strategies and interact with one another. The increasing flexibility of modern digital systems implies that the rate at which this occurs can now be much quicker than at any other time in human history. Conceptualizing digitization in this way allows greater insight into the research questions posed in the introduction.

The answers to the first research question on standards creation that emerged from the case studies all relied upon the relationships among the regulatory regime, the marketplace, and the innovation space as well as the status of technologies – in other words upon the configuration of the relevant socio-technical actor-networks. The interests of various actors were generally evident and it was possible to retrospectively build descriptive explanations of how these interests were pursued in the creation of various standards. The overarching trend of increasing flexibility of digital communications technologies and software help explain some of the changes in the world of standardization observed in the cases. The flexibility of digital technologies, the complexity of the possible reconfigurations, and the speed with which reconfigurations were possible, made the creation of standards to restrict some of possibilities essential.

However, this increase in the range of interests, and the sheer level of complexity to be translated led standardization to be increasingly performed outside the traditional

372

telecommunications forums. The greater number of possible interconnections that need

standardization also helps explain the why there has been such an explosion in the

number of forums for wireless related standardization (Tilson & Lyytinen, 2004). The

global reach of the new technologies also contributed to the increased globalization of standardization efforts.

Theorizing in classic economics on the relationship between industry structure and the creation of technical standards has exhibited technological determinism. There was little consideration of standards creation other than as a game for players selecting between existing options. The research presented here provides description based

explanations of the relationships between standards creation, industry structure and the

configuration of the wider actor-networks including technology, natural phenomena, as

well as the role of national, regional, and international regulatory regimes. The

descriptions allow the dynamics of the interactions to be examined and communicated in

much richer ways than is possible with mathematical models. This goes some way to

moving our understanding of the effects of standards beyond the technological

determinism found in the empirical economics literature. Using the actor-network

perspective, and paying more than lip service to the roles of technology and natural phenomena, showed the importance of the configuration of the socio-technical actor- network, and thus timing, upon the effects of standards creation, adoption and other actions. This helps explain why previous research found it difficult to generalize the effects of standards in the telecom industry (David & Steinmueller, 1994). One additional effect of standards that was observed in the US case study in particular was that standards

373

can facilitate or constrain the consolidation of an industry (e.g. US operators using AMPS

were less constrained before they selected a variety of 2G air interfaces).

As mobile wireless devices become a more important platform for a range of computing services application developers may have to take the regulatory regime and the properties of wireless systems into account in their heterogeneous engineering. The scarcity of the UHF wireless spectrum most suitable for television broadcasting and mobile wireless communications has historically given the regulatory regime a larger role in these industries than in the computing and many other industries. The resulting scarcity of operating or broadcasting licenses has, at least at various times, given their owners strong obligatory passage points when dealing with other actors. However, the brands and higher-quality content of the media industry, and certain service platforms (e.g. Apple’s iPhone) have proven to be equally strong OPPs. The imminent release of more UHF frequencies as TV transitions away from analog broadcasts could reduce the power of existing operators and broadcaster OPPs as more actors gain access to spectrum. The differences in the timing of this transition, and thus the availability of spectrum for new uses may result in distinctly different industry configurations in the US and the UK.

Mobile devices already support multiple frequency bands and standards (e.g. in 2007 a Nokia N95 phone supported Wi-Fi and Bluetooth air interfaces at 2.4 GHz in addition to five GSM and UMTS bands used around the world). As radio technologies become more flexible the limitations imposed by devices’ restricted access to spectrum will continue to lessen. There are even visions of completely open access to wide swathes of spectrum with devices dynamically selecting unused frequencies to achieve higher overall spectrum utilization and alleviate its scarcity (Nekovee, 2006). This along with

374

the concept of software defined radios capable of working with just about any signal

format brings with it the prospect of the radio systems exhibiting more upward and downward flexibility than has been historically possible and mirroring the increasing

flexibility of communication networks, protocols and software.

The increasing upward and downward flexibility associated with digitization has had

increasing impact far beyond just the telecom and television industries. It provides the

means for disparate organizational actors to work with one another in ways that simply

were not possible in earlier decades. As the potential of this flexibility is exploited in

various ways the organizational actors involved have to develop standards (e.g. web

services) to ensure compatibility and to some extent restrict the unbounded potential to a

manageable subset. The role of standards becomes more important with the ever

increasing integration of information systems within and among organizations. Thus this

research fits into what could be a growing realization that industry level research is of

interest to information systems research (Iacono & Wigand, 2005; King & Lyytinen,

2005). Overall, the analyses of the multiple industry level cases provides amble evidence

on the utility of the actor-network based perspective in conceptualizing the complex stories behind standards creation, adoption, and their effects on the evolution of industries. It is envisaged that the conceptualization of digitization and the malleability of software in actor network terms may also provide a starting point for future research into how such flexibility can be stabilized.

The explanations used to address the creation of standards (RQ1) relied heavily upon the configuration of the rest of the relevant actor-network. Likewise the coordination of technical, organizational, and other actors in the telecom and broadcasting industries

375

(RQ2) relied at least partially upon the characteristics of technology and of technical

standards in particular. It may have been possible to treat industry structure as exogenous

in studies of the creation of standards in older industries but this is not the case in the

recent history of the industries studied here, and it becomes increasingly problematic as the flexibility of the technologies involved facilitates ever faster reconfiguration.

Similarly, the increasingly prominent role of inter-organizational information systems makes taking the configuration of technical standards as a given in studies of the relationships among organizations ever more difficult. Consequently, the use of factor models in studying what is going on in industries that rely on flexible digital systems and technical standards is even more limited than it was in the past and can only hope to capture snap-shots that will rapidly become obsolete. A further implication is that the first two research questions used in this dissertation are effectively subsumed into the third that explicitly allows for the dynamic interactions between technical standards creation and the wider actor-network. The final research question exploring the impact of initial conditions (RQ4) also becomes redundant as the dynamic process model developed to address RQ3 allows for an ever changing actor-network configuration.

Finally, this research provides some guidance to the IS field in the use of the actor-

network perspective beyond the scope of a single standard (e.g. Markus et al., 2006) or a

fairly restricted domain (e.g. Nickerson & zur Muehlen, 2006). Specifically, we have

used time sequence diagrams (e.g. Figure 18) as ways of summarizing and presenting the

key events and actions in the cases. The specific versions of the time sequence diagrams

used here, allow the sequence of events and actions across the domains of the triangle

framework (Lyytinen & King, 2002) presented in Figure 3. However, the approach could

376

be adapted for different domains or combinations of actors. For most of the events in the

case studies we used descriptions to elaborate the actor-network conceptualization of

strategy formulation and the interactions among actors (see the loop model illustrated in

Figure 2). In addition the ‘loop model’ was explicitly populated for one instance (see

Figure 31) to illustrate its potential wider application. The loop model was also combined with Lyytinen and King’s (2002) triangle model to summarize the actor-network configuration and actor-network building in the mobile wireless and television industries

(see Figure 53). This allows the more general features of the relationships among the analytical domains to be illustrated (also see Figure 32); thereby abstracting their representation somewhat from the details captured in the time sequence diagrams.

The four research questions posed in the introduction and the diversity of the case studies used to investigate them cover a very wide domain. There is no doubt that several dissertations could have been, and have been, written about one case or by focusing on a single more restricted question. Taking such a wide view has permitted the development of comprehensive explanations of the creation and the effects of real-world technical standards, and the conceptualization of digitization in actor-network terms. While it is unlikely that these insights could have been gleaned without taking such a wide view the approach inevitably had its drawbacks. Certainly there are many theoretical perspectives that could have been applied to parts of such a wide domain. One such perspective is the extensive literature on innovation systems, which at least in some of its incarnations

allows for heterogeneous actors involved in the dynamic creation, selection, and

application of new technologies for economic ends. The capabilities of commercial actors

on several dimensions (e.g. strategic, organizational, functional, and learning) have been

377

examined (Carlsson et al., 2002), as has the effect of the configuration of the wider

institutional arrangements on successful innovation (Carlsson & Eliasson, 2003).

Additionally, there have been conceptualizations of the dimensions of innovation

systems, for example: cognitive, institutional, and economic (Carlsson & Eliasson, 2003).

While much of this literature treats technology in abstract terms there are instances where specific technological developments have been treated in detail, for example Holmén’s

(1998) examination of the use of digital signal processing to improve antenna performance. Although these dimensions did not emerge from the case studies included here the use of them as sensitizing devices could certainly inform a deeper analysis of the relationship between the innovation system and marketplace in particular, albeit at the cost of moving further away from an actor-network perspective.

This research could be extended in other ways as well. The case studies provide

advanced starting points for the use of alternative theoretical perspectives to examine the

events in these industries. Similar in-depth case studies of industries built around

standards and involving an extensive role for the regulatory regime could be used to

explore whether the findings apply in more widely. Alternatively, one could examine

other industries, or countries, with contrasting features. For example the computing

industry relies just as much on standards as the telecom and television industries but with

historically less involvement of the regulatory regime in standards creation and selection.

Contrasting such as study with this research would allow the role of the regulatory regime

to be more readily isolated.

378

Appendix 1: Interview Guide

Basic individual questions

1. Basic demographic information questions (age, gender, company, rank, education

background)

2. How did you get involved in the broadband wireless project in the current company? Please

tell us a brief history of your own career.

3. What is your current role in the project?

Company questions

4. Please give a brief history of your firm (or organization). What are the main product (or

mission), main market, number of employees, annual budget & sales volume, and the market

position?

5. How did your company get involved in the broadband wireless project? Please tell us a brief

history of your company’s involvement in the broadband market?

6. What are the main roles that your company is playing in the broadband space?

7. What is your firm’s perspective on the broadband wireless market (on competition, market,

technology, standards, and applications)?

8. What standards is your firm pursuing?

9. What role is your firm playing in the development of the standard, if any?

10. What effect has your firm had on the development of the standard?

Identifying Actor Network

11. What actors do you interact with? Who are they? What role do they play? Key individuals of

those organizations? Whom do you think we need to talk to?

379

12. What is your relationship with those that you just mentioned?

13. What is the role of regulatory regime and where are they moving toward?

Strategy

14. What is your firm’s strategy in the broadband wireless market in terms of product, standards,

and markets?

15. What is your firm’s strategy in terms of R&D, IPR, and standard?

16. What is your firm’s strategy in terms of standards and market?

Technology

17. What are other key technologies that affected (either facilitate or impede) the diffusion of

broadband wireless in your country?

National diffusion

18. Please tell us how you feel about the broadband wireless diffusion in your country?

19. Can you compare the current 2.5G and 3G to the previous wireless technology diffusion?

What are the primary differences, if any?

380

References

3GPP2. (2007). About 3GPP2. Retrieved November 22, 2007, from

http://www.3gpp2.org/Public_html/Misc/AboutHome.cfm

Ackerlof, G. (1970). The Market for Lemons. Quarterly Journal of Economics, 84(3),

488-500.

Albright, P. (2000a). Branded: TDMA-EDGE. Wireless Week(March 03).

Albright, P. (2000b). Cite about confusion around TDMA upgrade path Adding A Big

Dash Of Acronyms. Wireless Week(January 03).

Albright, P. (2001a). CDMA, GSM Vie To Convert TDMA Markets. Wireless

Week(April 2).

Albright, P. (2001b). Sprint, Verizon split over 1xEV. Wireless Week, 7(32), 12.

Albright, P. (2002). Reborn again: 3G Americas replaces UWCC. Wireless Week(March

18).

Alder, P. S. (2005). The evolving object of software development. Organization, 12(3),

401.

Alleven, M. (1998). TDMA Looks Back, Forward. WirelessWeek(April 20).

Alverez, R. (2001). "It was a great system": Face-work and the discursive construction of

technology during information systems development. Information Technology &

People, 14(4), 385.

Amobi, T. N., & Donald, W. (2006). Industry Surveys: Broadcasting, Cable, & Satellite

(Dec. 14): Standard & Poors.

381

Amobi, T. N., & Donald, W. (2007). Industry Surveys: Movies & Home Entertainment

(March 29): Standard & Poors.

Andersen, H. P. S. (2001). UMTS in 3GPP (December 1998-May 2001). In F. Hillebrand

(Ed.), GSM & UMTS: The Creation of Global Mobile Communications (pp. 247).

Chichester: Wiley.

Anonymous. (2004). Ex-Sky duo to offer limited pay-TV deal for Freeview boxes.

Campaign, 6.

Arthur, C. (2007, Aug 2). Technology: Television is a turnoff for mobile users: TV on

mobile phones has got the thumbs down from UK users, despite operators

spending millions trying to get them to tune in. Guardian.

Arthur, W. B. (1989). Competing technolgoies, increasing returns, and lock-in by

historical events. Economic Journal, 99(394), 116.

Ashley, N. (2005). Life: Online: TV on your phone. The Guardian, p. 22.

ATSC. (2007). Development of the ATSC Digital Television Standard. Retrieved May

24, 2007, from http://www.atsc.org/history.html

Austerberry, D. (2007). EU lines up for mobile TV standards war. Broadcast Engineering

(World Edition), 49(8), 8.

Avgerou, C. (2000). IT and organizational change: An institutionalist perspective.

Information Technology & People, 13(4), 234.

Axelrod, R. M., Will. (1995). Coalition formation in standard-setting alliances.,

Management Science (Vol. 41, pp. 1493): INFORMS: Institute for Operations

Research.

382

Baig, E. C. (2005, Jun 30). Orb Networks can set you free ; Start-up lets you access

media on your home PC through wireless devices. USA Today, p. 10.

Bannister, N. (1993, Oct 19). Cable firms' hopes of stalling BT video plans dashed.

Guardian, p. 14.

Barber, J. M. (1987). The Role of Government in the Provision of Measurement

Standards: Department of Trade and Industry (UK).

Barley, S. R. (1986). Technology as an occasion for structuring: Evidence from

observations of CT scanners and the social order of radiology departments.

Administrative Science Quarterly, 31, 71-108.

Barley, S. R., & Tolbert, P. (1997). Institutionalization and structuration: Studying the

links between action and institution. Organizational Studies (Walter De Gruyter

GmBH), 18(1), 93-117.

Barnes, B. (2007, May 14). Can CBS put the Net into Network? Broadcaster launches

plan syndicating shows on web, admits old strategy failed. Wall Street Journal, p.

B1.

Barrie, C. (1998, Nov 13). BT reaches for Sky deal. The Guardian, p. 025.

Barrie, C. (1999, Apr 22). Sky and BT cut decoder price in interactive boost. The

Guardian, p. 23.

Bassuener, K. (2001). AT&T Wireless Gets Rolling With GPRS. Wireless Week(July 17).

Bassuener, K. (2002). AT&T Wireless And Nokia To Test EDGE. Wireless Week(June

2000).

BBC News. (1998). First shots in Digital TV war (Publication. Retrieved March 20,

2007: http://news.bbc.co.uk/1/hi/business/the_company_file/214809.stm

383

BBC News. (2001, July 31). Terrestrial TV viewing falls. Retrieved April 3, 2007, from

http://news.bbc.co.uk/1/hi/entertainment/tv_and_radio/1466864.stm

BBC News. (2004). Football fans get phone action. BBC News(May 10 - retreived from

http://news.bbc.co.uk/1/hi/technology/3687747.stm April 24, 2007).

BBC News. (2006, Feb 28). 'Fewer young people' watching TV. Retrieved April 3,

2007, from http://news.bbc.co.uk/1/hi/entertainment/4758932.stm

Beijer, T. (2001). The UMTS Forum. In F. Hillebrand (Ed.), GSM & UMTS: The

Creation of Global Mobile Communications (pp. 156). Chichester: Wiley.

Bekkers, R. (2001). The development of European Mobile Telecommunications

Standards. An assessment of the success of GSM, TETRA, ERMES and UMTS.

Unpublished Unpublished Proefschrift (Doctoral Dissertation), Universiteit

Eindhoven, Eindhoven.

Besen, S. M. (1999). Innovation, Competition, and the Theory of Network Externalities.

Retrieved October 24, 2003, from

http://www.econ.yale.edu/alumni/reunion99/besen.htm

Bijker, W. (1995). Sociohistorical technology studies. In T. P. Pinch (Ed.), Handbook of

Science and Technological Studies (pp. 229-256). Thousand Oaks: Sage.

Bijker, W. (1997). Of Bicycles, Bakelites, and Bulbs: Toward a Theory of Sociotechnical

Change. Cambridge, MA: MIT Press.

Binmore, K., & Klemperer, P. (2002). The biggest auction ever: The sale of the British

3G telecom licences. The Economic Journal, 112(478), C74.

Birkmaier, C. (2006, July 1). Mobile Madness. Broadcast Engineering.

384

Blind, K. (2004). The Economics of Standards: Theory, Evidence, Policy. Cheltenham:

Edward Elgar Publishing Ltd.

Bloomfield, B. P., Coombs, R., Cooper, D. J., & Rea, D. (1992). Machines and

manoeuvres: Responsibility accounting and the construction of hospital

information systems. Accounting, Management and Information Technologies,

2(4).

Boland, R. J., & Schultze, U. (1996). From work to activity: Technology and the

narrative of progress. In W. J. Orlikowski, G. Walsham, M. R. Jones & J. I.

DeGross (Eds.), Information Technology and changes in organizational work.

London: Chapman & Hall.

Bowker, G., Timmermans, S., & Star, S. L. (1996). Infrastructure and organizational

transformations: Classifying nurses' work. In W. J. Orlikowski, G. Walsham, M.

R. Jones & J. I. DeGross (Eds.), Information Technology and changes in

organizational work (pp. 344-370). London: Chapman & Hall.

Brandenburger, A. J., & Nalebuff, B. J. (1997). Co-opetition. New York: Doubleday.

Braun, Wernher von. (2007). Encyclopædia Britannica Retrieved June 17, from

http://www.britannica.com/eb/article-817

Bray, H. (2005). Two more entrants in video revolution new iPod plays video, too.

Knight Ridder Tribune Business News, 1.

BSI. (2007). Introducing Standards. Retrieved October 14, 2007, from http://www.bsi-

global.com/upload/Standards%20&%20Publications/NSB/BSIintroducingstandar

ds.pdf

385

BT. (2007). The historical development of BT. Retrieved October 14, 2007, from

http://www.btplc.com/Thegroup/BTsHistory/History.htm

BT_Movio. (2007, 2007). BT Movio - Mobile entertainment for a converging world

(marketing brochure). Retrieved April 4, 2007, from

http://www.movio.bt.com/bt_movio_more.pdf

Buckley, N. (1999, Jan 19). Brussels sidesteps US mobile phones dispute. Financial

Times, p. 01.

BusinessWire. (2005). T-Mobile In Hungary Launches Mobile TV and Mobile

Commerce Services in Late-2005 - Telecoms, Mobile and Broadband in Central

Europe. Business Wire, 1.

Cable&Sat. (2005). Czech DVB-H trial. Cable & Satellite Europe(Nov 1).

CableLabs. (1998). A Decade of Innovation - The History of CableLabs 1988-1998.

Retrieved Oct.25 2007, from

http://www.cablelabs.com/downloads/pubs/history.pdf

Calhoun, G. (1988). Digital Cellular Radio. Norwood, MA: Artech House.

Callon, M. (1986). Some elements of a sociology of translation: Domestication of the

scallops and fishermen of St. Brieuc Bay. In J. Law (Ed.), Power, action and

belief: a new sociology of knowledge? (32 ed., pp. 196-233). London: Routledge.

Callon, M. (1991). Techno-economic networks and irreversibility. In J. Law (Ed.), A

sociology of monsters: essays on power, technology and domination (pp. 132-

161). London: Routledge.

386

Carlsson, B. (2007). Innovation Systems: A Survey of the Literature from a

Schumpeterian Perspective. In H. Hanusch & A. Pyka (Eds.), Elgar Companion

to Neo-Schumpeterian Economics. Cheltenham: Elgar.

Carlsson, B., & Eliasson, G. (2003). Industrial Dynamics and Endogenous Growth.

Industry and Innovation, 10(4), 435-455.

Carlsson, B., Jacobsson, S., Holmén, M., & Rickne, A. (2002). Innovation Systems:

Analytical and Methodological Issues. 31(2), 233-245.

Carroll, K. (2002). CDMA launches catalyze race to wireless data. Telephony, 242(5), 21.

Cassy, J. (2001, May 7). BT prepares to bite bullet: Life-saving plans include pounds 5bn

rights issue, sale of Wireless, and scrapping dividend. The Guardian, p. 17.

Catto, C. (2000). Cable Modems in the UK. Retrieved Oct 25, 2007, from

www.nicc.org.uk/nicc-public/Public/open_forums/nov00/ccatto.ppt

CEC. (1987a). COUNCIL DIRECTIVE (87/372/EEC) of 25 June 1987 on the frequency

bands to be reserved for the coordinated introduction of public pan-European

cellular digital land-based mobile communications in the European Community

CEC. (1987b). COUNCIL RECOMMENDATION (87/371/EEC) of 25 June 1987 on the

coordinating introduction of public pan-European cellular digital land-based

mobile communications in the Community

CEC. (1988). COMMISSION DIRECTIVE (88/301 /EEC) of 16 May 1988 on

competition in the markets in telecommunications terminal equipment

CEC. (1990). COMMISSION DIRECTIVE (90/388/EEC) of 28 June 1990 on

competition in the markets for telecommunications services

387

CEC. (1996a). COMMISSION DIRECTIVE (96/2/EC) of 16 January 1996 regarding

Mobile and personal communications

CEC. (1996b). COMMISSION DIRECTIVE (96/19/EC) of 13 March 1996 regarding the

implementation of full competition in telecommunications services.

CEC. (1996c). Communication from the Commission to the Council and the European

Parliment of 24th July 1996 on "Standardization and the Global Information

Society: The European Approach", COM(96) 359. Brussels: Commission of the

European Communities.

Cerf, V. C., & Kahn, R. E. (1974). A Protocol for Packet Network Intercommunication.

IEEE Transactions on Communications, 22(5).

Chandler, A. D. (1977). The visible hand : the managerial revolution in American

business. Cambridge: Belknap Press.

Chandler, A. D. (2005). Inventing the Electronic Century: The Epic Story of the

Consumer Electronics and Computer Science Industries. Cambridge MA:

Harvard University Press.

Charny, B. (2006, April 10). Vodafone Rolls Out Possible VOIP-Blocking Feature.

Retrieved April 21, 2006, from

http://www.eweek.com/article2/0,1895,1948111,00.asp

Clemons, E. K., Gu, B., & Lang, K. R. (2003). Newly vulnerable markets in an age of

pure information systems: An analysis of online music and online news. Journal

of Management Information Systems, 19(3), 17-41.

388

Clemons, E. K., & Row, M. C. (1988). McKesson Drug Company: A case study of

Economost - A strategic information system. Journal of Management Information

Systems, 5(1), 36-50.

Clift, W. E. (2002). The Sensible Road To 3G. Wireless Week(January 21).

Colker, D. (2004, Apr. 11). Sony Aims to Amaze With Its Wireless TV; Location Free, a

portable multi-use device to be rolled out in the U.S. in the fall, aims to change

how people watch the tube and surf the Net. Is it the hit the company needs? Los

Angeles Times, p. C.1.

Cowan, R. A. (1990). reactors: a study in technologica lock-in. Journal of

Economic History, 50, 541-567.

Crossed lines. (1999, Jan 20). Financial Times, p. 19.

Crowe, D. (2000). Split ends. Wireless Review, 17(19), 57.

Curtis, J. (2006). Screen saver. Marketing(Mar 29), 15.

Damsgaard, J., & Lyytinen, K. (2001). The role of intermediating institutions in the

diffusion of Electronic Data Interchange (EDI): How industry associations

intervened in Denmark, Finland and Hong Kong. The Information Society, 17(3).

Damsgaard, J., & Scheepers, R. (1999). Power, influence and intranet implementation: A

safari of South African organizations. Information Technology & People,

1999(12), 4.

Dan Milmo, M. b. e. (2006, Jun 22). ITV cuts programme budgets as advertising

revenues plunge: Ratings agency criticises pounds 200m return to investors:

Executive hints at further cuts to 2007 schedule. The Guardian, p. 24.

389

David, P. A. (1987). Some New Standards for the Economics of Standardization in the

Information Age. In P. Dasgupta & S. P. (Eds.), Economic Policy and

Technological Performance: Cambridge University Press.

David, P. A. (1995). Standardization policies for network technologies: The flux between

freedom and order revisited. In D. Forey & C. Freeman (Eds.), Standards,

Innovation and Competitiveness: THe Politics and Economics of Standards in

Natural and Technical Environments. Cheltenham: Edward Elgar.

David, P. A. (2000). Path dependence, its critics and the quest for ”historical economics‘.

mimeo.

David, P. A., & Greenstein, S. (1985). Clio and the Economics of QWERTY. American

Economic Review, May 1985.

David, P. A., & Greenstein, S. (1990). The Economics of Compatibility Standards: An

Introduction to Recent Research. The Economics of Innovations and New

Technology, 1(1/2), 3-41.

David, P. A., & Steinmueller, W. E. (1990). The ISDN bandwagon is coming, but who

will be there to climb aboard. Economics of Innovation and New Technology,

1(1/2), 43-62.

David, P. A., & Steinmueller, W. E. (1994). Economics of compatibility standards and

competition in telecommunications network. Information Economics and Policy,

6(3-4), 217-241.

De Laet, M., & Mol, A. (2000). The Zimbabwe Bush Pump: Mechanics of a Fluid

Technology. Social Studies of Science, 30, 225-263.

390

DiMaggio, P. (1988). Interest and agency in institutional theory. In L. Zucker (Ed.),

Institutional Patterns and Organizations: Culture and Environment (pp. 3-21).

Cambridge MA: Ballinger.

DiMaggio, P., & Powell, W. (1983). The Iron Cage Revisted: Institutional Isomorphism

and collective rationality in organizational fields. American Sociological Review,

48, 147-160.

Dixit, A., & Stiglitz, J. (1977). Monopolistic competition and optimum product diversity.

American Economic Review, 67, 297-308.

Douglas, M. (1986). How Institutions Think. Syracuse, NY: Syracuse University Press.

Drucker, E. (2001). TDMA's Curious Road To 3G. Wireless Week(December 03).

Duffy, J., & Pappalardo, D. (2005). Broadband rulings draw user concerns. Network

World, 22(32), 1.

Dupuis, P. (2001). The GSM Phase 2+ Work in ETSI SMG from 1993 to 1996. In F.

Hillebrand (Ed.), GSM & UMTS: The Creation of Global Mobile

Communications (pp. 73). Chichester: Wiley.

Eisenhardt, K. M. (1989). Building Theory from Case Study Research. Academy of

Management Review, 14(4), 532-550.

Electromagnetic radiation. (2007). Encyclopædia Britannica Retrieved July 23, 2007,

from http://www.britannica.com/eb/article-9106022

European Commission. (2007). Commission opens Europe's Single Market for Mobile

TV services. Retrieved November 2, 2007, from

http://www.europa.eu/rapid/pressReleasesAction.do?reference=IP/07/1118&form

at=HTML&aged=0&language=EN&guiLanguage=en

391

European Commission DG XIII/B. (1996). Umts task force report. Tecnical Report 1.

Brussels.

Europemedia. (2002). Confusion over BT streaming TV roll out. Europemedia, N.A.

Farquhar, M. (1996). Private Land Mobile Radio Services: Background. Retrieved

January 7, 2008, from http://wireless.fcc.gov/reports/documents/whtepapr.pdf

Farrell, J. S., & Saloner, G. (1985). Standardization, compatibility, and innovation.

RAND Journal of Economics, 16(1), 70.

Farrell, J. S., & Saloner, G. (1986). Installed Base and Compatibility: Innovation, Product

Preannouncements, and Predation., American Economic Review (Vol. 76, pp.

940): American Economic Association.

Farrell, J. S., & Saloner, G. (1988). Coordination through committees and markets.,

RAND Journal of Economics (Vol. 19, pp. 235): RAND Journal of Economics.

Farrell, J. S., & Saloner, G. (1992). Converters, compatibility, and the control of

interfaces, Journal of Industrial Economics (Vol. 40, pp. 9): Blackwell Publishing

Limited.

Farrell, J. S., & Shapiro, C. (1988). Dynamic competition with switching costs. RAND

Journal of Economics, 19, 123-137.

FCC. (1988). Amendment of Parts 2 and 22 of the Commission's Rules to Permit

Liberalization of Technology and Auxiliary Service Offerings in the Domestic

Public Cellular Radio Tele-communications Service, Gen. Docket 87-390, Report

and Order, 3 FCC Rcd 7033, 7038 (1988). .

392

FCC. (1996). Advanced Television Systems and Their Impact Upon the Existing

Television Broadcast Service: Fourth Report and Order MM Docket No. 87-268.

Washington D.C.: FCC.

FCC. (1997). Advanced Television Systems and Their Impact Upon the Existing

Television Broadcast Service: 6th Report and Order on MM Docket No. 87-268.

Washington D.C.: FCC.

Fernandes, B. E. (2001). The UMTS Taskforce. In F. Hillebrand (Ed.), GSM & UMTS:

The Creation of Global Mobile Communications (pp. 147). Chichester: Wiley.

FinancialTimes. (2006, Nov 10). BUNDLED OPTIONS RUNNING TO 'QUAD'.

Financial Times, p. 20.

Fine, J. (2007). Not Too Juiced by Joost. Business Week(4038), 24.

Fitchard, K. (2006). Mobile TV network No. 3 emerges. Telephony(Apr 24), 6.

Fluendy, S. (2006). Air wars. Knight Ridder Tribune Business News(Dec 3), 1.

Foley, T. (2000). AT&T Wireless 3G bid opens the US market to GSM

CommunicationsWeek International(Dec 18).

Fomin, V., Gao, P., & Damsgaard, J. (2004). The role of standards and its impact on the

diffusion of 3G wireless mobile services. Paper presented at the European

Academy for Standardization 9th EURAS Workshop on Standardization,, .

Fomin, V., & Lyytinen, K. (2000). How to distribute a cake before cutting it into pieces:

Alice in Wonderland or radio engineers' gang in the Nordic countries? In

Information technology standards and standardization: a global perspective (pp.

222 - 239). Hershey, PA, USA: Idea Group.

393

Forbis, S. (1988, Jul 3). Portable VCRs May Be New Fad. Chronicle, p.

51.

Fox, J. R. (1990). A brief history of cable television in the UK: cable television network

options in the UK for the 1990s. LCS, IEEE [see also IEEE LTS], 1(1), 60-65.

Fransman, M. (2002). Telecoms in the Internet Age: From Boom to Bust to...? (Vol. New

York): Oxford University Press.

FT. (2007a, May 15). Investors question strategy at Virgin Media. Financial Times, p. 1.

FT. (2007b, May 2). On-demand TV - who offers what. Financial Times, p. 5.

Funk, J. L. (2001). The Mobile Internet: How Japan Dialed up and the West

Disconnected: ISI Publications.

Funk, J. L. (2002). Global competition between and within standards : the case of mobile

phones. New York: Palgrave.

Funk, J. L., & Methe, D. T. (2001). Market- and committee-based mechanisms in the

creation and diffusion of global industry standards: the case of mobile

communication., Research Policy (Vol. 30, pp. 589).

Garrard, G. A. (1997). Cellular Communications: Worldwide market development.

Norwood MA: Artech House.

Geist, M. (2005, December 22). Towards a two-tier internet. Retrieved April 2001,

2006, from http://news.bbc.co.uk/1/hi/technology/4552138.stm

George, C. (2004). First video pics, now digital TV? European trials of 3G mobiles

adapted to receive digital TV start shortly. But the plan's had a mixed reception.

Financial Times, p. 14.

394

Gibson, O. (2007, May 2). Financial: ITV unveils on-demand play again service. The

Guardian, p. 25.

Giddens, A. (1984). The Constitution of Society. Berkeley: University of California Press.

Glen, R. (2006). The fools on the hill. Retrieved May 3, 2007, from

http://www.transdiffusion.org/emc/baird/fools.php

Glover, T. (2006a). Alcatel says 3G networks are unsuited to mobile TV. Knight Ridder

Tribune Business News, 1.

Glover, T. (2006b). The Business, London, technology column. Knight Ridder Tribune

Business News(Aug 5), 1.

Goodway, N. (2006). BT battles to keep broadband share. Knight Ridder Tribune

Business News(Nov 10), 1.

Gordon, S. H. (1996). Passage to Union: How the Railroads Transformed American Life,

1929-1929. Chicago: Ivan R. Dee.

Gosain, S. (2004). Enterprise Information Systems as Objects and Carriers of Institutional

Forces: The New Iron Cage? Journal of the Association for Information Systems,

5(4), 152-182.

Grove, A. S. (1996). Only the paranoid survive : how to exploit the crisis points that

challenge every company and career. New York: Currency Doubleday.

Guardian. (1998, Apr 24). Rival gingers up dital bidding: BT broadcast ban lifted a year

early. The Guardian, p. 24.

Hanseth, O., & Monteiro, E. (1997). Inscribing behavior in information infrastructure

standards. Accounting, Management and Information Technologies, 7, 183-211.

395

Hanseth, O., Monteiro, E., & Hatling, M. (1996). Developing Infomration Infrastructure:

The Tension between Standardization and Flexibility. Science, Technology, &

Human Values, 21(4), 407-426.

Haug, T. (2002). A commentary on standardization practices: lessons from the NMT and

GSM mobile telephone standards histories. Telecommunications Policy, 26(3/4),

101.

Henry, G. (1991, Jan 14). D-Days for the D-MAC. The Guardian, p. 25.

Hillebrand, F. (2001a). The Creation of the UMTS Foundations in ETSI from April 1996

to February 1999. In F. Hillebrand (Ed.), GSM & UMTS: The Creation of Global

Mobile Communications (pp. 184). Chichester: Wiley.

Hillebrand, F. (2001b). The GSM Work in ETSI SMG from May 1996 to July 2000. In F.

Hillebrand (Ed.), GSM & UMTS: The Creation of Global Mobile

Communications (pp. 80). Chichester: Wiley.

Hillebrand, F. (2001c). Short Message and Data Services. In F. Hillebrand (Ed.), GSM &

UMTS: The Creation of Global Mobile Communications (pp. 407). Chichester:

Wiley.

Hine, C. (1995). Representations of Information Technology in disciplinary development:

Disappearing plants and invisible networks. Science, Technology, & Human

Values, 20(1), 65-85.

Holmén , M. (1998). Regional Industrial Renewal: The Growth of 'Antenna Technology'

in West Sweden. Technology Analysis and Strategic Management, 14(1), 87-106.

396

Houssos, N., Gazis, V., & Alonistioti, A. (2004). Enabling Delivery of Mobile Services

Over Heterogeneous Converged Infrastructures. Information Systems Frontiers,

6(3), 189.

Howcroft, D., Mitev, N., & Wilson, M. (2004). What we may learn from the Social

Shaping of Technology Approach. In J. Mingers & L. Wilcocks (Eds.), Social

Theory and Philosophy for Information Systems (pp. 329-371). Chichester,

England: Wiley.

Huber, J. (2001). Spectrum Aspects. In F. Hillebrand (Ed.), GSM & UMTS: The Creation

of Global Mobile Communications (pp. 165). Chichester: Wiley.

Hughes, T. P. (1998). Rescuing Prometheus. New York: Pantheon.

Iacono, S., & Wigand, R. T. (2005). Information technology and industry change: view

from an industry level of analysis. Journal of Information Technology, 20(4), 211-

212. informitv. (2007, Feb 20). Orange will offer broadband television service. informitv

Retrieved May 21, 2007, from

http://informitv.com/articles/2007/02/20/orangewilloffer/

ITU. (2005). World Telecommunications Indicators.

ITU. (2007, Geneva, 19 October 2007). ITU defines the future of mobile communications

- ITU Radiocommunication Assembly approves new developments for its 3G

standards. Retrieved Nov 22, 2007, from

http://www.itu.int/newsroom/press_releases/2007/30.html

ITVplc. (2007). Our businesses. Retrieved May 11, 2007, from

http://www.itvplc.com/itv/about/businesses/

397

Kallinikos, J. (2005). The order of technology: Complexity and control in a connected

world. Information and Organization, 15(3), 185-202.

Katz, J. E. (2003). Machines that become us: The social context of personal

communication technology. New Brunswick, NJ: Transaction Publishers.

Katz, M. L., & Shapiro, C. (1985). Network Externalities, Competition, and

Compatibility., American Economic Review (Vol. 75, pp. 424): American

Economic Association.

Katz, M. L., & Shapiro, C. (1994). Systems competition and network effects. Journal of

Economics Perspectives, 8(2), 93-115.

Kenedy, K. (2002). Sprint set to launch 2.5G wireless network. Crn(1006), 3.

Kennedy, S. (2007, April 3). LLU in the UK. Paper presented at the Seventh UK Network

Operators' Forum, Manchester, UK.

Kharif, O. (2006). Hiwire's High Wire Act. Business Week(Aug 30).

Kharif, O. (2007). The Mobile TV Wars: Qualcomm's MediaFlo and MobiTV are set to

duke it out in the battle for this emerging business. Business Week - News

Analysis, from

http://www.businessweek.com/technology/content/jul2007/tc20070725_294703.h

tm?campaign_id=rss_tech

King, J. L., Gurbaxani, V., Kraemer, K. L., McFarlan, F. W., Raman, K. S., & Yap, C. S.

(1994). Institutional factors in information technology inovation. Information

Systems Research, 5(2), 139-169.

King, J. L., & Lyytinen, K. (2005). Automotive Informatics: Information Technology and

Enterprise Transformation in the Automotive Industry. In W. H. Dutton, B.

398

Kahin, R. O'Callaghan & A. W. Wyckoff (Eds.), Transforming Enterprise: The

Economic and Social Implications of Information Technology (pp. 283-333).

Cambridge MA: MIT Press.

King, J. L., & West, J. (2002). Ma Bell's orphan: US cellular telephony, 1974–1996.

Telecommunications Policy, 26(3/4), 189.

Klemperer, P. (2002, Nov 26). The wrong culprit for telecom trouble: PAUL

KLEMPERER. Financial Times, p. 21.

Knisely, D. N., Kumar, S., Laha, S., & Nanda, S. (1998). Evolution of wireless data

services: IS-95 to cdma2000. Communications Magazine, IEEE, 36(10), 140-149.

Korea_Times. (2005). KOREA: Cell phone-based broadcasting starts [Electronic

Version]. Asia Media - Media News Daily from

http://www.asiamedia.ucla.edu/article.asp?parentid=23866.

Kraut, R., Steinfield, C. W., Chan, A., Butler, B., & Hoag, A. (1999). Coordination and

virtualization: The role of electronics markets and personal relationships.

Organizational Science, 10(6), 722-740.

Latour, B. (1986). The Powers of Association. In J. Law (Ed.), Power, Action and Belief.

A new sociology of knowledge? Sociological Review monograph (pp. 264-280).

London: Routledge and Kegan Paul.

Latour, B. (1987). Science in action: how to follow scientists and engineers through

society. Cambridge, MA: Harvard University Press.

Latour, B. (1991). Technology is society made durable. In J. Law (Ed.), A sociology of

monsters: essays on power, technology and domination (pp. 103-131). London:

Routledge.

399

Latour, B. (1995). Social Theory and the Study of Computerized Work Sites. Paper

presented at the IFIP WG8.2, Cambridge UK.

Latour, B. (1998). On Actor-network theory: A few clarifications. Retrieved October 24,

2003, from http://www.nettime.org/Lists-Archives/nettime-1-

9801/msg00019.html

Latour, B. (2004). On using ANT for studying information systems: a (somewhat)

Socratic dialogue. In C. C. Chrisanthi Avgerou, and Frank Land (Ed.), The Social

Study of Information Communication Technology: Innovation, Actors and

Contexts (pp. 62-76). Oxford: Oxford University Press.

Latour, B. (2005). Reassembling the Social: An Introduction to Actor-Network Theory.

Oxford: Oxford University Press.

Law, J. (1992). Notes on the Theory of the Actor-Network: Ordering, Strategy and

Heterogeneity. Systems Practice, 5, 379-393.

Law, J., & Singleton, V. (2005). Object Lessons. Organization, 12(3).

Lee, J. (2004). Top Up TV targets ex-ITV Digital customers with ad drive. Campaign, 7.

Lee, R. (2000). AWS In Success Mode. Wireless Week(Dec 12).

Leibenstein, H. (1950). Bandwagon, Snob, and Veblen Effects in the Theory of

Consumers Demand. The Quarterly Journal of Economics, May 1950 -(reprinted

in W. Breit and H.M. Hochman, Readings in Microeconomics, Second Edition

(New York: Holt, Rinehart and Winston, Inc., 1971), pp. 115-116.).

Leiner, B. M., Cerf, V. G., David D. Clark, Kahn, R. E., Kleinrock, L., Daniel C. Lynch,

et al. (2003, 10 Dec). A Brief . Retrieved Jun 18, 2007,

from http://www.isoc.org/internet/history/brief.shtml

400

Leland, H. E. (1979). Quacks, lemons, and licensing: a theory of minimum quality

standards. Journal of Political Economy, 87, 1328-1346.

Lera, E. (2000). Changing relations between manufacturing and service providion in a

more competitive telecom environment `. Telecommunications Policy, 24, 413-

437.

Liebowitz, S. J., & Margolis, S. E. (1990). The fable of the keys. Journal of Law and

Economics, 33, 1-25.

Liebowitz, S. J., & Margolis, S. E. (1995). Path Dependence, Lock-In, and History.

Journal of Law and Economics, 33, 1-25.

Lilley, A. (2007, Jul 23). Media: New Media: Let the market not the politicians judge

mobile TV. The Guardian p. 8.

Liska, V. P. (1987). Astra: Closer to Reality. Satellite Communications, 11(10), 45.

Littlechild, S. C. (2006). Mobile termination charges: Calling Party Pays versus

Receiving Party Pays. Telecommunications Policy, 30(5,6), 242-277.

Luna, L. (2001a). CDMA carriers face standards conundrum. Telephony, 241(11), 72.

Luna, L. (2001b). iDEN directly connected to revenues. Telephony, 241(10).

Luna, L. (2002). CINGULAR'S 2001 MARKED BY A LOT OF PAIN, LITTLE GAIN.

Telephony Online(Feb 4).

Lyytinen, K., & Fomin, V. V. (2002). Achieving high momentum in the evolution of

wireless infrastructures: the battle over the 1G solutions., Telecommunications

Policy (Vol. 26, pp. 149).

401

Lyytinen, K., Keil, T., & Fomin, V. (2008). A Framework to Build Process Theories of

Anticipatory Information and Communications Technology (ICT) Standardizing.

Journal of IT Standards and Standardization Reserach, 6(1).

Lyytinen, K., & King, J. L. (2002). Around the cradle of the wireless revolution: the

emergence and evolution of cellular telephony., Telecommunications Policy (Vol.

26, pp. 97).

MacQueen, A. (2006, Oct 30). Media: The revolution must be televised: Instead of being

a threat, user-generated content is an exciting opportunity that can revitalise the

art of documentary film-making. The Guardian, p. 8.

Maitland, A. (2007). Virgin Mobile's Lobster fails to take off [Electronic Version].

Pocket-lint. Retrieved April 25, 2007 from http://www.pocket-

lint.co.uk/news/news.phtml/6301/7325/virgin-mobile-lobster-doesn.phtml.

Malik, O. (2007). Aggregating the Aggregators. Business 2.0 (March 7).

Malone, T., Yates, K., & Benjamin, R. (1987). Electronic markets and electronic

hierarchies: Effects of information technology on market structure and corporate

strategies. Communications of the ACM, 30(6), 484-497.

Manninen, A. T. (2002). Elaboration of NMT and GSM Standards. University of

Jyvaskyla, Jyvaskyla.

Mansell, R. (1995). Standards, industrial policy and innovation. In R. Hawkins, R.

Mansell & J. Skea (Eds.), Standards, Innovation and Competitiveness: The

politics and economics of standards in natural and technical environments.

Aldershot, UK: Edward Elgar.

402

Mansell, R., & Steinmueller, W. E. (2002). Mobilizing the Information Society:

Strategies for Growth and Opportunity. Oxford: Oxford University Press.

Marcus, J. S. (2004). Call termination fees: The US in global perspective. Paper

presented at the 4th ZEW conference on the Economics of Information and

Communication Technologies, Mannheim, Germany, July, (available at

ftp://ftp.zew.de/pub/zew-docs/div/IKT04/Paper_Marcus_Parallel_Session.pdf).

Marek, S. (2002). Cellular South Chooses CDMA Route. Wireless Week(May 13).

Marek, S. (2006). Martha Stewart, Telenovelas and Mobile TV. Wireless Week(Jan 27).

MarketingWeek. (2005). BT hands ex-Sky executive top TV marketing post. Marketing

Week, 17.

MarketingWeek. (2006). BT lures Disney new media exec for interactive TV arm.

Marketing Week, 17.

Markus, M. L., Steinfield, C. W., Wigand, R. T., & Minton, G. (2006). Industry-wide IS

Standardization as Collective Action: The Case of the US Residential Mortgage

Industry. MIS Quarterly, forthcoming.

Mateja, J. (1998, Aug 16). Carmakers to Build More Function and Fun into Their

Workhorses. Chicago Tribune, p. 1.

Maxwell, J. C. (1865). A Dynamical Theory of the Electromagnetic Field. Philosophical

Transactions of the Royal Society of London, 155, 459-512.

McCall, M. (2002). Sprint PCS gets new leadership. Wireless Week, 8(36), 1.

McLean, C., & Hassard, J. (2004). Symmetrical Absence/Symmetrical Absurdity:

Critical Notes on the Production of Actor-Network Accounts., Journal of

Management Studies (Vol. 41, pp. 493-519): Blackwell Publishing Limited.

403

MediaFLO. (2007). FLO Technology Overview. San Diego: Qualcomm (accessed at

http://www.qualcomm.com/mediaflo/news/pdf/tech_overview.pdf accessed April

23, 2007).

Méndez-Wilson, D. (2001a). Fun, Fabulous And Profitable? WirelessWeek(February 19).

Méndez-Wilson, D. (2001b). Nextel Chooses Motorola's iDEN. Wireless Week, 7(41), 1.

Méndez-Wilson, D. (2002). Top Five U.S. Carriers Now Offer SMS Interoperability.

WirelessWeek(April 08).

Miles, S. (2006, Sept 19). Virgin Lobster 700 TV mobile phone review. Retrieved April

5, 2007, from http://www.pocket-

lint.co.uk/reviews/review.phtml/1785/2809/lobster-700-tv-mobile-phone.phtml

Millar, S. (1997, Jan 30). Death of Titanic breathed life into Marconi's wireless telegraph

as world was awakened to its life-saving potential. The Guardian, p. 13.

Min-hee, K. (2005 ). KOREA: KBS to start terrestrial mobile television services next

month - World's first terrestrial mobile service to launch Dec. 1 [Electronic

Version]. Asia Media - Media News Daily from

http://www.asiamedia.ucla.edu/article.asp?parentid=34172.

MobieCommIntl. (2005). Virgin Mobile TV on trial in London. Mobile Communications

International(126), 1.

MobiTV. (2003). Press Release: Watch Live TV Content On Your Sprint Mobile Phone.

(November 13, 2003 (retreived from

http://www.mobitv.com/press/press.php?i=press/release_111303 April 24, 2007)).

MobiTV. (2004). Press Release: MobiTV Powers Sprint TV – new Video-on-Demand

Offering for Sprint PCS Vision Multimedia Services. (August 17 (retreived from

404

http://www.mobitv.com/press/press.php?i=press/release_081704 on April 24,

2007)).

MobiTV. (2005a). Press Release: Cingular and MobiTV Announce First Radio Service

for the Nation’s Largest digital voice and data network. (Nov 14 (retreived from

http://www.mobitv.com/press/press.php?i=press/release_111405a April 24,

2007)).

MobiTV. (2005b). Press Release: Cingular Goes Live with MobiTV (January 27

(retreived from http://www.mobitv.com/press/press.php?i=press/release_012505

on April 24, 2007)).

MobiTV brings mobile video channels to Europe. (2005). Television Business

International, 17(6), 1.

Mock, D. (2005). The Qualcomm Equation. New York: Amacon.

Moe, T. M. (1984). The new economics of organization. American Journal of Political

Science, 28, 739-777.

Mölleryd, B. G. (1999). Entrepreneurship in technological Systems - The Development of

Mobile Telephony in Sweden. School of Economics, EFI, the

Economic Research Institute.

Monteiro, E., & Hanseth, O. (1996). Social shaping of informtion infrastructure on being

specific about the technology. In W. J. Orlikowski, G. Walsham, M. R. Jones & J.

I. DeGross (Eds.), Information Technology and Changes in Organizational Work.

London: Chapman & Hall.

Muther, C. (2002, Dec 22). Not Your Father's Jukebox Versatile Little Mp3 Player Also

Takes Pictures and Shoots Videos. Globe, p. 24.

405

Nekovee, M. (2006). Dynamic spectrum access -- concepts and future architectures. BT

Technology Journal, 24(2).

Netherby, J. (2007, March 5). BitTorrent launches legal studio downloads. Video

Business, 27.

New Media Age. (2005a). Making money out of mobile: Portable telly. New Media Age,

S.4.

New Media Age. (2005b). Strategic Play - Cartoon Network: Animated play. New Media

Age(Oct. 6), 20.

New Media Age. (2005c). Strategic Play - Flextech: Spirited moves. New Media

Age(Dec. 1), p22.

New Media Age. (2006). O2 Strategic Play: O2's mobile TV trial. New Media Age, 23.

New Media Age. (2007). T-Mobile preparing for live-TV service. New Media Age, Mar

22, 10.

NewMediaAge. (2006). O2 Strategic Play: O2's mobile TV trial. New Media Age, 23.

Nickerson, J. V., & zur Muehlen, M. (2006). The ecology of standards processes: Insights

from Internet standard making. MIS Quarterly, forthcoming.

Nicolaou, A. I. (1999). Social control in information systems development. Information

Technology & People, 12(2), 130-147.

Niepold, R. (2001). The European Regulation. In F. Hillebrand (Ed.), GSM & UMTS:

The Creation of Global Mobile Communications (pp. 128). Chichester: Wiley.

Noguchi, Y. (2005, Jan 30). Gone in 60 Seconds; Mobile-Phone TV Demands Quick

Shows. The Washington Post, p. A.01.

Nolle, T. (2005). The FCC and DSL: New murk? Network World, 22(33), 69.

406

Nolle, T. (2006). FCC auction results and the wireless future. Network World, 23(38), 35.

Norris, A. (2004, Jun 10). Online: Dont miss a kick: As Euro 2004 approaches, Ashley

Norris offers a round up of the must have gadgets to keep you on the ball

throughout the tournament. The Guardian, p. 19.

NYT. (1994, Nov 15). Company News; Expansion of a British Interactive Tv Trial Is

Planned. New York Times, p. 5.

O'Brien, S. (2007). ITN ad funded vids on 3 UK. Mobile Entertainment(Apr 6 - retreived

from http://www.mobile-ent.biz/news/26556/ITN-ad-funded-vids-on-3 on April

26, 2007).

O'Halloran, J. (2007, Sep 19). Inside mobile TV. Electronics Weekly.

Ofcom. (2004). The Communications Market 2004: Office of Communications.

Ofcom. (2005a). The Communications Market 2005: Office of Communications.

Ofcom. (2005b, September 22). Final statements on the Strategic Review of

Telecommunications, and undertakings in lieu of a reference under the Enterprise

Act 2002 Retrieved June 18, 2007, from

http://www.ofcom.org.uk/consult/condocs/statement_tsr/statement.pdf

Ofcom. (2006a). The Communications Market 2006: Office of Communications.

Ofcom. (2006b). The International Communications Market 2006: Ofcom.

Ofcom. (2007a). The Communications Market 2007: Office of Communications.

Ofcom. (2007b). The Communications Market: Broadband - Digital Progress Report:

Ofcom (April 2).

Oliver, C. (1991). Strategic responses to institutional processes. Academy of Management

Review, 16, 145-179.

407

Olla, P., & Atkinson, C. (2004). Developing a wireless reference model for interpreting

complexity in wireless projects. Industrial Management + Data Systems,

104(3/4), 262.

Orlikowski, W. J., & Barley, S. R. (2001). Technology and institutions: What can

research on information technology and research on organizations learn from each

other? MIS Quarterly, 25(2), 145.

Pappalardo, D. (2005). Sprint rolls out EV-DO service. Network World, 22(27), 13.

Pappalardo, D. (2006). FCC's wireless auction fetches billions. Network World, 23(37),

22.

Parsons, P. R., & Frieden, R. M. (1998). The Cable and Satellite Television Industries.

Needham Heights, MA: Allyn and Bacon.

Pemberton, A. (2006). World Analogue Television Standards and Waveforms.

Retrieved May 4, 2007, from http://www.pembers.freeserve.co.uk/World-TV-

Standards/Transmission-Systems.html#A

PhoneContent.com. (2004). LG Electronics Unveils World's First Terrestrial DMB-

Receiving Mobile Phone [Electronic Version]. PhoneContent.com. Retrieved

March 19, 2007 from http://www.phonecontent.com/bm/news/gnews/580.shtml.

Pinch, T. F., & Bijker, W. (1987). The social construction of facts and artifacts: Or how

the sociology of science and technology might benefit each other. In T. P. Pinch

(Ed.), The Social Construction of Technological Systems: New Directions in The

Sociology and History of Technology (pp. 399-441). Cambridge MA: MIT Press.

Popular Electronics. (1998). DVD hits the road. Popular Electronics, 15(9), 21.

408

Porter, M. E. (1980). Competitive strategy: Techniques for analyzing industries and

competitors. New York: Free Press.

Porter, M. E., & Millar, V. E. (1985). How information gives you competitive advantage.

Harvard Business Review, 63(4), 149-160.

PR Newswire. (1999a). Lucent Technologies Calls Harmonized Next Generation G3G

CDMA Proposal 'Great News for Network Operators and Consumers'. PR

Newswire(Jun 7), 1.

PR Newswire. (1999b). Wireless Operators Announce Agreement on Globally

Harmonized Third- Generation (G3G) Code Division Multiple Access Standard.

PR Newswire(Jun 8), 1.

PR Newswire. (2007, Feb 12). QUALCOMM and British Sky Broadcasting Complete

Second MediaFLO Trial in Manchester, United Kingdom; - Second Successful

Trial Underscores Technical Advantages of MediaFLO Over DVB-H.

PRNewswire. (2005). UK's First Mobile TV Service, Orange TV, to Premiere Ashes

Heroes in Exclusive Live Broadcast of ICC Super Series. PR Newswire Europe

Including UK Disclose, n/a.

Puffert, D. (1999). Path Dependence in Economic History. Retrieved March 15, 2006,

from http://www.vwl.uni-muenchen.de/ls_komlos/puffert.html

Puffert, D. (2000). The Standardization of Track Gauge on North American Railways,

1830-1890. The Journal of Economic History, 50(4), 933-960.

Puuskari, M. (2002). Development of IP Multimedia Services and Architecture Standards

for 3G Networks. Retrieved 21 August, 2007, from

409

http://akseli.tekes.fi/opencms/opencms/OhjelmaPortaali/ohjelmat/NETS/fi/Doku

menttiarkisto/Viestinta_ja_aktivointi/Seminaarit/NETS_1a/Mikko_Puuskari.pdf

Qualcomm. (1998, Oct 13). Qualcomm Provides IPR Position to ITU for Third-

Generation Proposals. Retrieved Oct 28, 2007, from

http://www.qualcomm.com/press/releases/1998/press779.html

Qualcomm. (1999, March 25). ERICSSON and Qualcomm Reach Global CDMA

Resolution. Retrieved Oct 28, 2007, from

http://www.qualcomm.com/press/releases/1999/press457.html

Qualcomm. (2007). Press Release: QUALCOMM and British Sky Broadcasting

Complete Second MediaFLO Trial. (Feb 12 - retreived from

http://www.qualcomm.com/press/releases/2007/070212_british_sky_broadcasting

.html April 25, 2007)).

Qualcomm, & AT&T. (2007). Press Release: AT&T Selects Qualcomm’s MediaFLO

USA for Mobile Entertainment Services. Retrieved Nov 2, 2007, from

http://www.qualcomm.com/press/releases/2007/070212_att_selects_s_print.html

Quigley, P. (2000). 3G auction hammer falls. Wireless Week, 6(18), 1.

Radio. (2007). Encyclopædia Britannica Retrieved July 23, 2007, from

http://www.britannica.com/eb/article-9106101

Radousky, K. (1999). Guest Opinion: Going Global With GSM/TDMA. Wireless

Week(August 02).

Ramke, M. (2007). Modeo Update: NAB 2007. (accessed at

http://www.modeo.com/NAB_Pres_041707.pdf April 25, 2007).

410

Ray, B. (2007). Virgin pulls the plug on mobile video. November 11, from

http://www.theregister.co.uk/2007/07/26/virgin_pulls_mobile_video/

Reding, V. (2006). Television is going Mobile - and needs a pan European policy

approach. Member of the European Commission responsible for Information

Society and Media - Speech at International CeBIT Summit, Hannover

Germany(Mar 8 - retreived from

http://europa.eu/rapid/pressReleasesAction.do?reference=SPEECH/06/157&form

at=HTML&aged=1&language=EN&guiLanguage=en April 24, 2007).

Reports, A. N. (1999, May 27). Moving pictures Drive-in movies take on a whole new

meaning as DVD comes to a screen near the back seat of your car. The Guardian,

p. 007.

Rigby, E. (2004). T-Mobile to offer exclusive UEFA Euro 2004 content. Revolution, 10.

Rohlfs, J. (1974). A Theory of Interdependent Demand for a Communications Service.

Bell Journal of Economics and Management Science, Spring 1974, p16.

Rosenbluth, T. (2007). Industry Survey Telecommunications: Wireline: Standard & Poors

(Feb 8).

Rosenbock, K. H. (2001). The Creation of 3GPP. In F. Hillebrand (Ed.), GSM & UMTS:

The Creation of Global Mobile Communications (pp. 221). Chichester: Wiley.

Rothman, W. (2006, Sep 27). A Movie Library in Your Living Room. New York Times,

p. 7.

Sallie, H. (1998). HDTV Signals a New Era for Viewers. Los Angeles Times, p. 1.

SatelliteNews. (2006a). BT Unveils Name For IPTV Service. Satellite News, 29(12), 1.

411

SatelliteNews. (2006b). Launch Of BT's IPTV Offering Could Slow BSkyB Subscriber

Growth. Satellite News, 29(47), 1.

Schmidt, S. K., & Werle, R. (1998). Coordinating Technology: Studies in the

international standardization of telecommunications. Cambridge, MA: The MIT

Press.

Schonfeld, E. (2007). Forget Joost. Meet the real TiVo of the Web. Business 2.0 (June

20).

Schwarz da Silva, J. (2001). The European Research. In F. Hillebrand (Ed.), GSM &

UMTS: The Creation of Global Mobile Communications (pp. 115). Chichester:

Wiley.

Scott, W. R. (2001). Institutions and organizations (2nd ed.). Thousand Oaks, CA: Sage

Publications.

Scott, W. R. (2005). Institutional Theory: Contributing to a Theoretical Research

Program. Oxford UK: Oxford University Press, USA.

Shapiro, C., & Varian, H. R. (1998). Information Rules: A Strategic Guide to the Network

Economy: Harvard Business School Press.

Shvets, V., Coe, N., & Kieley, A. (2005). Wireless Industry: The state of play: Deutsche

Bank.

Sidorova, A., & Sarker, S. (2000). Unearthing some causes of BPR failure: An actor-

network theory perspective. Paper presented at the AMCIS 2000.

Siklos, R. (2007, Apr 1). Push Comes to Shove for Control of Web Video. New York

Times, p. 8.

412

Skype. (2006, February 14). Skype and Hutchison 3 Group Join Forces to Offer Skype on

Mobile Devices. Retrieved April 21, 2006, from

http://www.skype.com/company/news/2006/skype_hutchison.html

Slotten, H. R. (2002). Satellite Communications, Globalization, and the Cold War.

Technology and Culture, 43(April), 315-350.

Smith, B. (2000a). Is EDGE Balancing On The Edge? WirelessWeek(November 20).

Smith, B. (2000b). Upgrades Pose Challenge. Wireless Week(Dec 4).

Smith, B. (2007). Mobile TV's High Wire Act. Wireless Week(April 15).

Snoddy, R. (1997, Jan 4). US group wins BBC transmitters Castle Tower

Communications named as 'preferred bidder' in pound;250m deal. Financial

Times, p. 04.

Spikes, R. B. a. S. (2007, Apr 4). National Grid sells wireless unit for Pounds 2.5bn

*Australian group gains transmission monopoly. Financial Times, p. 17.

Standage, T. (1999). . New York: Berkley Books.

Steinbock, D. (2001). The Nokia Revolution. New York: Amacom.

Stern, C. (2001, Dec 20). Comcast Deal Gives Armstrong Another Shot; Cable Strategy

Failed at AT&T, So He Moves With the Business. The Washington Post.

Steward, S. (1995). Packet or circuit: Deciding factors. Cellular Business, 12(7), p18.

Swain, R. S. (1995). RACE Vision of UMTS. Paper presented at the Workshop of Third

Generation Mobile Systems, DGXIII-B, .

Swann, G. M. P. (1985). Product competition in microprocessors. Journal of Industrial

Economics, 34, 33-54.

413

Swann, G. M. P. (1999). The Economics of Measurement: Report for NMS Review:

Department of Trade and Industry (UK).

Swann, G. M. P. (2000). The Economics of Standardization: Final Report for Standards

and Technical Regulation Directorate of the Department of Trade and Industry:

Manchester Business School.

Tassey, G. (2000). Standardization in Technology-Based Markets. Research Policy,

29(4/5), 587-602.

Taylor, M., Martens, L., & Jaeger, D. (2004). Full Service Deployment via TV Broadband

Networks - System Evolution Based on MSO Experiences and Requirements.

Paper presented at the ITU-T Workshop All Star Network Access. from

www.itu.int/ITU-

T/worksem/asna/presentations/Session_4/asna_0604_s4_p5_dj.ppt.

Teather, D. (2001, Jul 11). New economy: BT and broadcasters link to take on cable

industry. The Guardian, p. 24.

Telecom_Policy. (2002). Special Issue on the role of standardization in the rise of cellular

telephony. Telecommunications Policy, 26(3-4), 97-217.

Telecoms Deal Report. (2000). Hutchison Whampoa: Europe's East Asian Pioneer?

Telecoms Deal Report, 2(11), 1.

Telecomworldwire. (2005). Orange brings 3G to UK PAYG customers.

Telecomworldwire, 1.

Telecomworldwire. (2006a). NTL and Telewest announce completion of merger.

Telecomworldwire, 1.

414

Telecomworldwire. (2006b). NTL to rebrand as Virgin Media in 2007.

Telecomworldwire, 1.

Telephony. (2007, March). IPTV Watching the next generation of video. Telephony

(special supplement).

television. (2007). Encyclopædia Britannica Retrieved May 4, 2007, from

Encyclopædia Britannica Online http://www.britannica.com/eb/article-9106102

Temple, S. (2001). The GSM Memorandum of Understanding the Engine that Pushed

GSM to the Market. In F. Hillebrand (Ed.), GSM & UMTS: The Creation of

Global Mobile Communications (pp. 36). Chichester: Wiley.

Thelander, M. W. (2005). The 3G Evolution - Taking CDMA2000 into the next decade.

Retrieved August 10, 2007, from

http://www.cdg.org/resources/white_papers/files/3G_Evol_Oct05.pdf

Thomas, D. (2006, Aug 24). BT Group Banks on Internet-TV Launch in U.K. Wall Street

Journal, p. 3.

Thompson, B. (2006). Will 'fourplay' be the next big thing? Retrieved April 12, 2006,

from http://news.bbc.co.uk/1/hi/technology/4896104.stm

Tilson, D., & Lyytinen, K. (2004). The 3G Transition: Changes in the U.S. Wireless

Industry. Sprouts: Working Papers on Information Environments, Systems and

Organizations, 4(Summer), Article 8.

http://weatherhead.cwru.edu/sprouts/2004/040308.pdf.

Tilson, D., & Lyytinen, K. (2005, June 1-3). An actor-network study of 3G standards

making and adoption in the US wireless industry - A case study of wireless

415

operator standardization strategies. Paper presented at the Hong Kong Mobility

Roundtable, Hong Kong.

Tilson, D., & Lyytinen, K. (2006). The 3G Transition: Changes in the U.S. Wireless

Industry. Telecommunications Policy, 30(10-11), 569-586.

Tilson, D., Lyytinen, K., Sørensen, C., & Liebenau, J. (2006, June 1-2). Coordination of

technology and diverse organizational actors during service innovation – the case

of wireless data services in the United Kingdom Paper presented at the Mobility

Roundtable, Helsinki.

Tolbert, P., & Zucker, L. (1994). Institutional Analyses of Organizations: Legitimate but

not Institutionalized (Working Paper Series No. issr-1004): Institute for Social

Science Research, .UCLA.

Trosby, F. (2004). SMS, the strange duckling of GSM. Telektronikk, 3, 187-194.

TV Business Int. (2005). The small screen becomes a big deal. Television Business

International, 17(10), 1.

TVBus. (2006). BT's Vision becomes clearer. Television Business International(8), 1.

UMTS Decision. (1999). Decision No 128/1999/EC of the European Parliament and of

the Council of 14 December 1998 on the coordinated introduction of a third-

generation mobile and wireless communications system (UMTS) in the

Community. Official Journal of the European Communities, 42(L17), 1.

UMTS_Forum. (2003). Press: Release: 3 UK video mobile available Nationwide. Jun 2 -

retreived from http://www.umts-forum.org/content/view/190/95/ April 24, 2007).

Verizon_Wireless. (2005). Press Release: On-Demand In The Palm Of Your Hand:

Verizon Wireless Launches “VCAST” – Nation's First And Only Consumer 3G

416

Multimedia Service. (Jan 7 - accessed at

http://news.vzw.com/news/2005/01/pr2005-01-07.html on April 24, 2007).

Victor, K. (2005). Technology: Opinion: How to fit the world in your pocket. The

Guardian, p. 4.

Vidgen, R., & McMaster, T. (1996). Black boxes, non-human stakeholders and the

translation of IT through mediation. In W. J. Orlikowski, G. Walsham, M. Jones

& J. I. DeGross (Eds.), Information technology and changes in organizational

work (pp. 250-271). London: Chapman and Hall.

Vodafone. (2005, October 31, 2005). Vodafone UK and Sky team up to launch Sky

Mobile TV. Retrieved March 23, 2007

Walker, J., & Ferguson, D. (1997). The Broadcast Television Industry. Needham Heights,

MA: Allyn & Bacon.

Walsham, G. (1997). Actor-Network Theory and IS research: Current status and future

prospects. In A. S. Lee, J. Liebenau & J. I. DeGross (Eds.), Information systems

and qualitative research (pp. 466-480). London: Chapman and Hall.

Walsham, G., & Sahay, S. (1996). GIS for district-level administration in India:

Problems and opportunities. Lancaster, United Kingdom.

Ward, M. (2007, March 16). Mobile TV warned to standardise. Retrieved April 18,

2007, from http://news.bbc.co.uk/1/hi/technology/6459161.stm

Washington Post. (1952, Nov 18). Portable TV Arrives With Tubeless Set. The

Washington Post, p. 16.

417

Weitzel, T., Beimborn, D., & König, W. (2006). A unified economic model of standard

diffusion: The impact of standardization cost, network effects and network

topology. MIS Quarterly, forthcoming.

West, J. (2000). Institutional constraints in the initial deployment of cellular telephone

service on three continents. In K. Jakobs (Ed.), Information Technology Standards

and Standardization: A Global Perspective (pp. 198-221). Hershey: Idea Group

Publishing.

West, J. (2002a). Qualcomm 2000: CDMA Technologies (pp. 24). San Jose: San Jose

State University.

West, J. (2002b). Qualcomm 2001: 3G Strategies (pp. 15). San Jose: San Jose State

University.

West, J., & Fomin, V. V. (2001). When government inherently matters: National

innovation systems in the mobile telephone industry, 1946-2000. Paper presented

at the Academy of Management Conference, Washington D.C.

Wheatley, J. J. (1999). World Communications Economics. Stevenage, United Kingdom:

The Institute of Electrical Engineers.

Wickham, R. (2007). V Cast Mobile TV Goes Live. Wireless Week(March 1).

Wigand, R. T., Steinfield, C. W., & Markus, M. L. (2005). Information Technology

Standards Choices and Industry Structure Outcomes: The Case of the U.S. Home

Mortgage Industry. Journal of Management Information Systems, 22(2), 165-191.

Williams, G. (1998). Guest Opinion: 3G's Two-legged Stool. Wireless Week(December

14).

418

Williamson, O. E. (1975). Markets and Hierarchies: Analysis and Antitrust Implications.

New York: Free Press.

Williamson, O. E. (1985). The economic institutions of capitalism : Firms, markets,

relational contracting. New York, NY: Free Press.

Wireless_News. (2006). IPWireless Previews TDtv. Wireless News(Jan 21), 1.

Wireless_News. (2007). Vodafone, Telefonica, Orange, and 3UK Complete Successful

TDtv Trial with IPWireless and MobiTV. Wireless News(Feb 12), 1.

Wray, R. (2006a). Financial: BSkyB looks at broadcasting straight to mobiles: Pay-TV

group conducts new technology trials Plan would bypass five existing phone

operators. The Guardian, p. 40.

Wray, R. (2006b). Financial: NTL launches first broadcast TV service to mobiles. The

Guardian, p. 26.

Wray, R. (2006c, Jun 1). Financial: Orange plans broadband TV rival to BT Vision. The

Guardian, p. 27.

Wray, R. (2006d). Mobile TV trial finds users want more channels. The Guardian, p. 25.

Wray, R. (2007a, Jul 10). Brussels courts phonone company anger with move to fix

mobile TV standard. The Guardian, p. 24.

Wray, R. (2007b, Jul 17). Financial: O2 drops i-mode mobile internet service in Britain.

The Guardian.

Wray, R. (2007c, Jan 17). Mobile TV fails to sell despite ad campaign: Virgin Mobile

sells fewer than 10,000 handsets Consumers shun the Lobster phone. The

Guardian, p. 24.

419

Wray, R., & Tryhorn, C. (2006, Nov 28). Financial: BT joins battle for digital TV

audience: Content via broadband service is set for launch Telecoms firm

challenges Sky with view-on-demand. The Guardian, p. 27.

Wright, J. (1999, Jun 10). Your Wheels; TVs, VCRs On Board: Are They Worth the

Entertainment Value? Los Angeles Times, p. 10.

Writer, F. A. W. P. S. (2006, Sep 19). FCC Wireless Auction Could Open Up Airwaves.

The Washington Post, p. 1.

WSJ. (2007, December 11). Google Inc.: YouTube to Expand Program To Put Ads in

Online Videos. Wall Street Journal (Eastern Edition).

www.3g.co.uk. (2005). Vodafone UK and SKY Team Up To Launch SKY Mobile TV

[Electronic Version]. www.3g.co.uk. Retrieved March 28, 2006 from

http://www.3g.co.uk/PR/Nov2005/2133.htm.

Wynn, C. (2006, Sep 4). Media: Go figure: Mobile TV: Why is the UK not setting the

pace for mobile broadcasts? The Guardian, p. 9.

Yin, R. K. (2003). Case study research: design and methods (3rd ed.). Thousand Oaks,

CA: Sage Publications.

Yoo, Y., Lyytinen, K., & Yang, H. (2004, Mar 11-12). The role of standards and its

impact on the diffusion of broadband mobile services: A Korean case. Paper

presented at the Austin Mobility Roundtable, Austin TX.

Yoo, Y., Lyytinen, K., & Yang, H. (2005). The role of standards in innovation and

diffusion of broadband mobile services: The case of South Korea. Journal of

Strategic Information Systems, 14(3), 323-353.

Yoshida, J. (2006). Hybrid IC reopens analog mobile-TV debate. EE Times.

420

Yu, J. I. (1992, October 19-21). Overview of EIA/TIA IS-41 [cellular radio standard].

Paper presented at the Third IEEE International Symposium on Personal, Indoor

and Mobile Radio Communications (PIMRC '92), Boston, MA.

Zucker, L. (1977). The role of institutionalization in cultural persistence. American

Journal of Sociology, 42, 726-743.

421