Goal-Based Initiatives Solve Empirically Proven

Total Page:16

File Type:pdf, Size:1020Kb

Goal-Based Initiatives Solve Empirically Proven

***GOALS CP 1nc—counterplan

COUNTERPLAN TEXT—

Goal-based initiatives solve—empirically proven Willsie 2/23/11 – senior at the Princeton University Woodrow Wilson School of Public and International Affairs and contributor in AEL’s New Energy Leaders Project (Tucker, “Grounding Our Innovation Policy Debate,” Americans for Energy Leadership, http://leadenergy.org/2011/02/grounding-our-innovation-policy- debate/) The second major contribution of the Sematech initiative was the role of government in coordinating the member organizations and helping to direct the research. A conservative criticism of government involvement in private research is that the government doesn’t know as well as private firms what research should be undertaken. By ‘picking winners,’ the government does not force technologies to prove themselves in the marketplace and might waste money on research initiatives that finally fail. However, several aspects of the microchip industry made government involvement preferable. First, rather than picking a specific technological solution, the government merely outlined a broad objective – to improve American manufacturing processes. This macro level direction focused the research on issues that the government deemed critical while allowing the freedom for private industry to determine the best technological solutions. Furthermore, Japan was already benefitting from such a research initiative, so the government was assured it would be worthwhile and private industry had signaled their approval of the research objective by donating half of the $200 million in funding for the project.

Picking winners causes cost overruns Niman ’95 – Associate Professor of Economics, Whittemore School of Business and Economics, University of New Hampshire (Neil B, “Picking winners and losers in the global technology race,” July, Contemporary Economic Policy, Vol. 13, Iss. 3, proquest)

In addition, Romano (1991) contends that government subsidies can spawn a moral hazard problem by creating incentives for researchers to "pocket" publicly provided funds. Researchers also may engage in post-contractual opportunistic behavior (Williamson, 1985). Once a contract has been signed, the recipient of the research award may threaten endless delays or inflated costs in order to gain additional government funding. With potential jobs at stake in key congressional districts, the government may accede to the contractor's opportunistic demands.

NASA cost overruns causes budget trade offs Fox News ’09 (“NASA BY THE NUMBERS: COST OVERRUNS PLAGUE KEY PROJECTS, April 10, http://www.foxnews.com/story/0,2933,513575,00.html#ixzz1R4TjDxYf)

NASA's next administrator will have his hands full — with federal funds and a set of formidable challenges, including how to return to the moon by 2020 without seeing the agency's budget go into orbit itself. Despite cost overruns totaling nearly $1.1 billion in nine of its flagship projects, NASA will see its 2010 fiscal year budget increase to $18.7 billion. Combined with the $1 billion NASA got from the new stimulus package, that’s $2.4 billion more than the space agency received in 2008. President Obama has vowed to return NASA to its glory days — to “inspire the world,” grow the economy and make America stronger — by funding, among other projects, climate change research and the completion of the International Space Station. But congressional auditors have called for a “more disciplined approach” to projects like the Mars Science Laboratory, which has seen its cost skyrocket to $2.3 billion since October 2007, an increase of more than $657 million — an amount roughly equivalent to the gross domestic product of Grenada. "What is common among these and other programs is that whether they succeed or fail, they cost more to build and take longer to launch than planned," according to a Government Accountability Office report released last month on NASA's large-scale programs. "As a result, NASA is able to accomplish less than it plans with the money it is allocated, and it is forced to make unplanned trade-offs among its projects — shorting one to pay for the mistakes of another."

<<>>

And mission failure—turns the case GAO 3/3/11 – United States Government Accountability Office (“NASA Assessments of Selected Large-Scale Projects,” http://www.gao.gov/new.items/d11239sp.pdf) Five projects in implementation and one project in formulation reported experiencing contractor challenges, including not completing work on time, not identifying risks for the project, and inadequate oversight. Contractor management challenges have been reported for a greater number of projects and with a greater impact for projects in past reports. Although the impact of this challenge on projects we reviewed this year has diminished, as contractors spend about 85 percent of NASA’s annual budget, their performance is critical in terms of achieving the success of many NASA missions. As a result, we continue to identify this area as a common project challenge that can contribute to cost and schedule growth. 1nc—politics link

NASA cost overruns empirically trigger Congressional backlash Klamper ’10 – Space News Contributor (Amy, “Congress Criticizes Spending on NASA’s James Webb Space Telescope,” January 11, http://www.spacenews.com/civil/100111-congress-criticizes-spending-webb- telescope.html)

WASHINGTON — NASA and U.S. lawmakers are at odds over whether a 2009 budget increase for the agency’s $5 billion James Webb Space Telescope, the bulk of which was provided as part of last year’s economic stimulus package, constitutes a cost overrun on the program. In the report accompanying an omnibus U.S. federal spending package for 2010 that includes NASA funding, which passed in December, Congress criticized the program for spending some $95 million more than was budgeted for 2009. The language, inserted by Sen. Barbara Mikulski (D-Md.), who chairs the Senate Appropriations commerce, justice, science subcommittee, indicated that additional cost overruns on Webb loom in 2010. Jon Morse, director of astrophysics at NASA’s Goddard Space Flight Center, said the bulk of last year’s excess spending on the flagship-class observatory, $75 million in 2009 American Recovery and Reinvestment Act funds, was needed to keep the program’s workers employed. “There’s no question that if we had not had an injection of those funds that we would have had to lay off a couple hundred at least — we would have had to slow down work,” he said. He added that program officials became aware of the budget shortfall when an independent review board drew attention to it during Webb’s mission confirmation review in 2008. “That adjustment was made in order to achieve the budget profile that the standing review board recommended, and we could not have gotten there without that stimulus money,” he said, adding “there was a good six months between the review and when we were actually able to start thinking about making that adjustment.” Morse said the near-term cash infusion will reduce the chances of future cost growth on the program. “The cheapest [James Webb Space Telescope] is the one that launches the soonest,” Morse said. “Because if you slip work you wind up paying a price for that; not just inflation but you have an inefficiency in the way you use your work force and you wind up paying two or three times the amount that you slipped, so we’re really trying to hold schedule.” Injecting resources “here and there” as needed is the best way to do that, he said, adding that the stimulus funding will not add to the overall cost of the program. A Senate aide was not buying that argument, however, and said Webb telescope program officials need to level with lawmakers about cost overruns. “ If a program has a certain amount budgeted and it needs more money, that’s an overrun,” the aide said. “We appropriate one year at a time, so when you have an overrun in one year, and you have to cover it, it’s taking from other priorities.”

***POLITICS 2nc—politics link

Congress hates NASA funding—cost overruns trigger the link Block and Matthews 1/19/11 – Orlando Sentinel (Robert and Mark K, “Analysis: NASA flails as forces pull on it from all directions,” http://www.physorg.com/news/2011-01-analysis-nasa-flails.html)

With the space shuttle set to retire this year, and no successor imminent, today's NASA is being pulled apart by burdensome congressional demands, shrinking federal budgets, greedy contractors, a hidebound bureaucracy and an ambitious new commercial space industry that wants to shake up the status quo. "Our civil space agency has decayed from Kennedy's and Reagan's visions of opening a new frontier to the point where it's just a jobs program in a death spiral of addiction and denial, with thousands of honest innovators trapped inside like flies in bureaucratic amber," said space-policy consultant James Muncy. Efforts to get the agency back on track are in trouble. Already, a new plan for NASA signed into law by President Barack Obama in October - to replace the Constellation program, which spent $12 billion without producing a rocket - appears to be unraveling. In a letter to Congress last week, NASA all but threw up its hands - telling lawmakers that it could not build the "heavy-lift" rocket and capsule Congress wants on the budget and schedule it demands. Congress had specified that NASA use solid-rocket motors designed for Constellation's Ares I rocket, as well as parts from the space shuttle, to speed construction of a new rocket. But the agency has told the Orlando Sentinel that the new rocket could cost as much as $20 billion - about $9 billion more than the initial budget Congress has set - and take up to two years longer than the six-year deadline set by lawmakers. Two key NASA backers in the Senate - Florida Democrat Bill Nelson and Texas Republican Kay Bailey Hutchison - responded with a simple message: Try harder. But trying is unlikely to be enough given the agency's history of busting budgets, experts say. In 2004 and again in 2008, the Congressional Budget Office said that after studying 72 past NASA projects, it found that cost overruns of as much as 50 percent are routine for the agency. And though Congress wants NASA to cut costs, it also wants the agency to continue using NASA's expensive work force and existing cost- plus contracts - meaning it will be very difficult to slash overhead costs. But extending those contracts, as Nelson and Hutchison are demanding, might not be so easy. In recent weeks, key aerospace companies have demanded that NASA open the new rocket project to competition or face the prospect of lawsuits. One, aerospace giant Aerojet, told NASA in a letter Dec. 1, that "we do intend to compete" for the solid-rocket boosters and engines that Congress wants put on the new rocket. Aerojet makes solid rockets as well as liquid-rocket engines. The company has long been unhappy that NASA awarded a no-bid contract for the first stage of the Ares I rocket to rival solid-rocket manufacturer ATK, in part on the erroneous grounds that it was the country's only producer of large solid-fuel rockets. With a new project in the offing, Aerojet wants the chance to win the business back. "Aerojet believes that the only affordable and sustainable path ... is achieved through competition," the letter said. But a new competition - combined with the inevitable protests and legal challenges that would follow it - runs the risk of further slowing any effort to quickly convert pieces of the shuttle and Constellation into a new spacecraft system by Dec. 31, 2016, as the new law demands. Senate staffers who helped craft the new law say that NASA has no choice but to extend the contracts. However, several contracting lawyers disagree, saying that a new rocket is a significant-enough change in the scope of the project as to require NASA to rebid the contracts or face potentially lengthy legal action. Aerojet officials would not comment on the letter, and NASA would only say that it is still considering its options. Nelson and Hutchison were not amused. "The law directs NASA to build on past investments in human spaceflight by leveraging existing knowledge from the space shuttle and Constellation programs," they said in a letter to NASA Administrator Charlie Bolden. "We expect NASA to work with Congress to identify ... how existing contracts and technologies will be utilized," reminding Bolden that this was not optional. "It is the law." Bolden, for his part, has been conspicuous by his absence, almost invisible to the public and refraining from any public comments on the agency's new direction. NASA spokesman Michael Braukus said the agency "is committed to finding the necessary efficiencies to drive costs down and develop this system as soon as possible." He promised "cultural changes (that will) drive down development and operational costs through innovation, improved practices, right-sized infrastructure and reducing other fixed costs." Some observers are sympathetic to the agency's plight. "It's a really tough spot to be in. And all indications are that's going to get tougher because of the fiscal situation," said Howard McCurdy, a space expert at American University. He said a major problem is that NASA is still rooted in the big-budget, big-project approach of the Apollo program of the 1960s. "The brute-force method of spaceflight is in control (of NASA), and it's very susceptible to spending cuts," he said, adding that the new law was "another case of Congress trying to repeal the law of physics."

***PICKING WINNERS BAD 2nc—generic

Picking winners empirically fails Adrianson 5/24/11 – editor of InsiderOnline.org (Alex, “When Government Picks “Winners,” They Sometimes Turn Out to Be Losers,” Heritage Foundation, http://blog.heritage.org/2011/05/24/when- government-picks-winners-they-sometimes-turn-out-to-be-losers/) The government isn’t very good at picking technologies to invest in. Undeterred by this track record, however, some members of Congress now want to create a new subsidy program for cars powered by natural gas. Time to remember two great moments in government technology policy. In the early 1960s, an adviser told President Kennedy that failure to enter the supersonic transport market would cost the United States 50,000 jobs, $4 billion in income, and $3 billion in capital. In 1968, the federal government began subsidizing the development of the Boeing 2707—an aircraft that was expected to compete with the French Concorde. Supersonic travel uses a lot more fuel, and spikes in the price of fuel made the project uneconomical. Concorde flights were also so loud that they provoked a backlash against supersonic flight. After spending nearly $1 billion without completing a prototype, the Boeing project was canceled in 1971. Flash forward a decade. High gas prices induced Congress to create the Synthetic Fuels Corporation to invest in developing alternatives to imported fossil fuels. This public-private collaboration was eventually killed in 1986, but not after spending $4.5 billion of taxpayers’ money without producing any new fuels. This time falling gas prices foiled the government’s plans, as alternative fuels could not compete against cheap gas. (See “Energy Subsidies,” by Chris Edwards, DownsizingGovernment.org, February 2009.) Government can’t predict the price of fuel—or many other factors that determine whether a particular technology will succeed. Maybe compressed natural gas vehicles are the future. But, as Nick Loris points out, if consumers want them, private companies will figure out how to make them without government subsidies. For more on why these subsidies are a bad idea, see Loris’s paper “Natural Gas Vehicle Subsidies Hurt Consumers,” The Heritage Foundation, May 11, 2011.

Government fails at picking winners Slivinski ’01 – fiscal policy analyst at the Cato Institute (Stephen, “The Corporate Welfare Budget Bigger Than Ever,” Cato Institute Policy Analysis No. 415, October 10, http://www.cato.org/pubs/pas/pa415.pdf)

The federal government has a disappointing record of picking winners and losers. The function of private capital markets is to direct investment to industries and firms that offer the highest potential rate of return. The capital markets, in effect, are in the business of selecting corporate winners and losers. Yet the underlying premise of federal business subsidies is that the government can direct the limited pool of capital funds just as effectively as, if not better than, venture capitalists and money managers. The truth is that capital markets rely on more sophisticated knowledge, and in much larger quantities, than a government could ever collect, use effectively, or even fathom. That dooms most capital allocation decisions by government bureaucracy to failure.2 5 As T. J. Rodgers, president and CEO of Cypress Semiconductors, has noted, when the federal government tries to control investment capital , the “proven moneymakers and job creators lose control over the investment of their funds, and unproven Washington amateurs take over.”2 6 The evidence supports this contention. For instance, the Small Business Administration, which subsidizes loans to small businesses, has a delinquency rate of up to 15 percent in any given year.27 The average delinquency rate for similar commercial loans by private lenders is around 2 percent.2 8 The same goes for agricultural loans. The Farm Service Agency’s direct farm loan portfolio has a delinquency rate of just over 28 percent. 2 9 In fact, a Department of Agriculture study stated: “All major institutional lender groups except the Farm Service Agency continue to experience historically low levels of delinquencies, foreclosures, net loan charge-offs, and loan restructuring.”30 Indeed, the corresponding private delinquency rate was no higher than 5.4 percent over the last 10 years, and that rate has been declining.31 Taxpayers lost more than $2 billion in defaulted government agricultural loans between 1995 and 1997.3 2 Congressional oversight and agency reforms might reduce those default rates only slightly. But the fundamental fact will always remain: government, by its nature, will never be able to participate in a capital market effectively and efficiently.

Governments can’t pick winners and it leads to delays Loris 5/18/11 – focuses on energy, environmental and regulatory issues as a policy analyst in the Roe Institute for Economic Policy Studies at The Heritage Foundation (Nicolas, “Stop Picking Energy Winners and Losers with the Tax Code,” Heritage Foundation, http://www.heritage.org/Research/Commentary/2011/05/Stop- Picking-Energy-Winners-and-Losers-with-the-Tax-Code)

When the government decides to favor a technology with subsidies, it’s a good bet that subsidy “winner” is a loser in the marketplace. Political decisions to provide subsidies distort the marketplace at the expense of economic growth and prosperity. That’s exactly what has happened--and what continues to happen--with America’s energy tax policy. Reversing this practice will benefit American consumers and taxpayers. Special tax treatment can come on the production side or the consumption side. On the consumption side, the Energy Policy Act of 2005 and American Recovery and Reinvestment Act provide tax rebates for hybrid and electric vehicles. Currently, policymakers are pushing to expand and extend tax credits that subsidize the production vehicles powered by natural gas and other credits that support refueling infrastructure. These are just two examples of energy tax provisions that benefit a specific group of producers. Our federal government has woven a complicated web of energy tax policy over the past few decades. Such preferential tax treatment causes a number of problems. First, special tax credits for cherry-picked technologies artificially reduce the price for consumers. This makes them seem far more competitive than they actually are. Rather than increase competition, the artificial market distortion gives these technologies an unfair price advantage over other technologies. The more concentrated the subsidy or preferential treatment, the worse the policy is because the crowding-out effect for other technologies is larger. If subsidized technologies are market viable, then the tax break merely offsets private-sector costs for investments that would have been made either way. This creates industry complacency and perpetuates economic inefficiency by disconnecting market success from production costs. Furthermore, when the government becomes involved in the decision-making process, it increases the business incentive to send lobbyists to Capitol Hill to make their pitch why their industry needs those tax credits. Industries will plead that they need five years of tax credits then they’ll be good to go on their own. Five years later, they’re asking for five more years. These specific carve outs reduce the incentive for producers to be cost competitive with technologies that do not rely on help from the government. The reality is, if electric vehicles, natural gas vehicles or any technology is profitable, it won’t need preferential tax treatment. The market is much better than bureaucrats at sorting out the good ideas from the bad. Subsidies only centralize power in Washington and allow lobbyists and politicians to decide which companies will produce. The other argument industries employ when demanding tax breaks is the old “other industries get them, so we should too.” This is not a good reason to apply more industry-specific tax cuts; it’s a good reason to remove those already in place. Lately, high gas prices and high profits for oil companies have tempted some lawmakers to propose removing “subsidies” for oil and gas companies. The problem is, the tax provisions they target are not special interest subsidies at all. Take the domestic manufacturer’s tax credit. It applies to any number of U.S. producers—clothing manufacturers, road builders, electricity generators, water companies and more. Making oil and gas production companies ineligible for this credit neither removes a subsidy nor closes a tax loophole. It merely imposes a targeted tax hike on oil and gas producers. The same is true for foreign tax credits and deferral of foreign income. These are two critical features of a worldwide tax system that prevent the U.S. corporate income tax from double taxing—and further crippling—the international competitiveness of U.S. companies. Broadly available tax provisions are not oil subsidies. Peeling back the industry-specific tax credits and evenly applying lower taxation to all energy sources will allow the best technologies to reach the marketplace, which will ultimately benefit the consumer. It sounds easy enough. But given the cycle of politicians picking winners and then those winners donating to the same politicians come election time, it’s a difficult end to achieve. 2nc—space

Government can’t pick winners for space transportation Slivinski ’01 – fiscal policy analyst at the Cato Institute (Stephen, “The Corporate Welfare Budget Bigger Than Ever,” Cato Institute Policy Analysis No. 415, October 10, http://www.cato.org/pubs/pas/pa415.pdf)

Commercial Space Transportation. This program was created to encourage private space launches and expendable launch vehicles with taxpayer money. While it is important that the private sector be allowed to develop commercially viable means of traveling in space, it is not the government’s role to pick winners in that market by giving government money to favored aerospace companies. The venture capital markets are capable of handling that type of development funding.

Private sector should handle space R&D Slivinski ’01 – fiscal policy analyst at the Cato Institute (Stephen, “The Corporate Welfare Budget Bigger Than Ever,” Cato Institute Policy Analysis No. 415, October 10, http://www.cato.org/pubs/pas/pa415.pdf)

National Aeronautics and Space Administration: Aeronautical Technology and Commercialization Activities. This account funds R&D activities (often in direct partnership with private industry) that benefit the commercial airline industry. Current projects include developing new propulsion systems, robotics, and a solar-powered airplane. Such applied R&D benefits primarily specific private companies, such as Boeing, Lockheed Martin, and Airbus. The government has trouble picking winners in other industries and will likely have a difficult time doing so in the aerospace industry as well.

***PERM 2nc—at: perm do cp

1. The counterplan’s competitive National Academy of Science ‘08 (A Constrained Space Exploration Technology Program: A Review of NASA's Exploration Technology Development Program, The National Academies Press, Committee to Review NASA’s Exploration Technology Development Program Aeronautics and Space Engineering Board Division on Engineering and Physical Sciences, Copyright 2008 by the National Academy of Sciences) The committee takes literally the implication of the VSE’s introductory text, which states that “ a robust space exploration policy” is the means to “advance the U.S. scientific, security, and economic interests,” and not an end in itself.

2. Referencing a particular technology locks it in as the path that we will pursue, which is distinct from a goals- based initiative—permutation can’t solve, that’s 1nc Willsie 2nc—at: perm do both

1. Links to cost overruns—the permutation still locks in a specific technology. Even if an alternative is found, it still leads to cost overruns in the short-term

***SOLVENCY 2nc—generic

Selecting the best result of the private sector is comparatively better than locking in a product Sutherland and Tayler ‘02 – *energy economist who has worked at Argonne National Laboratory and the American Petroleum Institute and **senior fellow at the Cato Institute (Roger J. and Jerry, “Time to Overhaul Federal Energy R&D,” February 7, Cato Institute, http://www.cato.org/pubs/pas/pa424.pdf) Therefore, a better model for allocating science dollars is one based not on picking potential technological “winners” but on putting in place the proper investment incentives. If the incentive is to produce the largest public benefit, investments may provide such benefits. However, if the decision to fund a scientific area is based on political, institutional, or other considerations not relating to public benefits, the investment is less likely to be in the public interest. Simple observation of private markets suggests a model based on incentives. Suppose we observe a firm undertaking an investment in a plant and equipment. We ask whether the investment is in the public interest. Is the initial decision to invest in this plant a wise investment choice? If the investment provides benefits to the public, will the investment continue? If the investment fails to benefit the public, will the government cancel the investment? How should a government policymaker, concerned with the public well-being, answer these questions? The policymaker has no empirical evidence on the expected rate of return on the investment. The incentives characterizing investment decisions indicate whether the investment is likely to be in the public interest. The business firm acts in its self-interest, which is to make money. However, to make money, the firm must provide a product or service that customers are willing to pay for. As long as the firm expects that investing in the plant will provide a good or service that customers value, the firm continues to invest. If the firm learns that the investment is of insufficient value, the firm cancels the investment. Just as explained by Adam Smith, private markets operate as if guided by an invisible hand to produce public benefits. Although the government policymaker has no empirical data, a reasonable assumption is that the investment is in the public interest, because the incentives characterizing the investment reflect the public interest.

Encouraging winners to arise is empirically more effective than picking them Economist ’10 (“Leviathan Inc: Governments seem to have forgotten that picking industrial winners nearly always fails,” August 5, http://www.economist.com/node/16743343)

In the rich world, meanwhile, the record shows, again and again, that industrial policy doesn’t work. The hall of infamy is filled with costly failures like Minitel (a dead-end French national communications network long since overtaken by the internet) and British Leyland (a nationalised car company). However many new justifications are invented for the government to pick winners, and coddle losers, it will remain a bad old idea. Thanks to globalisation and the rise of the information economy, new ideas move to market faster than ever before. No bureaucrat could have predicted the success of Nestlé’s Nespresso coffee-capsule system—just as none foresaw that utility vehicles, vacuum cleaners and tufted carpets (to cite examples noted by Charles Schultze, an American opponent of state planning) would have been some of America’s fastest-growing industries in the 1970s. Officials ignore the potential for innovation in consumer products or services and get seduced by the hype of voguish high-tech sectors. The universal race to create green jobs is the latest example. Led by China and America, support for green tech is rapidly becoming one of the biggest industrial-policy efforts ever. Spain, blinded by visions of a solar future, subsidised the industry so lavishly that in 2008 the country accounted for two- fifths of the world’s new solar-power installations by wattage. This week it slashed its subsidies, but still has a bill of billions. How to keep the beast at bay Not all such money is wasted, of course. The internet and the microwave oven came out of government-led research; the stranger stuff that governments do can prove surprisingly successful. A few governments, such as America’s and Israel’s, have contributed usefully to the early development of venture- capital networks. Some advocates of industrial policy argue that the government, like a pharmaceutical company or a seed-capital firm, should simply increase the number of its bets in order to raise its hit rate. But that is a cavalier way to behave with taxpayers’ money. And the public funds have an odd habit of flowing towards politically connected projects. Fortunately, there are now some powerful constraints on governments’ ability to meddle. In an age of austerity they can ill afford to lavish money on extravagant industrial projects. And the European Union’s competition rules place some limits on the ability to do special favours for particular firms. That points to the first of three ideas that should guide a more sensible approach to securing the jobs of the future. Straightforward steps to improve the environment for business—less red tape, more flexible labour markets, simpler tax and bankruptcy regimes—will be more effective than handouts to favoured firms or sectors. Europeans ought to be seeking to strengthen the rules of their single market rather than pushing to dilute them; a long-overdue single European patent process would be a good start. Competition will do far more for jobs than coddling. Second, governments should invest in the infrastructure that supports innovation, from modernised electricity grids (a smarter way to help green energy) to basic research and university education. The current fashion for raising barriers to the inflows of talented researchers and entrepreneurs hardly helps. Third, rather than the failed policy of picking winners, governments should encourage winners to emerge by themselves, for example through the sort of incentive prizes that are growing increasingly popular (see article). None of this excites politicians as much as donning hard hats and handing out cash in front of the cameras. But the rich world has a clear choice: learn from the mistakes of the past, or else watch Leviathan Inc grow into a true monster. 2nc—space

Only useful way to incorporate the private sector is to let it explore a wide range of opportunities Lamassoure et al ’03 – Mission and Systems Architecture Section, Jet Propulsion Laboratory (Elisabeth S, Ramachandra Manvi and Robert W. Easter – Mission and Systems Architecture Section, Jet Propulsion Laboratory, Bradley R. Blair, Javier Diaz, and Michael B. Duke – Center for Commercial Applications of Combustion in Space (CCACS), Mark Oderman and Marc Vaucher – CSP Associates, Inc, “Evaluation of Private Sector Roles in Space Resource Development,” http://trs- new.jpl.nasa.gov/dspace/bitstream/2014/10473/1/02-2470.pdf)

A number of studies have shown the great potential space resource utilization holds for space exploration. For example, Duke (1998) analyzed possible lunar ice extraction techniques. A study by NIAC (Rice, 2000) showed how using this ice to produce H2/02 propellants would reduce the Earth launch mass (ELM) for a reference lunar outpost mission by up to 68%. Based on similar outpost assumptions, Nelson (2001) calculated how much a private venture must charge to transfer cargo and astronauts to the Moon. Borowski (1997) studied the lunar transportation improvements that nuclear thermal propulsion could provide. Considering low Earth launch costs, Stancati (1 999) showed that using lunar-based LOX and LH2, and nuclear thermal propulsion, ELM for space exploration could be improved by up to 51 %, but cost improvements would be negligible. These are only a few examples of the wealth of interesting engineering studies that characterize what we might call the “potential for space resources supply”. A few studies also characterized the “potential for space resources demand”. Outstanding examples include the commercial space transportation study (CSTS, 1994), which systematically quantified potential markets for future launch services; but also propellant demand studies such as Smitherman (2001), who quantified the demand for H 2 / 0 2 propellants in low Earth orbit (LEO) for LEO-to-GEO (geostationary) Earth orbit transfer. Between these two bodies of research and analysis, there is a clear gap: among all the architectures proposed for space resources development, do any suggest (financially) viable private ventures? An integrated financial and engineering model based on a private investor perspective is the only way to bridge this gap, for three main reasons: First, an engineering-optimized architecture is not necessarily the most interesting to a private investor. For example, economies of scale could lead the engineer to build upfront the capacity to meet optimistic demand growth; while the private investor might prefer a scalable architecture, building capacity only as demand increases. Second, the metrics that interest private sector investors differ are not always the same ones that public sector engineers use for economic analyses. A ‘business case analysis’ is required to translate the engineering costs estimates into the metrics of interest to private sector investors. Third, an informed and effective public policy and strategy for space exploration demands that architecture trades, and initiatives regarding the private sector assess a wide range of scenarios. A single business case yields an outcome that depends on specific assumptions. For NASA to effectively incorporate the private sector into its long- term plans, it should explore a wide range of potential space ventures, the conditions under which they would flourish, the steps that NASA can take to encourage them, and the public benefitdcosts of those steps. To make these numerous case studies fast, accurate and comparable, a common analytic framework is needed. 2nc—ssa

Must ensure SSA tech before use to reduce program risks GAO 2011 (United States Government Accountability Office, May 2011. SPACE ACQUISITIONS Development and Oversight Challenges in Delivering Improved Space Situational Awareness Capabilities http://www.gao.gov/new.items/d11545.pdf)

DOD has significantly increased its investment and planned investment in SSA acquisition efforts in recent years to address growing SSA capability shortfalls. Most efforts designed to meet these shortfalls have struggled with cost, schedule, and performance challenges and are rooted in systemic problems that most space acquisition programs have encountered over the past decade. Consequently, in the past 5 fiscal years, DOD has not delivered significant new SSA capabilities as originally expected. To its credit, the Air Force recently launched a space-based sensor that is expected to appreciably enhance SSA. However, two critical acquisition efforts that are scheduled to begin development within the next 2 years—Space Fence and the Joint Space Operations Center Mission System (JMS)—face development challenges and risks, such as the use of immature technologies and planning to deliver all capabilities in a single, large increment, versus smaller and more manageable increments. It is essential that these acquisitions are placed on a solid footing at the start of development to help ensure their capabilities are delivered to the war fighter as and when promised. GAO has consistently recommended that reliable acquisition business cases be established, such as maturing technologies prior to development start, utilizing evolutionary development, and stabilizing requirements in order to reduce program risks. For efforts that move forward with less mature technologies, assessments of the cost, schedule, and performance implications of utilizing backup technologies, if they exist, could provide the knowledge needed to determine whether the efforts are worth pursuing or the investment trade-offs that may need to be made small satellite tech is effective—most likely to be used Turnbull 08 (Wallace R. III, Major and aeronautical engineer, “Beyond Awareness: Moving Towards Comprehensive Space Situational Knowledge”) While the prospects for future nanosat missions are certainly very good, technology advances have enabled even smaller 1 kg class picosatellites. In January, 2000, Stanford University deployed six picosat “daughter ships” from a demonstration satellite.48 Since then, there have been 20 picosat missions with masses of 1 kg or less. Many of these were based on the CubeSat picosat design standard created by Stanford University and California Polytechnic Institute to reduce the cost and complexity of building small satellites.49 These picosat missions, flown by a number of nations, included a wide range of payloads such as cameras, intersatellite communication, precision attitude control, high-bandwidth communication, and scientific instruments. There are at least 24 picosats in development, six of which are to launch in 2008.50 The increasing number of small satellite missions shows that technology is enabling smaller spacecraft at the same time that interest in these systems is growing. Table 2 shows the results of a literature search for the terms microsatellite, nanosatellite, picosatellite, and femtosatellite in the academic literature. Interest in small satellite technology clearly continues to expand with at least a dozen nations either conducting or planning nanosat or picosat missions.51 2nc—colonization

Tech exists but the private sector should develop it Bollard ‘10 (Pat, Formor Marine, The War Starts Here, Scientists Urge Immediate Colonization of Mars, http://patdollard.com/2010/11/scientists-urge-immediate-colonization-of-mars/, JG) Mars is a six-month flight away, possesses surface gravity, an atmosphere, abundant water, carbon dioxide and essential minerals. They propose the missions start by sending two two-person teams, in separate ships, to Mars. More colonists and regular supply ships would follow. The technology already exists, or is within easy reach, they wrote. An official for NASA said the space agency envisions manned missions to Mars in the next few decades, but that the planning decidedly involves round trips. President Obama informed NASA last April that he “‘believed by the mid-2030s that we could send humans to orbit Mars and safely return them to Earth. And that a landing would soon follow,’” said agency spokesman Michael Braukus. No where did Obama suggest the astronauts be left behind. “We want our people back,” Braukus said. Retired Apollo 14 astronaut Ed Mitchell, who walked on the Moon, was also critical of the one-way idea. “This is premature,” Mitchell wrote in an e-mail. “We aren’t ready for this yet.” Davies and Schulze-Makuch say it’s important to realize they’re not proposing a “suicide mission.” “The astronauts would go to Mars with the intention of staying for the rest of their lives, as trailblazers of a permanent human Mars colony,” they wrote, while acknowledging the proposal is a tough sell for NASA, with its intense focus on safety. They think the private sector might be a better place to try their plan. “What we would need is an eccentric billionaire,” Schulze-Makuch said. “There are people who have the money to put this into reality.” Indeed, British tycoon Richard Branson, PayPal founder Elon Musk and Amazon.com Inc. CEO Jeff Bezos are among the rich who are involved in private space ventures. Isolated humans in space have long been a staple of science fiction movies, from “Robinson Crusoe on Mars” to “2001: A Space Odyssey” to a flurry of recent movies such as “Solaris” and “Moon.” In many of the plots, the lonely astronauts fall victim to computers, madness or aliens. Psychological profiling and training of the astronauts, plus constant communication with Earth, will reduce debilitating mental strains, the two scientists said. “They would in fact feel more connected to home than the early Antarctic explorers,” according to the article. But the mental health of humans who spent time in space has been extensively studied. Depression can set in, people become irritated with each other, and sleep can be disrupted, the studies have found. The knowledge that there is no quick return to Earth would likely make that worse. Davies is a physicist whose research focuses on cosmology, quantum field theory, and astrobiology. He was an early proponent of the theory that life on Earth may have come from Mars in rocks ejected by asteroid and comet impacts. Schulze-Makuch works in the Earth Sciences department at WSU and is the author of two books about life on other planets. His focus is eco-hydrogeology, which includes the study of water on planets and moons of our solar system and how those could serve as a potential habitat for microbial life. The peer-reviewed Journal of Cosmology covers astronomy, astrobiology, Earth sciences and life. Schulze-Makuch and Davies contend that Mars has abundant resources to help the colonists become self-sufficient over time. The colony should be next to a large ice cave, to provide shelter from radiation, plus water and oxygen, they wrote. They believe the one-way trips could start in two decades. “You would send a little bit older folks, around 60 or something like that,” Schulze-Makuch said, bringing to mind the aging heroes who save the day in “Space Cowboys.” That’s because the mission would undoubtedly reduce a person’s lifespan, from a lack of medical care and exposure to radiation. That radiation would also damage human reproductive organs, so sending people of childbearing age is not a good idea, he said. There have been seniors in space, including John Glenn, who was 77 when he flew on the space shuttle in 1998. Still, Schulze-Makuch believes many people would be willing to make the sacrifice. The Mars base would offer humanity a “lifeboat” in the event Earth becomes uninhabitable, they said. “We are on a vulnerable planet,” Schulze-Makuch said. “Asteroid impact can threaten us, or a supernova explosion. If we want to survive as a species, we have to expand into the solar system and likely beyond.”

Declaring a mandate for NASA sets the overall purpose for the mission---NASA’s development process is adaptive and effective Correll 5 – Randall R. Correll, national security consultant with Science Applications International Company, and Nicholas Peter, 2005, “Odyssey: Principles for enduring space exploration,” Space Policy, Vol. 21, p. 251- 258 The most debilitating obstacle would be lack of compelling purpose. The human instinct to explore is, in itself, not sufficient to justify the public treasure that will be required. Neither is scientific gain, in itself, commensurate with the anticipated cost of publicly funded human space flight. NASA has not yet articulated how it will develop the objectives and purpose of lunar and Martian missions, laboratories, observatories and bases. Many of these decisions do not need to be made immediately and, following the metaprinciples of open-systems architectures, should not be forced prematurely. However, the process should begin among NASA, academia, industry, the public and the international community to debate the specific activities that will define the content of the program. Without visible progress in the development of compelling purpose, the exploration vision is not likely to endure, nor should it. 2nc—nuclear propulsion

Private sector solves propulsion systems Slivinski ’01 – fiscal policy analyst at the Cato Institute (Stephen, “The Corporate Welfare Budget Bigger Than Ever,” Cato Institute Policy Analysis No. 415, October 10, http://www.cato.org/pubs/pas/pa415.pdf)

National Aeronautics and Space Administration: Aeronautical Technology and Commercialization Activities. This account funds R&D activities (often in direct partnership with private industry) that benefit the commercial airline industry. Current projects include developing new propulsion systems, robotics, and a solar-powered airplane. Such applied R&D benefits primarily specific private companies, such as Boeing, Lockheed Martin, and Airbus. The government has trouble picking winners in other industries and will likely have a difficult time doing so in the aerospace industry as well.

Private sector development’s the only way to solve the aff Kapp ’09 – Activist, publishes Rational Review News Digest (Thomas L, “Anarchy and the Nuclear Option,” July 1o, http://c4ss.org/content/756)

The costs of constructing a nuclear weapon are huge. Not only is research and development expensive, but the actual assembly of the weapons requires acquisition of huge amounts of raw material (uranium), processing of that material by large numbers of expensive machines (gas centrifuges), and the attention of skilled technicians who don’t work cheaply. In other words, only two types of organizations could reasonably be expected to create a nuclear weapon: A state, which can take the cost out of its subjects’ hides whether they like it or not; or a large corporation of the kind which generally only exists under the auspices of the state and which has no profit motive to build such a weapon unless it’s doing so for the state. Of course, the nuclear genie is unfortunately already out of the bottle. There are already a lot of weapons out there. It’s reasonable to be concerned that if the state disappeared tomorrow, those weapons might fall into the hands of individuals or groups who could never have built them, and who might be inclined to actually use them rather than merely use them as a “Mutually Assured Destruction” threat to keep cold wars cold. I have two counter- arguments to offer to that reasonable concern. The first is that the danger it alludes to already exists because the weapons already exist. Maintaining the state does not guarantee that these weapons will never be stolen by force, or illicitly sold by those appointed to guard them. Both possibilities became major concerns during the disintegration of the Soviet Union. For all we know, “private nukes” may already be in play. The second is that, to the extent that nuclear weapons may fall into non-state hands and be used, the state is the most likely target for their use. Even if the physical target is a civilian population, the justification for their use would be to put pressure on a state to act or react in a given way. To this extent, permitting the continued existence of the state — any state, anywhere — represents an increased risk. Not only does the existence of nuclear weapons not constitute an argument against the stateless society, precisely the opposite is true: Only states or state-privileged organizations are likely to command the resources to build nukes, or to have any motive to do so. Only states or those attacking states have any incentive to use nukes as instruments of warfare. “Private nukes” are not, and never have been, a serious threat except to the extent that the existence of the state makes them one. However, I can envision a scenario in which “private nukes” might contribute to the peaceful establishment of stateless societies: If Earth’s states are serious about divesting themselves of their nuclear weapons, they should offer those weapons, gratis, to private organizations which demonstrate the ability to build “Project Orion” style spacecraft — spacecraft propelled by the detonation of nuclear weapons behind a “pusher plate.”

***IMPACT STUFF 2nc—cost overruns

Picking winners prioritizes political considerations over tech effectiveness Niman ’95 – Associate Professor of Economics, Whittemore School of Business and Economics, University of New Hampshire (Neil B, “Picking winners and losers in the global technology race,” July, Contemporary Economic Policy, Vol. 13, Iss. 3, proquest)

Substituting the government for the market when the market fails does not necessarily eliminate failure. The government is subject to many of the same problems found in any organization (Milgrom and Roberts, 1992). For example, the existence of asymmetric information can impede the government's ability to make sound decisions. When applying for federal grants, managers, entrepreneurs, or researchers who are working actively in the field likely have more information about their ability to create, commercialize, or manage the development of new technologies. Thus, these managers may find themselves in a position where they can withhold important facts in order to increase their chances of being awarded a grant. As a result, the selection process may become biased toward those individuals or companies that are best at working their way through the government bureaucracy rather than recognizing companies that offer the greatest promise for surviving the rigors of the market. Thus, the government may find itself financing projects that have only the smallest chance of being successful. By financing these efforts (because they sound the most appealing), the government could experience a type of Gresham's Law where the good ideas or individuals are driven from the contest. Wasting resources on bad projects could seriously reduce resources for good ones, with the unintended consequence that the government has almost nothing to show for all of the dollars it has spent. In addition to problems generated by asymmetric information, the government also is subject to principal-agent problems. While public officials carry the burden of the public trust, some officials occasionally place their own self- interest above the interest of those they serve. Managers in the government bureaucracy may be more intent on pleasing members of Congress than on allocating resources in a way that reflects market values (Lindsay, 1976). Given that members of Congress themselves often are motivated by political rather than market considerations, public funding of research projects may not adequately reflect the general public interest. Political considerations may lead not only to a misallocation of funds, where the focus is on pork-barrel rather than sound economic projects, but also may affect the overall level of spending. If the optimal level of investment for innovation cannot be achieved without government subsidy, what ensures that the government will set the subsidy at a level that promotes economic efficiency? Driven by political rather than economic forces, the government may choose a funding level that will not antagonize special interest groups or raise taxes rather than one that will promote the socially optimal level of research and development expenditures.

Kills initial tech development and spin offs Niman ’95 – Associate Professor of Economics, Whittemore School of Business and Economics, University of New Hampshire (Neil B, “Picking winners and losers in the global technology race,” July, Contemporary Economic Policy, Vol. 13, Iss. 3, proquest)

With a myriad of potential problems and sources for abuse, government involvement in the research and development process may cost more than is gained from subsidizing private sector initiatives. However, these costs of government failure are not the only costs to society resulting from government involvement in the process of creating new ideas. The government also may serve as an impediment to the diffusion of new technologies. At the most basic level, inefficient use of resources means fewer new technologies created. Because the diffusion of technology is a direct function of how many new technologies are created, less invention means less diffusion. As a result, society loses not only the benefits of the new technologies but also the corresponding "multiplier" effect as fewer technologies become available to spur even greater activity by other firms. In addition to supporting less inventive activity than otherwise might occur, the government also risks adding to allocative inefficiency by misdirecting development efforts toward technologies that have only limited value to the rest of the market. Such technologies will not be widely adopted for use in competitive or complementary products. As a result, society not only loses because it backed the wrong technology but also because it receives no collateral benefits from other firms' use of that technology. Finally, any program that supports one firm's efforts over another also may distort the diffusion of new technologies. Subsidizing one company's innovation may discourage other companies from developing similar or complementary technologies because of difficulty in competing with a company that does not have to shoulder the full cost of innovation. Thus, the very government actions designed to solve the market failure problem of underinvestment in new technologies may create an even greater social loss.

Developmental failure causes cost overruns GAO 3/3/11 – United States Government Accountability Office (“NASA Assessments of Selected Large-Scale Projects,” http://www.gao.gov/new.items/d11239sp.pdf)

• Finally, by the time of the production decision, the product must be shown to be producible within cost, schedule, and quality targets and have demonstrated its reliability, and the design must demonstrate that it performs as needed through realistic system-level testing. Lack of testing increases the possibility that project managers will not have information that could help avoid costly system failures in late stages of development or during system operations. 2nc—budget tradeoffs

Cost overruns causes NASA to scrap other projects GAO ’09 – United States Government Accountability Office (“NASA Assessments of Selected Large-Scale Projects,” March, http://www.gao.gov/new.items/d09306sp.pdf)

However, NASA has also had its share of challenges. For example, the X-33 and X-34 programs, which were meant to demonstrate technology for future reusable launch vehicles, were cancelled due to technical difficulties and cost overruns after NASA spent more than $1 billion on them. More recently, the Mars Science Laboratory, which was already over budget, announced a two-year launch delay. Current estimates suggest the price of this delay may be $400 million—which drives the current project lifecycle cost estimate to $2.3 billion, up from its initial confirmation estimate of $1.6 billion. GAO and others have also reported on overruns on many other NASA programs over the past decade. What is common among these and other programs is that whether they succeed or fail, they cost more to build and take longer to launch than planned. As a result, NASA is able to accomplish less than it plans with the money it is allocated, and it is forced to make unplanned trade-offs among its projects—shorting one to pay for the mistakes of another.

Cost overruns have empirically triggered budget tradeoffs Borenstein ’09 – Associated Press (Seth, “Cost overruns plague U.S. space agency,” March 4, http://www.msnbc.msn.com/id/29513895/ns/technology_and_science-space/t/cost-overruns-plague-us- space-agency/)

Historically, overruns have caused NASA to run low on money, forcing it to shelve or delay other projects. Often, the agency just asks taxpayers for more money. In fact, NASA got $1 billion from the new stimulus package. It's to be spent on climate-watching satellites and exploration among other things. "Getting an extra infusion of money doesn't necessarily mean you have a capability to spend it well," said Cristina Chaplain, GAO's acquisitions chief who wrote the study. A second GAO report used NASA as one of its leading poster children for bad practices in estimating costs. The space agency, which has a budget of about $18 billion, needs "a more disciplined approach" to its projects, the GAO said. NASA spending has been on GAO's "high risk" list since 1990. Its cost overrun problems will be the subject of a House Science Committee hearing Thursday. "A cancer is overtaking our space agency: the routine acquiescence to immense cost increases in projects," NASA's former science chief Alan Stern wrote in an op-ed piece in the New York Times in 2008. He quit last year over the shifting of money to pay for cost overruns. NASA's spending problems are so predictable and big that two years ago Congress put it under the same tough budgeting rules as the Defense Department. That means NASA must notify Congress if a program's cost rises by more than 15 percent. The GAO report issued Monday was the first using NASA's new requirements. In a statement to The Associated Press, NASA said its missions "are one-of-a-kind and complex, which always makes estimating challenging... We do believe NASA is a good investment of federal funds and strive to provide the best value." The agency statement said external forces, such as launch availabilities, also cause delays and cost increases. The agency says it has improved its cost estimating. Last December then-NASA administrator Michael Griffin tried to compare cost overruns — like the $400 million extra needed for the Mars Science Laboratory — with do-it-yourself projects that keep requiring extra trips to the hardware store. When a reporter quipped that his do-it-yourself projects use his own money, Griffin drew laughter with his response: "And we are spending your own money for this." Imposing financial discipline, as GAO urges, "is an uphill fight," said Smithsonian Institution space scholar John Logsdon, who is on NASA's advisory council. In the latest report, NASA couldn't provide the GAO with current accurate estimates on two of its hugest projects so the watchdog agency merely cited ballpark guesses: The program to build new spaceships to send astronauts back to the moon would cost somewhere around $37 to $49 billion and already has financial and technical risks, the GAO found. The multibillion-dollar James Webb Space Telescope, whose current cost is unknown, was at least $1 billion over estimates three years ago, before NASA began its new cost accounting methods. NASA has cost overruns for several reasons, said the GAO's Chaplain. Those include poor cost estimating at the beginning, trying to do cutting-edge science, constantly changing designs, and poor contractor performance. Six of the projects had problems with contractors, including lack of experience, that led to delays or higher costs. In his December news conference, Griffin said there isn't a very good way to estimate at the front end of a mission what it's going to take to achieve scientific priorities. Griffin, whose replacement hasn't been named yet by President Barack Obama, said scientists tend to downplay costs early to convince NASA that their project is cheaper than someone else's. Later, once NASA commits and the money is being spent, more bucks are needed. So NASA spends more instead of canceling the project. That's a problem everyone knows about and accepts, but shouldn't, Chaplain said. The Mars Science Laboratory, which has ballooned to a $2.3 billion price tag, is a good example of NASA's approach. In 2003, its cost was put at $650 million on the National Academy of Sciences wish list, which NASA used to set priorities. But on Tuesday, Doug McCuistion, who heads NASA's Mars exploration program, said the proper estimate to start with was $1.4 billion, not $650 million because it was not an official NASA projection. By last December, the number was up to $1.9 billion. Then technical problems delayed launch plans from this year to 2011, adding another $400 million. The extra money came from cuts to other science projects. "The costs of badly run NASA projects are paid for with cutbacks or delays in NASA projects that didn't go over budget," Stern wrote in his newspaper piece. "Hence the guilty are rewarded and the innocent are punished."

AFF ANSWERS Government can pick winners Milford 4/26/11 – President and founder of the Clean Energy Group (CEG) and Executive Director of the Clean Energy States Alliance (CESA) (Lewis, “Picking Winners or Losers,” Clean Energy Group, http://www.cleanegroup.org/blog/picking-winners-or-losers/) Some arguments never die. Recently, some members of Congress criticized Energy Secretary Chu for “picking winners’ through his research and development programs like ARPA- E. This is an old canard that often comes from people who really think that the private sector alone, without government help, creates products and services. The evidence is so overwhelming to the contrary that the debate seems almost one sided by now. Everything from computer chips to cars is a result of long-term government research and development—as well illustrated in a recent Breakthrough Institute report. The argument against picking winners is especially wrong for emerging technologies that require deep and persistent public support. In the late 1990s, two Harvard professors in a book titled “Investing in Innovation: Creating Research and Innovation Policy that Works” demolished the myth that government should not be in the business of “picking winners.” And they came up with some surprising conclusions about the role of government in technology innovation. Branscomb and Keller describe how this bias against a government technology role can lead to two incorrect conclusions: …First, that markets do that most effectively; and second, that pork barrel politics is more likely to support the losers anyway. This neat two-step eliminates from the role of technology policy everything for which government is institutionally well-suited, from infrastructure building and investment incentives to support of skills training. It then notes that what is left is, of course, institutionally more appropriate for the market. The argument is legitimated simultaneously by our ancient faith in markets and our recent cynicism about politics. They admitted that the “picking winners and losers argument” might apply to some government efforts but not to the development of new technologies. Here’s why: Private markets often under-investment in new technologies; “empirical evidence suggests that as a result of spillovers of all kinds, the social returns to R&D spending on new technologies far exceed the private returns, perhaps by as much as 50 to 100 percent.” Private rates of return may not equal social rates of return—companies often cannot appropriate all the social benefits of an innovation and so fail to invest in what could be socially optimal technology. Because innovation is highly contingent—the actions of developers, governments and users are highly uncertain, making good information hard to come by, leading to great risks for investment—there is an inevitable misallocation of resources. “Some bets will pay off; some not at all. Winners and losers can only be positively identified in the revealing gaze of hindsight.” And finally, “…there is absolutely no evidence, beyond the economist’s leap of faith, that private investment is any more capable than public investment of separating the winners from the losers before the fact. The major difference is that private losers exit the market, while publicly backed losers are held to the higher standard of wasting taxpayers’ money.” Further, they confront another myth about government technology policy—that the federal government has in the past and in the future should only focus on R&D rather than commercial diffusion and use. Instead, they point out, in those areas where success has occurred, government has in fact played a much more expansive role than simply research and development. The most unlikely proof is in the defense area. Referring to the post-World War Two period in the U.S. regarding defense industry support as the most obvious time when many government policy tools were used, they note: Public spending supported the enormous development costs of relevant new technologies…In these cases, government underwrote the basic science research at universities and labs; direct R&D contracts accelerated the development of the technology; and defense procurement at premium prices constituted a highly effective initial launch market…A variety of mechanisms, ranging from patent pooling and hardware leasing (such as machine tool pools) to loan guarantees for building production facilities, helped to lower entry costs, diffused technology widely among competitors and set the stage for commercial market penetration. Aspects of this support model were adapted for government investment in other sectors, notably for public health, and produced similarly beneficial results… In the defense area, the U.S. government did not limit its role to only R&D, the typical critic’s myth, but “to the successful launch and diffusion of a technology development path—a trajectory—whose characteristics corresponded to the requirements of the commercial marketplace.” So to those who say, don’t pick winners, say it has always been so, and the country is better off for it. The alternative is to let losers win, and who wants that. Picking winners key to innovation—empirically proven. Atkinson ’10 – President of the Information Technology and Innovation Foundation (Robert D, “For Once and for All, Let's Agree the Government Can and Should Pick Winners,” April 22, Huffington Post, http://www.huffingtonpost.com/robert-d-atkinson-phd/for-once-and-for-all- lets_b_548145.html)

It's not just conservatives who worry about government being too active, many moderates and liberals who abide by the so-called Washington Consensus hold as an article of faith that while it's okay for the government to do things like fund basic research and improve education, by all means it should not "pick winners." On this matter (as on many), the Washington Consensus is wrong. Let's be clear about what "picking winners" means. It means government identifying industries and technologies where the country needs to be competitive globally, (i.e. health IT, nanotechnology, green energy, biotech, robotics, broadband) and then developing and implementing policies to work with the private sector to ensure that we grow and retain high-end jobs at home in these key sectors. Picking winners is not simply another name for an "industrial policy" in which the government selects specific firms or extremely narrow technologies, nationalizes industries, or impedes beneficial market forces. There's a clear reason why we need to put the rhetoric about socialism aside and start picking winners: we are starting to slip in terms of innovation and competitiveness. In a 2009 report examining innovation-based competitiveness among 40 nations, ITIF found that the United States has slipped from first to sixth place in the last decade, behind Singapore, Sweden, South Korea and others. In fact, the U.S. ranked dead last in progress in innovation and competitiveness over the last decade. Other countries are making more progress in developing the capacity to innovate and lead in key sectors. Unless we are willing to live with high unemployment, chronic trade deficits and relatively lower standards are living, we need to act. Creating the right market conditions for our companies and workers, (i.e. sound tax, trade, and fiscal policies) and investing in basic research are necessary but not sufficient conditions to keep pace with the nations around the world competing vigorously for innovation and related jobs. But we are kidding ourselves if we think that will be enough. Instead of the hodgepodge of policies from an array of complex tax laws to wasteful farm subsidies to a dizzying jumble of state incentives, we need a coordinated and comprehensive approach to making sure we not only come up with the next "big thing" but also that we do not have it snatched away from us. (Remember the VCR?) And that means picking key technologies and industries to focus on. But the free market opponents will say how can Washington outsmart the market? Is this the same market that through its infinite wisdom invested hundreds of billions of subprime mortgages? In fact, the government has a pretty good track record of picking winners. Just look at the technologies that the government had a key role in developing: the Internet, the web browser, the search engine, computer graphics, semiconductors, and a host of others. There are many other examples of success stories made possible not because government anointed a particular young entrepreneur but because the government made a conscious choice to open new pathways into which young innovators could embark. In the 1980s, we responded to Japan's economic ascendance by picking winners with the research and development tax credit, creating programs like the Advanced Technology Program and the Manufacturing Extension Partnership, and aggressively taking on unfair trade policies. We need to do the same today. It's time to break free of neo-classical economic orthodoxy that preaches that markets acting on their own optimize economic well-being and that low taxes, minimal regulation, and free trade alone can guarantee long- term U.S. leadership on the growth engines of the future. These ingredients work best when the government develops a strategy for correcting systemic "market failures" that limit innovation. We need to come to recognize that our country and not just our companies are competing and begin to think and act more like a country. need to re-underline. WAY too long as is Ezell ’10 – Senior Analyst with the Information Technology and Innovation Foundation (ITIF), with a focus on innovation policy, international information technology competitiveness, trade, and manufacturing and services issues (Stephen, “The Economist’s Strange Attack on Industrial Policy,” August 25, Progressive Fix, http://progressivefix.com/the-economist%E2%80%99s-strange-attack-on-industrial-policy)

It would be more constructive to envision a continuum of government-market engagement, increasing from left to right in four steps from a “laissez faire, leave it to the market” approach to “supporting factor conditions for innovation (such as education)” (which The Economist endorses, as, certainly, does ITIF) to going further by “supporting key technologies/industries” to at the most extreme “picking specific national champion companies”, that is, “picking winners.” And while it is generally inadvisable for governments to intervene in markets to support specific national champion companies, ITIF believes there is an appropriate role for government in placing strategic bets to support potentially breakthrough nascent technologies and industries. Ironically, The Economist asserts that, “Industrial policy may be designed to support or restructure old struggling sectors, such as steel or textiles, or to try to construct new industries, such as robotics or nanotechnology. Neither track has met with much success. Governments rarely evaluate the costs and benefits properly.” Yet, seconds later, the authors admit, “America can claim the most important industrial-policy successes, in the early development of the internet and Silicon Valley.” In one sentence, the article glosses over the point that the government, in this case the Defense Advanced Research Projects Agency (DARPA), “supported creation of ARPANET, the predecessor of the Internet, despite a lack of interest from the private sector.” (Italics mine.) But this point, as economists are wont to say, is “non-trivial.” In fact, it is the precisely the point. Early on, companies were reticent to invest in the nascent field of computer networking because the sums required were enormous and the technology was so far from potential commercialization that companies were unable to foresee how to monetize potential investments. Moreover, such basic research often results in knowledge spillovers, meaning the company cannot capture all the benefits of its R&D investment (in economist’s terms, the social rate of return from R&D is higher than the private rate of return), and thus companies tend to underinvest in R&D to societally optimal levels. Of course, this dynamic pertained not just to the Internet, but applies today to a range of emerging infrastructure technologies such as biotechnology, nanotechnology, robotics, etc. As Greg Tassey, Senior Economist at the National Institute of Standards and Technology (NIST), explains it, “the complex multidisciplinary basis for new technologies demands the availability of technology “platforms” before efficient applied R&D leading to commercial innovation can occur.” In other words, the levels of investment required to research and develop emerging technologies is so great that the private sector cannot support it alone, and thus, “government must increasingly assume the role of partner with industry in managing technology research projects.” Such was the case with the initial development of the Internet, as government stepped in and provided initial R&D funding, helped coordinate research between the military, universities, and industry, and thus seeded development of a breakthrough digital infrastructure platform, making the Internet a reality decades before the free market ever would have (if ever) if left to its own devices. And this admittedly-successful industrial policy has indeed been a spectacular success. As ITIF documented in a recent report, The Internet Economy 25 Years After.com, the commercial Internet now adds $1.5 trillion to the global economy each year—that’s the equivalent of adding South Korea’s entire economy annually. Moreover, the list of technologies in which government funding or performance of research and development (R&D) has played a fundamental role in bringing the technology to realization is long and compelling. It includes: the cotton gin, the manufacturing assembly line, the microwave, the calculator, the transistor and semiconductor, the relational database, the laser beam, the graphical user interface, and the global positioning system (GPS), amongst many others. The National Institute of Health (NIH) practically created the biotechnology industry in this country. And yes, even Google, the Web search darling, isn’t a pure-bred creature of the free market; the search algorithm it uses was developed as part of the National Science Foundation (NSF)-funded Digital Library Initiative. (But Google hasn’t done much to spur economic growth!) The point is that companies like IBM, Google, Oracle, Akamai, Hewlett-Packard, and many others may not have even come into existence─and certainly would not have prospered to the extent they have─if the U.S. government was not either an early funder of R&D for the technologies they were developing or a leading procurer of the products they were producing. And if you don’t get Intel developing the semiconductors, or Cisco building out the Internet, or Akamai securing it, or Google making it accessible, then you don’t get the downstream companies like the Amazons or eBays, the latter of which 724,000 Americans rely on as their primary or secondary source of income. Thus, while governments shouldn’t be creating and running such companies itself—that is for the free market to do—the government has a role to play in thoughtfully, strategically, and intentionally placing strategic bets on nascent and emerging technologies—as the United States did with information and communications technologies in the 1960s and 1970s—that have the potential to turn into the industries, companies, and jobs that drive an economy two to three decades hence. We call this innovation policy, as opposed to industrial policy. Today, this augurs the need for smart policies and investments in industries such as robotics, nanotechnology, clean energy, biotechnology, synthetic biology, high-performance computing, and digital platforms such as the smart grid, intelligent transportation systems, broadband, and Health IT. Explicit in this approach is a recognition that some technologies and industries are in fact more important than others in driving economic growth—that “$100 of potato chips does not equal $100 of computer chips.” Indeed, they are not because some industries, such as semiconductor microprocessors (computer chips) experience very rapid growth and reductions in cost, spark the development of subsequent industries, and increase the productivity of other sectors of the economy—not to mention support higher wage jobs. Yet The Economist frets that governments aren’t very good at identifying and investing in strategic emerging technologies. In impugning governments’ ability to pick winning technologies, the article cites failures such as France’s Minitel (a case of a country picking a national champion company) and argues that “Even supposed masters of industrial policy {like Japan’s MITI, or Ministry of International Trade and Industry} have made embarrassing mistakes.” But this would be tantamount to pointing to the spectacular failure of Apple’s Newton and arguing that Apple’s no good at innovation. The Economist seems to suggest that if governments failed 80-90% of the time in picking technology winners (and ITIF actually thinks their success rates are much higher), then they must be pretty incompetent at the effort and should stop trying altogether. But if private corporations followed that advice, then we would have no innovation whatsoever. Indeed, research by Larry Keeley of Doblin, Inc. finds that, in the corporate world, only 4 percent of innovation initiatives meet their internally defined success criteria. More than ninety percent of products fail in the first two years. Other research has found that only 8 percent of innovation projects exceed their expected return on investment, and only 12 percent their cost of capital. Yet companies have to continue to try to innovate, even in the face of these long odds, because research finds that firms that don’t replace at least 10 percent of their revenue stream annually are likely to be out of business within five years. The point is that just because innovation is difficult and success rates are low, this does not mean that corporations, or governments, should quit trying—or that their successes, like the Internet, can’t be spectacularly successful and have a profound impact on driving economic growth. But The Economist laments that industrial or innovation policies are subject to capture by industries. What this neglects is that all countries, including the United States, already have de facto industrial policies that favor some industries over others. In the United States, for example, our regulatory and tax system favors agribusiness through farm subsidies, the oil industry through oil subsidies, airlines and highways at the expense of rail, and mortgage and financial industries. In fact, it is precisely because the United States has historically lacked an ability, or willingness, to have a clearly defined innovation strategy and an open dialogue about “making strategic decisions about strategic industries” that we’ve ended up with a de facto industrial policy ill-suited to supporting industries that will drive economic growth in the future. The Economist notes that “there is no accepted framework for “vertical” policy, favoring specific sectors or companies.” True. So let’s make one. Finally, while The Economist criticizes President Obama’s new Strategy for American Innovation (released in 2009), it fails to come up with compelling evidence that breakthroughs such as mapping the human genome, unlocking nanotechnology’s potential, or achieving the technology-enabled transformations that need to occur in sectors from energy to transportation will occur solely because of the market’s ability to allocate capital efficiently. In this, it discounts the need for effective, intentional public-private partnerships to invest in and collaborate in the development and diffusion of these industries and technologies. This critique is not meant to pick on The Economist, which is usually chock full of solid reporting and informed commentary. Rather it is take on the myth of America’s purely free market capitalist system and make the case for an informed innovation policy. It is also to note that countries (like the United States) find themselves desperately turning to industrial policy in a last ditch effort to save stumbling sectors such as automobiles because they have failed to make adequate investments in innovation policies that would support science and technology, R&D, and the development and diffusion of innovative processes and technologies that could have helped keep old sectors like automobiles at the technology frontier while supporting the development of new sectors to drive the economy forward. Finally, it seeks to rebut the ideological and highly politicized assault on the idea the governments cannot make prudent, targeted bets on the industries of tomorrow. As Greg Tassey has noted, competition among governments has become a critical factor in determining global market share among nations. Indeed, the role of government is now a critical factor in determining which economies win and which lose in the increasingly intense process of creative destruction. There are appropriate and inappropriate roles for governments to play in this competition. Supporting education, removing barriers to competition, supporting free and fair global trade, opening countries to high-skill immigration, and targeting strategic R&D investments towards the technologies and industries of the future are appropriate roles for governments to play in this competition. Other government policies, such as mercantilist ones which deny foreign countries’ corporations access to domestic markets, pilfer intellectual property by stealing it outright or making it a condition of market access, creating indigenous or proprietary IT standards, failing to adhere to trade agreements, or directly subsidizing domestic companies or their exports, are illegitimate forms of global economic competition. The United States—and The Economist—must abandon its fanciful, stylized neoclassical notion of a purely free global economic marketplace unfettered by any form of government intervention whatsoever, and recognize that governments play a legitimate and crucial role in shaping the innovation capabilities of national economies. As between corporations, it’s a competition; and, as with companies, the ones that develop the best strategies and skills at fostering, developing, and delivering innovation are the ones most likely to win. Government picks winners as well as the free market Phillips 7/28/10 – Senior Fellow of the IC2 Institute of the University of Texas at Austin (Fred, “Picking Winners: Is Government Technology Strategy Good Or Bad? (Or, Say “Thai Baht” Three Times Really Fast),” http://www.science20.com/machines_organizations_and_us_sociotechnical_systems/picking_winners_government_technology_strategy_good_or_ bad_or_say_%E2%80%9Cthai_ba_)

But can the private sector really do it better? There has been, I believe, no rigorous study of this question. The US federal government has difficulty dismantling a bureaucracy, once built. So I would concede, on the opinion level, that government may pick beneficial technological directions less efficiently than the private sector, because when the government is wrong, it’s expensive to recover. But I see no reason to believe corporations can pick winners more consistently. That is, out of ten chances to pick winners, I’d bet governments and corporations would be right about the same number of times.

Recommended publications