<<

A Report by a Study Team of the

NATIONAL ACADEMY OF PUBLIC ADMINISTRATION

For the National Aeronautics and Space Administration

October 2005

PRINCIPAL INVESTIGATOR-LED MISSIONS IN SPACE SCIENCE

Advisory Panel

John G. Stewart, Chair Herbert N. Jasper

Study Team

William Lilly, Project Director Carole M. P. Neves, Senior Advisor Malcolm L. Peterson, Senior Advisor

Officers of the Academy

Valerie A. Lemmie, Chair of the Board G. Edward DeSeve, Vice Chair C. Morgan Kinghorn, President Jonathan D. Breul, Secretary Howard M. Messner, Treasurer

Project Staff

J. William Gadsby, Vice President, Academy Studies William Lilly, Project Director Carole M. P. Neves, Senior Advisor Malcolm L. Peterson, Senior Advisor

The views expressed in this report are those of the Study Team. They do not necessarily reflect the views of the Academy as an institution.

National Academy of Public Administration 1100 New York Avenue, N.W. Suite 1090 East Washington, DC 20005 www.napawash.org

First published October 2005

Printed in the United States of America

ISBN 1-57744-119-2

Academy Project Number: 2035-100

ii FOREWORD

Since its creation in 1958, the National Aeronautics and Space Administration (NASA) has undertaken many space flight missions to carry out the mandate in its enabling legislation to increase “human knowledge of phenomena in the atmosphere and space.” These space missions have been planned, designed, developed, and operated using a variety of management approaches. More than a decade ago, NASA decided to undertake an experiment that would place the principal scientific investigator in a leadership position for the life cycle of a space science mission, beginning with the initial proposal and concluding with the final scientific data analysis and publication of results.

The space science community was quick to embrace this new approach and responded to NASA’s solicitation of proposals with a large number of mission concepts. The solicitation included fixed schedule durations and limitations on the amount of government funding. Proposals selected for development would have to meet three tests. First and foremost, were they investigating scientifically compelling phenomena? Second, were the proposed technical and management approaches consistent with good engineering practices? Third, were they capable of being executed with a reasonable level of confidence within the cost and schedule caps established in NASA’s Announcement of ? In the ensuing decade, more than 400 proposals were subjected to those evaluation criteria. Out of this number, twenty missions were approved for development. As of this writing, twelve have been launched, with eleven having returned scientific data.

NASA asked the National Research Council and the National Academy of Public Administration (Academy) to conduct an independent review of the principal investigator-led missions. The Academy’s study analyzes the causes for increased mission costs and makes recommendations to reduce the likelihood of cost growth in future missions. The study team’s fundamental conclusion is that changes in the competitive solicitation process can be made that will provide NASA with a better base of information on the probable costs and schedule. This, in turn, will allow NASA decision officials to assess the risks and provide an appropriate level of contingency in the funding requested for missions approved for development. The National Research Council’s study focused on the selection process and objectives for principal investigator-led missions; the roles and relationships among principal investigator team members; lessons learned from previous missions; and opportunities for knowledge transfer to new principal investigators.

The Academy was pleased to undertake this study. I want to thank the Academy Fellows and the study team for their hard work and diligence in producing this important report.

C. Morgan Kinghorn President

iii iv TABLE OF CONTENTS

FOREWORD...... iii

ACRONYMS...... vii

INTRODUCTION...... 1

SECTION I: THE EVOLUTION OF THE PI-LED MISSIONS...... 3

Embarking on an Experiment ...... 3 The Initial PI-Led Missions ...... 5 The Rationale for PI Leadership...... 7 The Experience with PI Leadership...... 8 NASA’s Phased Project Planning and Execution Process...... 12 Impact of the NASA Integrated Action Team Report ...... 15

SECTION II: FINDINGS, ANALYSIS, AND RECOMMENDATIONS...... 17

The Proposal Process ...... 18 The Mission Development Process...... 25

TABLES AND FIGURE

Table 1. Number of PI-Led Mission Proposals Received; Selected for Concept Feasibility Studies; Selected for Development; and Launched, 1994-2003 ...... 11 Figure 1. Space Science Flight Program Management Process Flow ...... 13 Table 2. Cost Performance Measures for Discovery Projects, Expressed as a Percentage ...... 14 Table 3. Cost Estimates for a Medium Class Explorer Mission...... 28

APPENDICES

APPENDIX A: Individuals Interviewed or Contacted ...... 31 APPENDIX B: References ...... 35 APPENDIX C: Advisory Panel and Staff...... 39

v vi ACRONYMS

Academy National Academy of Public Administration AO Announcement of Opportunity APL Applied Physics Laboratory (a unit of the Johns Hopkins University) ARC (NASA Center located in Sunnyvale, CA) CR Confirmation Review FY Fiscal Year JPL Jet Propulsion Laboratory (operated for NASA by the California Institute of Technology) LMA Lockheed-Martin Astronautics (Denver, CO) LMMS Lockheed-Martin Missiles and Space (Sunnyvale, CA) NASA National Aeronautics and Space Administration NIAT NASA Integrated Action Team NRC National Research Council PI Principal Investigator PM Project Manager SSB Space Studies Board (a component of the National Research Council)

vii

viii INTRODUCTION

The National Aeronautics and Space Administration (NASA) has, over the last decade, used principal scientific investigators in project leadership roles for an increasing number of and Space Science flight missions. The principal investigators (PIs) lead missions throughout their lifecycles—from inception through definition, design, development, test, launch, mission operations, and data analysis. The conventional approach for NASA is to use its internal project managers (PMs) to lead the design and development of space flight projects. Under this approach, the principal scientific investigators have leadership responsibility for the science investigation and, commonly, for the design and development of a specific instrument. For their part, the NASA PMs are responsible for optimizing the ’s design and development. This optimization takes place within technical constraints that include: the launch vehicle that places payloads into orbit, the electrical power provided by the solar arrays, the telemetry downlink rate, the on-board computational capability, etc. There are also programmatic constraints, such as annual budget limitations, total cost for the mission, and schedule. The optimum arrangement of these highly interactive constraints is an extraordinary challenge. Compromises are often struck, and in the process, engineering, cost, and schedule feasibility often trumps optimum arrangements for the science investigations.

NASA’s space science program managers commissioned this study to identify corrective actions to address the increase in the number and magnitude of cost overruns in PI-led space science flight missions. The associate administrator for space science asked the National Research Council’s (NRC) Space Studies Board (SSB) to take a broad look at PI-led projects, examining the overall successes and difficulties of PI-led space science missions, and to recommend ways to improve their overall performance. The National Academy of Public Administration (Academy) study team worked closely with the SSB’s Committee on PI-Led Missions in the Space Sciences. The interaction proved highly valuable, because the Committee’s membership consisted of experienced space scientists and program managers.

In the course of this study, the NRC committee and the Academy study team interviewed PIs, industry and government program and project managers, and other members of project management teams. The Academy study team collected and reviewed an extensive amount of budgetary, cost estimating, and schedule data to document changes from the initial cost estimates to the estimates at the time NASA approved the projects for development through the final estimates and, where available, final costs. The team also reviewed other published reports on NASA missions in Earth and Space Science, analyzed the data, and produced this report. Two Academy Fellows, John Stewart and Herb Jasper, served on an Advisory Panel to the study team. They reviewed the report and provided valuable comments and suggestions.

The reader is encouraged to review the NRC report to obtain a comprehensive perspective on project management practices and issues, as well as a discussion of the science value of the missions.

1 2 SECTION I THE EVOLUTION OF THE PI-LED MISSIONS

EMBARKING ON AN EXPERIMENT

In 1992, NASA’s leadership decided to experiment with a different approach to selecting and managing certain space science flight projects. The incoming NASA administrator was receptive to the argument of many in the space and earth science communities that small teams of dedicated scientists and engineers could build high quality spacecraft of low-to-moderate complexity in relatively short periods at greatly reduced costs. Although this concept had been used at NASA before,1 the decision to put a scientist—rather than a NASA PM—in control of a mission was a significant change. “Missions would be under the sole and complete control of a principal investigator … to ensure that each mission would be planned and executed in a manner that would optimize the mission and its science output.”2

Another important element of the experiment was that NASA would not dictate the scientific purpose or design approach of the space mission, but would instead ask scientists to propose focused science investigations of their own devising. This change allowed the use of an “Announcement of Opportunity” for an upcoming procurement. A follow-on solicitation would request proposals for a complete space science mission, with the stipulation that selection would be based on the potential value of the scientific discoveries the mission would accomplish, technical and management feasibility, and compliance with strict, comprehensive cost and schedule limitations. In addition, NASA would not provide the prospective PI with the funds to generate the proposal. Therefore, the PI would have to solicit teammates, including a project management institution, spacecraft and instrument providers, and scientific collaborators. The team members would cover the “bid and proposal” costs of generating a sufficient level of information on a proposed mission to convince NASA of its scientific merit, compliance with total cost and schedule caps, and technical merit and feasibility. In another departure from conventional practices, NASA did not require proposers to select a NASA center as the project management institution. The proposer could partner with a university-affiliated institution or other non-NASA institution with competence in project management.

The willingness to undertake the experiment was certainly influenced by the advice provided to the incoming NASA administrator by key officials in the executive branch, the Congress, and the science community. After the highly visible failures/shortcomings of several flagship NASA missions, the Administration and Congress pushed for more affordable missions and did not see NASA’s project management expertise as a critical ingredient for mission success. Space scientists interested in solar system exploration were frustrated by the limited number of space flight opportunities within a space science budget dominated by the costs of supporting the major missions. One solution they suggested was to use the concept employed with the Explorer

1 The earliest space science missions, called Explorer, were small, focused science investigations for space physics and . In fact, the first three Explorer missions were launched before NASA was legislatively created in 1958. 2 Space Studies Board, National Research Council, Assessment of Recent Changes in the Explorer Program, December 1996.

3 program of frequent, relatively low-cost missions.3 The scientists found a receptive audience in NASA’s space science leadership, the Office of Management and Budget, and Congressional committees.

The new NASA administrator was also keenly aware that the agency had been slow to embrace the concept of using a number of smaller, technically advanced spacecraft to carry out missions in the . The lengthy cycle time of space science missions from initial design to publication of the science results meant that NASA had to sustain political interest in supporting its budget requests for a mission over many years. The answer to these problems was to create spacecraft that were less expensive, had shorter cycle times, and had science returns commensurate with the investment made.

In solar system exploration, the acknowledged capabilities of the Applied Physics Laboratory (APL) of Johns Hopkins University in managing lower cost projects for the Department of Defense made NASA willing to accept having an institution other than its Jet Propulsion Laboratory (JPL) manage a project. The interest of APL management in supporting NASA solar system exploration missions was obvious in the late 1980s, when APL developed a low-cost near-earth asteroid rendezvous mission concept.4 At the time, the JPL project management team had the reputation of being conservative, reaching decisions only after careful evaluation of alternatives, and of being slow to adopt new technologies; its process contributed to longer design and development cycles and higher costs. NASA’s management was interested in challenging JPL to adopt a more streamlined approach. In response, JPL formed a small team to design a mission to that would enable a network of low-cost environmental surveying stations to be placed across the planet.

With Congressional support, in 1993 NASA authorized JPL and APL to proceed with the initial two missions of the new of low-cost solar system exploration missions—the Mars Pathfinder5 and the Near Earth Asteroid Rendezvous (NEAR). These missions were not led by PIs; instead, the management team consisted of a PM, a PI and small project management teams with capable mission scientists and science teams. Both the PI and small project management teams made extensive use of internal laboratory engineering, manufacturing, and integration capabilities. The spacecraft were launched on schedule, using relatively inexpensive

3 A more extensive discussion of the gestation of this approach can be found in Howard E. McCurdy’s recent publication, Low-Cost Innovation in Spaceflight: The Near Earth Asteroid Rendezvous (NEAR) Shoemaker Mission, NASA SP-2005-4536. 4 A Science Working Group met in late 1989 and early 1990 to define the concept for a new solar system exploration program and proposed that it be called Discovery. NASA’s Dr. Wesley Huntress, then the Headquarters Solar System Exploration Division director, asked the group to focus on a specific mission and to evaluate feasibility studies from both JPL and APL. It completed the study in May 1991. The Near Earth Asteroid Rendezvous mission was used to define the approach, and the studies suggested the mission could be done at a cost of approximately $110 million (APL) to more than $150 million (JPL). Dr. Huntress included the Discovery Program as an element in the solar system exploration strategic plan issued in 1991. He also briefed the Congressional appropriations committees, which in the 1992 appropriations bill had directed NASA to prepare a plan for “small planetary or other space science projects, emphasizing those which could be accomplished by the academic or research communities.” 5 Although originally planned as the pathfinder mission for the network of low-cost Mars landers envisioned in the Mars Environmental Survey (MESUR) Network program planning, the MESUR Pathfinder was renamed the when the administration decided not to approve the MESUR program.

4 launchers. The total mission costs were less than estimated, with NEAR at $234 million, $28 million under the approved budget, and Mars Pathfinder at $262 million, $11 million under. Both missions were scientifically productive.

THE INITIAL PI-LED MISSIONS

In 1994, NASA issued the first Discovery Announcement of Opportunity (AO) to solicit proposals from PI-led teams for future missions. Because the initial two missions in this experiment were still under development, administration support of funding requests for future missions was not assured. NASA issued the solicitation in anticipation of a successful outcome. Twenty-seven teams submitted proposals. In November 1995, NASA selected two proposed missions, Lunar and Stardust. After Congress provided funding for the very low-cost ($63 million) Lunar Prospector in the appropriations bill for fiscal year (FY) 1996, Congress and the administration supported the inclusion of development funds for Stardust in the FY 1997 appropriation.

A small integrated team of officials from NASA’s Ames Research Center (ARC) and Lockheed- Martin Missiles and Space (LMMS), under the leadership of a PI (Dr. Alan Binder), was responsible for Lunar Prospector’s management. The team had refined the Lunar Prospector mission design over a number of years (late 1988 through 1994) before ARC and LMMS became involved. The proposed cost estimate for the mission was set at $63 million during that refinement phase. When approved for development in late 1995, the estimate included $31 million for the spacecraft, $25 million for the launch vehicle, and $7 million for mission operations and data analysis.6 During the development phase, the project management team had to cope with schedule slips and cost growth due to problems with the new Lockheed Launch Vehicle (Athena 2). The underruns in the spacecraft’s development enabled the team to apply $4 million to offset the launch vehicle’s cost growth. Although launch slips usually result in a significant increase in spacecraft costs, the PI and the LMMS project manager were able to move staff off the project in a rapid and efficient fashion. The final launch in January 1998 was seven months later than the June 1997 target launch date, and the final costs of $69 million reflected a NASA decision to add $6.5 million to extend the mission operations period and fund a data archiving effort. Four months of the launch delay were attributable to a NASA management decision to hold off until after the launch of the spacecraft on the Athena 1.

The PI was deeply engaged in all aspects of mission design, development, and flight operations; his style of leadership was to be involved in all details. The PI listed the following elements as critical to the success of the Lunar Prospector mission:7

y “A small, dedicated, competent team managed by the PI, with no more than 30-35 people at any time

y Limited participation in management—the PI makes the decisions

6 Remarkably, the initial estimate for the spacecraft included an allowance for reserves of only $274,000. 7 Presentation by Dr. S. Alan Binder to the NRC Committee on PI-led Missions, November 15, 2004.

5 y Minimum documentation

y Limited, unalterable objectives/payload

y Small, simple spacecraft

y No development—use of heritage hardware

y Short schedule—early freeze of designs

y Design, building, testing, and flying of the spacecraft by the team—no loss in understanding by ensuring that a small group is involved”

Among the Discovery missions, the Lunar Prospector stands as a unique example of a style of project management that is suited to small, in-house projects.

NASA approved the Stardust project for development in October 1996, with a total mission cost estimate of $206.1 million.8 At the launch in February 1999, the spacecraft had cost $1 million less than baselined, although the baseline estimate included 12 percent reserves ($19 million). The project management at both JPL and Lockheed Martin Astronautics (LMA) was stable throughout the development period, and an excellent information and communications system infrastructure enabled the PI to keep abreast of project activities from his location at the University of Washington, without constant traveling to Pasadena and Denver. The PI (Dr. Donald Brownlee) relied upon the JPL project manager to manage the day-to-day activities. A tailored approach to earned-value management and a common work breakdown structure between JPL and LMA facilitated analysis of performance against baseline plans.9 From a hardware standpoint, LMA took advantage of the developments with and shared parts procurements with the Mars Surveyor ’98 projects. LMA also put company funds into the development of the sample return capsule. The knowledge inheritance from these efforts mitigated the cost-estimating risk. The structural design approach originally proposed had to be redone to reduce the mass by 100 kg; this was discovered early enough (at the Capability Requirements Review during phase B) that it could be fixed in a timely manner. The project management approach employed for Stardust proved to be one of the success stories for the “faster-better-cheaper” mission design and implementation philosophy.

In 1996, the SSB studied the merits of extending the Discovery program approach to the Explorer program. The SSB supported the extension of the “PI mode” because “[it] is an open process that appears to be intrinsically fair. It has exposed a reservoir of ideas for focused

8 The $206.1 million included $117.8 million for spacecraft development, $37.2 million for mission operations and data analysis (MO&DA), $41.5 million for the Delta launch vehicle, and $9.6 million for the preliminary design phase. The 1994 AO stipulated a cost cap of $150 million for the development phase and $35 million for MO&DA, expressed in constant 1992 dollars. The launch vehicle was not included in the cost caps. 9 Earned value management (EVM) provides management with a means of evaluating planned vs. actual schedule and costs performance. The common work breakdown structure was an innovative approach that allowed for such performance analysis.

6 science under a cost cap.”10 This finding was based on the results from the Discovery AO solicitation and subsequent evaluation process. Even though the experiment in using PIs to lead project teams through the design and development phases was still underway in the Discovery program, the PI-led mission approach was extended to the Explorer program.11

THE RATIONALE FOR PI LEADERSHIP

There was both a federal acquisition and a policy management philosophy behind NASA’s adoption of the new approach. First, NASA’s plan was to solicit—through the AO process—the best proposals for focused science missions from individual teams of scientists and their supporting casts, with the understanding that each proposal amounted to the intellectual property of the team. NASA would evaluate the proposal on its scientific merits first and foremost, and then for the feasibility of its technical, management, schedule, and cost elements within the constraints established in the AO. The PI clearly has to make a compelling case for the value of the scientific return from the mission. The quality of the science is the PI’s responsibility.

Second, NASA viewed the PI’s placement in a leadership role throughout the design, development, and mission operations phase as proper because the PI had the most at stake if the mission failed to produce scientifically valuable returns on the public’s investment. The NASA science executives recognized there would be many difficult design decisions and issues during the design’s implementation, with necessary compromises that would impact the potential science return. The PI would make the big decisions, clearly with the counsel of the PM and his team of engineers and scientists. Rather than having the PI in an advisory role to the project manager, NASA decided that the PI would have the final say on issues involving the balance between the potential science return and the engineering risk and its impact on costs and schedule. To give this decision credibility, NASA put the PI in charge of the process that allocates the project’s funding, and, in particular, the reserves. NASA also stipulated that the PI would select the PM and would be able to petition the institutional project management center for the PM’s removal.

Putting a scientist in the lead position for the proposal was not revolutionary, but having a scientist decide the allocation of funds during development raised some concerns. The response was several-fold. First, NASA would rigorously evaluate the proposed management arrangements as well as the proposal’s technical, cost, and schedule feasibility. Second, NASA officials would be continuously engaged in the project development cycle, beginning with the conditional selection of a PI’s proposal for NASA-funded definition studies, followed by a second step of preliminary design, leading to final selection for authority to proceed with development. Third, during the development process, the higher level NASA program manager would have “” into the status of the individual projects through discussions with both the PI and PM on at least a quarterly basis. Finally, as is customary with all NASA-managed developments, a stipulated requirement was monthly status reviews of the project’s science and

10 Space Studies Board, Commission on Physical Sciences, Mathematics, and Applications, Assessment of Recent Changes in the Explorer Program, National Research Council, December 1996. 11 In the last several years, NASA has added two new categories of PI-led missions: the for higher cost solar system exploration missions and the Scout projects in the .

7 engineering decisions, procurement planning, schedule progress, funding requirements, actual vs. planned cost performance, and projected costs.

The PI carries out the leadership role within a tightly constrained set of documented requirements. The Program Level Requirements Document signed by the NASA Headquarters associate administrator gives a designated project the authority to proceed into development. This document also includes the cost and schedule caps and details the launch vehicle to be used, the minimum science “floor” that the designed and developed mission must meet, and other interfaces such as the reporting relationship to a specific governing Program Management Council. All these boundary constraints are under the direct authority of the associate administrator. The PI must stay within those limits, or the mission will undergo a Termination Review.

THE EXPERIENCE WITH PI LEADERSHIP

The Academy team’s review of PI-led missions found only a few instances where the PI was deeply engaged with the PM in the overall day-to-day operations of the project team and interactions with the contractors. Lunar Prospector was such a case. The Wilkinson Microwave Anisotrophy Probe was another. One Discovery mission PI noted that over time he had to work with three PMs and was forced to get involved when the latest PM did not understand why a particular design approach had been selected. In this instance, the predecessor PM had not documented why the design issues got resolved in a particular manner, and the new PM was opening up issues that had already been settled with the contractor. Alerted to this, the PI stepped in and filled the role of knowledge transfer agent between the last PM and his predecessor.

Based on the NRC/Academy interviews with PIs, it is clear that different styles of leadership and management are possible. Some PIs believe the job requires their full-time involvement. Others engage the PM frequently, perhaps having daily conversations and attending meetings several times per week, either physically or remotely via teleconferences. Only a few PIs continued teaching and publishing. To be successful in a part-time capacity, the PI relied even more heavily on having a project scientist on the PM’s staff to represent the science interests in the engineering discussions. Both PIs and PMs agreed that the quality of the “marriage” between the PI and PM is critical to a successful mission. The interviews yielded several instances where PIs recounted that they had asked for the removal of PMs who lacked the skills required to succeed in the fast-paced environment of cost and schedule caps. In other instances, the reason was a lack of trust, which arose when the PMs failed to keep the PIs appropriately advised of problems or made decisions that were not consistent with project priorities.12 The Academy team found cost and schedule growth in projects where the PIs had to deal with multiple PMs.

Some PIs possessed engineering and management as well as scientific skills, and were able to rely less on the PM and associated staff. One PI expressed the view that the PI has to be watchful of design and development decisions by engineers, who lack an appreciation for the

12 In one example, a PM agreed to subordinate project resource needs in favor of institutional priorities without advising the PI as to the decision’s rationale.

8 goals of the scientific mission.13 The Wilkinson PI took it upon himself to interact constantly with the engineers, and to work to solve design, schedule, and cost problems that emerged as the engineers attempted to meet the requirements specified in the interface control documents. This PI provided the engineers with an understanding of which design requirements were critical to the science return and which were not.14 He viewed this revision process as essential to meeting the tight schedule and cost caps for the mission.

Few PIs interviewed claimed to understand budget forecasting, cost estimating, and accounting for the indirect and direct elements of reported costs. One highly experienced former PI said of himself and his colleagues: “We PIs are generally clueless about costs.” A first-time PI interviewee said he was at a great disadvantage when his PM and staff discussed cost trends and forecasts. The PI came from an academic setting and lacked experience with the cost-charging practices of JPL and the project’s industrial contractor. He said he could not understand why the latest cost estimates provided to him were any better or more valid than the previous ones. Further, he remarked that he was “at the mercy” of what JPL staff told him, and they exhibited little interest in helping him understand issues such as the calculation of the overhead rates.

Several PIs questioned whether the PM and associated staff saw any value in helping the PI become knowledgeable about budget and cost management issues, starting with the proposal and continuing through the formulation and implementation phases. One PM acknowledged that he found no particular advantage in having the PI more educated about project control practices. He feared that a PI with little or no experience in project control would slow decision making by asking questions and wanting clarifications, and that the PI’s lack of in-depth knowledge would prevent him from engaging in a productive dialogue with the PM and his staff. This lack of knowledge did not appear to be a major concern for PIs whose missions did not encounter cost growth or schedule problems that threatened to exceed the cost/schedule caps. These PIs expected PMs to keep them informed and to handle the details using their best judgment. On the other hand, it was an issue for the PI whose mission encountered such problems.

The comments made by the PIs about the difference between the assumptions underlying the proposal cost estimates and the ensuing reality raise concerns. Many PIs said they were chagrined to find that: “off the shelf” hardware was not usable without modifications that were not included in the cost estimates; the cost for hardware with new technology is highly uncertain; there is a strong likelihood that a supplier will provide one price for use in a cost estimate but

13 The ensuing discussion on the merits of this argument led to an understanding of the greater potential for distrust between scientists and engineers working on non-PI-led missions. The scientists indicated that they were motivated to specify requirements that exceeded their real minimum requirements because they believed the engineers would design to the specified minimum; they did not believe the engineers had any compelling interest to design so as to generate the highest scientific return within the dictated cost and schedule parameters. The scientists were motivated to yield ground only grudgingly. This process has had some additional schedule and cost impacts in the case of program-led missions, although the effect was said to vary as a function of the skill of the PM in forcing issues to a timely and equitable resolution. Although the team’s interviews with PIs and PMs did not enable it to conclude that no traces of this attitude persist in PI-led missions, the PI clearly leads and is trusted by the science team to achieve the best possible science return within the constraints of the mission’s Program Level Requirements Document. 14 This approach is akin to a “Six-sigma” design methodology, because the “on the spot” design interactions allow for issues to be resolved before major design reviews, instead of waiting for problems to emerge during the reviews that then have to be resolved through the review item discrepancy process.

9 will not stand by that price when the actual order is placed one to two years later; and NASA centers, JPL, APL, and aerospace firms will increase their overhead pricing rates when the business base is lower than projected. These “surprises” betray a lack of previous first-hand experience on the part of the first-time PIs, to be sure; however, they also reflect a failure on the part of the aerospace partners, NASA, and other institutions to educate PIs on the “lessons learned” from prior development projects.

Of equal—if not greater—concern, the Academy team’s interviews led to an understanding that the primacy in the evaluation process of the potential for an excellent science return created pressure to pack as much science return as possible into the proposal, to the detriment of the feasibility of the proposed costs. A few PIs were informed beforehand that there was a high degree of optimism in the engineering cost estimates, but justified optimism on the need to be selected.

To illustrate just how competitive the process is, Table 1 on page 11 provides the number of proposals submitted by PI-led teams; the number selected for concept feasibility studies; the number approved for development; the number still under development; and the number of missions launched.15 The data is for the NASA Discovery, Explorer, Mars Exploration,16 and New Frontiers space science programs.

15 Table 1 does not separately identify projects that are still in preliminary design and thus have not been confirmed for development. 16 In the Explorer program, there are three categories of PI-led missions—Medium-class Explorer, Small-class Explorer, and University-class Explorer.

10 Table 1. Number of PI-Led Mission Proposals Received; Selected for Concept Feasibility Studies; Selected for Development; and Launched, 1994-2003

Announcements of Full Mission Studied for Confirmed Under Opportunity Proposals Received Feasibility for Development Launched Development

Discovery 1994 AO 27 4 2 2 1996 AO 34 6 2 2 1998 AO 26 5 2 2 2000 AO 23 3 2 2 2003 AO 16 0

MIDEX 1995 AO 43 13 2 2 1998 AO 31 5 1 1 2001 AO 43 4 2 2

SMEX 1997 AO 46 3 2 2 1999 AO 46 7 1 1 2003 AO 29 5 1 1

UNEX 1998 AO 35 2 1 1

MARS SCOUT 2002 AO 18 4 1 1

NEW FRONTIERS 2001 AO 2 2 1 1 2003 AO 7 2 ______TOTALS 426 65 20 12 8

Sources: NASA Science Selection Support Office; NASA Space Science Research Selection announcements.

The key point of Table 1 is to reflect the intensity of the competition. After submitting a proposal, the proposers know that only 10-20 percent will be selected for the next round, the concept feasibility phase. Making it through the concept feasibility stage, the number selected for development is closer to 30 percent. Overall, less than 5 percent of the proposals are selected for development. The Academy team believes that the intensity of the competition and NASA’s groundrules for proposal submittal and selection contribute greatly to the probability of lower cost estimates being submitted. The seeds of the problem with cost overruns are sown here. The Academy team’s interviews also suggest there is a conflict between what NASA desires—high

11 confidence cost estimates—and the amount of resources required to generate the proposals.17 To complete this background section, NASA’s phased project planning approach, and the watershed event—the twin failures of the Mars Surveyor 1998 missions—that led to a dramatic change in the “operative” environment for the PI-led missions are discussed briefly below.

NASA’S PHASED PROJECT PLANNING AND EXECUTION PROCESS

NASA employs a disciplined, documented approach to managing space science flight programs that has changed remarkably little over the past decades. The phased planning approach is used to guide first the formulation and then the implementation of space flight projects.

The formulation stage is broken into two phases: A, which is intended to ensure that the mission requirements are precisely defined; and B, which is intended to provide a preliminary technical design that addresses the mission requirements and meets the affordability, timeliness, and risk tolerance criteria. NASA closely reviews a project’s readiness to proceed from phase A to phase B. If it is evident that the proposal does not accommodate the desired science return within the predetermined caps on schedule and cost with an adequate level of confidence, then NASA either terminates or reduces the project’s scope—commonly referred to as descoping.

The process leading to approval of a project’s entrance into the implementation stage entails a thorough evaluation of the science, technical approach, management, cost, and schedule. A Confirmation Assessment Review by an independent team precedes the Confirmation Review (CR) with the NASA associate administrator for the Science Mission Directorate. The associate administrator issues a Program Level Requirements Document that establishes a set of formal understandings about the planned approach for the project. The implementation stage consists of a detailed design and development phase (C), followed by a spacecraft integration and test, launch preparations, launch support phase (D), and a mission operations phase (E) that lasts for a specified period of time. Many successful missions operate beyond this period and are funded for extended mission operations (phase F). (See Figure 1.)

17 As noted, NASA did not bear the costs of producing the 426 proposals—at least, not directly. Industry, academic institutions, and government laboratories have approved “bid and proposal” budgets as an element of the indirect costs that can be billed as an allowable cost on government contracts. To the extent that NASA contracts with these institutions, it indirectly pays for the bid and proposal costs.

12 Figure 1. Space Science Flight Program Management Process Flow

Source: Space Science Enterprise Management Handbook, Office of Space Science, NASA Headquarters.

The amount of funds spent during formulation is not great in percentage terms as compared with the implementation stage, but the groundwork laid in the early phases is vitally important to avoid costly redesigns or compromises to mission success during the implementation stage. Previous studies of cost overruns on NASA projects have stressed the importance of reducing the implementation stage risks by investing sufficient funds and time in (1) demonstrating sufficient levels of maturity in mission critical technologies, and (2) reaching at least a state of design definition that a preliminary design review can be successfully completed. Unfortunately, there is a tendency among NASA managers to argue that a high ratio of funds spent during project definition to the total phase A-D cost estimate is in itself an indicator of design maturity. This point is cogently made by Dr. Mathew Schaffer.

The hypothesis that a higher definition ratio helps to control cost growth has inherently a compelling logic, and one expects the hypothesis is valid. Of course, the fact that a project spent any particular percentage of its overall budget prior to implementation may not, in the end, be particularly correlated with a sound understanding of the project … A high definition ratio could just as well have been caused by requirements changes or re-plans due to budget reductions in the definition phases, which would naturally have led to attendant slips in the start of implementation. Many NASA projects had high up-front costs for just such

13 reasons, and the projects ultimately embarked upon implementation. In this phase, the projects had their successes and their troubles. But these events were independent of the magnitude of the long forgotten up-front costs. In short, a high definition ratio can be beneficial only if the resources are spent reasonably efficiently.18

Indeed, the Academy team’s review of the cost performance of the Discovery projects does not show a good correlation between the identified NASA funding investments and the final development costs vs. the cost estimates at the time of the CR. Table 2 provides (1) the investments made by NASA during the phase A and phase B formulation stage, expressed as a percentage of the development phase estimates at the CR; and (2) the development stage estimates without reserves as a percentage of the actual development costs.

Table 2. Cost Performance Measures for Discovery Projects Expressed as a Percentage

Formulation Implementation Stage Stage Investment Internal Cost Growth (A/B ratio to C/D estimate) (C/D estimate to actual)

NEAR 2.0 -5.5 Mars Pathfinder 6.0 23.6 Lunar Prospector 5.8 -6.5 Stardust 8.1 18.1 Genesis 9.2 33.0 Contour 11.6 18.5 Messenger 19.3 38.4 Deep Impact 24.5 58.0 Sources: NASA Headquarters budget documents, Jet Propulsion Laboratory cost analysis documents

It should be noted that several factors bias the cost data. The formulation stage investment only includes NASA’s funding, not the industry/government/university investments preceding the proposal and during the formulation stage. The value of knowledge from or synergy with other past/current developments is missing. The recorded costs for the implementation stage do not reflect the full effort expended for the development, largely because there is a significant amount of uncompensated labor that does not show up in the recorded costs. The relative complexity of the spacecraft is not the same, e.g., comparing the “simple spacecraft” of Lunar Prospector vs. the complex spacecraft of the later missions. And, notably, the failure of the two Mars Surveyor

18 “The Myth of the Gruhl Curve,” unpublished NASA HQ. Cost Analysis Division document, Dr. Mathew Schaffer, ca. 2003.

14 1998 spacecraft dramatically changed the risk tolerance environment of the earlier missions. NASA imposed a level of formality and rigor in the design and development environment for subsequent missions that clearly influenced costs.

IMPACT OF THE NASA INTEGRATED ACTION TEAM REPORT

NASA’s willingness to tolerate the risks of failure for projects over the last 15 years has changed dramatically. In March 2000, NASA released reports that addressed the failures of the and Mars Polar in 1999, the wiring problems, and the “faster-better-cheaper” approach to missions. A NASA Integrated Action Team (NIAT) reviewed the findings of these reports, integrated the lessons learned, and augmented them with findings from its own examination of project management practices across the agency. The NIAT report, published in December 2000, called for sweeping changes in how NASA executed its missions.

It would be difficult to overstate the impact of the NIAT recommendations. For NASA’s science missions, the pendulum swung away from the increased tolerance of the risk of failure present during the 1990s to a greatly reduced level of risk tolerance. In practice, this shift has led to an “overwhelming” increase in the amount of documentation required, additional independent reviews, mandated use of independent verification and validation of software, and an intense focus on risk analysis and management. The resultant impact on management and costs of all science missions has been significant.19 The changes affected the PI-led missions more than they did the larger, “program-led” missions, because the larger missions were closer to the minimal risk tolerance model NASA had adopted as the norm.

To its credit, NASA management adjusted the cost caps for those PI-led missions that were caught up in the backwash of these changes. In the case of a number of missions evaluated in this study, it increased the cost caps to reflect the individual project office’s estimate of the NIAT effects.20 In spite of the adjustments to the caps, the cost overruns for PI-led missions have increased dramatically since the watershed events of 1999-2000. Many interviewees said the underestimation of mission costs reflects in part poor prediction of the true costs of the NIAT changes.

19 Based on experience, moderation of the mandated changes normally occurs after a few years. The failure of the Comet Nucleus Tour, a Discovery mission, in mid-2002, the Columbia in 2003, and other mishaps appears to have inhibited this shift. 20 The adjustments generally amounted to $5-10 million. It should be noted that the actual costs of complying with the NIAT changes have not been rigorously tracked or even estimated using post-mission fact-finding evaluation techniques. Thus, the true impact on mission costs is poorly understood.

15 16 SECTION II FINDINGS, ANALYSIS, AND RECOMMENDATIONS

The history of cost overruns in space flight projects shows that the greatest overruns occur when there is a mismatch between the project’s cost estimate and the scope of the engineering challenge required to achieve a project’s scientific and/or engineering objectives. The mismatch is driven by a number of potential factors. Commonly, the cost estimate has been prepared in an environment influenced by externally dictated “affordability” constraints. Often, the engineering solutions are not well understood for spacecraft systems or instruments that are critical to the mission’s success. Key assumptions used for preparing the estimates prove to be inaccurate. For example: frequently, the resources (funds, talent, facilities, launchers) aren’t available in a timely manner due to another project’s higher priority; occasionally, the project leadership is not up to the management challenge; quite often, the external environment for failure tolerance changes, and new design/test requirements are imposed; and, delivery and quality control problems are encountered that consume the contingencies included in the schedule and cost plans. Finally, the cost estimators simply are handicapped by a lack of relevant prior experience; they lack appropriate cost analogies which could provide a basis of comparison.

NASA program managers have several tools at their disposal to reduce the probability that a new project will be affected by one or more of these factors. First, they can ensure the conditions established in the Announcement of Opportunity do not inadvertently contribute to the creation of a mismatch. The cost caps, schedule caps, funding constraints, and review cycles can reflect the desired science return and the imposed management conditions. Second, they can use unbiased external reviewers with relevant science, engineering, and cost estimating experience to provide the critical insights needed to make informed decisions during the project’s life cycle. Third, they can make allowances for unanticipated but mandated changes, the adverse effects of non-anticipatable events, and anticipatable but unrecognized resource constraints that lead to inefficient expenditures.21 Fourth, they can ensure a match between the managerial and engineering talent furnished to the project and the challenges inherent in the project. Finally, they can reduce the probability of deliberate underestimating of the “basic cost estimate”—the estimate for the baseline mission elements before the application of reserves—by insisting on a high confidence in that cost estimate.

As discussed in the prior section, NASA program managers do avail themselves of these tools, and have made adjustments in their management approach in response to changed circumstances. The findings and recommendations provided below should be viewed as building on and strengthening current NASA practices. The Academy team’s examination of the cost growth of projects from the baseline without reserves (as seen in “the internal cost growth” for Discovery projects) indicates how seriously underestimated the starting estimates for many projects have been. The team’s findings, analysis, and proposed corrective actions are presented below.

21 Obvious examples are delays due to technical facilities not being available, lack of access to the most talented personnel, funding limitations, delays in launch vehicle readiness, and, quality/availability problems with electronic piece parts.

17 The analysis is broken down by two principal processes. The first examines how the proposal process for PI-led missions influences the quality of the basic cost estimates. The second focuses on the mission development process.

THE PROPOSAL PROCESS

Finding 1: The proposal and selection process has characteristics and limitations that encourage the submission of optimistic basic mission cost proposals for science missions.

The AO proposal process encourages a proposer to adopt strategies that emphasize the prospects for exciting science returns from innovative missions that explore new environments, using a mixture of proven and new technologies. Proposers are astutely packaging their proposals to address NASA’s expectations—as articulated in the AO and NASA program management directives—in terms of accepted project management processes. And proposers are very aware of the need to meet NASA’s stipulation that a specified percentage level of project cost reserves be shown as available at key milestones before the project will be approved for development start.

However, the specific rules set up by NASA in the AO work against proposers providing NASA with proposal cost estimates for the baselined mission content that have a higher probability of being successfully executed without reliance on project cost reserves.

y First, the proposers must satisfy NASA’s emphasis on the amount of reserves included, where higher levels of reserves in the estimates are evaluated as “a strength.” The AO specifies a percentage of “unencumbered reserves” at the time of confirmation for development of at least 25 percent of all development costs in phases C and D; if the unencumbered reserves are lower than 25 percent, the projects “are likely to be judged as having an unacceptably high cost risk and, therefore, not confirmed for further development.”22 As a consequence, the proposers are encouraged to prepare estimates that ensure the required level of reserves can be displayed. The Academy team’s interviews suggest that proposers are achieving this requirement by using more “optimistic” costing assumptions in their basic cost estimates.

y Second, the rules of the AO process call for a focus on total mission costs, including the amounts spent during phases A and B. This focus on the end-to-end mission costs at the time of the step 1 proposal establishes a presumption of a level of maturity in the cost estimates that is normally achieved only by the end of the formulation phase. Indeed, the proposer is encouraged to include statements that stress the positives—e.g., a stable design and mature technologies — and minimize the negatives. There are no incentives for a proposal team to propose formulation phase “design trades” to reduce technical risks, even if the trades were to be funded using its own resources. It is better to be quiet about the design risks than to signal to evaluators that there is a design maturity issue. The Academy team also noted that there are even greater disincentives for a proposal

22 See Announcement of Opportunity, New Frontiers Program 2003 and Missions of Opportunity, NASA AO 03- OSS

18 team that proposes NASA funds be used during phases A and B for design trades: the expenditures are counted against the cost cap, thereby reducing the total funds available for development, and, in effect, applying funds that otherwise could be counted as “reserves.”

y Third, proposers are fully aware that a mission plan intended to achieve truly exciting science returns often requires departures from the “experience base.” Employing existing technologies and exploring well-understood spacecraft operating environments are avenues that markedly reduce cost and schedule risks. However, as an example, the exploration of Mercury by the MESSENGER spacecraft clearly necessitated the introduction of advanced materials and technologies and exposure to a poorly understood, stressful spacecraft thermal operating environment; the cost growth experienced during the MESSENGER’s development phase had much to do with the inherent challenges required to achieve the desired science return.

The PIs interviewed clearly stated the conflict between achieving exciting science, and achieving lower cost and schedule risks. However, the fierce competition among proposal teams leads to the understanding that step 1 proposals that promise high science return are favored over those with reduced science return but lower cost and schedule risks. This conundrum is an incentive to the proposal teams to take their chances that the step 1 independent reviewers will not fully appreciate the cost and schedule risks inherent in achieving the desired science return.

y Fourth, NASA has been slow to adjust cost caps to reflect the true impacts of the post- NIAT environment and the progression toward more difficult science exploration missions (the simpler and cheaper missions have already been done or are not as exciting).23

y Fifth, the AO restricts the potential cost increase to 20 percent from the time the proposal is received to the concept feasibility report published at the end of phase A, and in no case can it be higher than the cost cap. There is little incentive for proposers to alter the initial cost estimates to reflect their sense of “realism” unless the proposers view implementation of a descoping option as viable.

The Academy study team concluded that the only way the project team can achieve, without penalty, a higher level of formulation phase expenditures sufficient to permit design trades to be undertaken to refine the design or to develop engineering testbeds is to do so without NASA funding. The Academy team’s interviews revealed this is the practice the proposal teams have adopted; the proposal teams at least match if not exceed the limited amounts of funding NASA provides during phase A.24 In the step 1 proposals, the plans for phase B also have to be

23 The PIs and PMs have argued that the cost caps in the AOs should be increased to reflect this. The increase in total mission cost limitations in recent NASA AOs shows that NASA’s Science Mission Directorate has recognized the merits of this argument. 24 The amounts provided by NASA for phase A studies have increased over time. The 1998 Discovery AO provided for $375 thousand; the 2000 Discovery AO provided for $450 thousand; and the 2004 Discovery AO provided for $1 million.

19 provided in some detail; again, there is little incentive for a team to put forth a plan of technology trades or for testbeds that consumes funds otherwise available for reserves. In addition, NASA specifies an allowable funding profile that it expects proposers to adhere to in their planning; this militates against a proposer planning a more robust formulation phase than NASA has allowed for in the allowable spending profile.

The Academy team also considered the merits of NASA’s strategy of relying on descoping options that could be put into effect should the level of reserves diminish to an unacceptable level. The proposers have to provide a detailed plan for descoping to the “minimum science floor.” The interviewees said this strategy is of limited effectiveness during the formulation phase. Its limitations stem from the incentives for proposers to maintain as much science capability as possible; thus, the proposers are likely to overstate the cost savings achieved by implementing specific descoping options. Because the objective is to be confirmed for development, with a key being NASA’s evaluation of reserve levels, the proposers have every incentive to overstate the savings and little incentive to recognize the additive costs that result from redesigns. The interviewees noted that NASA had the ability to verify the savings from eliminating capabilities, but lacked the detailed insight required to appreciate the costs of redesigns. During implementation, several project managers noted that pre-planned descopes were considered to be of minimal effectiveness in restoring reserves, because post-design review changes in the technical requirements require redesigns that have broader systems engineering and integration impacts. The recognition of the need for changes that will reduce costs is usually six months to a year after the critical design review, too late to implement descopes that could generate appreciable savings.

The Academy team concluded that changes to the AO provisions that govern the proposal’s cost estimates could present the best approach to reducing the probability of cost overruns. Instead of relying upon stipulated reserve levels to address uncertainties, NASA could request the proposer to address the uncertainties by attributing a confidence level to the baseline estimate, and showing the magnitude of that uncertainty by providing lower and higher values commensurate with increased and decreased levels of cost estimating uncertainties. By requiring a three point estimating approach, NASA could also gain insight into the areas of greatest technical risk, that is, where there are the greatest dispersions in the low-medium-high confidence estimates.

However, the Academy team was concerned about the level of additional investment the proposal teams would have to make to respond to the AO with their step 1 proposals. Although this would be of reduced concern for proposers whose proposals are built on the work expended during previous AO submissions, the new requirement could affect the first-time proposer in assembling the necessary teammates and getting the teammates to invest their resources to generate a proposal.

As a consequence, the Academy team concluded the objective intent of the proposed change could be met by requiring the three point estimate as a product of the Concept Feasibility Study. If NASA evaluated the risks as acceptable and approved the proposal for the phase B definition and design studies, NASA could also work with the PI’s team to ensure the schedule and funding resources expended during the preliminary design studies were tailored to reduce the areas of

20 greatest cost estimating uncertainty. The NASA decision official could accordingly focus his Confirmation Review on whether the basic cost estimate had a high level of confidence.

Recommendation 1: NASA should adopt a risk-based estimating focus on the basic elements of cost, and require, at the end of the Concept Feasibility Study (phase A), an analysis of the estimated costs—before reserves—that provides high/medium/low-range estimates for key elements of the work breakdown structure, with the high-to-low-range estimates based on a confidence-level distribution.

* * * * *

Finding 2: The independent evaluation teams used for NASA’s proposal evaluation process provide the step 1 decision official with an appropriate level of understanding of the science value, management approach, and inherent technical risks. However, the AO-stipulated reserve percentages operate as a disincentive for the proposer to provide more realistic baseline cost estimates and an assessment of the appropriate level of reserves. NASA could gain useful information from proposers by requiring them to address a specific list of “cost risk subfactors.” The NASA evaluation team could use the responses to these subfactors as a consistent basis of comparison among proposals.

NASA uses the initial phase of the selection process to eliminate proposals that do not combine an exciting science discovery potential with an appropriate technical, management, and cost approach. NASA carries out the evaluation on the basis of the written proposal, without requiring explanations of unclear elements. It is only after the selection of the best proposals for concept feasibility studies and generation of the study reports that the independent evaluators can meet with the proposal team and complement the written study report with fact-finding discussions.

The Academy team’s examination of mission proposal evaluations indicated that the evaluators’ reports were generally effective in identifying and disclosing to the NASA decision official the preponderance of the technical issues associated with the project. However, the cost evaluations of the impact of those technical issues on the pre-reserve basic mission costs were not as effective in providing the decision official with an understanding of the potential costs. The cost evaluations were handicapped by the quality of the information provided in the proposal. Inconsistencies in the proposal’s cost estimates or lack of clarity meant the cost evaluator provided the decision official with step 1 inputs that could only note the inconsistencies/unclear elements.

The Academy team considered recommending that NASA allow the evaluators to submit requests for clarification to the proposal teams during the step 1 evaluation process to clear up these ambiguities. However, the number of proposals submitted to NASA in response to the AO is high enough to be concerned about the additional time required for a review and clarification cycle.

21 A better approach appears to be the use of an approach that requires each proposal team to respond to a specific list of “cost risk subfactors.” The JPL cost estimating group published a set of such cost risk subfactors in its December 2003 study of contributing factors to underestimates of baseline cost estimates and insufficient levels of reserves. Both primary and secondary risk subfactors were identified. The primary risk subfactors were: mission with multiple flight elements; operation in harsh environments; mission enabling spacecraft technology with a technology readiness level of less than 5; new design with multiple parameters not meeting the margin requirements specified in the JPL design principles; inadequate team and management experience; contractor inexperienced in mission application; level 1 requirements not well defined in formulation phase; excessive reliability requirements; new system architecture; and late selection of science instruments. The JPL cost evaluation assigned numeric values against the primary risk subfactors and secondary risk subfactors, and generated an aggregate cost risk evaluation total for each proposal. The total score is indicative of the total cost risk and the commensurate need for reserves.

One possibility is to use the JPL list of cost risk subfactors as a basis to generate a set of required information responses that will be included in future AOs. Each proposer would provide information on each cost risk element. While the Academy team believes that the JPL list may not be suitable as the standard for each AO, the NASA cost estimating community could work with the AO science proposal evaluation team to generate a list that could be tailored for a given AO. The responses by the proposers and the evaluations of them by the team would provide a consistent basis of comparison of the cost risks for each step 1 proposal.

Recommendation 2: NASA should eliminate the specified reserve levels in the AO and instead specify a list of mission cost risks. Proposers would be required to address each risk in their step 1 proposals. The proposal evaluation teams should participate in the generation of the list included in the AO. The evaluators should consider the proposer’s self-assessment, and present to the NASA decision official their evaluation. The decision official should also receive a comparative analysis of the evaluated cost risks on a consistent basis across the proposals.

* * * * *

Finding 3: PIs are expected to provide leadership and management, but most lack the requisite skills, which must be based on a fundamental understanding of the elements of project control.

Interviews with PIs and PMs revealed that new PIs have a very limited understanding of project control practices—cost estimating, schedule estimating, use of management information systems, and configuration control practices—and how these relate to the science and engineering for their missions. The Academy study team also found that it is the exceptional prospective new PI who has more than a rudimentary understanding of the elements of project control practices. PIs without experience found that NASA did not expect or require them to be knowledgeable partners in project control or engineering issues, but rather expected them to rely on their PMs. Some PIs who had experience on previous NASA space missions were more conversant with

22 project control techniques and could interact effectively with their project management team and contractors.

The lack of project control expertise of new PIs has not led to problems with missions in which there has been a stable, close, and trusting PI-PM relationship and a supportive project scientist working on the PM’s staff. It has been problematic in those instances where a PI has had multiple PMs whose different technical and management approaches have contributed to a lack of stability and/or altered the basic requirements established in the proposal. The Academy team’s review of these instances revealed that the PI either did not understand the ramifications of the PM’s decisions, or the PI believed he or she was in a weak position to override the PM’s argument that the decision would not materially impact adherence to the cost and schedule caps.

NASA does not treat a PI’s lack of project control knowledge as a weakness in evaluating the merits of proposals. Nor does it offer—or require—elementary training in this skill to prospective PIs. The experience an unsuccessful prospective new PI gains from the initial proposal process, the verbal debriefing by NASA, and the second round of formulating a proposal gives only limited exposure to project control subjects.

Recommendation 3: NASA should provide training and resources in project control for prospective PIs when they indicate an interest in submitting a proposal in response to an upcoming AO.

* * * * *

Finding 4: The costs incurred by proposers in responding to AOs has been identified as a potential limiting factor by the proposal teams as to whether they will respond to future AOs. However, NASA does not possess good information on the costs incurred by proposers in responding to an AO. There is a risk that participating institutions, agencies, and contractors will limit the number of proposals for which they are willing to support the preparation and submission costs. This could have adverse effects on the scope and variety of science investigations proposed in the future.

Two issues are associated with this finding. The first has to do with understanding the level of investment required to submit a proposal that has not only high science merit, but also sufficiently detailed and convincing technical, cost, and schedule information that it will be considered for selection to enter the Concept Feasibility Study stage. The Academy team found that it is unusual for a proposal to be selected the first time it is proposed in response to an AO. By the second—and even third—time a PI submits a mission proposal, the proposal team has amassed much greater detail to buttress its proposal. In addition, the team has benefited from the feedback it received in previous evaluation cycles. After it makes the selections, NASA provides the proposal teams with verbal evaluations of the strengths and weaknesses found in the proposal. The proposer specifically addresses those weaknesses in the next proposal. However, NASA does not ask the proposer in this post-selection period for even a rough order of magnitude estimate of the costs of preparing the proposal and of investments in independent research and development funds to advance the technologies. The Academy team believes the collection of this data could provide NASA officials with feedback on the level of investments

23 made by proposal teams and areas where NASA investments in advancing technology could reduce the time required before an attractive mission concept is mature.

Second, the interviews indicated that the number of unsuccessful proposals is sufficiently high that team members—aerospace firms, universities, and government laboratories—find it difficult to justify the cost of the proposal process to their management and consequently are cutting back on the number of proposals they will submit in response to future AOs. If NASA’s objective is to continue a process marked by full and open competition, decisions by these entities to team with PIs on fewer proposals may restrict the competitive process. The Academy team found that NASA has not been interested in the costs of the proposal effort, even though NASA indirectly pays much of the cost. The Academy team’s inquiries revealed that the cost to produce a Discovery-class proposal was about $1.5 million, with the project management center, industry partner, and PI’s institution each accounting for $500,000. This amount did not include precursor investments in technology development or non-recorded expenses. Multiplied by the number of proposals (16) made in the latest Discovery AO, the cost of responding to the AO can be roughly estimated to be at least $20-30 million.

The Academy team believes that—without a change in NASA’s approach—the aerospace firms, government laboratories, and universities may respond to the low success rate, high costs, and long lead times by supporting only those proposals they consider to have a higher probability of selection. This would reduce the number of proposals in response to the AOs. Coupled with that could be an increase in the efforts by the institutions, agencies, and aerospace firms to obtain good “market intelligence” on NASA priorities by meeting with NASA officials and scientists. This leads to a concern that the proposers will invest in preparing proposals for only those projects with the scientific objectives, cost and schedule attributes that they assume are favored by NASA’s leadership.

The Academy team believes NASA should be wary of the potential for adverse impacts from the latter, as well as be concerned about the possibility that those parties who invest most in market intelligence efforts will gain a competitive advantage. Instead, the Academy team favors NASA working with the appropriate scientific community to adopt a proactive approach that recognizes that the high costs of proposing could result in negative consequences, and takes actions that focus the investment decisions into areas of greatest scientific interest to NASA for the next series of missions. This will require NASA to be vigilant as to the cost of preparing proposals individually and collectively, and to determine how to maintain full and open competition while continuing to receive exciting science proposals in areas of key interest.

In concert with the NRC Committee, the Academy team considered several approaches that NASA could consider to forestall the negative effects of the high costs and long leadtimes for preparing proposals. One approach would be to reduce the investment costs for step 1 proposals by limiting the amount of information the proposers would have to generate in response to the AO. A second approach was for NASA to use the time prior to release of the AO to engage in internal and external discussions regarding scientific priorities, and then advise the program management centers, industry partners, and interested scientists at the earliest opportunity as to the areas of greatest scientific interest. If done sufficiently in advance, this approach could

24 reduce the number of incorrect assumptions by the proposal parties as to what science investigations would be considered high priority.

A third approach would be for NASA to play a more active role in the pre-proposal period by underwriting certain costs and providing technical support. Limited support would be provided to scientists with prospective missions, particularly those with innovative but technologically immature science instruments, by providing collaborative support (advanced technology development funding and access to NASA facilities and personnel). This support could be offered to those proposers who had received high evaluation marks during the previous AO but had not been selected for Phase A and/or Phase B. The support provided would be restricted to a specified period (e.g., 12 months) before release of the next AO. As such, it would complement but not supplant technical advice and support resources from industry and institutions.

Recommendation 4: NASA should work with the appropriate scientific community to develop specific strategies to identify and proactively address the negative effects of the high pre-proposal and post-proposal investment costs for proposal team members.

* * * * *

THE MISSION DEVELOPMENT PROCESS

Finding 5: The processes used by NASA to prepare for the confirmation review provide decision officials with a good appreciation of the key technical risks remaining for the development phase. However, the information provided on cost, schedule, and funding risks is more limited. In addition, the confirmation review process does not sufficiently address the constraints that could limit the availability of resources required for the mission development team to succeed. The assessment of the cost estimates focuses too much on the required level of reserves, with insufficient attention given to whether the cost estimate reflects the level of design maturity and the related constraints.

The Academy team’s examination of the last decade of PI-led missions showed that NASA has increased both the time spent in formulation and the funds provided in the belief that this will result in a lower probability of schedule growth and cost overruns. The independent review teams do an excellent job of identifying the technical risks. However, the Academy team believes more emphasis should be placed on the design maturity of the candidate mission and— as emphasized above—mitigation of the negative aspects of the competitive environment on the selection process.

The characteristics of the acquisition process necessitate that NASA use independent review teams to evaluate candidate missions. The reviewers attempt to ferret out the uncertainties in both the design approach and the schedule and cost estimates. NASA did not approve several proposed missions for development because of the review teams’ findings. Analyses of the review teams’ reports to the decision official and of the problems experienced with confirmed missions show that the teams identified most of the development risks. Nonetheless, the teams recommended that the missions be approved for development. They did not appreciate the

25 magnitude of the identified risks on the cost and schedule of the missions. The Academy team interviewed key project officials on projects that experienced significant overruns and determined that they understood that the design risks had not been addressed at the start of development.

In addition, a number of projects that experienced schedule delays and cost overruns faced “environmental constraints,” that is, constraints on getting the needed engineering expertise or access to equipment and facilities, on having timely hardware and software deliveries, and on getting management to provide timely attention to pressing problems. In many instances, the problems resulted from several factors: a lack of priority assigned to the mission because other higher priority missions had access to the best talent; an over-scheduling of facilities; and more profitable ventures taking precedence when determining launch schedules. The interviews revealed that many of these constraints were predictable but were not taken into account during the evaluation process.

Recommendation 5: NASA should focus on design maturity in its evaluation of readiness to proceed into phases C/D, and use that assessment, coupled with an understanding of the inherent environmental constraints, to determine the level of schedule and funding reserves required. This tailored approach would allow decisions to be made on the basis of the specific challenges presented by each mission.

* * * * *

Finding 6: NASA has adopted quality assurance process controls, risk management practices, and documentation, reporting, and review requirements that have cost consequences that need to be carefully considered in establishing probable cost estimates for future mission proposals.

As noted, in the five years following the NIAT report, NASA has emphasized risk reduction practices and de-emphasized lower cost and innovative management approaches for PI-led space science missions. It has increased the cost caps for missions, not only for accounting adjustments and economic factors, but also for the changes in technical management practices. Although NASA made discrete adjustments to the cost caps of missions already in progress during this period, it based the adjustments on inputs from the project teams. The Academy study team found that these inputs were rough order of magnitude estimates based on a limited understanding of the impact of the new requirements.

The Academy study team’s examination of the recently completed missions from this time period indicates that, with minor exceptions, the record of project costs does not identify the true cost impacts of the added work. (The exceptions are found in the information supplied by contractors that submitted claims for equitable adjustments to contract target cost baselines.) However, the Academy team was supplied with an abundance of anecdotal information on the additional hours spent preparing for reviews and the displacement of planned effort caused by these changes. The Academy team concluded that the real costs can only be indirectly inferred by examining the schedule delays experienced by these projects. For future missions, the Academy team believes there is merit in NASA focusing special attention on how these new

26 requirements are factored into the basic cost estimates, starting with the evaluation of the proposal estimates and following through to the confirmation review.

Based on the Academy team’s interviews, NASA project officials and PIs question whether an appropriate balance has been found between the benefits of the additional processes and requirements and their cost. However, the absence of good cost data is a critical problem when discussing the need for changes to the added requirements. A specific study is needed to provide decision officials with solid estimates. The findings from such a study would be useful in assessing the benefit vs. cost relationship of these changes and in potentially modifying their implementation. Of particular value, the findings would enable a determination of the reasonableness of the allowance in project cost estimates for these requirements.

Recommendation 6: NASA should undertake a detailed fact-finding evaluation of the time and cost incurred by compliance with the additional processes, documentation requirements, review teams, and related risk- reduction practices. The information obtained should be used to evaluate the adequacy of resource estimates for both proposed and approved PI-led missions. A second objective would be to provide NASA’s decision-makers with information that they can use to consider whether risk-reduction practices with marginal benefits and excessive costs should be eliminated.

* * * * *

Finding 7: In cases where proposed missions have experienced development difficulties sufficient to lead to Termination Reviews, the forecasts given to decision makers of the costs required to complete the missions were consistently understated. Also, the decision official is not provided with high/mid/low confidence engineering estimates of the costs-to-go, and parametric cost-estimating tools are not used to provide comparative cost estimates.

The Academy team examined the cost estimates provided to decision makers in the termination review and found the estimates to be poor predictors of the eventual development costs. The project estimates in the annual Program Operating Plan process consistently understated the costs. Table 3 presents a series of estimates to complete a medium class Explorer mission’s development and the dates that the estimates were provided to NASA program management executives.

27 Table 3. Cost Estimates for a Medium Class Explorer Mission

Expenditures Estimate to Date of Estimate Total Estimate as of that date Completion September 30, 1999 $15.1 million $30.1 million $45.2 million

September 30, 2000 $35.0 million $19.4 million $54.4 million

September 30, 2001 $50.1 million $13.7 million $63.8 million

September 30, 2002 $62.1 million $ 2.0 million $64.1 million

The final development cost was $68 million. Although a number of unexpected and non- anticipatable events occurred during this mission’s development, the preponderance of the cost growth could be anticipated. Interviews revealed there was little incentive early on to abandon the optimistic estimating approach used from the outset. It was not until the available project funding showed signs of being exhausted that a better prediction of the completion costs was made.

It is common practice to use parametric cost-estimating tools25 and related models to estimate the costs of projects that lack engineering detail, such as occurs at the outset of projects going through concept feasibility studies. Often—usually during phase B—the amount of design work available to use as a basis for estimates leads to a reliance on engineering cost estimates—also known as “grassroots” estimates. When sufficiently detailed, the engineering designs for hardware can be used to develop a “bill of materials” that is given to manufacturing cost estimators, who can estimate the labor hours required based on estimating handbooks or specific experience. Individual hardware elements can be estimated with a reasonable level of accuracy (plus/minus 10 percent) if the cost-estimating experience is sufficiently analogous to the current task. Software cost estimating is considerably less precise, a result of the constant change in the hardware available for computing and signal detection. The high level of error in detailed software cost estimates stands in stark contrast to that for hardware.

Based on the Academy team’s work, it appears that NASA could benefit from continuing to use parametric cost-estimating tools throughout the development phase for mission elements with estimating analogs at a lower level of confidence. This is particularly the case when the spacecraft systems operate in poorly understood environments, when the scientific instrumentation represents a change in the state of the art, and when the computational systems are similarly advanced.

25 Parametric cost estimating tools are computerized cost estimating models using specific technical parameters of space flight systems. The model's data base of costs and technical parameters is collected from prior space flight missions. Using statistical correlations, equations are generated that relate the technical parameters to costs.

28 Recommendation 7: NASA should focus on providing decision officials with a range of estimates and should augment its reliance on detailed engineering level estimates with estimates derived from cost modeling and parametric estimates, particularly for mission elements where the accuracy of the cost estimates is uncertain.

* * * * *

Finding 8: The recorded costs for PI-led science missions understate the true amount of the costs required to execute these missions.

Cost estimators rely heavily on the recorded costs of instruments, spacecraft, and other mission elements to estimate future mission costs. Whether the cost estimating is done using parametric tools, detailed engineering estimates, or other approaches, the cost experience of past missions forms the base of knowledge that estimators use as a point of departure for future missions.

Therefore, special recognition needs to be given to the finding that the recorded costs for PI-led missions are recognizably understated. Interviewees estimated the understatement for just the amount of unrecorded-uncompensated labor and recorded-uncompensated labor to be at least 20- 30 percent. In addition, there are numerous instances of projects taking advantage of available spare hardware, getting special deals on NASA Center overhead charges, obtaining discounted prices from aerospace contractors, and avoiding procurement overhead charges by using other parties to procure instruments from vendors. In several cases, the agreed-to period for mission operations and data analysis was truncated, with the understanding that an excellent science return would result in the restoration of the mission operations period. NASA would authorize the costs for an extended mission (phase F) in a separate decision, outside the limitations of the approved mission scope.

These unrecorded and understated costs reflect the of PIs, project managers, and other parties in coping with the constraints imposed by the cost caps, including a perceived threat of cancellation. The Academy team evaluated one mission in which it found major growth in the recorded monthly costs. When NASA decided not to cancel the project, stressing to the project team the need for mission success, the project team, which had been struggling to stave off termination, stopped under-recording costs by ending the “donation” of labor. NASA managers should be aware of the amount of total labor and not assume that contributions of labor will continue indefinitely.

The other problem relates to the record of costs that cost estimators and managers rely on to determine the reasonableness of cost estimates for future missions. If the estimators have an appropriate understanding of the “true costs” of the past mission being used as an analog, they can make adjustments. However, if participants know the record but do not publish it, the potential for losing that knowledge when the participants retire or are reassigned is high. NASA’s history is replete with examples of technical mishaps or poor judgments that are the consequence of poor knowledge capture. 26

26 This finding appears in a number of failure review board reports, including that of the recent Columbia Accident Investigation Board.

29

Recommendation 8: NASA’s knowledge capture activities during and after mission development should specifically address discounts, unique situations, and the unrecorded-uncompensated and recorded-uncompensated direct labor hours, and provide appropriate footnotes to the recorded costs. NASA should review approaches that ensure appropriate knowledge transfer, such as collecting and publishing a record of the mission cost and schedule estimates, actual costs and schedules, and adjustments to those actuals, accompanied by information on a project’s management, technical and science development history, and environmental constraints.

30 APPENDIX A

INDIVIDUALS INTERVIEWED OR CONTACTED

PRINCIPAL INVESTIGATORS

Michael A’Hearn, University of Maryland (College Park): Deep Impact Charles A. Barth, University of Colorado: SNOE Chuck Bennett, Goddard Space Flight Center: WMAP Alan B. Binder, Lunar Research Institute: Lunar Prospector William Borucki, Ames Research Center: Kepler Donald Brownlee, University of Washington: Stardust James Burch, Southwest Research Institute: IMAGE Donald Burnett, California Institute of Technology: Genesis Supriya Chakrabarti, Boston University: SPIDR (selected, not confirmed) David Crisp, Jet Propulsion Laboratory: Orbiting Carbon Observatory Neil Gehrels, Goddard Space Flight Center: Swift Roderick Heelis, University of Texas (Dallas): CINDI Robert P. Lin, University of California (Berkeley): RHESSI Christopher Martin, California Institute of Technology: GALEX Glenn Mason, University of Maryland: SAMPEX Gary Melnick, Smithsonian Astrophysical Observatory: SWAS Chris Russell, University of California (Los Angeles): Sean Solomon, Carnegie Institution of Washington: MESSENGER S. Alan Stern, Southwest Research Institute: Ed Stone, California Institute of Technology: ACE Joseph Veverka, Cornell University: CONTOUR

APPLIED PHYSICS LABORATORY OF JOHNS HOPKINS UNIVERSITY

Larry Crawford, Associate Head, Space Department (appointed Head, April 2005) Michael D. Griffin, Head, Space Department (appointed NASA Administrator, April 2005) Bill Harvey, New Horizons Project Fiscal Administrator Ted Mueller, Programs Manager, Space Department

NASA HEADQUARTERS

Walter Cantrell, Office of the Chief Engineer Al Diaz, Associate Administrator, Science Mission Directorate Claude Freaner, Science Mission Directorate Jane Green, Explorer Program Budget Analyst, Science Mission Directorate Joseph Hamaker, Office of Program Analysis and Evaluation David Jarrett, Discovery Program Manager Paul Hertz, Explorer Program Scientist Roy Maizel, Director, Resources Management Division, Science Mission Directorate Voleak Roeum, Discovery Program Analyst

31 APPENDIX A

GODDARD SPACE FLIGHT CENTER

Nancy Abell, Chief Financial Officer Laurie Adkinson, Swift project office Liz Citrin, Project Manager, Solar Dynamics Observatory Anthony Comberiate, Explorer Program Manager Joe Dezio, Swift Project Manager and Deputy Explorer Program Manager Cynthia Fryer, Resources Analysis Office, Office of Systems Safety and Mission Assurance James Greaves, Associate Director, Flight Programs and Projects Directorate Jule Johnson, Deputy Business Manager, Explorer Program Herb Hittelman, Business Manager, Explorer Program Heather Keller, Swift Business Manager Margaret Luce, Associate Director, Flight Programs and Projects Directorate Nancy Newman-Pape, Program Analysis Office, Office of the CFO Robert Stephens, General Counsel Ed Weiler, Center Director

JET PROPULSION LABORATORY

Fred Doumani, Manager, Costing and Pricing Offices Charles Elachi, Laboratory Director James Fanson, Project Manager: GALEX Charlayne Fliege, Project Support Office Tom Fraschetti, Project Manager: Dawn Thomas Gavin, Associate Director, Flight Projects and Mission Success Jeff Leising, Manager, Project Planning Office Chet Sasaki, Project Manager: Genesis, Kepler Greg Vane, Manager, Office of Competed Missions, Science Instruments Office

LANGLEY RESEARCH CENTER

Cynthia Daniels, Science Selection Support Office John Drummond, Science Selection Support Office Carlos Liceaga, Explorer Acquisition Manager Robert Wayne Richie, Science Selection Support Office

OTHER

Robert Arentz, Ball Aerospace Technology Corporation Robert Bitten, Aerospace Corporation Sherry Carlson, Ball Aerospace Technology Corporation

32 APPENDIX A

Ben Clark, Lockheed Martin Astronautics Bill Gibson, Southwest Research Institute Mark K. Jacobs, Science Applications International Corporation Steve Jordan, Ball Aerospace Technology Corporation John Marriott, Ball Aerospace Technology Corporation Todd May, Discovery Program Manager, Marshall Space Flight Center Dave Murrow, Ball Aerospace Technology Corporation Harold Reitsema, Ball Aerospace Technology Corporation Nick Schneider, University of Colorado, SMEX proposed mission proposer Gary Zarlengo, Ball Aerospace Technology Corporation Joe Vellinga, Lockheed Martin Astronautics

33

34 APPENDIX B

REFERENCES

Briefings to NRC Committee on PI-Led Missions

A’Hearn, Michael: PI Interview, Deep Impact, September 1, 2004 Angelopoulos, Vassilis: PI Interview, THEMIS, February 1, 2005 Barth, Charles: PI Interview, SNOE, November 18, 2004 Bennett, Charles: PI Interview, WMAP, September 1, 2004 Binder, S. Alan: PI Interview, Lunar Prospector Borucki, William: PI Interview, Kepler, February 1, 2005 Brownlee, Donald: PI Interview, Stardust, November 17, 2004 Burch, James: PI Interview, IMAGE, November 18, 2004 Cantrell, Walt: Independent Technical Authority, February 1, 2005 Chakrabarti, Supriya: PI Interview, Lessons Learned from TERRIERS and SPIDR, September 1, 2004 Chiu, Mary: PM Interview, CONTOUR, November 18, 2004 Clark, Ben: Industry Perspectives on PI-led Missions, November 17, 2004 Comberiate, Anthony: PI-Led Missions and the Goddard Explorers Office, February 2, 2005 Crisp, David: ESSP PI Interview, Orbiting Carbon Observatory, February 3, 2005 Diaz, Al: NASA Science Office Perspectives on PI-Led Missions, September 1, 2004 Elachi, Charles: JPL Perspectives on PI-Led Missions Fanson, Jim: PM Interview, GALEX, February 2, 2005 Fraschetti, Tom: PM Interview, DAWN, February 3, 2005 Gail, William B: SSB Study on PI-Led Missions in the Earth Sciences, September 1, 2004 Gehrels, Neil: PI Interview, Swift, September 2, 2004 Gibson, Bill: PM Interview, IMAGE, November 18, 2004 Heelis, Roderick: PI Interview, CINDI, September 2, 2004 Hertz, Paul; Morgan, Tom; McBride, Karen; Jarrett, Dave: Explorer, Discovery, Mars Scout, New Frontiers Program Managers; Project Scientists: Panel Discussion, September 1, 2004 Lin, Robert P: PI Interview, RHESSI, February 1, 2005 Malin, Michael: Perspectives on PI-Led Missions, February 2, 2005 Martin, Chris: GALEX Lessons Learned, PI Perspective, February 1, 2005 Mason, Glenn: SAMPEX, September 1, 2004 Reitsema, Harold: Lessons Learned from Principal Investigator Missions in the Space Sciences, November 17, 2004 Richie, Wayne: Proposal Process, September 1, 2004 Rottman, Gary: PI Interview, SORCE, November 17, 2004 Russell, Chris: PI Interview, DAWN, February 1, 2005 Sasaki, Chet: Kepler, February 1, 2005 Schneider, Nick: PI Interview, JMEX, November 17, 2004 Solomon, Sean: PI Interview, MESSENGER, September 2, 2004 Smith, Peter: PI Interview, Phoenix, November 18, 2004 Stern, Alan: PI Interview, New Horizons, November 19, 2004 Stone, Ed: Perspectives on ACE and PI-Led Missions Vellinga, Joe: Industry Perspectives on PI-led Missions, November 18, 2004

35 APPENDIX B

NASA Documents

NASA Budget Justifications, Fiscal Years 1993-2006 NASA Program Financial Plans, Fiscal Years 1991-2006* Office of Space Science, Budget Backup documents, Fiscal Years 1993-2006* Project Monthly Reviews, Explorer Program* PI-Led Mission Confirmation Review Reports* Program Level Requirements Documents JPL Quarterly Status Reviews* JPL Cost Analysis documents*† Announcements of Opportunity for Discovery, Explorer, New Horizons, Mars Scout† AO Selection Announcements, Office of Space Science† Discovery Dispatch, quarterly newsletters issued by the Discovery Program

NASA Publications

CONTOUR Mishap Investigation Board, CONTOUR Mishap Investigation Report, 2003

Laufer, Alexander. Shared Voyage: Learning and Unlearning from Remarkable Projects, 2005, NASA History Series, NASA SP-2005-4111

McCurdy, Howard E. Low-Cost Innovation in Spaceflight, the Near Earth Asteroid Rendezvous (NEAR) Shoemaker Mission, 2005, NASA History Series, Monographs in Aerospace History, Number 36

Mars Climate Orbiter Mishap Investigation Board, Report on Project Management in NASA, 2000

NASA Integrated Action Team, Enhancing Mission Success, A Framework for the Future, 2000

NASA Cost Estimating Handbook, 2002

Reports

Gibson, Bill. PI Lead Mission Implementation, 2001, Draft report to the Committee on Earth Sciences, National Research Council, Space Studies Board

National Research Council, Space Studies Board, Setting Priorities for Space Research, Opportunities and Imperatives, 1992, National Academy Press, Washington DC

National Research Council, Space Studies Board, Assessment of Recent Changes in the Explorer Program, 1996, National Academy Press, Washington DC

* Access to Administratively restricted information made available through Non-Disclosure Agreement † Source documents available through: http://research.hq.nasa.gov/code_s/archive.cfm

36 APPENDIX B

National Research Council, Aeronautics and Space Engineering Board and Space Studies Board, Reducing the Costs of Space Science Research Missions, 1997, National Academy Press, Washington DC

National Research Council, Space Studies Board, Steps to Facilitate Principal-Investigator-Led Earth Science Missions, 2001, National Academy Press, Washington DC

Other Publications

Atkins, Kenneth L., et al. Stardust: Implementing a New Manage-to-Budget Paradigm, 2002, International Academy of Astronautics, IAA-L-0202

Bearden, David A. A Complexity-Based Assessment of Low-Cost Planetary Missions: When is a Mission Too Fast and Too Cheap? 2000, Fourth IAA Conference on Low-Cost Planetary Missions

37

38

ADVISORY PANEL AND STUDY TEAM

ADVISORY PANEL

John G. Stewart*—Partner, Stewart, Wright & Associates, LLC. Former Executive Director, Consortium of Research Institutions; Vice President, Resource Development, Manager of Corporate Administration, and Manager of Planning and Budget, Tennessee Valley Authority; Member, Aerospace Safety Advisory Panel, National Aeronautics and Space Administration; Staff Director, Subcommittee on Science, Technology, and Space, U.S. Senate; Staff Director, Subcommittee on Energy, Joint Economic Committee, U.S. Congress; Executive Assistant to the Vice President of the United States.

Herbert N. Jasper*—Senior Consultant, McManis-Monsalve Associates, Inc. Former Executive Vice President, American Council for Competitive Telecommunications; Specialist in American Government, Congressional Research Service; Legislative Counsel, Research Director and Chief Counsel, U.S. Senate Committee on Labor and Public Welfare; Assistant Director, Office of Legislative Reference; Assistant Chief, Government Organization Branch, U.S. Bureau of the Budget.

ADVISORY STUDY TEAM

J. William Gadsby*—Responsible Academy Officer. Vice President for Academy Studies, National Academy of Public Administration. Former Senior Executive Service; Director, Government Business Operations Issues, Federal Management Issues and Intergovernmental Issues, U.S. General Accounting Office; Assistant Director, Financial Management Branch, U.S. Office of Management and Budget.

William E. Lilly—Project Director. Director, National Aeronautics and Space Administration Programs, National Academy of Public Administration. Former Associate Administrator/Comptroller, National Aeronautics and Space Administration.

Carole M. P. Neves—Senior Advisor. Director, Office of Policy and Analysis, Smithsonian Institution. Former Project Director and Director of International Programs, National Academy of Public Administration.

Malcolm L. Peterson—Senior Advisor. Financial Management Consultant. Former Comptroller, National Aeronautics and Space Administration; Chief, Resources Analysis Division, NASA; Chief, Space Shuttle Cost and Schedule Team; Procurement Reform Specialist, National Space Council, Executive Office of the President.

* Academy Fellow

39