<<

NASA Instrument Cost Model: Mission Class’s Impact, Imputation, Multiple Builds and More: New Features in the Latest Version of NICM Joseph Mrozinski, NICM Task Lead NICM@jpl..gov April 15, 2021 Copyright 2021 California Institute of Technology. Government sponsorship acknowledged. Disclaimer: The cost information contained in this document is of a budgetary and planning nature and is intended for informational purposes only. It does not constitute a commitment on the part of JPL and/or Caltech. Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov NICM Stakeholders

• Sponsor: NASA HQ OCFO/SID • Special thank you to James Johnson • Legacy Co-Sponsor: JPL Cost Estimation & Pricing Section • Development Team • JPL Systems Modeling, Analysis & Architectures Group • JPL Engineering Cost Estimation Group • JPL Technical Division Experts • Science • Communications, Tracking, and • Instruments and Science Data Systems • Mechanical Systems • Last but not least, all of the NASA Centers, Contractors, Universities and others who have built instruments and contribute data to NICM jpl.nasa.gov NICM Team NICM 9 Team • Gary Ball • Luther Beegle • Justin Boland • Kyle Brown • Robert Cesarone • Mike DiNicola • Michael Fong • Melissa Hooke • Joe Mrozinski • Al Nash • Michael Saing • Sherry Stukes • Marc Walch

jpl.nasa.gov NICM Team

NICM Alumni/Advisors NICM Consultants • Daniel Belter • Kamrooz Parchamazad • Ed Caro • Grace Chen • John Pearson • George Fraschetti • Jerry Clark • Matthew Ramirez • Tom Fraschetti • George Fox • Carrie Selski • Dean Johnson • Hamid Habib-Agahi • Cesar Sepulveda • Ken Klaasen • John Jack • Louise Veilleux • Chris Paine • Leora Juster • Keith Warfield • Brian Sutin • Eric Kwan • Wayne Zimmerman • David Swenson • Richard Levin • Xiaoyan Zhou

jpl.nasa.gov NICM Introduction

Joseph Mrozinski, NICM Task Lead [email protected] November 2, 2020 Copyright 2020 California Institute of Technology. Government sponsorship acknowledged NICM Introduction

• NICM is the NASA Instrument Cost Model: • Instrument Cost and Schedule (B/C/D) Estimating Tool Suite based off of previously flown space flight instruments across all of NASA • Includes objective-input-based parametric cost and schedule models, cost and schedule analogy tools and JCL capabilities. • Models exist at both the total instrument and instrument subsystem levels.. • Civil Servant copy includes all normalized data as well as a data Search Engine • Users: • All NASA Centers, Contractors, Universities etc. • Proposal Teams as well as by Proposal Evaluators. • Over 600 individuals have attended NICM Training sessions • For Training Contact: [email protected] • Download the NICM Excel file from: • www.oncedata.com (Civil and Contractor version) • www.software.nasa.gov (Contractor Version only)

jpl.nasa.gov A Brief NICM History

• NICM began collecting data in 2004 • Foreshadowing: You’ll notice our CERs are all in FY04. This is why. • NICM has collected and normalized data non-stop 2004-present. • Newly collected data feeds the updates to the NICM CERs. • Trying to cram the highlights of 16 years of NICM history into 8 bullets: • NICM I: released in 2006, with each tool in an individual workbook. • NICM II-III: a flood of new data pours in after the NICM I release. • NICM IV: In-situ instruments added. All tools combined into a single workbook. • NICM V: schedule estimating and JCL added. • NICM VI: NICM-E capability and Cluster Tool added. • NICM VII: Telescope estimating capability added. • NICM VIII: Mission Class as a cost driver added. • NICM 9: Data Imputation Utilized, Multiple Build estimates introduced

jpl.nasa.gov NICM Data

• Interviewing, Analyzing, Normalizing and Reviewing technical and cost Data is the heart and main strength of the NICM. Good models require good data. • We are stringent when it comes to the quality, applicability and completeness of the data: before data is used for modeling, all records and normalization approaches are reviewed and blessed by both individuals who built the hardware as well as the multi-disciplinary NICM Team.

• NICM 9 includes 299 Data Records

jpl.nasa.gov Modeling Methodology

• Cluster Analysis • Identifies Instrument Groupings from Attribute Values • Assesses Consistency of Groups with Instrument Types • Principal Components Analysis • Standard Data Mining Technique that • Finds Potential Cost Drivers from Instrument Attributes • Identifies NICM Data Outliers – Revisit data with technical experts • Finds separation in the data (i.e. clustering) • Addresses multi-collinearity in data for regression analysis • Bootstrap Cross Validation • Bootstrap: Process for generating meaningful statistics without assuming asymptotic normality. • Cross Validation: Partitioning of data set into training and testing sets. Out-of-sample validation. • Bootstrap technique also used to perform statistical tests for regression analysis. • Imputation: Allows for use of incomplete records. jpl.nasa.gov Model Types

• NICM contains the following modeling types: • Cost Estimating Relationships (CERs). • Schedule Estimating Relationships (SERs). • The CERs exist at two different levels: • The instrument System or Total Level. • The instrument Subsystem and Wrap Level. • The SERs only produce a total instrument schedule (they do not provide subsystem schedules).

jpl.nasa.gov Example Model: Optical Planetary CER

Imputed Instruments used in this CER: instruments used:

ALICE_Rosetta CFI CIRS ALICE CRISM CRISP CTX IUVS DLRE IRAC IRS MDI ISS ITS JunoCam LOLA LORRI LROC SECCHI M3 MARCI MASCS UVCS MCS MDIS MICAS

MIPS MIR MLA

MOC-MO MOLA-MO MRI

MSI NavCam NIR

NIS NLR NSP

ONC PMIRR TES_MO

THEMIS TLP UVIS

UVS - VIMS VIS_LCROSS

VSP

Alternative form of equation: ���� = 392 ����������. ����. �����. ��� ��������������� . where MissionClassBin = 0 if Class A or B, & MissionClassBin = 1 if Class C or D

jpl.nasa.gov NICM 9: Latest Tool Improvements

Joseph Mrozinski, NICM Task Lead [email protected] November 2, 2020 Copyright 2020 California Institute of Technology. Government sponsorship acknowledged Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov Mission Class-Based Subsystem CERs

• NICM VIII introduced the use of Mission Class (A/B vs. C/D) in the NICM System Tool. However, Mission Class was not examined at the instrument subsystem level. • NICM 9 expands on this by finding Mission Class as a statistically meaningful driver for the following instrument subsystems: • Electronics Orbiting • Electronics Planetary • Mechanical/Structures • Thermal • Mission Class was not found to be a driver for the following subsystems: • Optics • Antenna • Detectors • Software jpl.nasa.gov Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov Imputation

Data imputation is a statistical method used to handle missing data in a dataset by probabilistically filling a data observation’s missing value(s) based on 1) the partial data that is available for that data observation and 2) the completed observations (those not missing any values) in the dataset. Advantages: 1. Instruments for which the mass and peak power are known, but the cost is missing still contain knowledge about NASA instruments. 2. The addition of these data observations increases sample size, which, in general, leads to CERs with greater statistical power (e.g. a better understanding of the effect size of variables used in CERs) and increases predictive strength of the CERs. 3. Bringing these incomplete instrument records into the NICM tool increases the number of analogy instruments available to the user through the Search Engine and the other analogy-based subtools in NICM.

jpl.nasa.gov Imputation-Based System CERs

• Using data imputation, the NICM team was able to make improvements (green) to the following System Tool CERs:

CER Complete Imputed V8.5 V9 V8.5 V9 Records Records R^2 R^2 PE PE Optical Planetary 43 6 89% 89% 48% 45% Optical Earth Orbiting 35 10 85% 85% 48% 42% Passive 12 2 86% 86% 37% 34% Fields 12 2 74% 74% 44% 41% Particles Planetary 30 7 60% 62% 52% 47% Particles Earth Orbiting 24 3 77% 77% 53% 53% Body Mounted 13 1 72% 72% 64% 62% Arm/Mast Mounted 12 1 59% 60% 52% 50%

jpl.nasa.gov Imputation-Based System CER Example Passive Microwave

NICM 9 NICM 8.5

Passive Microwave Instruments Passive Microwave Instruments Cost = 1,686 * TotalMass^0.39 * Cost = 1,741 * TotalMass^0.38 * TotalMaxPwr^0.38 TotalMaxPwr^0.39

2 2 R = 86%, PE = 34%, Ncom = 12, Nimp = 2 R = 86%, PE = 37%, N = 12

jpl.nasa.gov Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov Multiple Build Estimation

• The NICM Team performed a deep dive looking for actual data regarding cost splits for multiple units (first unit cost vs. subsequent unit costs). • The observation was that most instruments built in multiples do not have cost recorded this way. Rather, instrument development leadership would simply apply commonly used industry models to estimate these splits. • The conclusion was that the NICM team was unable to build a multiple build estimator that was built off of actual cost splits for previously flown instruments. • Instead, NICM integrated and tested for applicability one of these common industrial models, the Crawford Learning Curve Model.

jpl.nasa.gov Multiple Build Estimation- Key Terms

• Recurring Cost: Repetitive elements of development and investment costs that may vary with the quantity being produced • Parts, procurement activities, fabrication, testing for Flight Unit • ATLO operations for flight units, acceptance testing • Sustaining level-of-effort (PM, PSE, MA) • Non-Recurring Cost: Non-repetitive elements of development and investment costs that generally do not vary with the quantity being produced • All design work for system, EM/Simulator development, testing and analysis • Tooling • Modifications • ATLO planning, development of GSE, articles for testing and analysis, training • Software development and maintenance • Corresponding level-of-effort (PM, PSE, MA) • Learning Slope: This is a percentage used describe the reduction in RECURRING cost as the lot size doubles • Recurring Cost of Unit 2n / Recurring Cost of unit n • Example: Learning Slope=85%: RE cost for Unit 2 is 85% that of Unit 1

*Definitions drawn from the DoD Manual

jpl.nasa.gov Multiple Build Estimation

• RE % = RE $ / (RE $ + NRE $) • NRE % = NRE $ / (RE $ + NRE $) • These percentages apply to the first unit development cost • For purposes of this assessment, assume “development” = Phases B/C/D • For multiple build cost applications, learning curve equation should be applied to the RE $ • LS % = RE Cost of Unit 2n / RE Cost of unit n

Reduction in cost due to NRE

Reduction in cost due to Learning Slope

jpl.nasa.gov Multiple Build Estimation Crawford Learning Curve Model

• Crawford Model applies a power curve equation C(n) = anb • C(n) = Cost of unit n • a = T1 Cost (more on defining this later - it is not the Total Instrument Cost from NICM) • n = Unit index; n=1, …, N, where N= total # of units

• b = Learning Parameter = log2(Learning Slope) b • Total Cost of all Units = Σn C(n) = aΣn n • Impact of yield/rework issues not directly considered

For NICM: Need to understand how to calculate: RE cost, T1 Cost and Learning Slope

jpl.nasa.gov Multiple Build Estimation- Example

Value Units Source

Total Instrument Provided Ph B/C/D Cost by NICM (First Unit) $10,000 FY04 $K Tool Learning Slope: 85% unitless User Input RE% 60% unitless User Input T1 Cost $6,000 FY04 $K User Input? N 41 unitless User Input Total Cost of All Units $135,063 FY04 $K Calculated T1 Cost = RE%*Total Instrument Cost Cost FY04 $K Default values also provided by NICM Unit RE NRE Total 1 $6,000 $4,000 $10,000 2 $5,100 $0 $5,100 3 $4,637 $0 $4,637 4 $4,335 $0 $4,335 No yield/rework included explicitly 5 $4,114 $0 $4,114 6 $3,942 $0 $3,942 7 $3,802 $0 $3,802 8 $3,685 $0 $3,685 9 $3,584 $0 $3,584 10 $3,497 $0 $3,497

NICM Threshold (based on THEMIS) due to batch production that may occur for larger lot sizes. jpl.nasa.gov Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov Search Engine Improvements

• The Search engine now supports the following operators (syntax in parenthesis): • Greater than (>) • Less than (<) • Greater than or equal to (>=) • Less than or equal to (<=) • Ranges from x to y (x-y) • Find parameters that are not blank (Not Blank).

jpl.nasa.gov Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov Screenshot Generator

• The System and Subsystem Tool both contain buttons which say “Generate Screenshots.” Activating this button will open a blank PowerPoint presentation file, with slides containing: • Screenshots of the input and output sections for both the System and Subsystem Tool • A screenshot of the Search Engine outputs (if not blank) • Screenshots of any JCL outputs (if the JCL has been computed).

jpl.nasa.gov Screenshot Generator Example

jpl.nasa.gov Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov SSIT

• The NICM Team mapped each normalized instrument to the Aerospace Corporation’s recently published “Space Scientific Instrument Taxonomy (SSIT) Version 2.0.” This taxonomy contains 3 hierarchal levels.

• SSIT can be downloaded via ONCE or contact [email protected] directly.

jpl.nasa.gov SSIT

• This inclusion in NICM sets the stage for future work to explore modeling instruments when grouped according to the SSIT hierarchy at the 2nd level.

• Today, the SSIT taxonomy can be seen in the NICM Instrument datasheets, and can also be used as a search field in the NICM Search Engine.

jpl.nasa.gov SSIT SSIT In Instrument Datasheets

jpl.nasa.gov SSIT SSIT In Search Engine

jpl.nasa.gov Agenda

• Acknowledgements • NICM Introduction • NICM 9: Latest Tool Improvements • Mission Class-Based Subsystem CERs • Imputation-Based System CERs • Multiple Build Estimations • Search Engine Improvements • Screenshot Generator • Space Scientific Instrument Taxonomy (SSIT)

jpl.nasa.gov NICM 9: And more!

• The database now includes 299 datapoints, up from 255. • Design Life added as a driver for instruments on C/D missions for the Thermal and Planetary Electronics. • Peak Power now used in the Mechanical/Structures subsystem CER. • Double-click and right-click secondary functionality was put in place wherever hyperlinks are used to navigate the workbook, allowing for these functions for users whose IT systems have blocked the use of hyperlinks in Microsoft Office. • Batch Mode retired = NICM Team has more time to focus on moving the tool forward. • Development on NICM 10 has already begun!

jpl.nasa.gov NICM Publications & Presentations

1. IEEE Aerospace – “Salvaging Data Records with Missing Data: Data Imputation using the Multivariate t Distribution,” 2021 Aerospace Conference, Virtual, March 2021, M. Hooke, J. Mrozinski, M. DiNicola – “NASA Instrument Cost Model for Explorer-like Mission Instruments,” 2014 Aerospace Conference, Big Sky, MT, March 2014, H. Habib-Agahi, J. Mrozinski, G. Fox. – “NASA Instrument Cost and Schedule Model,” 2011 Aerospace Conference, Big Sky, MT, March 2011, H. Habib-Agahi, G. Fox, J. Mrozinski. 2. AIAA Space – “NASA Space Flight Instruments: Cost Time Trends,” 2016 Space Conference, Long Beach, CA, September 2016, J. Mrozinski, M. DiNicola, H. Habib-Agahi. – “Latest NASA Instrument Cost Model (NICM): Version VI,” 2014 Space Conference, San Diego, CA, August 2014, J. Mrozinski, H. Habib-Agahi, G. Fox, G. Balls. – “NICM Schedule & Cost Rules of Thumb,” 2009 Aerospace Conference, Pasadena, CA, September, 2009, H. Habib-Agahi, G. Fox, G. Ball. 3. International Cost Estimation and Analysis Association (ICEAA) – “NASA Instrument Cost Model (NICM),” 2014 International Cost Estimation and Analysis Association (ICEAA) Professional Development & Training Workshop, Denver, CO, June 2014, H. Habib-Agahi, J. Mrozinski, G. Fox.

jpl.nasa.gov NICM Publications & Presentations NASA Cost and Schedule Symposium Presentations

– 2020: NICM 9 announced 2020 NASA Cost & Schedule Virtual Gathering. – 2019: “NICM 8.5,” Johnson Space Center, J. Mrozinski, M. Ramirez. – 2018: “NASA Instrument Cost model: Version VIII Major Improvements,” Goddard Space Flight Center, J. Mrozinski, J. Johnson. – 2017: “NICM – Cryocooler,” NASA Headquarters, J. Mrozinski, M. DiNicola. – 2017: “The Silent “S” in NICM – NICM Schedule Capabilities”, NASA Headquarters, J. Mrozinski, M. DiNicola. – 2016: “NASA Instrument Cost Model Impact of Mission Class on Cost,” Glenn Research Center, August 2016, J. Mrozinski, M. DiNicola, H. Habib-Agahi. – 2015: “NICM Version VII,” Ames Research Center, H. Habib-Agahi, J. Mrozinski, M. DiNicola. – 2014: “Telescope Cost Estimating,” Langley Research Center, H. Habib-Agahi, J. Mrozinski. – 2013: “NASA Instrument Cost Model for Explorer-like Mission Instruments,” Jet Propulsion Laboratory, H. Habib-Agahi, J. Mrozinski, G. Fox, G. Ball. – 2012: “NASA Instrument Cost Model,” Applied Physics Laboratory, H. Habib-Agahi, J. Mrozinski. – 2011: “NICM,” Johnson Space Center, J. Mrozinski.

jpl.nasa.gov Questions? For Training or questions: [email protected] For Download: • Have a NASA login? Go to: oncedata.com • All others: software.nasa.gov

Joseph Mrozinski, NICM Task Lead [email protected] November 2, 2020 Copyright 2020 California Institute of Technology. Government sponsorship acknowledged jpl.nasa.gov Backup Slides

Joseph Mrozinski, NICM Task Lead [email protected] November 2, 2020 Copyright 2020 California Institute of Technology. Government sponsorship acknowledged System vs. Subsystem Tool Total Cost Estimates

• Users should NOT expect the NICM System Tool and Subsystem Tool to always produce the same Cost Estimate. There’s multiple reasons for this, and knowing these reasons can help the user choose which tool is more applicable for a given instrument.

jpl.nasa.gov System vs. Subsystem Tool Total Cost Estimates Different Data: Example, Fields Instrument

• The System Tool only uses Fields Instrument actual data to create the CER for estimating Fields Instruments • However, at the subsystem level, instruments did not segregate themselves by instruments types. For example, mechanical/structures $/kg is not distinguishable between Fields, Particles, Microwave and Optical Instruments. • Hence, when creating a cost estimate for a Fields Instrument in the Subsystem Tool, the individual subsystem costs estimates often include non-Fields instrument data – which is not the case in the System Tool. • Bottom line: The System and Subsystem Tool both create estimates using different parts of the NICM database.

jpl.nasa.gov System vs. Subsystem Tool Total Cost Estimates Different Cost Drivers: Example: Mass

• The System Tool utilizes Total Mass as a cost driver. This represents a typical mass distribution for a particular instrument type. • The Subsystem Tool utilizes subsystem masses, allowing for the user to put in off-typical mass distributions. • For example, the user could put an above-average proportion of mass into the higher $/kg subsystems, leading to a higher cost estimate. • Or, the user could do the opposite, and put an above-average proportion of mass into the lower $/kg subsystems, leading to a lower cost estimate. • Bottom line: The System and Subsystem Tool both use different cost drivers.

jpl.nasa.gov System vs. Subsystem Tool Total Cost Estimates

Do you have No Subsystem Use System Tool. inputs?

Yes Run both System Do both tools Yes & Subsystem yield the same Tools. estimate?

No

Can you rule out a tool based on Yes the instruments Done. used in its CER?

No

Can you rule No out a tool Yes Call NICM team for based on the opinion. lack of use of a key driver?

jpl.nasa.gov NICM Data Families

• NICM Data can be binned into different families. The following slides elaborate on these bins: • Used for Modeling vs. NOT Used for Modeling • Complete Data vs. Incomplete Data • Instruments vs. Cryocoolers vs. Telescopes • Instruments vs. In-Situ Instruments • Remote Sensing types • In-Situ types • Lead Organization

jpl.nasa.gov NICM Data Families

• NICM Data can be binned into different families. The following slides elaborate on these bins: • Used for Modeling vs. NOT Used for Modeling • Complete Data vs. Incomplete Data • Instruments vs. Cryocoolers vs. Telescopes • Remote Sensing Instruments vs. In-Situ Instruments • Remote Sensing types • In-Situ types • Lead Organization

jpl.nasa.gov NICM Data Used for Modeling

• In addition to the top level NICM Data characteristics described above, NICM Data Used for Modeling have the additional characteristics: • None are 100% foreign built, although some have foreign contributions of known cost.

jpl.nasa.gov NICM Data NOT Used for Modeling

• The NICM Database includes some data that is explicitly not used in the model. • Some data with launch dates between 1985-1990 have been shown to no longer be applicable for certain models. • NICM Includes the technical parameters (but no costs) of some 100% foreign built data. These are used in plots but do not feed the cost and schedule models. • NICM includes some “Analogy only” datapoints, which would be extreme outliers if included in the models. For example, the massive instruments flown for short periods of time out of the space shuttle cargo bay are outliers in the NICM models.

jpl.nasa.gov Used for Modeling vs. NOT Used for Modeling

• Of the 299 NICM 9 Data records: • 278 are Used for Modeling • 21 are NOT used for Modeling

jpl.nasa.gov NICM Data Families

• NICM Data can be binned into different families. The following slides elaborate on these bins: • Used for Modeling vs. NOT Used for Modeling • Complete Data vs. Incomplete Data • Instruments vs. Cryocoolers vs. Telescopes • Remote Sensing Instruments vs. In-Situ Instruments • Remote Sensing types • In-Situ types • Lead Organization

jpl.nasa.gov Complete Data vs. Incomplete Data

• “Complete Data” refers to those data records which have all of the necessary parameters to support at least one NICM model. For example, a data record with a known mass, peak power and total cost and can thus support one of the cost estimating relationships would be marked as a “Complete” record for Modeling purposes. • “Incomplete Data” refers to those data records which are missing one necessary parameter, which would otherwise make the data a complete record. For these records, the NICM modeling routines can impute the missing parameter during the modeling process, in which case the data record is marked as “Imputed.”

jpl.nasa.gov Complete Data vs. Incomplete Data

• Of the 278 NICM 9 Data records Used for Modeling: • 246 are Completed Records • 32 are Imputed Records

jpl.nasa.gov NICM Data Families

• NICM Data can be binned into different families. The following slides elaborate on these bins: • Used for Modeling vs. NOT Used for Modeling • Complete Data vs. Incomplete Data • Instruments vs. Cryocoolers vs. Telescopes • Remote Sensing Instruments vs. In-Situ Instruments • Remote Sensing types • In-Situ types • Lead Organization

jpl.nasa.gov Instruments vs. Cryocoolers vs. Telescopes

• As the name “NASA Instrument Cost Model” implies, NICM originally only contained data records for instruments. • The NICM database, and models, now also support Cryocoolers and Telescopes as well. • Of the 299 NICM 9 Data records: • 282 are Instruments • 9 are Cryocoolers • 8 are Telescopes

jpl.nasa.gov NICM Data Families

• NICM Data can be binned into different families. The following slides elaborate on these bins: • Used for Modeling vs. NOT Used for Modeling • Complete Data vs. Incomplete Data • Instruments vs. Cryocoolers vs. Telescopes • Remote Sensing Instruments vs. In-Situ Instruments • Remote Sensing types • In-Situ types • Lead Organization

jpl.nasa.gov Remote Sensing Instruments vs. In-Situ Instruments

• Of the 282 NICM 9 Instrument Data: • 235 are Remote Sensing • 47 are In-Situ • Of the 235 Remote Sensing: • 113 are Optical • 71 are Particles • 18 are Fields • 16 are Active Microwave • 17 are Passive Microwave

jpl.nasa.gov Remote Sensing Instruments Types Fields: •, • Magnetic Field Instrument, • Electric Field Instrument, • Plasma Wave Instrument, • ….etc. 8%

Particles: Optical: •Particle Detector, •Camera, • Gamma Ray Spectrometer, 30% 48% • Spectrometer, • X-Ray Imager, • Infrared Sounder, • Magnetospheric Imaging Instrument, • Laser Altimeter, • Plasma Spectrometer, • Photometer, • ….etc. • ….etc. 7% 7% Active Passive Microwave: •Radar, Microwave: • Altimeter, •Microwave Radiometer, • Scatterometer, • Microwave Imager, • …etc. • Microwave Limb Sounder, • …etc. jpl.nasa.gov Remote Sensing Instruments vs. In-Situ Instruments

• Of the 282 NICM 9 Instrument Data: • 235 are Remote Sensing • 47 are In-Situ • Of the 47 In-situ: • 15 are Arm/Mast • 16 are Body • 16 are Probe

jpl.nasa.gov In-Situ Instrument Mounting Types

• Instruments located on the arm or mast of a rover or lander. Instruments located on: Arm/Mast, • Rover Body 32% Body, 34% • Lander Body

Probe, 34%

• Instruments located on Atmospheric Probes jpl.nasa.gov NICM Data Families

• NICM Data can be binned into different families. The following slides elaborate on these bins: • Used for Modeling vs. NOT Used for Modeling • Complete Data vs. Incomplete Data • Instruments vs. Cryocoolers vs. Telescopes • Remote Sensing Instruments vs. In-Situ Instruments • Remote Sensing types • In-Situ types • Lead Organization

jpl.nasa.gov Data by Instrument Lead Organizations

APL, 10% University, 27% GSFC, 14%

System Other, 13% Contractor, 9%

JPL, 27%

jpl.nasa.gov Sample of Normalized Database (1 of 2) Instrument Launch Manag. Cost Syst. Eng. Prod. Assure. Int. & Test Sensor Cost B/C/D Cost B/C/D Sched. Name Date Destination Inst. Type ($K FY04) ($K FY04) ($K FY04) ($K FY04) ($K FY04) ($K FY04) (months) FTEs ACIS Jul-99 Earth Orbiting Optical 3782 20193 3203 3499 59347 94261 72 153 ACS Jan-97 Earth Orbiting Optical 6306 6462 3314 5686 70000 92129 60 100 AIRS May-02 Earth Orbiting Optical 36827 32594 19730 22289 209236 320676 95 210 GALEX Apr-03 Earth Orbiting Optical 815 438 457 2826 18609 23662 62 65 GLAS Oct-02 Earth Orbiting Optical 7717 3689 4825 54035 70266 49 106 HIRDLS Jul-04 Earth Orbiting Optical 29349 4690 4482 5833 39727 84081 58 MISR Dec-99 Earth Orbiting Optical 7780 6570 6350 12450 91580 124730 80 488 MODIS Dec-99 Earth Orbiting Optical 27000 37542 13000 96156 102547 276245 60 60 NICMOS Feb-97 Earth Orbiting Optical 5791 5761 2961 13455 39129 67097 84 353 STIS Feb-97 Earth Orbiting Optical 7929 7572 5098 12490 79319 112408 48 446 TES Jul-04 Earth Orbiting Optical 9310 8350 9350 24520 161140 212670 54 561 TOMS Jul-96 Earth Orbiting Optical 6007 7401 5176 12381 30965 40 WFPC1 Apr-90 Earth Orbiting Optical 8510 16340 9040 18550 106920 159360 141 456 WFPC2 Dec-93 Earth Orbiting Optical 11150 14590 6000 23710 77910 137460 108 463 CRISM Aug-05 Planetary Optical 2058 2314 963 1567 21049 30615 42 CTX Aug-05 Planetary Optical 540 416 44 4169 5686 40 HiRISE Aug-05 Planetary Optical 5640 3383 4096 4744 25398 49843 48 IRAC Aug-03 Planetary Optical 5824 2149 828 7878 24726 41405 ISS Oct-97 Planetary Optical 11672 3898 1997 13472 33502 67478 60 194 MARCI Aug-05 Planetary Optical 164 158 1988 2540 35 5 MCS Aug-05 Planetary Optical 865 420 741 3332 8651 14009 40 50 MICAS Oct-98 Planetary Optical 1320 310 1980 6850 10460 36 30Illustrative MOC Sep-92 Planetary Optical 3500 2041 1330 3917 17558 32041 72 13data only, MOLA Sep-92 Planetary Optical 336 4284 1428 672 29000 35720 48 4 may not ONC Aug-05 Planetary Optical 450 250 350 1400 3180 5660 36 9 VIMS Oct-97 Planetary Optical 5740 3883 2123 3200 20940 38356 72 119 match CPR Feb-06 Earth Orbiting Active 1922 1009 7542 68768 79241 66 91NICM 9 GFO Alt Feb-98 Earth Orbiting Active 5108 2478 2443 3885 21846 35759 38 ? data. NSCAT Aug-96 Earth Orbiting Active 14270 13800 10110 26890 168750 233820 120 548 SeaWinds Dec-02 Earth Orbiting Active 11270 19690 9810 17010 122320 180100 90 501 SIR-C Apr-94 Earth Orbiting Active 11138 5757 7728 20257 128860 173740 96 452 SRTM Feb-00 Earth Orbiting Active 15353 7788 8361 21311 177463 230275 119 604 TOPEX ALT Aug-92 Earth Orbiting Active 11000 3000 1200 10000 30000 58900 72 136 Cassini Radar Oct-97 Planetary Active 33077 17604 9445 32525 91786 184437 72 265 Magellan Radar May-89 Planetary Active 10917 25978 6148 25553 110326 210587 69 462 MARSIS Jun-03 Planetary Active 2020 1847 1028 2496 12182 19573 44 24 AMSU-A May-02 Earth Orbiting Passive 5578 1769 4207 8446 30001 50001 60 EMLS (MLS) Jul-04 Earth Orbiting Passive 8320 1860 15820 10040 122530 158570 90 GFO Radiometer Feb-98 Earth Orbiting Passive 1573 692 753 2097 5833 10947 38 ? JMR Dec-01 Earth Orbiting Passive 1307 2550 661 830 10818 16166 51 33 SSMI Jun-88 Earth Orbiting Passive 6253 1670 24460 32383 48 TMR Aug-92 Earth Orbiting Passive 1785 3484 903 1134 11950 20606 72 jpl.nasa.gov MIRO Mar-04 Planetary Passive 2160 650 5150 15040 23000 50 77 Sample of Normalized Database (2 of 2) Instrument Max Power Data Rate # of # of Optics/Ant. Elec Mass Elec Max Name Mass (kg) (W) TRL (kbps) Bands Channels Mass (kg) (kg) Power (W) Wavelength (um) ACIS 108 155 64 ACS 341 249 1000 3 3 132 100 144 .115-1.1 AIRS 177 244 1420 12 6 47 38 89 0.4-1.7 & 3.4-15.4 GALEX 135 191 4 25000 2 2 55 23 135 .135-.280 GLAS 307 349 5 550 2 2 119 61 215 1.064 & 0.532 HIRDLS 220 239 65 2 21 6.12-17.76 MISR 148 117 9000 4 9 27 21 15 0.466-0.866 MODIS 229 205 10500 4 36 33 111 202 0.415-14.4 NICMOS 273 150 7 1000 1 3 44 84 150 1.0-2.5 STIS 374 250 8 1000 4 28 181 91 250 .115-1 TES 328 249 5 4700 4 80 29 77 3.2-15.4 TOMS 34 67 0.8 1 6 15 5 0.3086-0.36 WFPC1 292 190 4 1000 48 8 136 51 190 0.115-1.10 WFPC2 281 190 7 1000 48 4 124 51 190 0.115-1.10 CRISM 33 46 5 2 2 9 3 46 0.4-4.05 CTX 3 6 4 20000 1 1 1 0.77 6 0.500-0.800 HiRISE 65 60 6 2400 2 3 15 19 60 0.4-1.0 IRAC 41 56 40 4 4 11 20 3.6, 4.5, 5.8, 8.0 ISS 58 56 6 2 64 14 18 0.2-1.0 MARCI 1 5 7 515 7 2 0.13 0.25 5 MCS 9 16 4 10 3 9 2 1 16 0.3-50 MICAS 13 17 5 100 3 4 4 12 .08-2.4 Illustrative MOC 24 21 6 31 1 3 4 12 21 0.4-0.9 data only, MOLA 25 29 0.618 1 1 9 10 10 1 may not ONC 3 5 7 5 1 1 1 1 4 .45-.60 VIMS 37 26 6 182 2 352 19 12 20 .35-5.10 match CPR 261 298 4 20 1 1 42 71 280 3200 NICM 9 GFO Alt 31 72 7 2 1 1 6 24 72 22200 NSCAT 280 240 3 1 2 50 230 240 21000 data. SeaWinds 202 185 36 1 1 62 141 170 Ku-Band 2.2 SIR-C 1956 7366 180 2 4 1479 477 1741 L: 240000, C: 57000 SRTM 1456 2241 180 1 4 678 608 1976 C: 57000 TOPEX ALT 220 237 2 2 14 206 237 23000-60000 Cassini Radar 46 111 365 1 1 3 43 111 21800 Magellan Radar 170 200 806 1 1 16 154 200 126000 MARSIS 18 39 45 4 2 7 11 39 540000-2300000 AMSU-A 100 110 6 3 15 15 110 13000, 9700, 6000, 6000, 5300 EMLS (MLS) 450 531 7 100 5 26 35 313 531 GFO Radiometer 22 20 9 0.16 2 2 6 15 20 13600, 8100 JMR 24 31 9 3 3 4 5 13 31 9000, 13000, 16000 SSMI 56 96 7 9 (peak), 3 (ave) 4 7 TMR 47 26 256 3 4 27 21 26 9000, 13000, 16000jpl.nasa.gov MIRO 20 59 6 2 2 4 14 59 5000, 16000