<<

Computer-Aided Design 45 (2013) 4–25

Contents lists available at SciVerse ScienceDirect

Computer-Aided Design

journal homepage: www.elsevier.com/locate/cad

Key computational modeling issues in Integrated Computational Materials Engineering

Jitesh H. Panchal a, Surya R. Kalidindi b, David L. McDowell ,∗ a School of Mechanical and Materials Engineering, Washington State University, Pullman, WA 99164-2920, USA b Department of and Engineering, Department of Mechanical Engineering and Mechanics, Drexel University, Philadelphia, PA 19104, USA c School of Materials Science and Engineering, Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332-0405, USA article info a b s t r a c t

Keywords: Designing materials for targeted performance requirements as required in Integrated Computational Materials design Materials Engineering (ICME) demands a combined strategy of bottom–up and top–down modeling and Multiscale modeling simulation which treats various levels of hierarchical material structure as a mathematical representation, ICME with infusion of systems engineering and to deal with differing model degrees of freedom Uncertainty and uncertainty. Moreover, with time, the classical materials selection approach is becoming generalized to address concurrent design of microstructure or mesostructure to satisfy product-level performance requirements. Computational materials science and multiscale mechanics models play key roles in evaluating performance metrics necessary to support materials design. The interplay of systems-based design of materials with multiscale modeling methodologies is at the core of materials design. In high performance alloys and composite materials, maximum performance is often achieved within a relatively narrow window of process path and resulting microstructures. Much of the attention to ICME in the materials community has focused on the role of generating and representing data, including methods for characterization and digital representation of microstructure, as well as databases and model integration. On the other hand, the computational mechanics of materials and multidisciplinary design optimization communities are grappling with many fundamental issues related to stochasticity of processes and uncertainty of data, models, and multiscale modeling chains in decision- based design. This paper explores computational and information aspects of design of materials with hierarchical microstructures and identifies key underdeveloped elements essential to supporting ICME. One of the of this overview paper is that ICME is not simply an assemblage of existing tools, for such tools do not have natural interfaces to material structure nor are they framed in a way that quantifies sources of uncertainty and manages uncertainty in representing physical phenomena to support decision- based design. © 2012 Elsevier Ltd. All rights reserved.

1. The path to Integrated Computational Materials Engineering support materials selection. Another class of approaches advocates (ICME) simulation-based design to exploit computational materials sci- ence and in accelerating the discovery of new materials, Within the past decade, several prominent streams of research computing structure and properties using a bottom–up approach. and development have emerged regarding integrated design of This can be combined with materials informatics. For example, materials and products. One involves selection of materials, em- the vision for Materials Cyber-Models for Engineering is a compu- phasizing population of databases and efficient search tational materials physics and perspective [3] on using for properties or responses that best suit a set of specified perfor- quantum and molecular modeling tools to explore for new ma- mance indices [1], often using combinatorial search methods [2], terials and compounds, making the link to properties. Examples pattern recognition, and so on. In such materials informatics ap- include the computational estimate of stable structure and prop- proaches, attention is focused on , visualization, and erties of multicomponent phases using first principles approaches providing convenient and powerful interfaces for the designer to (e.g., Refs. [4–6]). These approaches have been developed and pop- ularized largely within the context of the materials chemistry, physics and science communities. ∗ Corresponding author. Tel.: +1 404 894 5128; fax: +1 404 894 0186. A third approach, involving more the integration of databases E-mail address: [email protected] (D.L. McDowell). with tools for modeling and simulation, as well as design

0010-4485/$ – see front matter © 2012 Elsevier Ltd. All rights reserved. doi:10.1016/j.cad.2012.06.006 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 5

Fig. 1. Hierarchy of levels from atomic scale to system level in concurrent materials and product design (multi-level design). Existing systems design methods focus on levels to the right of the vertical bar, treating ‘materials design’ as a materials selection problem. Levels to the left require computational materials science and mechanics models of microstructure–property relations. Olson’s linear mapping of process–structure–property–performance relations in Materials by DesignTM is shown on the upper left [10]. optimization and concurrent design for manufacture, is that length scales of atomic structure and microstructure that can be of Integrated Computational Materials Engineering (ICME). It is considered in design of advanced metallic alloys for aircraft gas concerned with multiple levels of structure hierarchy typical of turbine engine components in commercial aircraft. The diagram in- materials with microstructure. ICME aims to reduce the time to set in the upper-left in Fig. 1 shows the Venn diagram from Olson market of innovative products by exploiting concurrent design [10] of overlapping process–structure–property–performance re- and development of material, product, and process/manufacturing lations that are foundational to design of materials to the left of the path, as described in the National Research Council report by a vertical bar; conventional engineering design methods and mul- National Materials Advisory Board (NMAB) committee [7]. It is ‘‘an tidisciplinary design optimization (MDO) are rather well estab- approach to design products, the materials that comprise them, lished. Extending systems design methods down to address the and their associated materials processing methods by linking concurrent design of material structure at various levels of hierar- materials models at multiple length scales’’. chy is a substantial challenge [12–15]. The process of tailoring the ICME was preceded by another major Defense Advanced material to deliver application-dependent performance require- Research Projects Agency (DARPA)-funded initiative, Accelerated ments is quite distinct from that of traditional materials selection Insertion of Materials (AIM), from 2001–2003, which sought to (relating properties to performance requirements) or from combi- build systems approaches to accelerate the insertion of new and/or natorial searching for atomic/molecular structure–property rela- improved materials into products. An NMAB study [8] discussed tions at the level of individual phases; such combinatorial searches the DARPA-AIM program, highlighting the role of small technology typically do not consider the role of microstructure, which is typ- startup companies in enabling this technology, and summarizing ically central to meeting multifunctional performance require- the commercial computational tools and supporting currently ments. The conventional approach in science-based modeling of available databases. The impact of AIM on framing of ICME is hierarchical systems is a bottom–up, deductive methodology of discussed further in Ref. [9]. modeling the material’s process path, microstructure, and result- The significant distinction of ICME from the first two ap- ing properties, with properties then related to performance re- proaches (materials informatics and combinatorial searches for quirements, as shown in Fig. 1. Design is a top–down process in atomic/molecular structures) is its embrace of the engineering per- which systems level requirements drive properties, structure and spective of a top–down, goal–means strategy elucidated by Olson process routes. [10]. This perspective was embraced and refined for the academic The pervasive difficulty in inverting process–structure and and industry/government research communities at a 1997 Na- structure–property relations in modeling and simulation greatly tional Science Foundation (NSF)-sponsored workshop hosted by complicates this top–down enterprise, and owes to certain un- the Georgia Institute of Technology and Morehouse College [11] avoidable realities of material systems: on Materials Design Science and Engineering (MDS&E). Effectively, ICME incorporates informatics and combinatorics, as well as mul- • Nonlinear, nonequilibrium path dependent behavior that re- tiscale modeling and experiments. As mentioned, ICME is fully quires intensive simulation effort, limiting parametric study cognizant of the important role that microstructure plays in tailor- and imparting dependence upon initial conditions. ing material properties/responses in most engineering applications, • Transitions from dynamic (lower length and time scales) to well beyond the atomic or molecular scale. This is a key concept thermodynamic (higher length and time scales) models for in materials science, as typically distinct from materials physics cooperative phenomena in multiscale modeling, with non- or chemistry. For example, mechanical properties depend not uniqueness due to reduction of model degrees of freedom only on atomic bonding and atomic/molecular structure, but are during scale-up. strongly influenced by morphology of multiple phases, phase and • Potential for a wide range of suboptimal solutions that compli- interface strengthening, etc. Fig. 1 shows a hierarchy of material cate design search and optimization. 6 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25

Fig. 2. Typical separation of materials development and materials selection activities in the process of integrated design of materials and products.

• Approximations made in digital microstructure representation the right in Fig. 2 that serves as a ‘‘fence’’ between materials devel- of material structure. opment and materials selection. Conventionally, design engineers • Dependence of certain properties such as true fracture duc- have worked with databases of existing materials to select, rather tility, fracture toughness, fatigue strength, etc., on extreme than design, materials in the process of designing components value distributions of microstructure attributes that affect dam- (systems, sub-systems). Tools for materials selection [1] are rea- age/degradation. sonably well-developed, supporting the materials selection prob- • Microstructure metastability and long term evolution in ser- lem. However, as discussed by McDowell et al. [12–15], materials vice. selection supports the property-performance linkage to the right in • Lack and variability of experimental data for validation pur- Fig. 2 and is just one component of the overall integration of ma- poses or to support methods based on pattern recognition. terials and product design. Integration of design engineering into • Uncertainty of microstructure, models, and model parameters. the overall process requires a coupling back to material suppliers, for example. This is one of the great challenges to realizing ICME in Since uncertainty is endemic in most facets of ICME, managing terms of concurrent computational materials design and manufac- uncertainty and its propagation through model/experiment chains turing. As stated in the report of the 1998 MDS&E workshop [11], is of paramount importance. Materials design is complicated by the ‘‘The field of materials design is entrepreneurial in nature, similar complexity of nonlinear phenomena and pervasive uncertainty in to such areas as microelectronic devices or software. MDS&E may materials processing, related models and model parameters, for very well spawn a ‘‘cottage industry’’ specializing in tailoring ma- example. This motivates the notion of robust design and other terials for function, depending on how responsive large supplier means to address the role of uncertainty. The materials design industries can be to this demand. In fact, this is already underway’’. challenge is to develop methods that employ bottom–up modeling The ICME report [7] focused mainly on advances in modeling and simulation, calibrated and validated by characterization and and simulation, the role of experiments in validation, databases measurement to the extent possible, combined with top–down, and microstructure representation, and design integration tools. requirements-driven exploration of the hierarchy of material In addition, cultural barriers to ICME in industry, government and length scales shown in Fig. 1. It is a multi-level engineering systems academia were discussed. Elements of ICME broadly encompass: design problem, characterized by top–down design of systems with successive embedded sub-systems, such as the aircraft shown • Modeling and simulation across multiple levels of hierarchy in Fig. 1. Multiscale modeling pertains more to the bottom–up of structure, including both process–structure and structure– consideration of scales of materials hierarchy. property relations Another critical aspect of materials design identified in the • Experiments and prototyping NMAB ICME report [7] is the key role played by quantitative repre- • Distributed collaborative networks sentation of microstructure. The notion of microstructure-mediated • Materials databases design is predicated on the use of mathematical/digital represen- • Microstructure representation as the focus of the materials tation of microstructure, rather than properties, as the taxonomic design problem basis for describing material structure hierarchy in materials de- • Knowledge framework that integrates databases with experi- sign. This is not merely semantics, but has quite practical im- ments and modeling simulation to pursue inverse design • plications in terms of actual pathways of material and product Uncertainty quantification and management • development. Specifically, process–structure relations tend to be Integration with OEMS and materials suppliers. quite dissociated from structure–property relations, with different The Materials Genome Initiative aspires to build on the preced- model developers, organizations involved, degree of empiricism, ing AIM and ICME initiatives to develop an infrastructure to accel- etc. The timeframe for process–structure relations in materials erate advanced materials discovery and deployment, including all development often do not match those of structure–property phases of development, property optimization, systems design and relations. Process–structure relations tend to be more highly integration, certification and manufacturing [16]. As outlined in empirical, complex, and less developed than structure–property Refs. [7,15], this goal and the premise of ICME is more comprehen- relations. The light dashed line on the left in Fig. 2 effectively sive in addressing the connection with product design and man- represents a ‘‘fence’’ between organizations/groups pursuing pro- ufacturing than combinatorial searching algorithms applied using cess–structure and structure–property relations, reflecting the first principles calculations to discover new materials or multicom- common decomposition of activities between material suppliers ponent phases, for example. The notion of genomics as predictive and OEMs. One goal of an integrated systems strategy for materi- exploration of crystal structures, single phases, or molecules for als design and development is to more closely couple these activ- desirable phase properties is a valuable part of the overall con- ities. Clearly, microstructure is solidly at this interface, the result cept of ICME to more broadly scope during design exploration, of process–structure relations. Hence, framing ICME in terms of but the ‘‘connective tissue’’ of multiscale modeling and multi-level microstructure-mediated design facilitates enhanced concurrency decision-based design in ICME are essential aspects of its integra- of material suppliers and OEM activities. Another important goal tive character that links microstructure to performance and con- of ICME should be to effectively remove the bold dashed line to siders integrated systems design, certification, manufacturing, and J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 7 product scale-up. Moreover, the analogy of ICME to bioinformat- quantification of microstructure in a large number of disparate ma- ics, DNA sequencing and so forth is therefore limited and not fully terial systems. Furthermore, the framework has to be sufficiently descriptive of scope and challenges of ICME. In our view, the no- versatile to allow compact representation and visualization of the tion of integrating manufacturing processes with concurrent de- salient microstructure–property–processing correlations, which sign of materials and products is at the core of ICME, considering populate the core knowledge bases used by materials designers. It the inescapable role of the hierarchy of material structure (i.e., mi- is important to recognize here that only the microstructure space crostructure) on properties and performance. Of course, one can can serve as the common higher dimensional space in which the conceive of important application domains in which discovery of much needed knowledge of the process–microstructure–property new material phases perhaps addresses the majority of the accel- correlations can be stored and communicated reliably without erated deployment problem; it may be entirely sufficient for pur- appreciable loss of information. Any attempt to represent this poses of searching for specific molecular or atomic structures for core knowledge by circumventing microstructure using lower di- nanoscale application such as quantum dots, semiconductors, etc. mensional spaces (e.g., processing–property correlations or com- But even in many applications where certain key performance indi- position–property correlations) should be expected to result in cators are dictated by atomic structure (e.g., piezoelectrics, battery substantial loss of fidelity and utility in the generated knowledge and fuel cell electrodes, etc.), material durability and utility in ser- bases. vice is often dictated by mesoscopic heterogeneity of structure and In arriving at a rigorous framework for microstructure quan- its lack of reversibility under cycling or in extreme environments. tification, it is extremely important to recognize and understand Therefore, materials informatics is a key technology for nanoscale the difference between a micrograph (or a microstructure dataset) design problems but only one toolset in the solution strategy for and the microstructure function [21]. Microstructure function is multi-level materials design that must consider scale-up and long term product performance. best interpreted as a stochastic process, in essence a response to This paper will focus on several critical path issues in realizing applied stimuli, whose experimental outcome is a micrograph (a ICME, namely materials databases, the role of multiscale modeling, two or three dimensional microstructure dataset). As such, the mi- materials knowledge systems and strategies that support inverse crostructure function cannot be observed directly and is expected design (top–down in Fig. 1), and uncertainty quantification and to capture the ensemble-averaged structure information from a management considerations that undergird plausible approaches. multitude of micrographs extracted from multiple samples pro- cessed by nominally the same manufacturing history. Since indi- 2. Materials databases and the need for a structure-based vidual micrographs have no natural origin, the only meaningful taxonomy information one can extract from each micrograph has to be con- cerning the relative spatial arrangement or spatial correlations of The NMAB report [7] has emphasized the key role of materials local material states (e.g., phases, orientations) in the structure. A databases in capturing, curating, and archiving the critical number of different measures of the spatial correlations in the mi- information needed to facilitate ICME. While the report does crostructure have been proposed and utilized in literature [19,20]. not elaborate on how these databases should be structured or Of these, only the n-point spatial correlations (or n-point statis- what specifically they should include, it discusses how these tics) provide the most complete set of measures that are naturally new databases will produce substantial savings by accelerating organized according to increasing degree of microstructure infor- the development of advanced high performance materials, while mation. For example, the most basic of the n-point statistics are eliminating the unintended repetition of effort from multiple the 1-point statistics, and they reflect the probability density as- groups of researchers possibly over multiple generations. sociated with finding a specific local state of interest at any ran- Before we contemplate on the ideal structure and content of domly selected single point (or voxel) in the microstructure. In such novel databases, it would be beneficial to get a historic per- other words, they essentially capture the information on volume spective on the successes and failures of the prior efforts on build- fractions of the various distinct local states present in the mate- ing materials databases. In a recent summary, Freiman et al. [17] rial system. The next higher level of microstructure information is ′ discuss the lessons learned from building and maintaining the hh contained in the 2-point statistics, denoted fr , which capture the National Materials Property Data Network that was operated com- probability density associated with finding local states h and h′ at mercially until 1995. Although this network contained numer- the tail and head, respectively, of a prescribed vector, r, randomly ous databases on many different materials, it proved too costly placed into the microstructure. There is a tremendous increase in to maintain because of (i) a lack of a single common standard for the amount of microstructure information contained in the 2-point information on the very broad range of materials included in the statistics compared to the 1-point statistics. Higher-order correla- , (ii) enhancements/modifications to testing procedures tions (3-point and higher) are defined in a completely analogous used for evaluating properties of interest that sometimes invali- manner. Another benefit of treating the microstructure function as dated previous measurements, and (iii) the placement of onus on a a stochastic process is the associated rigorous quantification of the single entity as opposed to being jointly developed by multiple en- variance associated with the microstructure [21]. The variance in tities that possess the broad range of requisite sophisticated skill sets. Notwithstanding these shortcomings, a major deficiency in structure can then be combined appropriately with the other un- these, as well as other similar, databases is that they did not aim certainties in the process (for example, those associated with the to capture the details of the microstructure. It is well-known in the measurements and those associated with the models used to pre- materials science and engineering discipline that the microstruc- dict overall properties or performance characteristics of interest in ture–property connections are not unique, and that composition an engineering application) to arrive at the overall variance in the alone is inadequate to correlate to the many physical properties of performance characteristics or responses of interest. the material. Indeed, the spatial arrangement of local features or Materials Informatics has recently emerged as a new field of ‘‘states’’ in the internal structure (e.g., morphological attributes) at study that focuses on the development of new toolkits for efficient various constituent length scales play an influential role in deter- synthesis of the core knowledge in large materials datasets (gen- mining the overall properties of the material [18–20]. erated by both experiments and models). The NMAB report [7] has The key materials databases needed to facilitate ICME have to be specifically highlighted the important role to be played by Mate- built around microstructure information. There is therefore a crit- rials Informatics in the realization of the potential of ICME. The ical need to establish a comprehensive framework for a rigorous standardization and adoption of a rigorous framework, such as the 8 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 n-point statistics, in enabling the quantification of the microstruc- it means that notions of cyberdiscovery of new or improved ture is a critical and essential step in the emergence of Materials materials must emphasize not only modeling and simulation tools Informatics. In recent work, it was demonstrated that the use of but also systems design strategies for using simulations to support concepts borrowed from informatics can produce a new toolkit decision-based concurrent design of materials and products. [22] with the following unprecedented capabilities: (i) archival, Materials with long range order, such as crystalline materials real-time searches, and direct quantitative comparisons of dif- or oriented fibrous composites or thermoplastic chains, ferent microstructure datasets within a large microstructure offer a particular challenge and motivate sophisticated multiscale database, (ii) automatic identification and extraction of mi- modeling methods. Characteristic scale transitions achieved by crostructure features or other metrics of interest from very large multiscale modeling do not follow a single mathematical or datasets, (iii) establishment of reliable microstructure–property structure, owing both to the consideration of long range correlations using objective measures of the microstructure, order and to the fact that as models are coarse grained, the models and (iv) reliable extraction of precise quantitative insights shift from those concerning nonequilibirum, dynamic particle on how the local neighborhood influences the localization of systems to continuously distributed mass systems that often macroscale loading and/or the local evolution of microstructure invoke statistical equilibrium arguments and radically reduce leading to development of robust, scale-bridging, microstruc- degrees of freedom. A distinction is made between so-called ture–property–processing linkages. The new field of Materials hierarchical and concurrent multiscale modeling schemes. The Informatics is still very much in its embryonic stage of devel- former apply scale-appropriate models to different levels of opment. The customization and adoption of modern information hierarchy to generate responses or properties at that scale. In technology tools such as image-based search engines, data-mining, contrast, the latter attempt to cast the problem over common machine and crowd-sourcing can potentially identify spatial and temporal domains by virtue of boundary conditions completely new research directions for addressing successfully and introduction of consistent variable order descriptions of some of the most difficult computational challenges in the imple- kinematics to achieve a variable resolution/fidelity description. mentation of the ICME paradigm. Clearly, concurrent schemes are more costly but are argued to provide high value in cases in which localization of responses within a microstructure cannot be located or anticipated a priori, 3. Multiscale modeling and materials design as well as the propagation of such localization events across a broad range of scales shown in Fig. 1. Information is lost in either Multiscale modeling involves seeking solutions to physical way — in the case of hierarchical multiscale models, information is problems that have important features at multiple spatial and/or lost (i.e., increase of information entropy) by disposing of model temporal scales. Multiscale models are necessary to address degrees of freedom, while in the case of concurrent multiscale the relative contributions of various levels of material structure modeling information is lost by virtue of the idealization necessary hierarchy shown in Fig. 1 to responses at higher scales [12–15, to frame the models with consistent structure to pass information 23–25]. However, multiscale models are subservient to the overall back and forth across length and time scales. design integration strategy in materials design. In other words, It is a daunting task to briefly summarize a range of multi- they offer utility in various specific ways to materials design, scale modeling approaches for all relevant process–structure or including, but not limited to: structure–property relations for all material classes, beyond the • relating polycrystalline and/or polyphase microstructures to scope of any single work. We accordingly limit discussion here to properties involving cooperative behavior at higher scales multiscale dislocation plasticity of metals to illustrate the range (e.g., yield strength, ductility, ductile fracture, etc.), of considerations involved. Dislocation plasticity is a multiscale, • quantifying sensitivity of higher scale responses to different multi-mechanism process [24–27] that focuses on evolving irre- mechanisms and levels of structure/microstructure hierarchy, versible microstructure rearrangement [28] associated with line defects crystals. Existing approaches address mechanisms over a • supporting microstructure-sensitive prediction of material fail- wide range of length and time scales, as summarized below. ure or degradation, • quantifying the influence of environment or complicating con- (a) Discrete to continuous modeling approaches based on contigu- tributions of impurities, manufacturing induced variability or ous (domain decomposition) or overlapping (coarse graining) defects, domains: • considering competition of distinct mechanical, chemical and • Domain decomposition, with an atomistic domain abutted to transport phenomena without relying too heavily on intuition an overlapping continuum domain to atomistically resolve to guide solutions in the case of highly nonlinear interactions, crack tips and interfaces [29–35] to model fracture and and defect interactions. • bridging domains of quantum physics and chemistry with en- • Quasi-continuum approaches with the continuum elastic gineering for problems that span vast ranges of length and time energy function informed by nonlocal atomistic interatomic scales. potentials, suitable for long range fields without defects [36–39], with adaptivity to selectively model regions with Olson’s Venn diagram structure in Fig. 1 should not be confused full atomic resolution [40]. with multiscale modeling strategies; the former is much more • Dynamically equivalent continuum or quasi-continuum, comprehensive in scope, embedding multiscale modeling as part consistent with fully dynamic atomistic simulations over of the suite of modeling and simulation tools that provide decision- same overlapping domain [41–46]. support in design. For example, structure–property relations • Coarse-grained molecular dynamics [47–51] for somewhat can involve the full gamut of length and time scales, as can longer ranged spatial interactions of microstructure and process–structure relations. In other words, the levels in Olson’s defects such as dislocation pile-ups. linear design strategy of Fig. 1 do not map uniquely to levels • Continuum phase field modeling (PFM) [52,53] to simu- of material hierarchy. Even concurrent multiscale models which late microstructure morphology and evolution, including attempt to simultaneously execute models at different levels of microscopic PFM [54] informed by ab initio or atomistic resolution or fidelity do not serve the full purpose of top–down simulations in terms of energy functions, with kinetics materials design. Materials design and multiscale modeling are (mobilities) established by matching overlapping MD or not equivalent pursuits. This distinction is important because experiments [55]. J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 9

Table 1 Examples of dominant material length scales (in each row) for design with regard to bulk mechanical properties of metallic systems. Strength Bulk polycrystals Inelastic flow and Brittle fracture Ductile fracture High cycle fatigue Low cycle fatigue hardening, uniform ductility resistance resistance

2 nm — unit processes of 2 nm — bond breaking nucleation (low defect density (cleavage) single crystals, nanoscale multilayers) 20 nm — source activation 20 nm — dislocation 20 nm — cooperative 20 nm — valve effect (nanocrystalline metals with reactions with grain, phase effects of impurities, dislocation prior cold work) or twin boundaries segregants, oxides, etc. nucleation (intergranular) 200 nm — source controlled 200 nm — dislocation 200 nm — source (NEMS, MEMS) multiplication, junctions activation 2 µm — free surfaces and 2 µm — statistical field of 2 µm — fracture 2 µm — damage 2 µm — localized starvation (micropillars, junctions, forests process zone process zone slip structures unconstrained layers) 20 µm — intergranular 20 µm — boundary 20 µm — individual 20 µm –damage interactions interactions, activated grains or process zone, roughness and crack phases, barrier spacing barrier spacing path bifurcation 200 µm — effect of neighbors

(b) Concurrent multiscale continuum modeling schemes for both the atomic scale structure and mesostructure of defect dis- plastic deformation and failure processes based on imposition tributions (dislocations, grain boundaries, etc.) can be impor- of consistent multiscale boundary conditions and/or higher tant. For nanostructures, however, atomic scale arrangement and order continua accounting for heterogeneous deformation at its evolution is dominant. In many systems, the hierarchy in- fine scales over the same physical domain [56–61]. cludes strong heterogeneity or phase contrast (e.g., weldments in (c) Multiscale modeling schemes of hierarchical nature for inelas- structures, fiber-reinforced polymer composites) and these scales tic deformation and damage in the presence of microstruc- may dominate responses of interest. Even interfacial-dominated ture: phenomena, such as fracture of heterogeneous materials (cf. • Multilevel models for plasticity and fracture based on [104–107]), is complicated according to whether the fracture oc- handshaking [62–65]. curs via brittle or ductile separation, with the latter involving inter- • Self-consistent micromechanics schemes [66–68]. face structure only as a gating mechanism for inelastic deformation • Internal state variable models informed by atomistic/conti- modes [108]. nuum simulations and coupled with micromechanics or FE For illustrative purposes, at the risk of oversimplifying the methods [69,70]. wide range of phenomena and scales over a wide range of • Weak form variational principle of virtual velocities (PVV) metallic systems, Table 1 attempts to identify dominant material with addition of enhanced kinematics based on microstruc- length scales to address in design for various target mechanical ture degrees of freedom [71–73]. properties of metallic systems. Although phenomena occur over • Extended weak form variational approaches of generalized a full spectrum of scales, the role of defects and microstructure continua or second gradient theory with conjugate mi- is largely manifested at various discrete levels of structure. One crostresses to represent gradient effects [74–79]. important point is the dominant mechanisms and scales change (d) Statistical continuum theories that directly incorporate dis- with the scale and nature of the application. Another point is crete dislocation dynamics [80–89] and dislocation substruc- that certain properties are sensitive to material structure at only ture evolution [24,25]. one or two levels of hierarchy. These observations suggest that (e) Field dislocation mechanics based on incompatibility with distinct models or scale transition methods that bridge several dislocation flux and Nye transport [90–93]. scales can supply first order information for materials design. (f) Dislocation substructure formation and evolution models Clearly, no single multiscale modeling strategy exists or suffices to based on fragmentation and blocky single slip or laminated link the various scale transitions. Table 2 offers a partial glimpse flow characteristics, including non-convex energy functions into the complexities of models at various length and time scales [94–98]. for metallic systems, distinguishing scale transition methods from (g) Transition state theory, based on use of atomistic unit process models appropriate to each scale. models [99–101] to inform event probabilities in kinetic Monte It is emphasized that there is considerable uncertainty in any Carlo models of mesoscopic many body behavior [102] and kind of multiscale modeling scheme, including the selection of other coarse graining approaches [103]. specific length scales to model, approximations made in separating Clearly, even in the focused application area of multiscale dis- length and time scales in models used, model structure and location plasticity, no single multiscale modeling strategy exists parameter uncertainty at each scale, approximations made in or suffices to wrap together the various scale transitions. Each various scale transition methods as well as boundary conditions, material class and response of interest is influenced by a fi- and lack of complete characterization of initial conditions. nite, characteristic set of length and time scales associated with Furthermore, microstructure itself has random character, such as key controlling microstructure attributes. For this reason, mate- sizes and distributions of grains, phases, and so forth. These sources rials design can be (and often is) pursued using sets of models of uncertainty give rise to the need to consider sensitivity of that are scale appropriate to the length scales and microstruc- properties and responses of interest to variation of microstructure ture attributes deemed most influential on the target response(s) at various scales, as well as propagation of model uncertainty at the scale of the application. So if the scale of the applica- through multiscale model chains. Uncertainty will be addressed in tion is on the order of tens of microns, as in MEMS devices, a later section of this paper. 10 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25

Table 2 Models and scale bridging methods for multiscale dislocation plasticity in metallic crystals and polycrystals. Shaded boxes appear in rows associated with scale bridging methodologies. Length scale Time scale Models Examples of scale bridging approaches Primary sources of uncertainty

2 nm NA/ ground First principles, e.g., Density Assumptions in DFT method, placement of atoms state Functional Theory (DFT) Quantum MD 200 nm 10 ns Molecular dynamics (MD) Interatomic potential, cutoff, thermostat and ensemble Domain decomposition, coupled Attenuation due to abrupt interfaces of models, passing atomistics discrete dislocation (CADD), defects, coarse graining defects coarse grained MD, kinetic Monte Carlo 2 µm s Discrete dislocation Discretization of dislocation lines, cores, reactions and dynamics junctions, grain boundaries Multiscale crystal plasticity Averaging methods for defect kinetics and lattice rotation 20 µm 1000 s Crystal plasticity, including Kinetics, slip system hardening (self and latent) relations, generalized continuum cross slip, obstacle interactions, increasing # adjustable models (gradient, parameters micropolar, micromorphic) RVE simulations, RVE size, initial and boundary conditions, eigenstrain polycrystal/composite homogenization fields, self-consistency 200 µm Days Heterogeneous FE with Mesh refinement, convergence, model reduction simplified constitutive relations Substructuring, variable fidelity, Loss of information, remeshing error, boundary adaptive remeshing, multigrid conditions >2 mm Years Structural FE Low order models, meshing, geometric modeling

4. Methods for requirements-driven, top–down design important measures of the microstructure deemed relevant to the problem. However, this reduction of the design space is As mentioned earlier, many of the currently available materials attained by rational consideration of the governing physics of databases are better suited for material selection [109] instead of the problem and/or taking into account the constraints on the rigorous materials design. This is because materials design requires availability of microstructures (i.e., recognizing that only a very knowledge of a validated set of process–microstructure–property small fraction of all theoretically possible microstructures are linkages that are amenable to information flow in both forward and accessible by currently employed manufacturing options). In other reverse directions, as essential to facilitate solutions to top–down, words, in these approaches, the design space in the simple problem requirements-driven (inverse) design problems. illustrated above will be drastically reduced first by consideration For example, sophisticated analytical theories and compu- of suitably selected subset of microstructure measures deemed tational models already exist for predicting several macroscale relevant to the selected physical phenomenon of interest, and (effective or homogenized) properties associated with a given mi- then optimal solutions will be explored in this reduced space. crostructure. However, most of these theories are not well suited It is anticipated that the approach described above will produce for identifying the complete set of microstructures that are the- better solutions compared to conventional approaches that rely oretically predicted to meet or exceed a set of designer-specified exclusively on the optimization algorithms. We next discuss macroscale properties or performance criteria. Topology optimiza- examples of approaches that attempt to reduce complexity of tion is one of the early approaches developed for materials design. top–down design as an illustration. In this method, the region to be occupied by the material is tes- Microstructure Sensitive Design (MSD) was formulated using sellated into spatial bins and numerical approaches such as finite spectral representations across the process, microstructure, and element methods are used to solve the governing field equations. property design spaces in order to drastically reduce the com- The design problem is set up as an optimization problem, where putational requirements for bi-directional linkages between the the objective is the maximization of a selected macroscale prop- spaces that can enable inverse design [20,117–125]. Much of the erty (or a performance criterion) and the design space enumerates focus in MSD was placed on anisotropic, polycrystalline materi- all of the possible assignments of one of the available choices of als, where the local material state is defined using a proper or- material phases (e.g., a solid or a pore phase) to each of the spatial thogonal tensor that captures the local orientation of the crystal bins in the numerical model [110,111]. Although the method has lattice in each spatial bin in the microstructure with respect to a been successfully applied to a limited class of problems [111–116], fixed sample (or global) reference frame. The corresponding lo- its main drawback is that it is likely to explore only a very small re- cal state space is identified as the fundamental zone of SO3 (the gion of the vast design space. As an example, consider a relatively set of all distinct proper orthogonal tensors in 3-D) that takes small problem with about 100 elements covering the space and into account the inherent symmetries of the crystal lattice. The assume that there are two choices for the selection of a material distinguishing feature of MSD is its use of spectral representa- phase in each element. For this simple problem, the design space tions (e.g., generalized spherical harmonics [126]) of functions constitutes 2100 discrete choices of microstructures. It is therefore defined over the orientation space. This leads to compact repre- very unlikely that an optimization process can explore the com- sentations of process–structure–property relationships that can be plete design space unless constraints (perhaps ad hoc or limited executed at minimal computational cost. Most of the MSD suc- in nature) are imposed. Moreover, explicit simulation of a large cess stories have focused on 1-point statistics of microstructure parametric range of microstructures via finite element or finite (also called Orientation Distribution Function [126]). At its core, the difference methods is extremely time-consuming, particularly if MSD effort involved the development of very large and highly effi- nonlinear and time-dependent processes are considered. cient databases organized in the form of microstructure hulls and A number of modern approaches to material design start by anisotropic property closures. Microstructure hulls delineate the dramatically reducing the design space by focusing on the most complete set of theoretically feasible statistical distributions that J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 11 quantitatively capture the important details of the microstructure. variables [127–130]. In the MKS framework, the localization Property closures identify the complete set of theoretically feasible relationship captures the local response field using a set of kernels combinations of macroscale effective properties for a selected ho- and their convolution with higher-order descriptions of the local mogenization (composite) theory. The primary advantages of the microstructure [132,133,136–142]. MSD approach are its (a) consideration of anisotropy of the proper- The localization relationship can be expressed as ties at the local length scales, (b) exploration of the complete set of  H S−1 relevant microstructures (captured in the microstructure hull and   h1 h1 ps = at ms+t property closures) leading to global optima, and (c) invertibility of 1 1 h1=1 t1=0 the process–microstructure–property relations (due to the use of H H S−1 S−1 spectral methods in representing these linkages).     h1h2 h1 h2 + a m + m + + Extension of the MSD framework to include 2-point statistics of t1t2 s t1 s t1 t2 h =1 h =1 t =0 t =0 orientation has proved to be computationally expensive because 1 2 1 2 H H H S−1 S−1 S−1 of the very large number of statistical variables and terms +       involved. It is important to recognize that 1-point statistics ======are defined exclusively over the local state space, whereas the h1 1 h2 1 h3 1 t1 0 t2 0 t3 0 higher-order statistics are defined over the product spaces of the  h h h h h h orientation space (for the local state variables) and the real three- × a 1 2 3 m 1 m 2 m 3 + · · · , (1) t1t2t3 s+t1 s+t1+t2 s+t1+t2+t3 dimensional physical space (for the vectors defining the spatial correlations). While the use of generalized spherical harmonics where H and S denote the number of distinct local states and the provides substantial compaction for functions over the orientation number of spatial bins used to define the microstructure signal space, the analogous compact representations for functions over h (ms ), and all the ti enumerate the different vectors used to define the real three-dimensional physical space are yet to be developed. the spatial correlations. The kernels, or influence coefficients, are In an effort to address the need to establish high fidelity mi-  h h h h h h  h h h h h h defined by the set a 1 , a 1 2 , a 1 2 3 ,... . a 1 , a 1 2 , and a 1 2 3 crostructure–property–process relations that include higher-order t1 t1t2 t1t2t3 t1 t1t2 t1t2t3 descriptors of the microstructure, a new mathematically rigorous are the first, second, and third-order influence coefficients, respec- approach for scale-bridging called Microstructure Knowledge Sys- tively. The higher-order descriptions of the local microstructure h h h are expressed by the products m 1 m 2 ··· m N , tems (MKS) was recently developed. MKS facilitates bi-directional s+t1 s+t1+t2 s+t1+t2+···+tN exchange of information between multiple length (and possibly where N denotes the order. time) scales. In the MKS approach [127–131], the focus is on the The higher-order localization relationship shown in Eq. (1) localization (the opposite of homogenization), which describes the is a particular form of Volterra series [134,135,143–146] and spatial distribution of the response field of interest (e.g., stress or implicitly assumes that the system is spatially-invariant, causal, strain fields) at the microscale for an imposed loading condition at and has finite memory. The influence coefficients in Eq. (1) are the macroscale. Localization is critically important in correlating analogous to Volterra kernels. The higher-order terms in the various failure-related properties of the material with appropriate series are designed to capture systematically the nonlinearity microstructure conformations that produce the hot spots associ- in the system. The influence coefficients are expected to be ated with damage initiation in the material. Moreover, if one can independent of the microstructure, since the system is assumed successfully establish accurate expression for localization (referred to be spatially invariant. Application of the concepts of Volterra to as localization linkages), it automatically leads to highly accurate series directly to materials phenomena poses major challenges. homogenization and computationally efficient multiscale model- Materials phenomena require an extremely large number of ing and simulation. input channels (because of the complexities associated with microstructure representation) and an extremely large number of The MKS approach is built on the statistical continuum theories output channels (because of the broad range of response fields of developed by Kröner [132,133]. However, the MKS approach dra- interest). However, the microstructure representation described matically improves the accuracy of these expressions by calibrat- above allows a reformulation of the localization relationship into ing the convolution kernels in these expressions to results from a form that is amenable for calibration to results from previously previously validated physics-based models. In recent work [131], validated numerical models (e.g. finite element models, phase- the MKS approach was demonstrated to successfully capture the field models) by linear regression in the Discrete Fourier transform tails of the microscale stress and strain distributions in compos- space [127–131], which is one of the distinctive features of the MKS ite material systems with relatively high phase contrast, using the framework. higher-order terms in the localization relationships. The approach A major challenge in the MKS framework is the calibration has its analogue in the use of Volterra kernels [134,135] for mod- of the selected higher-order influence coefficients via linear eling the dynamic response of nonlinear systems. regression. Although other approaches have been used in literature In the MKS framework, the localization relationships of interest to approximate the higher-order Volterra kernels (e.g. cross- are expressed as calibrated meta-models that take the form of correlation methods [145,146], artificial neural networks [144], a simple algebraic series whose terms capture the individual and Bayesian methods [147,148]), the solutions obtained using contributions from a hierarchy of microstructure descriptors. Each ordinary least squares fit [131] generally provide the best term in this series expansion is expressed as a convolution of the solutions for the MKS case studies conducted to date. Fig. 3 appropriate local microstructure descriptor and its corresponding demonstrates the accuracy of the MKS approach for predicting local influence at the microscale. The microstructure is defined the local elastic response in an example microstructure with a as a digital signal using the discrete approximation of the contrast of 10 in the values of the Young’s moduli of the local h microstructure function, ms , reflecting the volume fraction of local states. The MKS methodology has thus far been successfully state h in the spatial bin (or voxel) s. Let ps be the averaged applied to capturing thermo-elastic stress (or strain) distributions local response in spatial cell s defined on the same spatial grid in composite materials, rigid-viscoplastic deformation fields in as the microstructure function; it reflects a thoughtfully selected composite materials, and the evolution of the composition local physical response central to the phenomenon being studied, fields in spinodal decomposition of binary alloys [127–131]. and may include stress, strain or microstructure evolution rate The latter two phenomena (rigid-viscoplastic deformations and 12 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25

Finite Element Model 45

40

35

30

25

20

15

10

5 Materials Knowledge system × 10-4 10 20 30 40 45 2500

15 2000 35 30

1500 25 10

20 1000 Frequency 15 5 500 10

5 0 024671012 14 16 18 0 10 20 30 40 Strain × 10-4

Fig. 3. Demonstration of the accuracy of the MKS approach in predicting the local elastic fields in a two-phase composite system with a contrast of 10 in the elastic moduli of the constituent phases. The strain contours on a mid-section through the 3-D composite microstructure are shown along with the strain distributions in each phase in the entire 3-D volume element. It is seen that the MKS approach produced excellent predictions at a fraction of the computational cost associated with the finite element method (FEM). spinodal decomposition) involve highly nonlinear governing 5. Characterization for model validation and process–structure field equations. However, MKS framework has not yet been models applied to highly complex microstructures such as those seen in polycrystalline material systems. For the compact inclusion of Verification and validation (V&V) is a core component of lattice orientation in the higher-order localization relationships, it ICME [7]. Model verification and validation is an underdeveloped would be necessary to introduce the spectral representations over aspect of simulation-supported design of materials and products the orientation space (used in the MSD framework) into the MKS within the ICME vision. Within systems engineering, verification refers to the process of ensuring that the ‘‘system is built right’’ framework described above. (i.e., the system meets its requirements) whereas validation refers MSD and MKS approaches described in the preceding ad- to building the ‘‘right system’’ (i.e., the system satisfies the dress the compact representation of high fidelity microstruc- users’ needs) [149]. These are general definitions that apply to ture–property–processing linkages at a selected length scale. In verification and validation of individual and multiscale models, order to obtain viable materials design solutions, one needs to design processes and design outcomes. Within the context of link relevant models from different length and time scales, quan- simulation model development [150], verification and validation tify uncertainty in each model, and communicate the uncertainty have specific definitions, which are consistent with the systems throughout the model chain in a robust multi-level design frame- engineering definitions above: work. • Verification: The process of determining that a model imple- In a later section, the Inductive Design Exploration Method mentation accurately represents the developer’s conceptual de- (IDEM), will be introduced as a means of mitigating or managing scription of the model and the solution to the model. uncertainty while facilitating top–down searches for candidate • Validation: The process of determining the degree to which a microstructures based on mappings compiled from bottom–up model is an accurate representation of the real world from the simulations. In general, the utility of IDEM might be greatest perspective of the intended uses of the model. for applications involving highly heterogeneous datasets and/or One of the most difficult tasks in validating models is the material models for which the inverse design strategy afforded documentation of the material microstructure and its evolution by MSD or MKS approaches might be less clearly related to in any selected process. As noted earlier, it is essential to microstructure representation. Such cases may be typical of multi- address stochasticity of microstructure. Consequently, there is a physics phenomena with weak coupling to each other or to critical need to establish rigorous protocols for documenting in radically different length scales of microstructure, or to problems a statistically meaningful way the rich three-dimensional (3-D) for which the microstructure–property/response relations run topological details of the microstructure that might span multiple through a sequence of hierarchical models, with each transition length scales. These protocols need to establish reliable statistical characterized by a reduction of model order. measures to assess if a sufficient number of datasets have been J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 13 acquired, and if the spatial extent and spatial resolution in the Low Variance acquired datasets are adequate. The protocols should also address Mid Variance reconstructions of representative volume elements (RVEs) from High Variance the measured datasets. Since the microstructure has a stochastic character, it is unlikely that a single sample or statistical volume element of microstructure [24,25] can accurately represent the 2

microstructure. Therefore, one aim is to establish protocols for 3 α extracting a set of such samples that accurately reflect the 0 natural variance in the microstructure such that the statistics of the ensemble can be statistically representative (i.e., statistically homogeneous). -2 Obtaining 3-D microstructure datasets from samples contin- 4 ues to be a major challenge for most advanced material systems. 2 20 Impressive advances have been made in generating 3-D micro- α 0 2 0 -20 α structure datasets [151–162]. Despite these advances, the acqui- -40 1 sition of such datasets remains effort intensive, highly expensive, -60 very slow, and inaccessible to the broader scientific commu- Fig. 4. Visualization of ensembles of microstructure datasets and their variance us- nity. Moreover, the volumes of materials studied using these ing PCA representations of the 2-point statistics. Three ensembles of microstructure approaches are often limited, which raises critical questions re- datasets are shown with High-, Mid-, and Low-Variance. Each point in the figure garding their adequacy for reliably extracting desired microstruc- corresponds to a complete set of 2-point statistics, projected in the first three di- mensions of the PCA space [21]. The αi values in the figure are the corresponding ture measures and their variances. PCA weights. The points corresponding to each ensemble are enclosed in convex If we restrict attention to 2-point and 3-point statistics of hulls to help visualize the space spanned by each ensemble. Note that the range of the microstructure (these constitute the most important spatial the PCA weights decreases substantially with each new PCA dimension. correlations), it should be possible to acquire and assemble 3-D descriptions of these statistics from large 2-D microstructure scans Successful pursuit of the ICME vision requires integration of on multiple oblique sections into the sample [138,163]. This is all of the different components described earlier. The NMAB re- because all of the vectors used to define the 2-point and 3-point port on ICME [7] specifically identifies three main tasks for inte- statistics can be populated on carefully selected oblique sections gration tools: (i) linking information from different sources and into the sample. Since it is significantly less expensive to obtain different knowledge domains, (ii) networking and collaborative large 2-D microstructure scans in most materials, this approach is development, and (iii) optimization. Certainly, collaborative mul- much more practical for statistically meaningful characterization tidisciplinary design optimization presents challenges due to the of the microstructure in a given sample set. geographically distributed nature of the materials supply and It is important to emphasize that the complete set of 2-point product design enterprises, often undertaken in different corpo- statistics is an extremely large set. Compact and meaningful rations and in different divisions, as well as in consulting and grouping of the dominant 2-point statistics in a given material university laboratories [15]. Major emphasis to date on design in- system is an area of active current research. It has long been tegration has been placed on use of commercial software [13,14] that includes capabilities to manage databases and pursue vari- known that the complete set of 2-point statistics exhibits many ous optimization strategies. With regard to optimization, mate- interdependencies. Gokhale et al. [164] showed that at most rials design differs substantially from conventional problems in (N (N − 1)) /2 2-point correlations are independent for a material design optimization owing to the complexity, nonlinearity, and system comprised of N distinct local states. Niezgoda et al. [165] uncertainty [168,169] of material process–structure and struc- recently showed that only (N − 1) of the 2-point correlations ture–property models. It is not merely the case where existing are actually independent by exploiting known properties of the tools can be identified and deployed, but rather basic research is- Discrete Fourier transform representation of the correlations. sues at the intersection of computational materials science and However, for a material system comprising a very large number of engineering, experimental mechanics and processing, and multi- local states (e.g. a polycrystalline material system), even tracking disciplinary design optimization must be identified and addressed. N − 1 spatial correlations is an arduous task. It was recently ( ) As mentioned earlier, one of the most difficult challenges in the demonstrated that techniques such as principal component integration is the visualization of microstructure–property–proce- analysis (PCA), commonly used in face recognition and pattern ssing linkages in a common space. This is exacerbated by the fact recognition, can be used to obtain objective low dimensional that a comprehensive description of the microstructure requires representations of the 2-point statistics [21,22], as shown in Fig. 4. a very large number of statistics or spatial correlations. Therefore, Since the n-point statistics are smooth analytic functions with reduced-order representations of the microstructure are essential. a well defined origin, PCA decomposition is quite effective in The PCA decompositions of the n-point statistics discussed earlier establishing an uncorrelated orthogonal basis for any ensemble offer new exciting opportunities to build new integrations tools. of microstructure datasets. The PCA decomposition allows one As an example, consider the process design problem where the to represent each microstructure as a linear combination of the objective is to identify hybrid processing routes comprising an eigenvectors of a correlation matrix, with the weights interpreted arbitrary sequence of available manufacturing routes aimed at in the same manner as Fourier coefficients. The weights now form transforming a given initial microstructure into a desired one. Fig. 5 an objective low dimensional representation of the much larger set shows schematically how one might address the process design of independent 2-point statistics. The PCA representations allow problem by representing the microstructure–property–processing simple visualization of an ensemble of microstructures, and a linkages in a common (microstructure) space. The green volume in rigorous quantification of the microstructural variance within the Fig. 5 depicts a microstructure hull corresponding to an ensemble ensemble [120] (see Fig. 4). The same concepts can also be used of digitally created porous solids [22]. Each point in this space effectively to define weighted sets of statistical volume elements represents a microstructure (an example is depicted in the top (WSVEs) as an alternative to identifying RVEs for a given ensemble right in Fig. 5) through the PCA weights of its 2-point correlations. of microstructure datasets [22,166,167]. The elastic moduli for all microstructure members in the selected 14 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25

60 50 40 30 20 4 10

2 60 40 60 0 α 20 40 3 20 -2 Predicted Modules -4

-6 50 7 40 0 3000 30 2000 6.8 α 1000 20 2 0 -10 -1000 α -2000 1 10 -3000 6.6 0 -10 6.4 Modules (MPa) Processing -20 Pathlines -30 6.2 -40 -1 -0.5 0 0.5 1 1.5 2 2nd principal Component 0 0.01 0.02 0.03 0.04 0.05

Relative Error

Fig. 5. Illustration of the compact representation of microstructure–property–processing linkages using the reduced-order representations of the microstructure obtained from principal component analyses of higher-order statistics (2-point statistics). The αi values in the figure represent the PCA weights in the reduced-order representation of the n-point spatial correlations. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.) ensemble were evaluated using finite element models (FEM), linkage of models that explicitly consider microstructure to pro- and were correlated to the first two PCA weights using artificial cess path design is a key underdeveloped technology within the neural networks. The correlation and its validation are depicted ICME paradigm, as discussed earlier in reference to Fig. 2. in the contour plot on the bottom right in Fig. 5. It is seen that the microstructure–modulus correlation provides excellent 6. Uncertainty in ICME predictions for various members of the ensemble (with less than 5% error) [22]. It is noted that the PCA weights of the microstructure One of the key enablers of the ICME paradigm is the ability can also be related to properties of interest using various regression to account for different aspects of uncertainty within models, methods. experiments, and the design process. Uncertainty is a first Fig. 5 illustrates schematically the superposition of processing order concern in the process of simulation-supported, decision- information on the microstructure hull in the form of process path- based materials design. It is often overlooked in discussion lines. In any selected processing option, the salient features of the of ICME and materials design, possibly because it is assumed microstructure are expected to change, which can be visualized as that approaches exist within systems design and uncertainty a pathline in the multi-dimensional microstructure space. Differ- management methods that are suitable. This is generally not ent processing operations are expected to change the microstruc- the case in addressing nuances and complexity of multi-level ture in different ways, and therefore there are likely to be different materials design. Only two paragraphs were devoted to the processing pathlines intersecting at each microstructure. Identifi- uncertainty section in the NMAB ICME report [7], and little cation (either experimentally or by modeling) of all the processing discussion was devoted to MDO aspects and the richness of pathlines in the microstructure hull corresponding to all available required verification and validation approaches; however, these processing options that may be exercised results in a large database are perhaps among the most critical scientific obstacles to (or a process network). Once such a database is established, it realizing ICME owing to its nascent stages of formulation. The should be possible to identify the ideal hybrid process route (based ICME report focused more on model integration/, on cost or other criterion such as sustainability) that will transform i.e., assembling and networking modeling and simulation tools any selected initial microstructure into one that has been predicted along with accompanying supportive experimental methods. to exhibit superior properties or performance characteristics. The There are various sources of uncertainty in addressing ICME, basic concept was evaluated recently in a simple case study that inexhaustively listed without classification as [15]: aimed at transforming the given initial crystallographic texture in • Variability induced by processing, operating conditions, etc. and a sample into a desired one [123]. In all examples considered, the due to randomness of microstructure. process network identified solutions that could not be arrived at by • Incomplete knowledge of model parameters due to insufficient pure intuition or by brute-force methods (including constant per- or inaccurate data, including initial and boundary conditions. formance maximization or repeated trials). There is a critical need • Uncertain structure of a model due to insufficient knowledge to further develop this framework to include higher-order descrip- (necessitating approximations and simplifications) about a tions of microstructure (possibly based on n-point statistics) and physical process such as process–structure or structure–pro- higher-order homogenization theories (e.g. MKS framework). This perty relations. J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 15

• Uncertainty in meshing and numerical solution methods. in early 20th century to (b) environmental and behavioral uncer- • Propagation of uncertainty through a chain of models (amplifi- tainty in the mid 20th century, to the recent classification of fun- cation or attenuation). damental uncertainty and ambiguity [169]. Uncertainty is also re- • Experimental idealizations and measurement error. lated to the concept of risk which can be defined as a triplet of sce- • Lack of direct connection of measured quantities to model narios, likelihood of scenarios, and possible consequences [170]. degrees of freedom. Uncertainty gives rise to risk. While risk is always associated with undesirable consequences, uncertainty may have desirable conse- Accounting for uncertainty involves four aspects: uncertainty quences [171]. An understanding of these classifications is neces- quantification, uncertainty propagation, uncertainty mitigation, sary to properly position the methods for design under uncertainty. and uncertainty management. Uncertainty quantification is the Within computational modeling and simulation, uncertainty process of identifying different sources of uncertainty and has been classified from different standpoints. Chen et al. [172] developing corresponding mathematical representations. There classify uncertainty into three types — Type 1 uncertainty associ- are perhaps several unique aspects of materials with regard to ated with inherent variation in the physical system, Type 2 uncer- uncertainty quantification [168]: tainty associated with lack of knowledge, and Type 3 uncertainty associated with known limitations of simulation models. With spe- • The long range character of defect/structure interactions in cific goal of materials design, Choi et al. [173] classify uncertainty materials necessitates complex nonlocal formulations of con- associated with different aspects of simulation models into model tinuum field equations or corresponding coarse grain internal parameter uncertainty, model structural uncertainty, and propa- variables. An economics analogy might be consumer reaction gated uncertainty. to market instabilities abroad. The most widely adopted classification splits uncertainty into • Use of 2D images to reconstruct 3-D microstructures to esti- aleatory uncertainty and epistemic uncertainty. Aleatory uncer- mate properties/responses is often necessitated by lack of ex- tainty is due to the inherent randomness in material systems, such pensive and time-consuming 3-D quantitative microstructure as nearest neighbor distributions of grain size, orientation, and information; moreover, measured microstructures of new ma- shape in metallic polycrystals. This uncertainty is also referred to terials of interest to materials design do not yet exist, and pre- as irreducible uncertainty, statistical variability, inherent uncer- dicted microstructures may not be thermodynamically feasible tainty, or stochastic uncertainty. Aleatory uncertainty associated or metastable. with parameters is commonly represented in terms of probability • Extreme value (rare event) problems are common in failure of distributions. Probability theory is very well developed and hence materials, such as localization of damage within the microstruc- has served as a language and mathematics for uncertainty quan- ture in fracture or high cycle fatigue. An analogy might be insur- tification. This is one of the reasons that aleatory uncertainty has ance company estimation of probabilities of rare events such as received significant attention both in engineering design and com- 1000 year floods, tsunamis, and so forth. putational materials science. • The large range of equations used in solving field equations Epistemic uncertainty, on the other hand, refers to the lack of of nonlinear, time-dependent, multi-physics behavior of ma- knowledge which may be due to the use of idealizations (simpli- terials, including constraints, side conditions, jump conditions, fied relationships between inputs and outputs), approximations, thresholds, etc., which may vary from group to group (i.e., little unanticipated interactions, or numerical errors. Examples include standardization). size of the representative volume element (RVE) of microstruc- ture used to predict properties or responses, simplified mate- Uncertainty propagation involves determining the effect of in- rial constitutive models, including the form and/or structure of put uncertainties on the output uncertainties, both for individ- such models, and approximate model parameters. Epistemic un- ual models and chains of models. The combination of uncertainty certainty is also referred to as reducible uncertainty, subjective representation and propagation is sometimes referred to as uncer- uncertainty, and/or model-form uncertainty. Attempts have been tainty analysis. Uncertainty mitigation utilizes techniques for re- made to use a Bayesian approach to quantify epistemic uncer- ducing the impact of input uncertainties on the performance of the tainty using probability functions. Under the Bayesian framework material system. Finally, uncertainty management involves decid- for epistemic uncertainty, the subjective interpretation of proba- ing the levels of uncertainty that are appropriate for making de- bility functions is used [174]. Subjective interpretation of proba- sign decisions and prioritizing uncertainty reduction alternatives bility is different from the frequentist interpretation in the sense based on the tradeoff between efforts and benefits. While uncer- that subjective probabilities refer to degree of belief rather than tainty quantification and propagation activities are carried out by the frequency of occurrence. The subjective interpretation allows analysts or model developers, uncertainty mitigation and manage- for modeling all types of uncertainty (epistemic and aleatory) into ment are activities performed by system designers during the de- a common framework, which is convenient for uncertainty anal- sign process. These four aspects and corresponding techniques are ysis. The Bayesian approach involves determining prior probabil- discussed in a fairly general way in Sections 6.1 through 6.4, fol- ity distributions of uncertain parameters that represent modeler’s lowed by a brief discussion of some of the unique aspects associ- initial belief about uncertainty and using Bayes’ theorem to up- ated with multiscale modeling and design of materials. date the probability functions based on new pieces of information (e.g., from experiments). However, by the very nature of epistemic uncertainty, it 6.1. Uncertainty quantification is inherently difficult or impossible to characterize probability functions associated with limited knowledge such as model During the past few decades, a number of different classifi- idealization. An incorrect representation of uncertainty may cations of uncertainty have been proposed. Different classifica- result in wrong decisions. Hence, different types of mathematical tions have emerged within diverse fields such as economics and techniques are needed for representing epistemic uncertainty. finance, systems engineering, management science, product de- Alternative approaches for representing epistemic uncertainty velopment, and computational modeling and simulation. Uncer- include fuzzy set theory, evidence theory, possibility theory, tainty classification has also evolved over time, e.g., within eco- and interval analysis [175]. Different representations are suitable nomics the classification has evolved from (a) risk and uncertainty under different situations. 16 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25

While there has been significant progress in accounting for to higher levels of predicted response(s). The latter approach is aleatory uncertainty using probability theory, little progress has straightforward and tractable, but requires identification of some been made on (a) holistic quantification of epistemic uncertainty scale as a starting point at which finer scale response can be within multiscale material models, and (b) simultaneous consid- treated as deterministic, which of course does not fully accord with eration of different mathematical constructs for representing both physics. From a practical perspective, Table 1 suggests that each aleatory and epistemic uncertainty in an integrated manner. We property or response may have a dominant minimal scale which regard this as one of the primary challenges facing ICME. can serve as a starting point for applying deterministic models with probability distributions of inputs to propagate through a multi- 6.2. Uncertainty propagation scale modeling framework. On the other hand, the temptation with a fully stochastic approach is to avoid the hard but fruitful work of Within statistics, uncertainty propagation refers to the effect of assessing the viability of such a scale, perhaps leading to an early uncertainties in independent variables on the dependent variables. concession to Bayesian approaches to provide answers where the In the context of simulation-supported design (decisions based physics is complex or microstructure-sensitive models are limited at least in part on modeling and simulation), this translates to or avoided. In this regard, there are some recent works that merge the effect of input uncertainty on the output uncertainty. Of the two approaches in multiscale modeling in a manner that may course, the same general logic applies to input–output relations in provide promising avenues for future applications [179]. experiments, which assert certain models in their own right and have error and approximation. In addition to the uncertainty at 6.3. Uncertainty mitigation individual scales, materials design is associated with propagation of uncertainties in networks of interconnected models, which Uncertainty mitigation refers to reducing the effect of uncer- may either cause amplification or reduction of uncertainties in tainty on systems design. Techniques such as robust design and serial application of models. Hence, for the realization of ICME, reliability based design and optimization (RBDO) have been devel- techniques are required for (a) propagating uncertainty from the oped in the systems design community to achieve desired system inputs of one model to its outputs, (b) propagating uncertainty performance in the presence of uncertainty. Robust design tech- from lower level models to upper level models within a model niques are based on the principle of minimizing the effects of un- chain, (c) propagating uncertainty from material composition certainty on system performance [180] whereas reliability-based to structure through process route, material microstructure to design optimization techniques are based on finding optimized de- material properties or responses, and from material properties to signs characterized by low probability of failure [181]. Note that re- product performance. ducing the effect of uncertainty on design using these approaches is A number of techniques such as Monte Carlo techniques and different from reducing the (epistemic) uncertainty itself. In robust global sensitivity analysis are available for propagating uncertainty design and RBDO approaches, the levels of uncertainty are gener- in the inputs (represented as random variables) to the outputs. ally assumed to be fixed (or given). The goal is to determine the Probability-based techniques have also been developed for prop- best design that is insensitive to the given uncertainty or has low agating uncertainty across models. Lee and Chen [176] discuss five probability of failure. categories of methods for (typically aleatory) uncertainty prop- Robust design techniques are rooted in Taguchi’s quality en- agation: Monte-Carlo methods, local expansion based methods gineering methodology [182]. One of the earliest robust design (e.g., Taylor’s series expansion), most probable point (MPP) based approaches for systems design include the Robust Concept Ex- methods (e.g., first order reliability method and second order relia- ploration Method (RCEM) [183] which combines statistical tech- bility method), functional expansion based methods (e.g., polyno- niques of surrogate modeling and compromise decision support mial chaos expansion), and numerical integration based methods. problems [184] to perform robust design. RCEM has been extended A comparison of full factorial numerical integration, univariate di- to account for other sources of uncertainty (e.g., uncertainty in in- mensional reduction method, and polynomial chaos expansion is put variables) [185], and to develop product platforms [186]. The presented in Ref. [176]. multi-disciplinary design optimization (MDO) community has de- The primary challenge with Monte Carlo techniques is that they veloped techniques for optimizing systems in the presence of un- are computationally intensive in nature. Repeated execution of certainty [187–190]. simulation models may be computationally expensive and gen- Existing techniques for multi-level deterministic optimization erally infeasible when using multiscale models with a number have also been extended to account for uncertainty. An example is of sources of uncertainty. Hence, efficient techniques for uncer- the extension of the Analytical Target Cascading (ATC) approach tainty propagation are necessary. Chen and co-authors [172] have to Probabilistic Analytical Target Cascading (PATC) [191]. ATC developed techniques for efficient uncertainty propagation using is an approach for cascading overall system design targets to metamodels, but the use of metamodels in multiscale modeling sub-system specifications based on an hierarchical multi-level where sound physical basis must exist for models at fine scales decomposition [192], similar to the concurrent material and has not been fully acknowledged or pursued. There is strong sus- product design problem shown in Fig. 1, when layered with the picion that metamodels are too limited for this task, in general. Yin understanding of Olson’s Venn diagram. The subsystem design et al. [177] use random fields to model uncertainty within material problems are formulated as minimum deviation optimization microstructure and use dimension reduction techniques for effi- problems and the outputs of the subsystem problems are used as cient uncertainty propagation. Methods for propagation of uncer- inputs to the system-level problem. Within PATC, uncertainty is tainties represented as intervals are also available [178]. However, modeled in terms of random variables and uncertainty propagation approaches for propagating different types of uncertainty in a sin- techniques are used to determine the impact of uncertainties at the gle framework are in early stages. bottom level on parameters at the top level. A related issue is whether to engage description of stochas- While analytical target cascading approach is based on choos- ticity at each level of simulation in terms of parameter distribu- ing a design point during each iteration and repeating the opti- tion (e.g., variation of microstructure, model parameters) or to mization loops until convergence is achieved, approaches such as use many deterministic realizations of microstructure and corre- the Inductive Design Exploration Method (IDEM) use ranges of pa- sponding models/parameters extracted from assigned probabilis- rameters and design variables [15,193,194]. The primary advan- tic distributions at some ‘‘primal’’ scale to propagate uncertainty tage of working with ranged sets of design points is the possibility J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 17

Fig. 6. Inductive Design Exploration Method [15,193–195]. Step 1 involves bottom–up simulations or experiments, typically conducted in parallel fashion, to map composition into structure and then into properties, with regions in blue showing the feasible ranged set of points from these mappings. Step 2 involves top–down evaluation of points from the ranged set of specified performance requirements that overlap feasible regions (red) established as subsets of bottom–up simulations in Step 1. (For interpretation of the references to colour in this figure legend, the reader is referred to the web version of this article.) of parallelization of bottom–up simulations that project composi- design specifications by passing feasible solution spaces from final tion onto structure via process route and structure onto properties. performance requirements by way of an interdependent response IDEM facilitates design while accounting for uncertainty in mod- space to the design space while preserving the feasible solution els and its propagation through chains of models. IDEM [193,194] space to the greatest extent possible. has two major objectives: (i) to explore top–down, requirements- Choi and co-authors [195] use the IDEM framework to driven design, guiding bottom–up modeling and simulation, and design a multi-functional energetic structural material (MESM) (ii) to manage uncertainty in model chains. Fig. 6 illustrates the using a multiscale model consisting of a microscale discrete concept of IDEM. We consider multiple spaces of different dimen- particle mixture (DPM) model and a continuum non-equilibrium sion (DOF), between which models, simulations or experiments thermodynamic mixture (NTM) model. The inputs of the DPM serve as mappings that project points in a given space (e.g., compo- model are volume fractions and mean radii of the MESM sition) to the next (e.g., microstructure) and to the final (e.g., prop- constituents and the output is the weighted average temperature erties/responses). First we must configure the design system, of local hotspots. The inputs of the NTM model include volume including information flow between specific databases and mod- fractions and critical temperature for reaction initiation and the eling and simulation tools, and specify ranged set of performance output is the accumulated reaction product. For each input–output requirements. Then, steps in IDEM, shown below in Fig. 6, are as mapping, the inputs space is discretized using experimental design follows: techniques. The model is executed at discrete points considering various types of uncertainty (discussed in Section 6.1). Due to the 1. Perform parallel discrete points evaluation at each level of uncertainties, each point maps to a subspace in the output space. design process (bottom–up simulations and experiments). Specific techniques have been developed to account for inputs 2. Perform inductive (top–down) feasible space exploration based shared across multiple levels. After generating the mappings on metamodels from Step 2. between different levels, measures such as Hyperdimensional 3. Obtain a robust solution with respect to model uncertainty. Error-Margin Index (HD-EMI) are used to determine the robustness Projected points need not comprise simply connected regions of the design to various forms of uncertainty while satisfying in each space, but we plot as such for simplicity. There is also the problem constraints. Using the IDEM approach, Choi and co- provision for reconfiguring the design system in case there is no authors [195] determine robust combinations of design variables overlap between performance requirements and feasible ranges (volume fractions, and radii of constituents) to achieve the desired of the bottom–up projections and/or to mitigate uncertainty. accumulation of reaction products. Other examples of the IDEM Mitigation of uncertainty is a second primary feature of IDEM. implementation are discussed in [15,193,194]. If two candidate solutions exist, say Designs 1 and 2, for which The forward mappings in Step 1 of IDEM pertain either to final performance is identical, then which design should we select? multiscale models, in which information from a model at one In general, we select solutions that lie the furthest from the scale is projected onto inputs into higher scale models, or multi- constraint boundaries in each space, including the boundaries level models, in which composition is mapped into structure of feasible regions established in Step 1 of IDEM (blue regions and structure into properties. By remaining as far as possible in Fig. 6). Current robust design methods cannot address this from constraints, IDEM attempts to pursue solutions that are issue since these methods only focus on the final performance less sensitive to uncertainty, particularly in terms of solutions range. The fundamental difference between IDEM and other multi- that avoid infeasible designs. The IDEM process accounts for level design methods is that IDEM involves finding ranged sets of the effect of aleatory uncertainty in the input parameters (such 18 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 as randomness in particle sizes and randomness in simulated Different measures are developed for quantifying the benefit of microstructure) on the outputs of the multiscale model, along uncertainty mitigation in individual models [197,202]. These in- with the propagation of aleatory uncertainty along a design chain. clude (a) the improvement potential, and (b) process performance Instead of using a concurrent multiscale modeling approach, indicator (PPI). The improvement potential quantifies the maxi- an information-passing approach is used in IDEM. The primary mum possible improvement in a designer’s decision that can be distinction is that instead of passing information about single achieved by refining a simulation model [197] (e.g., through in- design points throughout the model chain, ranged sets of solutions creasing fidelity, improving the physical basis of the model, etc.). are passed across different levels. It allows for independent If the improvement potential is high, the design decision can pos- sibly be improved by refining the simulation model. However, if execution of models at different scales, which simplifies the the improvement potential is low, then the possibility of improv- uncertainty propagation within models. Since the process of ing the design solution is low. Hence the level of refinement of the uncertainty analysis is decoupled from design exploration, IDEM model is appropriate for decision-making. The main assumption is is flexible to be extended for other forms of uncertainty using that the bounds on the outputs of models, which translate to the existing uncertainty propagation techniques. Recently, IDEM has bounds on payoff, are available throughout the design space. The been extended to account for model parameter uncertainty [196]. improvement potential measure is applied to problems involving Further work needs to be carried out to account for other simulation model refinement [197] and the coupling between sim- representations of epistemic uncertainty such as intervals and ulation models at different scales [15,199]. fuzzy sets. The PPI is developed for scenarios where information about upper and lower bounds on the outputs of models are not available 6.4. Uncertainty management but it is possible to generate accurate information at a few points in the design space [198,203]. This is generally the case when physical experiments are expensive and can only be executed at a few Uncertainty management is the process of deciding the levels points in the design space. Another application of the PPI is when of uncertainty that are appropriate for design decisions and priori- there are multiple models (or refinements of the same model) that tizing uncertainty reduction alternatives based on the tradeoff be- a design can choose from, and one of the models is known to be tween efforts and benefits. The focus in uncertainty management is the most truthful. The PPI is calculated as the difference in payoff on deciding when to reduce epistemic uncertainty by gaining addi- between the outcomes of decisions made using the most truthful tional knowledge and when to reduce the complexity of the design model and the outcomes achievable using less complex models. process by using simpler models and design processes. The notion The improvement potential and the process performance of introducing uncertainty by using simplified models during the indicator account for the benefits of gaining additional information early conceptual design phases has received little attention from by reducing epistemic uncertainty but do not directly capture the the engineering design and the materials science communities. effort involved in reducing uncertainty. This is due to a lack of When designing a material system, the focus is on achieving holistic models that predict the effort involved in developing and properties such as strength and ductility that address performance executing simulation models. The effort models must be predictive requirements. In contrast, when managing uncertainty in the in nature because they need to be employed before the simulation design process, focus is placed on design-process goals such as is developed or refined. An effort model for ICME would be based reducing the complexity of the design process and achieving the on the cost models from software development and systems best design in the shortest possible time. The notion of designing the engineering [204]. The effort model accounts for three dimensions of effort — person hours, monetary cost, and computational time. design process is also important because of (a) evolving simulation Using this effort model, measures that account for both value and models, resulting in multiple fidelities of models at different effort are under development. stages of a design process, and (b) significant model development Constructs based on information economics can also be used to and execution costs, necessitating judicious use of computational choose between a concurrent multiscale modeling approach and a resources. Moreover, the needs for accurate information depend hierarchical multiscale modeling approach. In the case of concur- on whether the goal is to narrow down the alternatives to a rent multiscale modeling, models at various scales are tightly cou- specific class of materials (i.e., during early design phase) or to pled with each other. In contrast, hierarchical multiscale modeling optimize the composition and structure of a specific material approaches typically involve passing information between models (i.e., during the later stages of design). In the early phases of in the form of ranges of values or surrogate models. Due to the tight design, models and accurate representations of uncertainty may coupling in concurrent multiscale models, it is more challenging to not be available — they may also not be necessary. Uncertainty identify the impact of specific sources of uncertainties and specific management can be applied to problems such as refinement models on the overall output uncertainty. Additionally, the impact of simulation models [197], selection of models from a set of of uncertainty propagation through individual models is difficult to available models [198], and reducing the complexity of the design evaluate. Hence, it is more challenging to determine which models process [199]. within a multiscale framework should be refined to reduce the im- A potential approach to uncertainty management is to uti- pact uncertainty or which models can be simplified to reduce the lize concepts and mathematical constructs from information eco- complexity of the process for initial design exploration purposes. nomics [200,201] such as value of information to quantify the Information economics can be used to strike the balance between the complexity of concurrent multiscale models and the loss of in- cost/benefit tradeoff to make the best design decisions in the pres- formation in hierarchical multiscale models. These are very impor- ence of uncertainty. Information economics is the study of costs tant practical considerations for tools for addressing uncertainty in and benefits of information in the context of decision-making. materials design. Metrics from information economics, specifically the expected value Efficient and effective means for uncertainty management must of information, are extended to account for both statistical variabil- be integrated with the materials design strategy, and can be lever- ity and imprecision in models. The value of this added information aged tremendously by incorporation of parallelized computational is referred to as the improvement in a designer’s decision-making evaluations of process–structure and structure–property relations. ability. Mitigation of uncertainty in simulation models is effectively Clearly, the parametric nature of the underlying modeling and sim- analogous to the acquisition of a new source of information, and ulation necessary to quantify sensitivity to variation of material the set of results from the execution of this refined model is anal- structure must rely on increasing computational power to enhance ogous to additional information for making a decision. feasibility and utility. J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 19

7. Verification and validation Confirmation Verification and validation were defined at the beginning of Reality of Modeling Section5. In the context of ICME, pursuit of V&V consists of Interest following activities: 1. Individual model V&V — single model focusing on single length Simulation Mathematical and/or time scales. Validation Outcomes Model 2. Multiscale model V&V — single model or coupled set of models comprising a simulation operating over multiple length/time

scales in concurrent or hierarchical manner. Computer Implementation 3. Multi-physics model V&V — ensuring that the implementation Model Verification of a modeling framework spanning multiple phenomena is consistent mathematically and physically. 4. Design process validation — ensuring that the configured design process will yield a solution that satisfies design requirements. Modeling and Simulation Activities 5. Design (outcome) validation — comparing design outcomes to Assessment Activities system-level requirements.

It is noted that this decomposition is much broader than typical Fig. 7. Model verification and validation process [206]. notions of V&V applied either to models or design processes. Models are used to guide design, and the design process guides the 7.1. Individual model V&V validation experiments. There is another validation-related challenge in ICME due to the Development of a computational model involves various steps focus on design. When models are developed and employed to de- such as conceptual modeling, mathematical modeling, discretiza- sign materials that do not exist, the models cannot be experimen- tion and algorithm selection, computer programming, numerical tally validated before the system is designed. Experiments are also solution, solution representation [205]. Uncertainties and errors characterized by epistemic uncertainty such as the lack of knowl- may be introduced during each of these steps. The process of veri- edge regarding the system configuration. Hence, validation using fication and validation of simulation models is illustrated in Fig. 7. experiments must account for both aleatory and epistemic uncer- It consists of the following tasks [206]: tainty associated with the experiments. In that sense, the distinc- tion between simulation models and experiments is blurred. • Conceptual model validation (also called theoretical validation): Due to the specialized nature of models, their reuse is currently process of substantiating that the theories and assumptions severely limited in ICME. It is generally assumed that someone who underlying a model are correct and that the representation of is familiar with the complete implementation details of the model the system is reasonable for the intended simulation study. is always available to customize or adapt the models for different • Model verification: process of assuring that a computerized scenarios. The notion of using models as black boxes is generally model is a sufficiently accurate representation of the corre- discouraged because their validity outside the ranges for which sponding conceptual model. they are developed is unknown and generally undocumented. • Operational validation: process of determining whether the Currently, significant attention is not given to the reuse of models computerized model is sufficiently accurate for the needs of the within ICME. While the models capture a significant amount simulation study. of domain-specific knowledge, they are fragile in nature. Their • Data validation: process of ensuring that all numerical data used fragility is not due to a non-robust implementation but due to a to support the models are accurate and consistent. lack of documentation of their capabilities, scope of application, and limitations. Hence, the models cannot be used by system Verification and validation of models has recently received designers for integration into the design process as reusable, plug- significant attention in simulation-based design community. and-play, ‘‘black box’’ units as might be desired for a robust The Bayesian framework is one of the commonly adopted methodology. As ICME develops we maintain that significant frameworks for model validation [207]. Bayesian techniques for efforts are needed to embed knowledge and methods regarding model validation have also been extended to include epistemic the validity of models [210] so that they can be used by designers uncertainty regarding statistical quantities [208]. Another widely who may be generally familiar with the underlying physics but adopted technique involves concurrently validating models and not familiar with all the internal details of the model and its exploring the design space [209]. The traditional view of validation implementation. of simulation models is that if the outputs of the models are close Finally, the validity of a model depends on its purpose. Model enough to the experimental data, the models are valid. However, validation is effectively the process of determining the degree to this view is much too limiting in ICME, which is characterized which a model is an accurate representation of the real world from by a hierarchy of material length scales and focuses on designing the perspective of the intended uses of the model. As discussed in the novel materials that may not have been previously realized uncertainty management section above, the designers may prefer or characterized. Furthermore, it is essential to recognize that to use simpler models during the early design phases to narrow experimental data also have uncertainty. This uncertainty typically down the design space. Hence, there is a need for different levels increases at lower scales owing to an interplay between spatial of scope, maturity and fidelity of models within the design process. resolution of the measurement device/method and duration of There is a need for a classification scheme to classify models based measurement signal accumulation. In fact at the molecular level, on their scope and levels of fidelity [211], perhaps relating to a the uncertainty in experiments due to signal resolution and combination of model scope, readiness, and material length scale accumulation may exceed that of models. It may be infeasible level of input/output structure. The systems designers can use this or too expensive to run the experiments for complete model classification scheme in association with information economics validation before using them for design. Hence, validation activities constructs to choose appropriate models that are valid within a and design activities need to be carried out in an iterative manner. given application context. 20 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25

7.2. Multiscale model V&V

Valid models for given characteristic length and time scales Theoretical Theoretical may not result in valid multiscale models across scales. Hence, Structural Performance Validity Validity multiscale model V&V is an important aspect of the overall V&V strategy, and very little attention has been devoted to it. Multiscale 14 model V&V consists of two aspects: (a) ensuring compatibility between the domains of validity for 23 individual models, and Empirical Empirical (b) estimating the propagation of uncertainty through the chain of Structural Performance models. Validity Validity Compatibility assessment is the process of determining whether the domain of inputs of an upper-level model is consistent with the domain of outputs of the lower-level model. Individual models are generally valid within a domain of inputs. The input domain Fig. 8. Four phases in the validation square [213,214]. can be defined as a hypercube in terms of the lower and upper bounds on input variables [xlb xub]. The set of inputs within 1. Theoretical structural validation: Internal consistency of design this hypercube generates an output domain, which can be non- method, i.e., the logical soundness of its constructs both convex with disjoint sub-domains. This output domain serves individually and integrated. as the input domain to a higher-level model. Multiscale model 2. Empirical structural validation: The appropriateness of the V&V involves ensuring that the output domain of the lower- chosen example problem(s) intended to test design method. level model is a subset of the valid input domain of the upper- 3. Empirical performance validation: The ability to produce useful level model. Otherwise, the multilevel model would result in results for the chosen example problem(s). erroneous system-level predictions. Techniques such as support 4. Theoretical performance validation: The ability to produce useful vector machines have been used for mathematically quantifying results beyond the chosen example problem(s). This requires a the validity domains of individual models [212]. ‘leap of faith’ which is eased by the process (1)–(3) of building The second aspect of multiscale modeling V&V is related to confidence in the general usefulness of the design method. the effect of uncertainty propagation across a chain of models. Uncertainty in models at lower material length scales and/or in the process–structure–property mappings may be amplified as 7.4. Design (outcome) validation information flows from lower-level to upper-level models. Both types of chains are relevant to materials design, the first in Li et al. [215] argue that validation should be done on the design terms of multiscale modeling and the latter in terms of multi- rather than computational models used to obtain them. The goal of level design. The goal of V&V is to ensure that the effects of design validation is to gain confidence in the resulting design (of uncertainty at the lower-level(s) do not amplify beyond the the material) rather than the simulation model(s) used for design. desired uncertainty levels at which design decisions must be made. Validation of the design involves determining whether the system Uncertainty propagation can be viewed from both bottom–up as designed will perform as expected. The design outcomes are or top–down perspectives. From the bottom–up perspective, the evaluated against the system-level design requirements. This is effect of lower-level uncertainties on system-level prediction is generally carried out using experiments. determined. From a top–down view, the allowable uncertainty in system-level predictions is used to determine the allowable 8. Closure: advancing computational modeling and simulation uncertainty in lower-scale/level models. to pursue the ICME vision

7.3. Design process V&V The vision of Integrated Computational Materials Engineering is compelling in terms of value-added technology that reduces Design processes involve decisions regarding how to search the time to market for new products that exploit advanced, tailored space of alternatives so that the design targets can be achieved at materials. This article addresses underdeveloped concepts/tools a minimum cost. It involves decisions such as how to decompose within the portfolio, as well as some important points regarding the design problem, how to reduce the design space, sequencing the contribution of multiscale modeling to the overall materials fixation of design parameters in iterations, which models to design process. Effectively, the role of computational modeling and develop, how to invest the resources in achieving the design simulation is so pervasive that the probability of success of ICME objectives, etc. These decisions affect not only the final design is inextricably intertwined with addressing certain scientific and outcome but also affect the amount of effort required. technological ‘‘pressure points’’ as follows: Analogous to model validation, where the goal is to ensure that models accurately predict the desired response, the goal • ICME extends beyond informatics and combinatorics pursued of validating design processes is to ensure that a given design at the level of individual phases, addressing hierarchical mi- process will yield a solution that satisfies the design requirements. crostructures/structures of materials and embracing the ‘‘sys- Design process validation is important to guide the selection tems’’ character of integrating materials design with product of appropriate design methods and to support the development design and development. An engineering systems approach is and evaluation of new methods. In simulation-based multilevel essential. design, the design processes represent the manner in which the • Recognition of material structure hierarchy is of first order im- networks of decisions and simulation models are configured. One portance in multiscale modeling and multi-level materials de- of the approaches for validating design methods is the validation sign. Multiscale modeling and multi-level materials design are square [213,214]. The validation square, shown in Fig. 8, consists complementary but distinct enterprises, with the former serv- of four phases: ing the needs of the latter in the context of ICME. J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 21

Fig. 9. Expanded elements of the ‘materials innovation infrastructure’.

• Novel methods for inverse design including knowledge systems References and process path design are underdeveloped and require atten- tion. [1] Ashby MF. Materials selection in mechanical design. 2nd ed. Oxford, UK: • Uncertainty is a pervasive and complex issue to address, with Butterworth-Heinemann; 1999. [2] Broderick S, Suh C, Nowers J, Vogel B, Mallapragada S, Narasimhan B, et al. uncertainty quantification, propagation, mitigation and man- Informatics for combinatorial materials science. JOM Journal of the Minerals, agement carefully distinguished. Improvement of physically- Metals and Materials Society 2008;60:56–9. based models provides a promising foundational technology. [3] Billinge SJE, Rajan K, Sinnot SB. From cyberinfrastructure to cyberdis- covery in materials science: enhancing outcomes in materials research, Approaches to uncertainty that recognize and embrace the full Report of NSF-Sponsored workshop held in Arlington, VA, August 3–5. complexity of ICME are in their infancy. http://www.mcc.uiuc.edu/nsf/ciw_2006/, accessed October 19, 2011. See • Integration of multiscale modeling and multi-level design of also http://www.nsf.gov/crssprgm/cdi/, accessed October 19, 2011, 2006. materials with manufacturing and systems optimization is [4] Liu ZK. First principles calculations and Calphad modeling of thermodynam- ics. Journal of Phase Equilibria and Diffusion 2009;30:517–34. essential to pursuing the vision of ICME. [5] Hautier G, Fischer C, Ehrlacher V, Jain A, Ceder G. Data mined ionic In advancing methodologies and tools to address ICME, it is substitutions for the discovery of new compounds. Inorganic Chemistry 2011;50:656–63. essential to maintain a consistent long term vision among industry, [6] Dudiy SV, Zunger A. Searching for alloy configurations with target physical academia, and government. A critical assessment of prospects for properties: impurity design via a genetic algorithm inverse band structure progress towards the ICME vision in this regard was recently approach. Physical Review Letters 2006;97:046401. (4 pages). [7] Pollock TM, Allison JE, Backman DG, Boyce MC, Gersh M, Holm EA, offered by McDowell [216]. ICME is an engineering-directed et al. Integrated computational materials engineering: A transformational activity, requiring distributed, collaborative systems based design. discipline for improved competitiveness and national security. National Clearly, one consistent thread throughout the past two decades Materials Advisory Board, NAE, National Academies Press; 2008. Report of developing this field of integrated design of materials and Number: ISBN-10: 0-309-11999-5. [8] Apelian D, Alleyne A, Handwerker CA, Hopkins D, Isaacs JA, Olson GB, et al. products is the trend towards increasing emphasis on the role Accelerating technology transition: Bridging the valley of death for materials of computational materials science and mechanics across a broad and processes in defense systems. National Materials Advisory Board, NAE, range of length and time scales, depending on application. Much is National Academies Press; 2004. Report Number: ISBN-10: 0-309-09317-1. [9] Allison J, Backman D, Christodoulou L. Integrated computational materials to be gained by extending beyond the conventional computational engineering: a new paradigm for the global materials profession. JOM Journal materials science community to incorporate important advances in of the Minerals, Metals and Materials Society 2006;58:25–7. multiscale modeling and design optimization in the computational [10] Olson GB. Computational design of hierarchically structured materials. mechanics of materials and structures communities, as well as the Science 1997;277:1237–42. [11] McDowell DL, Story TL. New directions in materials design science and MDO and manufacturing communities. The tools and methods that engineering (MDS&E), Report of a NSF DMR-sponsored workshop, October are foundational to ICME effectively constitute the elements of 19-21, 1998. http://www.me.gatech.edu/paden/material-design/md_se.pdf, an expanded ‘materials innovation infrastructure’ in the Materials accessed May 5, 2012. [12] McDowell DL. Simulation-assisted materials design for the concurrent design Genome Initiative [16], as shown in Fig. 9. of materials and products. JOM Journal of the Minerals, Metals and Materials Society 2007;59:21–5. Acknowledgments [13] McDowell DL, Olson GB. Concurrent design of hierarchical materials and structures. Scientific Modeling and Simulation 2008;15:207–40. [14] McDowell DL, Backman D. Simulation-assisted design and accelerated JHP acknowledges financial support from National Science insertion of materials. In: Ghosh S, Dimiduk D, editors. Computational Foundation awards CMMI-0954447 and CMMI- 1042350. SRK methods for microstructure-property relationships. Springer; 2010. acknowledges support from Office of Naval Research (ONR) award [15] McDowell DL, Panchal JH, Choi H-J, Seepersad CC, Allen JK, Mistree F. Integrated design of multiscale, multifunctional materials and products. N00014-11-1-0759. DLM acknowledges the support of the Carter Elsevier; 2009. N. Paden, Jr. Distinguished Chair in Metals Processing, as well [16] Materials genome initiative for global competitiveness, National Science as the NSF Industry/University Cooperative Research Center for and Technology Council, Committee on Technology, June 2011. http://www. whitehouse.gov/sites/default/files/microsites/ostp/materials_genome_ Computational Materials Design (CCMD), supported by CCMD initiative-final.pdf, accessed May 5, 2012. members, through grants NSF IIP-0541674 (Penn State) and IIP- [17] Freiman S, Madsen LD, Rumble J. A perspective on materials databases. 541678 (Georgia Tech). Any opinions, findings and conclusions or American Ceramic Society Bulletin 2011;90:28–32. recommendations expressed in this publication are those of the [18] Milton GW. The theory of composites. Cambridge monographs on applied and computational mathematics, 2001. author(s) and do not necessarily reflect the views of the NSF or [19] Torquato S. Random heterogeneous materials. New York: Springer-Verlag; ONR. 2002. 22 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25

[20] Fullwood DT, Niezgoda SR, Adams BL, Kalidindi SR. Microstructure sensitive [51] Chen Y. Reformulation of microscopic balance equations for multiscale design for performance optimization. Progress in Materials Science 2010;55: materials modeling. Journal of Chemical Physics 2009;130:134706. 477–562. [52] Wang YU, Jin YM, Cuitino AM, Khachaturyan AG. Nanoscale phase field [21] Niezgoda SR, Yabansu YC, Kalidindi SR. Understanding and visualizing microelasticity theory of dislocations: model and 3-D simulations. Acta microstructure and microstructure variance as a stochastic process. Acta Materialia 2001;49:1847–57. Materialia 2011;59:6387–400. [53] Chen L-Q. Phase-field models for microstructure evolution. Annual Review of [22] Kalidindi SR, Niezgoda SR, Salem AA. Microstructure informatics using Materials Research 2002;32:113–40. higher-order statistics and efficient data-mining protocols. JOM Journal of the [54] Shen C, Wang Y. Modeling dislocation network and dislocation–precipitate Minerals, Metals and Materials Society 2011;63:34–41. interaction at mesoscopic scale using phase field method. International [23] McDowell DL. Materials design: a useful research focus for inelastic behavior Journal of Multiscale Computational Engineering 2003;1:91–104. of structural metals. Theoretical and Applied Fracture Mechanics 2001;37: [55] Belak J, Turchi PEA, Dorr MR, Richards DF, Fattebert J-L, Wickett ME. et 245–59. Special issue on the prospects of mesomechanics in the 21st century: al. Coupling of atomistic and meso-scale phase-field modeling of rapid current thinking on multiscale mechanics problems. solidification, in 2009 APS March Meeting, March 16–20, Pittsburgh, PA, [24] McDowell DL. Viscoplasticity of heterogeneous metallic materials. Materials 2009. Science and Engineering R 2008;62:67–123. [56] Ghosh S, Lee K, Raghavan P. A multi-level computational model for multi- [25] McDowell DL. A perspective on trends in multiscale plasticity. International scale damage analysis in composite and porous materials. International Journal of Plasticity 2010;26:1280–309. Journal of Solids and Structures 2001;38:2335–85. [26] Ziegler H. In: Becker E, Budiansky B, Koiter WT, Lauwerier HA, editors. An [57] Liu WK, Karpov EG, Zhang S, Park HS. An introduction to computational introduction to thermomechanics. 2nd ed. North Holland series in applied nanomechanics and materials. Computer Methods in Applied Mechanics and mathematics and mechanics, vol. 21. Amsterdam–New York: North Holland; Engineering 2004;193:1529–78. 1983. [58] Liu WK, Park HS, Qian D, Karpov EG, Kadowaki H, Wagner GJ. Bridging scale [27] Drucker DC. Material response and continuum relations: or from microscales methods for nanomechanics and materials. Computer Methods in Applied to macroscales. ASME Journal of Engineering Materials and Technology 1984; Mechanics and Engineering 2006;195:1407–21. 106:286–9. [59] Oden JT, Prudhomme S, Romkes A, Bauman PT. Multiscale modeling of [28] McDowell DL. Internal state variable theory. In: Yip S, Horstemeyer MF, physical phenomena: adaptive control of models. SIAM Journal on Scientific editors. Handbook of materials modeling, Part A: methods. the Netherlands: Computing 2006;29:2359–89. Springer; 2005. p. 1151–70. [60] Ghosh S, Bai J, Raghavan P. Concurrent multi-level model for damage [29] Gumbsch P. An atomistic study of brittle fracture: toward explicit failure evolution in micro structurally debonding composites. Mechanics of criteria from atomistic modeling. Journal of Materials Research 1995;10: Materials 2007;39:241–66. 2897–907. [61] Nuggehally MA, Shephard MS, Picu CR, Fish J. Adaptive model selection [30] Weinan E, Huang Z. Matching conditions in atomistic-continuum modeling procedure for concurrent multiscale problems. Journal of Multiscale of materials. Physical Review Letters 2001;87:135501. Computational Engineering 2007;5:369–86. [31] Shilkrot LE, Curtin WA, Miller RE. A coupled atomistic/continuum model of [62] Hao S, Moran B, Liu WK, Olson GB. A hierarchical multi-physics model for defects in solids. Journal of the Mechanics and Physics of Solids 2002;50: design of high toughness steels. Journal of Computer-Aided Materials Design 2085–106. 2003;10:99–142. [32] Shilkrot LE, Miller RE, Curtin WA. multiscale plasticity modeling: coupled [63] Hao S, Liu WK, Moran B, Vernerey F, Olson GB. Multi-scale constitutive model atomistics and discrete dislocation mechanics. Journal of the Mechanics and and computational framework for the design of ultra-high strength, high Physics of Solids 2004;52:755–87. toughness steels. Computer Methods in Applied Mechanics and Engineering [33] Shiari B, Miller RE, Curtin WA. Coupled atomistic/discrete dislocation 2004;193:1865–908. simulations of nanoindentation at finite temperature. ASME Journal of [64] Shenoy M, Tjiptowidjojo Y, McDowell DL. Microstructure-sensitive modeling Engineering Materials and Technology 2005;127:358–68. of polycrystalline in 100. International Journal of Plasticity 2008;24: [34] Qu S, Shastry V, Curtin WA, Miller RE. A finite-temperature dynamic 1694–730. coupled atomistic/discrete dislocation method. Modelling and Simulation in [65] Groh S, Marin EB, Horstemeyer MF, Zbib HM. Multiscale modeling of the Materials Science and Engineering 2005;13:1101–18. plasticity in an aluminum single crystal. International Journal of Plasticity [35] Nair AK, Warner DH, Hennig RG, Curtin WA. Coupling quantum and 2009;25:1456–73. continuum scales to predict crack tip dislocation nucleation. Scripta Materialia 2010;63:1212–5. [66] Suquet PM. Homogenization techniques for composite media. Lecture notes in physics, Berlin: Springer; 1987. [36] Tadmor EB, Ortiz M, Phillips R. Quasicontinuum analysis of defects in solids. Philosophical Magazine A 1996;73:1529–63. [67] Mura T. Micromechanics of defects in solids. 2nd ed. The Netherlands: Kluwer [37] Tadmor EB, Phillips R, Ortiz M. Mixed atomistic and continuum models of Academic Publishers; 1987. deformation in solids. Langmuir 1996;12:4529–34. [68] Lebensohn RA, Liu Y, Ponte Castañeda P. Macroscopic properties and field [38] Shenoy VB, Miller R, Tadmor EB, Phillips R, Ortiz M. Quasicontinuum models fluctuations in model power-law polycrystals: full-field solutions versus self- of interfacial structure and deformation. Physical Review Letters 1998;80: consistent estimates. Proceedings of the Royal Society of London A 2004;460: 742–5. 1381–405. [39] Miller R, Tadmor EB, Phillips R, Ortiz M. Quasicontinuum simulation of [69] Warner DH, Sansoz F, Molinari JF. Atomistic based continuum investigation fracture at the atomic scale. Modelling and Simulation in Materials Science of plastic deformation in nanocrystalline copper. International Journal of and Engineering 1998;6:607–38. Plasticity 2006;22:754–74. [40] Shenoy VB, Miller R, Tadmor EB, Rodney D, Phillips R, Ortiz M. An adaptive [70] Capolungo L, Spearot DE, Cherkaoui M, McDowell DL, Qu J, Jacob K. finite element approach to atomic-scale mechanics — the quasicontinuum dislocation nucleation from bicrystal interfaces and grain boundary ledges: method. Journal of the Mechanics and Physics of Solids 1999;47:611–42. relationship to nanocrystalline deformation. Journal of the Mechanics and [41] Curtarolo S, Ceder G. Dynamics of an inhomogeneously coarse grained Physics of Solids 2007;55:2300–27. multiscale system. Physical Review Letters 2002;88:255504. [71] Needleman A, Rice JR. Plastic creep flow effects in the diffusive cavitation of [42] Zhou M, McDowell DL. Equivalent continuum for dynamically deforming grain-boundaries. Acta Metallurgica 1980;28:1315–32. atomistic particle systems. Philosophical Magazine A 2002;82:2547–74. [72] Moldovan D, Wolf D, Phillpot SR, Haslam AJ. Role of grain rotation during [43] Chung PW, Namburu RR. On a formulation for a multiscale atomistic- grain growth in a columnar microstructure by mesoscale simulation. Acta continuum homogenization method. International Journal of Solids and Materialia 2002;50:3397–414. Structures 2003;40:2563–88. [73] Cleri F, D’Agostino G, Satta A, Colombo L. Microstructure evolution from the [44] Zhou M. Thermomechanical continuum representation of atomistic deforma- atomic scale up. Compuational Materials Science 2002;24:21–7. tion at arbitrary size scales. Proceedings of the Royal Society A 2005;461: [74] Kouznetsova V, Geers MGD, Brekelmans WAM. Multi-scale constitutive mod- 3437–72. elling of heterogeneous materials with a gradient-enhanced computational [45] Dupuy LM, Tadmor EB, Miller RE, Phillips R. Finite-temperature quasicon- homogenization scheme. International Journal for Numerical Methods in tinuum: molecular dynamics without all the atoms. Physical Review Letters Engineering 2002;54:1235–60. 2005;95:060202. [75] Kouznetsova VG, Geers MGD, Brekelmans WAM. Multi-scale second-order [46] Ramasubramaniam A, Carter EA. Coupled quantum-atomistic and quantum- computational homogenization of multi-phase materials: a nested finite continuum mechanics methods in materials research. MRS Bulletin 2007;32: element solution strategy. Computer Methods in Applied Mechanics and 913–8. Engineering 2004;193:5525–50. [47] Rudd RE, Broughton JQ. Coarse-grained molecular dynamics and the atomic [76] Larsson R, Diebels S. A second-order homogenization procedure for multi- limit of finite elements. Physical Review B 1998;58:R5893–6. scale analysis based on micropolar kinematics. International Journal of [48] Rudd RE, Broughton JQ. Concurrent coupling of length scales in solid state Numerical Methods in Engineering 2007;69:2485–512. systems. Physica Status Solidi B-Basic Research 2000;217:251–91. [77] Vernerey F, Liu WK, Moran B. Multi-scale micromorphic theory for [49] Rudd RE, Broughton JQ. Coarse-grained molecular dynamics: nonlinear finite hierarchical materials. Journal of the Mechanics and Physics of Solids 2007; elements and finite temperature. Physical Review B 2005;72:144104. 55:2603–51. [50] Kulkarni Y, Knap J, Ortiz M. A variational approach to coarse-graining of [78] Luscher DJ, McDowell DL, Bronkhorst CA. A second gradient theoretical equilibrium and non-equilibrium atomistic description at finite temperature. framework for hierarchical multiscale modeling of materials. International Journal of the Mechanics and Physics of Solids 2008;56:1417–49. Journal of Plasticity 2010;26:1248–75. J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 23

[79] Luscher DJ, McDowell DL, Bronkhorst CA. Essential features of fine scale [112] Sigmund O. Materials with prescribed constitutive parameters: an inverse boundary conditions for second gradient multiscale homogenization of sta- homogenization problem. International Journal of Solids and Structures tistical volume elements, Journal of Multiscale Computational Engineering 1994;31:2313–29. (in press). [113] Sigmund O. Tailoring materials with prescribed elastic properties. Mechanics [80] Amodeo RJ, Ghoniem NM. A review of experimental-observations and of Materials 1995;20:351–68. theoretical-models of dislocation cells and subgrains. Res Mechanica 1988; [114] Gibiansky LV, Sigmund O. Multiphase composites with extremal bulk 23:137–60. modulus. Journal of the Mechanics and Physics of Solids 2000;48:461–98. [81] Kubin LP, Canova G. The modeling of dislocation patterns. Scripta Metallur- [115] Neves MM, Rodrigues H, Guedes JM. Optimal design of periodic linear elastic gica 1992;27:957–62. microstructures. Computers & Structures 2000;76:421–9. [82] Groma I. Link between the microscopic and mesocopic length-scale [116] Gu S, Lu TJ, Evans AG. On the design of two-dimensional cellular metals for description of the collective behavior of dislocations. Physical Review B 1997; combined heat dissipation and structural load capacity. International Journal 56:5807–13. of Heat and Mass Transfer 2001;44:2163–75. [83] Zaiser M. Statistical modeling of dislocation systems. Material Science and [117] Knezevic M, Kalidindi SR. Fast computation of first-order elastic-plastic Engineering A 2001;309–310:304–15. closures for polycrystalline cubic-orthorhombic microstructures. Computa- [84] Arsenlis A, Parks DM. Crystallographic aspects of of geometrically-necessary tional Materials Science 2007;39:643–8. and statistically-stored dislocation density. Acta Materialia 1999;47: [118] Adams BL, Henrie A, Henrie B, Lyon M, Kalidindi SR, Garmestani H. 1597–611. Microstructure-sensitive design of a compliant beam. Journal of the [85] Zbib HM, de la Rubia TD, Bulatov V. A multiscale model of plasticity based Mechanics and Physics of Solids 2001;49:1639–63. on discrete dislocation dynamics. ASME Journal of Engineering Materials and [119] Proust G, Kalidindi SR. Procedures for construction of anisotropic elastic- Technology 2002;124:78–87. plastic property closures for face-centered cubic polycrystals using first- [86] Zbib HM, de la Rubia TD. A multiscale model of plasticity. International order bounding relations. Journal of the Mechanics and Physics of Solids Journal of Plasticity 2002;18:1133–63. 2006;54:1744–62. [87] Zbib HM, Khaleel MA. Recent advances in multiscale modeling of plasticity. [120] Houskamp JR, Proust G, Kalidindi SR. Integration of microstructure-sensitive International Journal of Plasticity 2004;20(6):979–1182. design with finite element methods: elastic-plastic case studies in FCC [88] Rickman JM, LeSar R. Issues in the coarse-graining of dislocation energetics polycrystals. International Journal for Multiscale Computational Engineering and dynamics. Scripta Materialia 2006;54:735–9. 2007;5:261–72. [89] Ohashi T, Kawamukai M, Zbib H. A multiscale approach for modeling scale- [121] Knezevic M, Kalidindi SR, Mishra RK. Delineation of first-order closures for dependent yield stress in polycrystalline metals. International Journal of plastic properties requiring explicit consideration of strain hardening and Plasticity 2007;23:897–914. crystallographic texture evolution. International Journal of Plasticity 2008; [90] Acharya A. A model of crystal plasticity based on the theory of continuously 24:327–42. distributed dislocations. Journal of the Mechanics and Physics of Solids 2001; [122] Wu X, Proust G, Knezevic M, Kalidindi SR. Elastic-plastic property closures 49:761–84. for hexagonal close-packed polycrystalline metals using first-order bounding [91] Acharya A. Driving forces and boundary conditions in continuum dislocation theories. Acta Materialia 2007;55:2729–37. mechanics. Proceedings of the Royal Society of London A 2003;459:1343–63. [123] Shaffer JB, Knezevic M, Kalidindi SR. Building texture evolution networks [92] Acharya A, Beaudoin AJ, Miller R. New perspectives in plasticity theory: for deformation processing of polycrystalline FCC metals using spectral dislocation nucleation, waves and partial continuity of the plastic strain rate. approaches: applications to process design for targeted performance. Mathematics and Mechanics of Solids 2008;13:292–315. International Journal of Plasticity 2010;26:1183–94. [93] Acharya A, Roy A, Sawant A. Continuum theory and methods for coarse- [124] Fullwood DT, Adams BL, Kalidindi SR. Generalized Pareto front methods grained, mesoscopic plasticity. Scripta Materialia 2006;54:705–10. applied to second-order material property closures. Computational Materials [94] Kuhlmann-Wilsdorf D. Theory of plastic deformation: properties of low Science 2007;38:788–99. energy dislocation structures. Materials Science and Engineering A 1989;113: [125] Adams BL, Xiang G, Kalidindi SR. Finite approximations to the second-order 1–41. properties closure in single phase polycrystals. Acta Materialia 2005;53: [95] Leffers T. Lattice rotations during plastic deformation with grain subdivision. 3563–77. Materials Science Forum 1994;157–162:1815–20. [96] Hughes DA, Liu Q, Chrzan DC, Hansen N. Scaling of microstructural [126] Bunge H-J. Texture analysis in materials science. Mathematical methods. parameters: misorientations of deformation induced boundaries. Acta Göttingen: Cuvillier Verlag; 1993. Materialia 1997;45:105–12. [127] Fast T, Niezgoda SR, Kalidindi SR. A new framework for computationally [97] Ortiz M, Repetto EA, Stainier L. A theory of subgrain dislocation structures. efficient structure-structure evolution linkages to facilitate high-fidelity Journal of the Mechanics and Physics of Solids 2000;48:2077–114. scale bridging in multi-scale materials models. Acta Materialia 2011;59: [98] Carstensen C, Hackl K, Mielke A. Non-convex potentials and microstructures 699–707. in finite-strain plasticity. Proceedings of the Royal Society of London A 2002; [128] Kalidindi SR, Niezgoda SR, Landi G, Vachhani S, Fast A. A novel framework 458:299–317. for building materials knowledge systems. Computers Materials & Continua [99] Li J. The mechanics and physics of defect nucleation. MRS Bulletin 2007;32: 2009;17:103–26. 151–9. [129] Landi G, Kalidindi SR. Thermo-elastic localization relationships for multi- [100] Zhu T, Li L, Samanta A, Kim HG, Suresh S. Interfacial plasticity governs strain phase composites. Computers, Materials & Continua 2010;16:273–93. rate sensitivity and ductility in nanostructured metals. Proceedings of the [130] Landi G, Niezgoda SR, Kalidindi SR. Multi-scale modeling of elastic response National Academy of Sciences of the United States of America 2007;104: of three-dimensional voxel-based microstructure datasets using novel 3031–6. DFT-based knowledge systems. Acta Materialia 2009;58:2716–25. [101] Zhu T, Li L, Samanta A, Leach A, Gall K. Temperature and strain-rate [131] Fast T, Kalidindi SR. Formulation and calibration of higher-order elastic dependence of surface dislocation nucleation. Physical Review Letters 2008; localization relationships using the MKS approach. Acta Materialia 2011;59: 100:025502. 4595–605. [102] Deo CS, Srolovitz DJ. First passage time Markov chain analysis of rare events [132] Kröner E. Statistical modelling. In: Gittus J, Zarka J, editors. Modelling small for kinetic Monte Carlo: double kink nucleation during dislocation glide. deformations of polycrystals. London: Elsevier Science Publishers; 1986. Modeling and Simulation in Materials Science and Engineering 2002;10: p. 229–91. 581–96. [133] Kröner E. Bounds for effective elastic moduli of disordered materials. Journal [103] Li J, Kevrekidis PG, Gear CW, Kevrekidis IG. Deciding the nature of the coarse of the Mechanics and Physics of Solids 1977;25:137–55. equation through microscopic simulations: the baby-bathwater scheme. [134] Volterra V. Theory of functionals and integral and integro-differential SIAM Review 2007;49:469–87. equations. New York: Dover; 1959. [104] Needleman A. Micromechanical modelling of interfacial decohesion. Ultra- [135] Wiener N. Nonlinear problems in random theory. Cambridge, MA: MIT Press; microscopy 1992;40(3):203–14. 1958. [105] Camacho G, Ortiz M. Computational modeling of impact damage in brittle [136] Garmestani H, Lin S, Adams BL, Ahzi S. Statistical continuum theory for large materials. International Journal of Solids Structures 1996;33:2899–938. plastic deformation of polycrystalline materials. Journal of the Mechanics and [106] Zhai J, Tomar V, Zhou M. Micromechanical simulation of dynamic fracture Physics of Solids 2001;49:589–607. using the cohesive finite element method. Journal of Engineering Materials [137] Lin S, Garmestani H. Statistical continuum mechanics analysis of an elastic and Technology, ASME 2004;126:179–91. two-isotropic-phase composite material. Composites Part B: Engineering [107] Tomar V, Zhou M. Deterministic and stochastic analyses of fracture processes 2000;31:39–46. in a brittle microstructure system. Engineering Fracture Mechanics 2005;72: [138] Mason TA, Adams BL. Use of microstructural statistics in predicting 1920–41. polycrystalline material properties. Metallurgical and Materials Transactions [108] Hao S, Moran B, Liu W-K, Olson GB. A hierarchical multi-physics model for A: Physical and Materials Science 1999;30:969–79. design of high toughness steels. Journal of Computer-Aided Materials Design [139] Garmestani H, Lin S, Adams BL. Statistical continuum theory for inelastic 2003;10:99–142. behavior of a two-phase medium. International Journal of Plasticity 1998; [109] CES selector, Granta Material , 2008. 14:719–31. [110] Bendsoe MP, Sigmund O. Topology optimization theory, methods and [140] Beran MJ, Mason TA, Adams BL, Olsen T. Bounding elastic constants of an applications. New York: Springer-Verlag; 2003. orthotropic polycrystal using measurements of the microstructure. Journal [111] Sigmund O, Torquato S. Design of materials with extreme thermal expansion of the Mechanics and Physics of Solids 1996;44:1543–63. using a three-phase topology optimization method. Journal of the Mechanics [141] Beran MJ. Statistical continuum theories. New York: Interscience Publishers; and Physics of Solids 1997;45:1037–67. 1968. 24 J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25

[142] Milton GW. In: Ciarlet PG, Iserles A, Khon RV, Wright MH, editors. The [173] Choi H-J, McDowell DL, Rosen D, Allen JK, Mistree F. An inductive design theory of composites. Cambridge monographs on applied and computational exploration method for robust multiscale materials design. ASME Journal of mathematics, Cambridge: Cambridge University Press; 2002. Mechanical Design 2008;130: (031402)1–13. [143] Cherry JA. Distortion analysis of weakly nonlinear filters using Volterra series. [174] Kennedy MC, O’Hagan A. Bayesian calibration of computer models. Journal Ottawa: National Library of Canada; 1995. of the Royal Statistical Society: Series B (Statistical Methodology) 2001;63: [144] Wray J, Green G. Calculation of the Volterra kernels of non-linear dynamic 425–64. systems using an artificial neural network. Biological Cybernetics 1994;71: [175] Helton JC, Oberkampf WL. Alternative representations of epistemic uncer- 187–95. tainty. Reliability Engineering & System Safety 2004;85:1–10. [145] Korenberg M, Hunter I. The identification of nonlinear biological systems: [176] Lee SH, Chen W. A comparative study of uncertainty propagation methods volterra kernel approaches. Annals of Biomedical Engineering 1996;24: for black-box-type problems. Structural and Mulidisciplinary Optimization 250–68. 2009;37:239–53. [146] Lee YW, Schetzen M. Measurement of the wiener kernels of a non-linear [177] Yin X, Lee S, Chen W, Liu WK, Horstemeyer MF. Efficient random field system by cross-correlation. International Journal of Control 1965;2:237–54. uncertainty propagation in design using multiscale analysis. Journal of [147] Cockayne E, van de Walle A. Building effective models from sparse but precise Mechancial Design 2009;131: 021006(1–10). data: application to an alloy cluster expansion model. Physical Review B [178] Ferson S, Kreinovich V, Hajagos J, Oberkampf W, Ginzburg L. Experimental 2009;81:012104. uncertainty estimation and statistics for data having interval uncertainty, [148] Kerschen G, Golinval J-C, Hemez FM. Bayesian model screening for the Sandia National Laboratories, 2007, Report Number: SAND2007-0939. identification of nonlinear mechanical structures. Journal of Vibration and Acoustics 2003;125:389–97. [179] Zabaras N. Advanced computational techniques for materials-by-design, in [149] Buede DM. The engineering design of systems: models and methods. New Joint Contractors Meeting of the AFOSR Applied Analysis and Computational York, N.Y: John Wiley & Sons, Inc; 2000. Mathematics Programs, Long Beach, CA, 2006 (August 7–11). [150] AIAA, Guidelines for the verification and validation of computational fluid [180] Allen JK, Seepersad CC, Choi H-J, Mistree F. Robust design for multidis- dynamics simulations, 1998, Report Number: AIAA-G-077-1998. ciplinary and multiscale applications. ASME Journal of Mechanical Design [151] Sivel VG, Van den Brand J, Wang WR, Mohdadi H, Tichelaar FD, Alkemade P, 2006;128:832–43. et al. Application of the dual-beam fib/sem to metals research. Journal of [181] Gano SE, Agarwal H, Renaud JE, Tovar A. Reliability-based design using Microscopy 2004;21:237–45. variable fidelity optimization. Structure and Infrastructure Engineering: [152] Giannuzzi LA, Stevie FA. Introduction to focused ion beams : instrumentation, Maintenance, Management, Lifecycle 2006;2:247–60. theory, techniques, and practice. New York: Springer; 2005. [182] Taguchi G. Introduction to quality engineering. New York: UNIPUB; 1986. [153] Spowart J, Mullens H, Puchala B. Collecting and analyzing microstructures in [183] Chen W, Allen JK, Mistree F. A robust concept exploration method three dimensions: a fully automated approach. JOM Journal of the Minerals, for enhancing productivity in concurrent systems design. Concurrent Metals and Materials Society 2003;55:35–7. Engineering: Research and Applications 1997;5:203–17. [154] Spowart JE. Automated serial sectioning for 3-d analysis of microstructures. [184] Mistree F, Hughes O, Bras B. The compromise decision support problem and Scripta Materialia 2006;55:5–10. the adaptive linear programming algorithm. In: Kamat M, editor. Structural [155] Salvo L, Cloetens P, Maire E, Zabler S, Blandin JJ, Buffiére JY, et al. X-ray micro- optimization: status and promise. Washington, DC: AIAA; 1992. p. 251–90. tomography an attractive characterisation technique in materials science. [185] Chen W, Allen JK, Tsui K-L, Mistree F. A procedure for robust design: Nuclear Instruments and Methods in Physics Research Section B: Beam minimizing variations caused by noise factors and control factors. ASME Interactions with Materials and Atoms 2003;200:273–86. Journal of Mechanical Design 1996;118:478–85. [156] Midgley PA, Weyland M. 3-D electron microscopy in the physical sciences: [186] Simpson T, Maier J, Mistree F. Product platform design: method and the development of Z-contrast and EFTEM tomography. Ultramicroscopy application. Research in Engineering Design 2001;13:2–22. 2003;96:413–31. [187] Sues RH, Oakley DR, Rhodes GS. Multidisciplinary stochastic optimization, in: [157] Stiénon A, Fazekas A, Buffière JY, Vincent A, Daguier P, Merchi F. A 10th conference on engineering mechanics, Boulder, CO, 1995, 934–7. new methodology based on X-ray micro-tomography to estimate stress [188] Oakley DR, Sues RH, Rhodes GS. Performance optimization of multidisci- concentrations around inclusions in high strength steels. Materials Science plinary mechanical systems subject to uncertainties. Probabilistic Engineer- and Engineering: A 2009;513–514:376–83. ing Mechanics 1998;13:15–26. [158] Proudhon H, Buffière JY, Fouvry S. Three-dimensional study of a fretting crack using synchrotron X-ray micro-tomography. Engineering Fracture [189] Du X, Chen W. Efficient uncertainty analysis methods for multidisciplinary Mechanics 2007;74(5):782–93. robust design. AIAA Journal 2002;40:545–52. [159] Tiseanu I, Craciunescu T, Petrisor T, Corte AD. 3-D X-ray micro-tomography [190] Gu X, Renaud JE, Batill SM, Brach RM, Budhiraja AS. Worst case propagated for modeling of NB3SN multifilamentary superconducting wires. Fusion uncertainty of multidisciplinary systems in robust design optimization. Engineering and Design 2007;82(5–14):1447–53. Structural and Multidisciplinary Optimization 2000;20:190–213. [160] Miller MK, Forbes RG. Atom probe tomography. Materials Characterization [191] Liu H, Chen W, Kokkolaras M, Papalambros PY, Kim HM. Probabilistic 2009;60(6):461–9. analytical target cascading: a moment matching formulation for multilevel [161] Arslan I, Marquis EA, Homer M, Hekmaty MA, Bartelt NC. Towards better optimization under uncertainty. Journal of Mechanical Design 2006;128: 3-D reconstructions by combining electron tomography and atom-probe 991–1000. tomography. Ultramicroscopy 2008;108(12):1579–85. [192] Kim HM, Michelena NF, Papalambros PY, Jiang T. Target cascading in optimal [162] Rowenhorst DJ, Lewis AC, Spanos G. Three-dimensional analysis of grain system design. Journal of Mechanical Design 2003;125:481–9. topology and interface curvature in a β-titanium alloy. Acta Materialia 2010; [193] Choi H-J, Allen JK, Rosen D, McDowell DL, Mistree F. An inductive design 58(16):5511–9. exploration method for the integrated design of multi-scale materials and [163] Gao X, Przybyla CP, Adams BL. Methodology for recovering and analyzing products, in: Design engineering technical conferences — design automation two-point pair correlation functions in polycrystalline materials. Metallur- conference, Long Beach, CA, 2005, Paper Number DETC2005-85335. gical and Materials Transactions A 2006;37:2379–87. [194] Choi H-J. A robust design method for model and propagated uncertainty, [164] Gokhale AM, Tewari A, Garmestani H. Constraints on microstructural two- Ph.D. Dissertation, The GW Woodruff School of Mechanical Engineering, point correlation functions. Scripta Materialia 2005;53:989–93. Georgia Institute of Technology, Atlanta, GA, 2005. [165] Niezgoda SR, Fullwood DT, Kalidindi SR. Delineation of the space of 2- [195] Choi H-J, McDowell DL, Allen JK, Mistree F. An inductive design exploration point correlations in a composite material system. Acta Materialia 2008;56: method for hierarchical systems design under uncertainty. Engineering 5285–92. Optimization 2008;40:287–307. [166] Niezgoda SR, Turner DM, Fullwood DT, Kalidindi SR. Optimized structure [196] Sinha A, Panchal JH, Allen JK, Mistree F. Managing uncertainty in multiscale based representative volume element sets reflecting the ensemble averaged systems via simulation model refinement, in: ASME 2011 international 2-point statistics. Acta Materialia 2010;58:4432–45. design engineering technical conferences and computers and information [167] McDowell DL, Ghosh S, Kalidindi SR. Representation and computational in engineering conference, August 28–31, Washington, DC, 2011. Paper structure-property relations of random media. JOM Journal of the Minerals, Number: DETC2011/47655. Metals and Materials Society 2011;63:45–51. [197] Panchal JH, Paredis CJJ, Allen JK, Mistree F. A value-of-information based [168] SAMSI workshop on uncertainty quantification in engineering and renewable approach to simulation model refinement. Engineering Optimization 2008; energy, http://www.samsi.info/programs/2011-12-program-uncertainty- 40:223–51. quantification-engineering-and-renewable-energy, accessed May 5, 2012. [169] Thunnissen DP. Propagating and mitigating uncertainty in the design of [198] Messer M, Panchal JH, Krishnamurthy V, Klein B, Yoder PD, Allen JK, et al. complex multidisciplinary systems. Pasadena, CA: California Institute of Model selection under limited information using a value-of-information- Technology; 2005. based indicator. Journal of Mechancial Design 2010;132: 121008(1–13). [170] Kaplan S, Garrick BJ. On the quantitative definition of risk. Risk Analysis 1981; [199] Panchal JH, Choi H-J, Allen JK, McDowell DL, Mistree F. A systems based 1:11–26. approach for integrated design of materials, products, and design process [171] McManus H, Hastings D. A framework for understanding uncertainty chains. Journal of Computer-Aided Materials Design 2007;14:265–93. and its mitigation and exploitation in complex systems. IEEE Engineering [200] Howard R. Information value theory. IEEE Transactions on Systems Science Management Review 2006;34(3):81–94. and Cybernetics 1966;SSC-2:779–83. [172] Chen W, Baghdasaryan L, Buranathiti T, Cao J. Model validation via [201] Lawrence DB. The economic value of information. New York: Springer; 1999. uncertainty propagation and data transformations. AIAA Journal 2004;42: [202] Panchal JH, Allen JK, Paredis CJJ, Mistree F. Simulation model refinement 1406–15. for decision making via a value-of-information based metric, in: Design J.H. Panchal et al. / Computer-Aided Design 45 (2013) 4–25 25

automation conference, Philadelphia, PA, 2006. Paper Number: DETC2006- 45th AIAA/ASME/ASCE/AHS/ASC structures, structural dynamics & materials 99433. conference, Palm Springs, CA, 2004. Paper Number: AIAA-2004-2009. [203] Messer M, Panchal JH, Allen JK, Mistree F. Model refinement decisions using [212] Malak RJ, Paredis CJJ. Using support vector machines to formalize the valid the process performance indicator. Engineering Optimization 2011;43(7): input domain in predictive models in systems design problems. Journal of 741–62. Mechancial Design 2010;132: 101001(1–14). [204] Valerdi R. The constructive systems engineering cost model (Cosysmo), Ph.D. [213] Pedersen K, Emblemsvag J, Bailey R, Allen J, Mistree F. The ‘valida- Dissertation, Industrial and Systems Engineering, University of Southern tion square’ — validating design methods, in: ASME design theory and California, Los Angeles, 2005. methodology conference, New York, 2000. Paper Number: ASME DETC2000/ [205] Oberkampf WL, DeLand SM, Rutherford BM, Diegert KV, Alvin KF. Estimation DTM-14579. of total uncertainty in modeling and simulation, Sandia National Laborato- [214] Seepersad CC, Pedersen K, Emblemsvag J, Bailey RR, Allen JK, Mistree F. ries, 2000, Report Number: SAND2000-0824. [206] Sargent R. Verification and validation of simulation models, in: Winter The validation square: how does one verify and validate a design method? simulation conferences, New Orleans, LA, 2003, 37-48. In: Chen W, Lewis K, Schmidt L, editors. Decision-based design: making [207] Bayarri MJ, Berger JO, Paulo R, Sacks J, Cafeo JA, Cavendish J, et al. A framework effective decisions in product and systems design. New York, NY: ASME for validation of computer models. Technometrics 2007;49:138–54. Press; 2005. [208] Sankararaman S, Mahadevan S. Model validation under epistemic uncer- [215] Li J, Mourelatos ZP, Kokkolaras M, Papalambros PY. Validating designs tainty. Reliability Engineering and System Safety 2011;96:1232–41. through sequential simulation-based optimization, in: ASME 2010 design [209] Chen W, Xiong Y, Tsui K-L, Wang S. A design-driven validation approach using engineering technical conferences & computers in engineering conference, Bayesine prediction models. ASME Journal of Mechanical Design 2008;130: Montreal, Quebec, 2010. Paper Number: DETC2010-28431. 021101(1–12). [216] McDowell DL. Critical path issues in ICME, models, databases, and simulation [210] Malak RJ, Paredis CJJ. Validating behavioral models for reuse. Research in tools needed for the realization of integrated computational materials Engineering Design 2007;18:111–28. engineering. In: Arnold SM, Wong TT, editors. Proc. symposium held at [211] Mocko GM, Panchal JH, Fernández MG, Peak R, Mistree F. Towards materials science and technology 2010. Houston, Tx: ASM International; reusable knowledge-based idealizations for rapid design and analysis, in: 2011. p. 31–7.