<<

Theory and Modeling in Nanoscience

Report of the May 10–11, 2002, Workshop Conducted by the Basic Energy and Advanced Scientific Computing Advisory Committees to the Office of , Department of Energy Cover illustrations: TOP LEFT: Ordered lubricants confined to nanoscale gap (Peter Cummings). BOTTOM LEFT: Hypothetical spintronic quantum computer (Sankar Das Sarma and Bruce Kane). TOP RIGHT: Folded spectrum method for free-standing quantum dot (Alex Zunger). MIDDLE RIGHT: Equilibrium structures of bare and chemically modified gold nanowires (Uzi Landman). BOTTOM RIGHT: Organic oligomers attracted to the surface of a quantum dot (F. W. Starr and S. C. Glotzer). Theory and Modeling in Nanoscience

Report of the May 10–11, 2002, Workshop Conducted by the Basic Energy Sciences and Advanced Scientific Computing Advisory Committees to the Office of Science, Department of Energy

Organizing Committee C. William McCurdy Co-Chair and BESAC Representative Lawrence Berkeley National Laboratory Berkeley, CA 94720

Ellen Stechel Co-Chair and ASCAC Representative Ford Motor Company Dearborn, MI 48121

Peter Cummings The University of Tennessee Knoxville, TN 37996

Bruce Hendrickson Sandia National Laboratories Albuquerque, NM 87185

David Keyes Old Dominion University Norfolk, VA 23529

This work was supported by the Director, Office of Science, Office of Basic Energy Sciences and Office of Advanced Scientific Computing , of the U.S. Department of Energy.

Table of Contents

Executive Summary...... 1

I. Introduction...... 3 A. The Purpose of the Workshop...... 3 B. Parallel Dramatic Advances in and Theory...... 3 C. The Central Challenge ...... 5 D. Key Barriers to Progress in Theory and Modeling in Nanoscience...... 6 E. Consensus ...... 7 F. Opportunity for an Expanded Role for the Department of Energy...... 7 G. Need for Computational Resources and Readiness to Use Them...... 8 H. Summary of Specific Challenges and Opportunities...... 9 Giant Magnetoresistance in Magnetic Storage ...... 11

II. Theory, Modeling, and Simulation in Nanoscience ...... 12 A. Nano Building Blocks...... 14 in Nanostructures: Electronic Devices ...... 14 Optical Properties on the Nanoscale: Optoelectronic Devices ...... 15 Coherence/Decoherence Tunneling: Quantum Computing...... 15 Soft/Hard Matter Interfaces: Biosensors...... 15 Spintronics: Information ...... 16 Implications for Theory and Modeling...... 16 B. Complex Nanostructures and Interfaces ...... 17 C. Dynamics, Assembly, and Growth of Nanostructures...... 21

III.The Role of and in Nanoscience...... 24 A. Bridging Time and Length Scales...... 26 B. Fast ...... 31 Linear Algebra in Electronic Structure Calculations...... 32 Monte Carlo Techniques...... 33 Data Exploration and ...... 34 Computational Geometry...... 35 C. Optimization and Predictability...... 35 Optimization ...... 35 Predictability...... 37 ...... 37 Appendix A: Workshop Agenda ...... 39

Appendix B: Workshop Participants...... 41

Executive Summary

On May 10 and 11, 2002, a workshop sence of quantitative models that describe entitled “Theory and Modeling in Nano- newly observed phenomena increasingly science” was held in San Francisco to iden- limits progress in the field. A clear consen- tify challenges and opportunities for theory, sus emerged at the workshop that without modeling, and simulation in nanoscience new, robust tools and models for the quan- and and to investigate the titative description of structure and dynam- growing and promising role of applied ics at the nanoscale, the research community mathematics and computer science in meet- would miss important scientific opportuni- ing those challenges. A broad selection of ties in nanoscience. The absence of such university and national laboratory scientists tools would also seriously inhibit wide- contributed to the workshop, which included spread applications in fields of nanotechnol- scientific presentations, a panel discussion, ogy ranging from molecular electronics to breakout sessions, and short white papers. biomolecular materials. To realize the un- mistakable promise of theory, modeling, and Revolutionary New Capabilities in Theory, simulation in overcoming fundamental Modeling, and Simulation challenges in nanoscience requires new During the past 15 years, the fundamental human and computer resources. techniques of theory, modeling, and - tion have undergone a revolution that paral- Fundamental Challenges and lels the extraordinary experimental advances Opportunities on which the new field of nanoscience is With each fundamental intellectual and based. This period has seen the development computational challenge that must be met of density functional algorithms, quantum in nanoscience comes opportunities for re- Monte Carlo techniques, ab initio molecular search and discovery utilizing the ap- dynamics, advances in classical Monte Carlo proaches of theory, modeling, and simula- methods and mesoscale methods for soft tion. In the broad topical areas of (1) nano matter, and fast-multipole and multigrid al- building blocks (nanotubes, quantum dots, gorithms. Dramatic new insights have come clusters, and nanoparticles), (2) complex from the application of these and other new nanostructures and nano-interfaces, and (3) theoretical capabilities. Simultaneously, ad- the assembly and growth of nanostructures, vances in computing hardware increased the workshop identified a large number of computing power by four orders of magni- theory, modeling, and simulation challenges tude. The combination of new theoretical and opportunities. Among them are: methods together with increased computing • to bridge electronic through macroscopic power has made it possible to simulate sys- length and time scales tems with millions of degrees of freedom. • to determine the essential science of Unmistakable Promise of Theory, transport mechanisms at the nanoscale Modeling, and Simulation • to devise theoretical and simulation ap- The application of new and extraordinary proaches to study nano-interfaces, which experimental tools to nanosystems has cre- dominate nanoscale systems and are ated an urgent need for a quantitative under- necessarily highly complex and hetero- standing of matter at the nanoscale. The ab- geneous

1 • to simulate with reasonable accuracy the of finer scales), to new numerical algo- optical properties of nanoscale structures rithms, like the fast-multipole methods that and to model nanoscale opto-electronic make very large scale devices calculations possible. Some of the mathe- matics of likely interest (perhaps the most • to simulate complex nanostructures in- important mathematics of interest) is not volving “soft” biologically or organi- fully knowable at the present, but it is clear cally based structures and “hard” in- that collaborative efforts between scientists organic ones as well as nano-interfaces in nanoscience and applied mathematicians between hard and soft matter can yield significant advances central to a • to simulate self-assembly and directed successful national nanoscience initiative. self-assembly The Opportunity for a New Investment • to devise theoretical and simulation ap- proaches to quantum coherence, deco- The consensus of the workshop is that the herence, and spintronics country’s investment in the national nano- science initiative will pay greater scientific • to develop self-validating and bench- dividends if it is accelerated by a new in- marking methods vestment in theory, modeling, and simula- tion in nanoscience. Such an investment can The Role of Applied Mathematics stimulate the formation of alliances and Since mathematics is the language in which teams of experimentalists, theorists, applied theory is expressed and advanced, develop- mathematicians, and computer and compu- ments in applied mathematics are central to tational scientists to meet the challenge of the success of theory, modeling, and simu- developing a broad quantitative under- lation for nanoscience, and the workshop standing of structure and dynamics at the identified important roles for new applied nanoscale. mathematics in the above-mentioned chal- The Department of Energy is uniquely lenges. Novel applied mathematics is re- situated to build a successful program in quired to formulate new theory and to de- theory, modeling, and simulation in nano- velop new computational algorithms appli- science. Much of the nation’s experimental cable to complex systems at the nanoscale. work in nanoscience is already supported by The discussion of applied mathematics the Department, and new facilities are being at the workshop focused on three areas that built at the DOE national laboratories. The are directly relevant to the central challenges Department also has an internationally re- of theory, modeling, and simulation in nano- garded program in applied mathematics, and science: (1) bridging time and length scales, much of the foundational work on mathe- (2) fast algorithms, and (3) optimization and matical modeling and computation has predictability. Each of these broad areas has emerged from DOE activities. Finally, the a recent track record of developments from Department has unique resources and expe- the applied mathematics community. Recent rience in high performance computing and advances range from fundamental ap- algorithms. The combination of these areas proaches, like mathematical homogenization of expertise makes the Department of En- (whereby reliable coarse-scale results are ergy a natural home for nanoscience theory, made possible without detailed knowledge modeling, and simulation.

2 I. Introduction

A. The Purpose of the Workshop

On May 10 and 11, 2002, a workshop enti- participants from the DOE labs. This report tled “Theory and Modeling in Nanoscience” is the result of those contributions and the was held in San Francisco, California, sup- discussions at the workshop. ported by the offices of Basic Energy Sci- This workshop report should be read in ence and Advanced Scientific Computing the context of other documents that define Research of the Department of Energy. The and support the National Nanotechnology Basic Energy Sciences Advisory Committee Initiative. Those documents describe a broad and the Advanced Scientific Computing range of applications that will benefit the Advisory Committee convened the work- principal missions of the Department of En- shop to identify challenges and opportunities ergy, ranging from new materials and the for theory, modeling, and simulation in energy efficiencies they make possible, to nanoscience and nanotechnology, and addi- improved chemical and biological sensing. tionally to investigate the growing and Key among those reports is the one from the promising role of applied mathematics and Office of Basic Energy Sciences entitled computer science in meeting those chal- “Nanoscale Science, and Tech- lenges. The workshop agenda is reproduced nology Research Directions” (http://www. in Appendix A. sc.doe.gov/production/bes/nanoscale.html), A broad selection of university and na- which points out the great need for theory tional laboratory scientists were invited, and and modeling. Other nanoscience documents about fifty were able to contribute to the from the Department of Energy can be workshop. The participants are listed in Ap- found at http://www.er.doe.gov/production/ pendix B. There were scientific presenta- bes/NNI.htm, and a number of reports from tions, a panel discussion, and breakout ses- other agencies and groups are linked to the sions, together with written contributions in Web site of the National Nanotechnology the form of short white papers from those Initiative, http://www.nano.gov/.

B. Parallel Dramatic Advances in Experiment and Theory

The context of the workshop was apparent ated new capabilities for characterizing at the outset. The rapid rise of the field of nanostructures. The invention of atomic nanoscience is due to the appearance over force microscopy produced not only a tool the past 15 years of a collection of new ex- for characterizing objects at the nanoscale perimental techniques that have made ma- but one for manipulating them as well. The nipulation and construction of objects at the array of experimental techniques for con- nanoscale possible. Indeed, the field has trolled fabrication of nanotubes and nano- emerged from those new experimental tech- crystals, together with methods to fabricate niques. quantum dots and wells, produced an en- tirely new set of elementary nanostructures. Some of those experimental methods, Combinatorial and genetic tech- such as scanning tunneling microscopy, cre- niques have opened the door to the synthesis

3 of new biomolecular materials and the • New mesoscale methods (including dis- creation of nano-interfaces and nano- sipative particle dynamics and field- interconnects between hard and soft matter. theoretic polymer simulation) have been Nanoscience arose from the entire ensemble developed for describing systems with of these and other new experimental tech- long relaxation times and large spatial niques, which have created the building scales, and are proving useful for the blocks of nanotechnology. rapid prototyping of nanostructures in Over the same 15-year period, the fun- multicomponent polymer blends. damental techniques of theory, modeling, • Quantum Monte Carlo methods now and simulation that are relevant to matter at promise to provide nearly exact descrip- the nanoscale have undergone a revolution tions of the electronic structures of that has been no less stunning. The advances molecules. of computing hardware over this period are • universally familiar, with computing power The Car-Parrinello method for ab initio increasing by four orders of magnitude, as molecular dynamics with simultaneous can be seen by comparing the Gordon Bell computation of electronic wavefunctions Prizes in 1988 (1 Gflop/s) and 2001 (11 and interatomic forces has opened the Tflop/s). But as impressive as the increase in way for exploring the dynamics of mole- computing power has been, it is only part of cules in condensed media as well as the overall advance in theory, modeling, and complex interfaces. simulation that has occurred over the same The tools of theory have advanced as period. This has been the period in which: much as the experimental tools in nano- science over the past 15 years. It has been a • Density functional theory (DFT) trans- true revolution. formed theoretical chemistry, surface science, and materials and has The rise of fast workstations, cluster created a new ability to describe the computing, and new generations of mas- electronic structure and interatomic sively parallel computers complete the pic- forces in molecules with hundreds and ture of the transformation in theory, model- sometimes thousands of atoms ing, and simulation over the last decade and (Figure 1). a half. Moreover, these hardware (and basic software) tools are continuing on the • Molecular dynamics with fast multipole Moore’s Law exponential trajectory of im- methods for computing long-range inter- provement, doubling the computing power atomic forces have made accurate cal- available on a single chip every 18 months. culations possible on the dynamics of Computational Grids are emerging as the millions and sometimes billions of next logical extension of cluster and parallel atoms. computing. • Monte Carlo methods for classical The first and most basic consensus of the simulations have undergone a revolu- workshop is clear: Many opportunities for tion, with the development of a range of discovery will be missed if the new tools of techniques (e.g., parallel tempering, theory, modeling, and simulation are not continuum configurational bias, and ex- fully exploited to confront the challenges of tended ensembles) that permit extraordi- nanoscience. Moreover, new investments by narily fast equilibration of systems with the DOE and other funding agencies will be long relaxation times.

4 required to exploit and develop these tools for effective application in nanoscience. This consensus is not merely specula- tion, but is based on recent experience of the role of theory in the development of nano- technology. Perhaps no example is more celebrated than the role of calculations based on density functional theory in the develop- ment of giant magnetoresistance in magnetic storage systems. The unprecedented speed with which this discovery was exploited in small hard disk drives for computers de- pended on a detailed picture from theoretical simulations of the electronic structure and electron (spin) transport in these systems. Some details of this remarkable story are Figure 1. Nanotubes break by first forming a bond rotation 5-7-7-5 defect. An application of density given in a sidebar to this report (page 11). functional theory and multigrid methods by Buongiorno Nardelli, Yakobson, and Bernholc, Phys. Rev. B and Phys. Rev. Letters (1998).

C. The Central Challenge

The discussions and presentations of the workshop identified many specific funda- mental challenges for theory, modeling, and simulation in nanoscience. However, a cen- tral and basic challenge became clear. Be- cause of the rapid advance of experimental investigations in this area, the need for quantitative understanding of matter at the nanoscale is becoming more urgent, and its absence is increasingly a barrier to progress in the field quite generally. The central broad challenge that emerged from discussions at the workshop can be stated simply: Within five to ten years, there must be robust tools for quan- titative understanding of structure and dy- Figure 2. Calculated current-voltage curve for a namics at the nanoscale, without which the novel memory-switchable resistor with 5µ × 5µ scientific community will have missed many junctions. (Stan Williams, Hewlett-Packard) scientific opportunities as well as a broad devices based on molecular electronics, even range of nanotechnology applications. when they can be built, unless they are thor- The workshop audience was reminded in oughly understood (Figure 2) and manufac- the opening presentation that the electronics turing processes are made predictable and industry will not deploying billions of controllable. The electronics industry must

5 have new simulations and models for nano- The fundamental theory and modeling technology that are at least as powerful and on which industry will build those tools does predictive as the ones in use today for con- not exist. While it is the province of industry ventional integrated circuits before it can to provide its own design tools, it is the role chance marketing molecular electronics de- of basic science to provide the fundamental vices for a myriad of applications. It can be underpinnings on which they are based. argued that biological applications of nano- Those fundamental tools for quantitative technology will require the same level of understanding are also necessary to the pro- quantitative understanding before they are gress of the science itself. widely applied.

D. Key Barriers to Progress in Theory and Modeling in Nanoscience

Much of the current mode of theoretical science arises because theoretical efforts in study in nanoscience follows the traditional separate disciplines are converging on this separation of the practice of experiment intrinsically multidisciplinary field. The from the practice of theory and simulation, specific barrier is the difficulty, given the both separate from the underpinning applied present funding mechanisms and policies, of mathematics and computer science. This is undertaking high-risk but potentially high- not a new , nor does it apply payoff research, especially if it involves ex- only to theory, modeling, and simulation in pensive human or computational resources. nanoscience. Nonetheless, it is a particularly At this early stage in the evolution of the problematic issue for this field. field, it is frequently not at all clear what techniques from the disparate subdisciplines By its very nature, nanoscience involves of , surface sci- multiple length and time scales as well as ence, materials science and engineering, the combination of types of materials and theoretical chemistry, , molecules that have been traditionally stud- and computational will be success- ied in separate subdisciplines. For theory, ful in the new context being created every modeling, and simulation, this means that week by novel . The traditional fundamental methods that were developed in degree of separation of the practice of ex- separate contexts will have to be combined periment from the practice of theory further and new ones invented. This is the key rea- intensifies the challenge. son why an alliance of investigators in nano- science with those in applied mathematics For these reasons, opportunities will be and computer science will be necessary to missed if new funding programs in theory, the success of theory, modeling, and simu- modeling, and simulation in nanoscience do lation in nanoscience. A new investment in not aggressively encourage highly specula- theory, in nano- tive and risky research. At least one experi- science should facilitate the formation of mentalist at this workshop complained that such alliances and teams of theorists, com- high-risk and speculative theoretical and putational scientists, applied mathemati- computational efforts are too rare in this cians, and computer scientists. field, and that sentiment was echoed by a number of theorists. A second impediment to progress in theory, modeling, and simulation in nano-

6 E. Consensus Observations

A broad consensus emerged at the workshop lectual and computational challenges on several key observations. that must be addressed to achieve the full potential of theory, modeling, and • The role of theory, modeling, and simu- simulation. lation in nanoscience is central to the success of the National Nanotechnology • New efforts in applied mathematics, Initiative. particularly in collaboration with theo- rists in nanoscience, are likely to play a • The time is right to increase federal in- key role in meeting those fundamental vestment in theory, modeling, and challenges as well as in developing simulation in nanoscience to accelerate computational algorithms that will be- scientific and technological discovery. come mainstays of computational nano- • While there are many successful theo- science in the . retical and computational efforts yield- ing new results today in nanoscience, there remain many fundamental intel-

F. Opportunity for an Expanded Role for the Department of Energy

The time is ripe for an initiative in theory ties have begun a new generation of and modeling of nanoscale phenomena not computational scientists who understand only because of the magnitude of the poten- scientific issues that bridge disciplines from tial payoff, but also because the odds of physical principles to . achieving breakthroughs are rapidly im- Advances in extensibility and portability proving. make major investments in reusable soft- ware attractive. Attention to validation and Computational simulation is riding a verification has repaired credibility gaps for hardware and software wave that has led to simulation in many areas. revolutionary advances in many fields repre- sented by both continuous and discrete mod- Nanotechnologists reaching toward els. The ASCI and SciDAC initiatives of the simulation to provide missing understanding DOE have fostered capabilities that make will find simulation technology meeting new simulations with millions of degrees of free- challenges with substantial power. However, dom on thousands of processors for tens of theorists and modelers working on the inter- days possible. National security decisions face of nanotechnology and simulation tech- and other matters of policy affecting the en- nology are required to make the connec- vironment and federal investment in unique tions. facilities, such as lasers, accelerators, toka- The workshop focused considerable at- maks, etc. are increasingly being reliably tention on the role of applied mathematics in informed by simulation, as are corporate de- nanoscience. In his presentation, one of the cisions of similar magnitude, such as where scientists working on molecular dynamics to drill for petroleum, what to look for in studies of objects and phenomena at the new pharmaceuticals, and how to design nanoscale expressed the sentiment of the billion-dollar manufacturing lines. Universi-

7 workshop effectively by saying that the role cussed in Section 3 of this report. But this of applied mathematics should be to “make discussion is not (and cannot be) fully com- tractable the problems that are currently im- prehensive due to the breadth of nanoscience possible.” As mathematics is the language in and the unknowable paths that modeling will which models are phrased, developments in follow. applied mathematics are central to the suc- The Department of Energy is uniquely cess of an initiative in theory and modeling situated to build a successful program in for nanoscience. A key challenge in nano- theory, modeling, and simulation in nano- science is the range of length and time scales science. Much of the experimental work in that need to be bridged. It seems likely that nanoscience is already supported by the De- fundamentally new mathematics will be partment, and new facilities are being built. needed to meet this challenge. The Department also has an internationally The diverse phenomena within nano- regarded program in applied mathematics, science will lead to a plethora of models and much of the foundational work on with widely varying characteristics. For this mathematical modeling and computation has reason, it is difficult to anticipate all the ar- emerged from DOE activities. Finally, the eas of mathematics which are likely to con- Department has unique resources and expe- tribute, and serendipity will undoubtedly rience in high performance computing and play a role. As models and their supporting algorithms. The conjunction these areas of mathematics mature, new algorithms will be expertise make the Department of Energy a needed to allow for efficient utilization and natural home for nanoscience theory and application of the models. These issues and modeling. some significant areas of activity are dis-

G. Need for Computational Resources and Readiness to Use Them

The collection of algorithms and computa- full range of resources, from massively par- tional approaches developed over the last 15 allel to workstation and years that form the basis of the revolution in cluster computing. The sentiment that not modeling and simulation in areas relevant to enough resources are available to the com- nanoscience have made this community in- munity, at all levels of computing power, tense users of computing at all levels, in- was nearly universally acknowledged. This cluding the teraflop/s level. This field has community is ready for terascale computing, produced several Gordon Bell Prize winners and it needs considerably more resources for for “fastest application code,” most recently workstation and cluster computing. in the area of magnetic properties of materi- A significant amount of discussion in the als. That work, by a team led by Malcolm breakout sessions focused on scalable algo- Stocks at Oak Ridge National Laboratory, is rithms, meaning algorithms that scale well only one example of computing at the tera- with particle number or dimension as well as scale in nanoscience. with increasing size of parallel computing The workshop did not focus much dis- hardware. The simulation and modeling cussion specifically on the need for compu- community in nanoscience is one of the tational resources, since that need is ubiqui- most computationally sophisticated in all of tous and ongoing. The presentations showed the natural sciences. That fact was demon- computationally intensive research using the strated in the talks and breakout sessions of

8 the workshop, and is displayed particularly algorithms and well characterized nano in the sections of this report devoted to fast building blocks.

H. Summary of Specific Challenges and Opportunities

The sections that follow identify a large are often composed of dissimilar classes number of challenges in theory, modeling, of materials. and simulation in nanoscience together with • To simulate complex nanostructures in- opportunities for overcoming those chal- volving many molecular and atomic spe- lenges. Because an assembly of 50 scientists cies as well as the combination of “soft” and applied mathematicians is too small a biological and/or organic structures and number to represent all the active areas of “hard” inorganic ones. nanoscience and related mathematics, the list compiled at the workshop is necessarily • To devise theoretical and simulation ap- incomplete. However, a summary of even proaches to nano-interfaces between the partial catalog accumulated during the hard and soft matter that will play cen- workshop and described briefly in the fol- tral roles in biomolecular materials and lowing sections should make a compelling their applications. case for investment in theory, modeling, and • To address the central challenge of simulation in nanoscience. The challenges bridging a wide range of length and time and opportunities include: scales so that phenomena captured in • To determine the essential science of atomistic simulations can be modeled at transport mechanisms, including electron the nanoscale and beyond. transport (fundamental to the functional- • To simulate self-assembly, the key to ity of molecular electronics, nanotubes, large-scale production of novel struc- and nanowires), spin transport (funda- tures, which typically involves many mental to the functionality of spintron- temporal and spatial scales and many ics-based devices), and molecule trans- more species than the final product. port (fundamental to the functionality of chemical and biological sensors, mo- • To devise theoretical and simulation ap- lecular separations/membranes, and proaches to quantum coherence and de- nanofluidics). coherence, including tunneling phenom- ena, all of which are central issues for • To simulate with reasonable accuracy using nanotechnology to implement the optical properties of nanoscale quantum computing. structures and to model nanoscale opto- electronic devices, recognizing that in • To devise theoretical and simulation ap- confined dimensions, optical properties proaches to spintronics, capable of accu- of matter are often dramatically altered rately describing the key phenomena in from properties in bulk. -based “spin valves” and “spin .” • To devise theoretical and simulation ap- proaches to study nano-interfaces, which • To develop self-validating and bench- are necessarily highly complex and het- marking for modeling erogeneous in shape and substance, and and simulation, in which a coarser- grained description (whether it is

9 atomistic molecular dynamics or meso- breakout sessions. Three of them focused on scale modeling) is always validated nanoscience directly: against more detailed calculations, since • Well Characterized Nano Building appropriate validating experiments will Blocks often be difficult to perform. • For each entry in this list, and the many Complex Nanostructures and Interfaces that can be added to it, there is an estab- • Dynamics, Assembly, and Growth of lished state of the art together with an array Nanostructures of specific technical issues that must be Three others focused on the role of ap- faced by researchers. For all of the chal- plied mathematics and computer science: lenges listed, there is also an array of fun- damental questions to be addressed by re- • Crossing Time and Length Scales searchers in applied mathematics. • Fast Algorithms This report is a brief synthesis of the • Optimization and Predictability effort of approximately 50 experts in the nanosciences, mathematics, and computer There were, of course, other possible science, drawn from universities, industry, ways to organize the discussions, but the and the national laboratories. Following the outcome would likely have been the same scientific presentations and a panel discus- no matter what the organization of the top- sion on the roles of applied mathematics and ics. Nevertheless, the structure of this report computer science, the principal recommen- reflects this particular selection of categories dations of the workshop were developed in for the breakout sessions.

10 Giant Magnetoresistance in Magnetic Storage

The giant magnetoresistance (GMR) effect was discovered in 1988 and within a decade was in Rparallel Rantiparallel wide commercial use in computer hard disks ip ia (Figure 3) and magnetic sensors. The technol- ogy significantly boosted the amount of informa- Figure 5. Schematic of GMR indicating change in tion that could be recorded on a magnetic sur- resistance accompanying magnetization reversal face (Figure 4). The unprecedented speed of upon sensing an opposing bit. (IBM) application (less than 10 years from discovery to deployment) resulted largely from advances in (up or down). In fact, the magnetic moment itself theory and modeling that explained the micro- results from an imbalance in the total number of scopic quantum-mechanical processes respon- electrons of each spin type. Modern quantum- sible for the GMR effect. mechanical computational methods based on density functional theory (for which Walter Kohn was awarded the 1998 Nobel Prize) then pro- vided a detailed picture of the electronic struc- ture and electron (spin) transport in these sys- tems. Indeed, some unexpected effects, such as spin-dependent channeling, were first predicted on the basis of first-principles calculations and Figure 3. GMR and MR head structures. (IBM) were only later observed experimentally. In fact, GMR is only one aspect of the rich physics associated with the magnetic multilayers now used in read heads. Equally important are oscillatory exchange coupling (which is used to engineer the size of the magnetic field required to switch the device) and exchange bias (which is used to offset the zero of the switching field in order to reduce noise). In exchange coupling, a detailed understanding of the mechanisms re- sponsible has been obtained on the basis of first-principles theory. Less is known about ex- change bias, but significant progress has been made on the basis of simplified models and with first-principles calculations of the magnetic struc- Figure 4. Magnetic head evolution. (IBM) ture at the interface between a ferromagnet and The effect is observed in a pair of ferromag- an antiferromagnet. netic layers separated by a (generally) nonmag- Impressive as these advances in theory and netic spacer. It occurs when the magnetic mo- modeling have been, their application to nano- ment in one ferromagnetic layer is switched from magnetism has only just begun. New synthesis being parallel to the magnetic moment of the techniques have been discovered for magnetic other layer to being antiparallel (Figure 5). When nanowires, nanoparticles, and molecular mag- a read head senses a magnetic “bit” on the stor- nets; magnetic with high Curie age medium, it switches the magnetic moment temperatures have been fabricated; and spin- on one layer and measures the resistance, thus polarized currents have been found to drive sensing the information on the disk. magnetic-domain walls. When understood Soon after the discovery of GMR, the phe- through theory and modeling, these findings also nomenon was seen to be related to the different are likely to lead to technological advances and “resistances” of electrons having different spins commercial applications.

11 II. Theory, Modeling, and Simulation in Nanoscience

The challenges presented by nanoscience pharmaceutical industries in the design of and nanotechnology are not simply re- new products, the design and optimization stricted to the description of nanoscale sys- of manufacturing processes, and the trouble- tems and objects themselves, but extend to shooting of existing processes. However, the their design, synthesis, interaction with the exquisite dependence on details of molecu- macroscopic world, and ultimately large- lar composition and structures at the nano- scale production. Production is particularly scale means that attempting to understand important if the technology is to become nanoscale systems and control the processes useful in society. Furthermore, the experi- to produce them based solely on experi- ence of the participants in the workshop mental characterization is out of the ques- from the engineering community strongly tion. suggests that a commercial enterprise will The field of nanoscience is quite broad not commit to large-scale manufacture of a and encompasses a wide range of yet-to-be- product unless it can understand the material understood phenomena and structures. It is to be manufactured and can control the pro- difficult to define nanoscience precisely, but cess to make products within well-defined a definition consistent with the National tolerance limits. Nanotechnology Initiative is: For macroscopic systems—such as the The study of structures, dynamics, products made daily by the chemical, mate- and properties of systems in which rials, and pharmaceutical industries—that one or more of the spatial dimen- knowledge is often largely or exclusively sions is nanoscopic (1–100 nm), empirical, founded on an experimental char- thus resulting in dynamics and acterization over the ranges of state condi- properties that are distinctly differ- tions encountered in a manufacturing proc- ent (often in extraordinary and un- ess, since macroscopic systems are intrinsi- expected ways that can be favora- cally reproducible. Increasingly, however, bly exploited) from both small- this characterization is based on molecular molecule systems and systems mac- modeling, including all of the tools relevant roscopic in all dimensions. to modeling nanoscale systems, such as electronic structure, molecular simulation, Rational fabrication and integration of and mesoscale modeling methods. Indeed, nanoscale materials and devices offers the the report Technology Vision 2020: The U.S. promise of revolutionizing science and tech- Chemical Industry1 has identified molecular nology, provided that principles underlying modeling as one of the key that their unique dynamics and properties can be the chemical industry needs to revolutionize discovered, understood, and fully exploited. its ability to design and optimize chemicals However, functional nanoscale structures and the processes to manufacture them. often involve quite dissimilar materials (for example, organic or biological in contact Molecular modeling already plays a with inorganic), are frequently difficult to major role in the chemical, materials, and characterize experimentally, and must ulti- mately be assembled, controlled, and util- 1 American Chemical Society et al., 1996, http:// ized by manipulating quantities (e.g., tem- membership.acs.org/i/iec/docs/chemvision2020.pdf. perature, pressure, stress) at the macroscale.

12 This combination of features puts un- • The focus of the third session was dy- precedented demands on theory, modeling, namics, assembly, and growth of nano- and simulation: for example, due to nano- structures, and so was concerned with scale dimensionality, quantum effects are the dynamical aspects of complex nano- often important, and the usual theories valid structures. Hence, transport properties either for bulk systems or for small mole- (such as electron and spin transport and cules break down. Qualitatively new theo- molecular diffusion) of complex nano- ries are needed for this state of matter inter- structures, as well as the dynamic proc- mediate between the atomic scale and a esses leading to their creation, particu- large-enough scale for collective behavior to larly self-assembly, were central to this take over, as well as methods for connecting session. the nanoscale with finer and coarser scales. The reports presented below are summa- Within the context of this definition of ries drafted by the session chairs. Given the nanoscience and the need for significant ad- ostensibly different charges presented to vances in our understanding of the nano- each group, there is remarkable unanimity in scale, three breakout sessions on theory, their conclusions with respect to the out- modeling, and simulation in nanoscale sci- standing theory, modeling, and simulation ence considered three broad classes of nano- needs and opportunities. Among the com- systems: mon themes are: • The first class consisted of nano building • the need for fundamental theory of dy- blocks (such as nanotubes, quantum namical electron structure and transport dots, clusters, and nanoparticles), some • algorithmic advances and conceptual of which can be synthesized quite repro- understanding leading to efficient elec- ducibly and well characterized experi- tronic structure calculations of large mentally. numbers of atoms • The second class consisted of complex 2 • new theoretical treatments to describe nanostructures and nano-interfaces, nano-interfaces and assembly of nano- reflecting the importance of nano- structures interfaces in a wide variety of nanoscale systems. This session was specifically • fundamentally sound methods for asked to consider steady-state properties bridging length and time scales and their of complex nanostructures and nano- integration into seamless modeling envi- interfaces. ronments • commitment to an infrastructure for community-based open-source codes.

2 These common themes have been incor- We can distinguish between interfaces in general porated into the recommendations and chal- and nano-interfaces as follows: Typically, an inter- face separates two bulk phases; we define, consistent lenges outlined in Section I of this report. with the above definition, a nano-interface as one in which the extent of one or more of the phases being separated by the interfaces is nanoscopic. In nano- interfaces we include nano-interconnects, which join two or more structures at the nanoscale. For nano- interfaces, traditional surface science is generally not applicable.

13 A. Nano Building Blocks

Well characterized building blocks for nano- units come to mind which range from clus- science need to be created and quantitatively ters less than 1 nm in size to nanoparticles, understood for a number of crucial reasons. such as aerosols and aerogels much larger A key aspect of construction, whether at the than 100 nm. However, we believe the best- macroscopic or microscopic scale, is the no- characterized and most appropriate building tion of a “building block.” Office buildings blocks are: are often made from bricks and girders, • clusters and molecular nanostructures molecular components from atoms. Just as knowledge of the atom allows us to make • nanotubes and related systems and manipulate molecular species, knowl- • quantum wells, wires, and dots. edge of bricks and mortar allows us to con- struct high-rise buildings. These blocks are well defined by ex- periment and frequently tractable using Well-characterized nano building blocks contemporary theory. Moreover, they have will be the centerpiece of new functional been demonstrated to hold promise in exist- nanomechanical, nanoelectronic, and nano- ing technology. We believe that building magnetic devices. A quantitative under- blocks would have an immediate impact in standing of the transport, dynamics, elec- the five areas described below. tronic, magnetic, thermodynamic, and me- chanical properties is crucial to building at Transport in Nanostructures: the nanoscale. As an example, current theo- Electronic Devices retical approaches to nanoelectronics often cannot accurately predict I-V curves for the Any electronic device needs to control and simplest molecular systems. Without a co- manipulate current. At the nanoscale, new herent and accurate theoretical description phenomena come into play. Classical de- of this most fundamental aspect of devices, scriptions become inoperable compared to progress in this area will necessarily be lim- quantum phenomena such as tunneling, ited. fluctuations, confined dimensionality, and the discreteness of the electric charge. At Theoretical studies for these systems these dimensions, the system will be domi- will necessarily have to be based on ap- nated entirely by surfaces. This presents proximate methods; the systems are simply further complications for electronic materi- too large and too complex to be handled als, which often undergo strong and com- purely by ab initio methods. Even if the plex reconstructions. Defining accurate problems are tractable, we will make much structure at the interface will be difficult. greater progress with well-validated ap- However, without an accurate surface proximate models. Thus a variety of bench- structure, it will be even more difficult to marking methodologies and validation test- compute any transport properties. Moreover, ing are needed to establish the accuracy and current-induced changes within the structure effectiveness of new models and approxi- may occur. This situation will require the mate theory. , but not intractable approach, of At the nanoscale, what are the nano a careful self-consistent solution in a non- building blocks that would play an analo- equilibrium environment. gous role to macro building blocks? Several

14 Optical Properties on the Nanoscale: Components will eventually function Optoelectronic Devices solely through quantum effects, for which we will need the capability to accurately At confined dimensions, optical properties simulate or predict in order to control. It has of matter are often dramatically altered. For been suggested that new computers can be example, silicon is the dominant electronic devised at nanoscales that will function by material, but it has poor optical properties using the creation of coherent quantum for optoelectronic devices such as solar cells states. By creating and maintaining such co- or lasers. However, at small dimensions the herent states, it should be possible to store properties of silicon can be dramatically al- enormous amounts of information with un- tered; the band gap in silicon can be blue- precedented security. In order to construct shifted from the infrared to the optical re- such new quantum computers, new fabrica- gion. One of the first manifestations of this tion methods will need to be found and effect occurs in porous silicon, which ex- guided by theory. Current technology is far hibits remarkable room temperature lumi- removed from the routine generation of nescence. To properly capitalize on such quantum logic gates and quantum networks. phenomena, a deeper quantitative under- Without a better quantitative description of standing of the optical properties of matter the science at the nanoscale, the emergence will be required. of quantum computing will not happen. Optical excitations are especially chal- lenging because most commonly used meth- Soft/Hard Matter Interfaces: Biosensors ods for structural energies, such as density Nanotechnology has the potential to make functional theory, are not well suited for ex- great advances in medical technology and cited state properties. The problem is exac- biosecurity. The technologies for controlling erbated for nanoscale systems, where the and processing materials for semiconductor many-body effects are enhanced by the devices can be used to make types of nano- physical confinement of the excitation. interfaces. Nanostructure synthesis and joining of biological or soft matter with tra- Coherence/Decoherence Tunneling: ditional semiconductors like silicon would Quantum Computing offer new pathways to the construction of While silicon dominates electronic devices, novel devices. For example, one could gain there is a fundamental limit to this technol- the ability to develop new types of sensors ogy. As the size of a transistor is reduced directly integrated into current device tech- each year, eventually one will approach the nology. A number of clear applications such new size regimes where traditional elec- as biosensors for national security, drug de- tronics must be viewed in different terms. livery and monitoring, and disease detection The approach to this regime will happen in are plausible. the not too distant future. Current litho- However, numerous complex and diffi- graphic techniques can create semiconductor cult issues must be addressed. These include chips with characteristic lengths of only a quantitative descriptions of nanofluids, re- few hundred nanometers; quantum effects action kinetics, and a selectivity to particular are likely to dominate at tens of nanometers. biological agents. Perhaps no systems are as At this latter length scale, transistor technol- complex at the nanoscale as these, because ogy will be solidly in the regime of quantum they require the understanding of the details phenomena. of interactions between quite diverse sys- tems, i.e., soft, biological materials inter-

15 faced with semiconductors or other in especially useful in providing benchmarks organic solids. for these building blocks, where experi- mental data is lacking or is unreliable due to Spintronics: Information Technology unknown variability or uncontrolled critical Spintronics centers on controlling the spin of function characteristics. Embedding meth- electrons in addition to controlling other ods whereby one can take the best features transport properties. For example, dilute of a particular approach will be crucial. magnetic semiconductors have attracted Algorithms for particular aspects of considerable attention, because they hold the nanoscale systems need to be developed. For promise of using electron spin, in addition to example, ab initio methods for optical prop- charge, for creating a new class of spintronic erties such as GW and Bethe-Salpeter meth- semiconductor devices with unprecedented ods are known to be accurate and robust, yet functionality. Suggested applications include these systems are limited in their applicabil- spin field effect transistors, which could al- ity to rather small systems, e.g., to systems low for software reprogramming of the mi- of less than 50 atoms. The essential physical croprocessor hardware during run time; features of these methods need to be adapted semiconductor-based spin valves, which to address much larger systems. would result in high-density, non-volatile Nonequilibrium and quantum dynamics semiconductor memory chips; and even spin present special problems. These systems of- qubits, to be used as the basic building block ten lack clear prescriptions for obtaining re- for quantum computing. liable results. Some recent attempts to match At the nanoscale, spintronic devices of- experiments on electron transport through fer special challenges owing to the enhanced molecular systems often get the correct exchange terms arising from quantum con- transport trends, but not quantitative num- finement. Few quantitative studies exist on bers. A number of entanglements exist: Is magnetic properties of quantum dots and the the experimental data reliable owing to evolution of ferromagnetic properties with problems with contacts? Or is there some size. fundamental component missing from our understanding? New experiments will need Implications for Theory and Modeling to be designed to ensure reproducibility and These five areas of research offer great the validity of the measurements. New theo- promise but will require new theories (ap- ries will need to be constructed and cross- proximate models) and computationally in- checked by resorting to the fundamental tensive studies. Current algorithms must be building blocks. made more efficient and sometimes more A chronic and continuing challenge accurate, or new algorithms must emerge in arising in most nanosystems concerns a lack order to address the challenges highlighted of understanding of structural relationships. above. For example, ab initio methods can Often nanosystems have numerous degrees address systems with hundreds of atoms, of freedom that are not conducive to simple and empirical methods can handle tens of structural minimizations. Simulated anneal- thousands of atoms. To fully exploit these ing, genetic algorithms, and related tech- approaches, algorithms need to be made niques can be used for some systems, but in scalable and take full advantage of current general it will require new optimization computer technology, i.e., highly parallel techniques in order to obtain a quantitative environments. Fully ab initio methods are

16 description of the structure of nanoscale tributes like hybridization and charge trans- systems. fer remain a challenge to incorporate into classical descriptions. At best, current clas- While great strides have been made in sical-like descriptions can be used for tar- simulation methods, a number of funda- geted applications. mental issues remain. The diversity of time and length scales remains a great challenge In order to facilitate dissemination of at the nanoscale. Rare event simulation new algorithms and numerical methods, methods will be essential in describing a open-source software should be encouraged number of phenomena such as crystal and made widely available. Toolkits and ef- growth, surface morphologies, and diffu- ficient codes should be offered without re- sion. Of course, the transcription of quantum strictions for distribution to as wide an audi- force fields to classical interatomic poten- ence as possible. tials is a difficult task. Intrinsic quantum at-

B. Complex Nanostructures and Interfaces

Central to nanoscience is the assembly and will come together at complex nano- manipulation of fundamental building interfaces across which transport of charge blocks on the nanoscale to create functional may occur, or which may provide structural structures, materials, and devices (Figure 6). stability. In new spintronic devices, the Unlike traditional materials, nanostructured nano-interfaces between ferromagnetic and materials that will enable the nanotechnol- antiferromagnetic domains may control the ogy revolution will be highly complex and speed of switching needed for faster com- heterogeneous in shape and substance, com- puters. In polymer nanocomposites, where posed of dissimilar materials, and dominated nanoscopic particles are added to a polymer, by nano-interfaces (Figure 7). These char- interactions at the polymer/particle nano- acteristics require that new theoretical ap- interface can have profound effects on bulk proaches and computational tools be devel- properties, leading to stronger yet lighter- oped to provide a fundamental understand- weight structural and other materials. The ing of complexity at the nanoscale, the manipulation of interfaces in soft matter physics and chemistry of nano-interfaces, systems at the nanoscale provides a wealth and how interfaces and complexity at the of opportunities to design functional, nano- nanoscale control properties and behavior at structured materials for templating, scaffolds larger scales. for catalysis or tissue growth, photonic de- vices, and structural materials (Figure 11). Nano-interfaces and complexity at the nanoscale play a central role in nearly all The high surface-to-volume ratio result- aspects of nanoscience and nanotechnology. ing from the prevalence of nano-interfaces, For example, in molecular electronic de- combined with the complexity of these vices, DNA and other biomolecules such as nano-interfaces and of nanostructures, pro- proteins are being explored both as assem- vides much of the challenge in developing blers of molecular components like carbon predictive theories in nanoscience. In this nanotubes and semiconductor quantum dots respect, materials designed at the nanoscale (Figure 8), and as active components them- are distinctly different from traditional bulk selves (Figures 9 and 10). In such devices, materials. One of the largest driving forces soft biomolecular matter and hard matter behind the need for new theory and simula-

17 Figure 6. Three-dimensional view of Charles Lieber's concept for a suspended crossbar array shows four nanotube junctions (device elements), with two of them in the “on” (contacting) state and two in the “off” (separated) state. The bottom nano- tubes on a thin dielectric layer (for example, SiO2) on top of a conducting layer (for example, highly doped silicon). The top nanotubes are sus- Figure 7. Schematic of an assembled structure of pended on inorganic or organic supports (gray nanoscopic building blocks tethered by organic blocks). Each nanotube is contacted by a metal elec- “linkers.” The structure is dominated by interfacial trode (yellow blocks). (IBM) connections. (S. C. Glotzer, 2002)

Figure 8. Top: Schematic of nanoscale building blocks tethered by DNA. The “soft/hard” interface in this complex nanostructure controls the transport of information through the structure. Bottom: Schematic of nanoscopic gold nanoparticles assem- bled onto a surface with DNA. Such guided assem- Figure 9. Atomistic simulation of a nanostructure bly can be used, e.g., to position the nanoparticles composed of polyhedral oligomeric silsesquioxane for fabrication into a nanowire. cages. (J. Kieffer, 2002)

Figure 10. Simulation of organic oligomers attracted Figure 11. Simulation of nanostructured domains in to the surface of a nanoscopic quantum dot. (F. W. a polymer blend where the interfaces are controlled Starr and S. C. Glotzer, 2001) at the nanoscale. (S. C. Glotzer, 1995)

18 tion in nanoscience is the fact that there are tion, and orientation of matter at hard-hard, few experimental techniques capable of di- hard-soft, and soft-soft nano-interfaces. rectly imaging or probing nano-interfaces. Much of this information may only be ac- Typically, those techniques having atomic- cessible by simulation, at least for the fore- scale resolution require extensive theory to seeable future. The geometry and topology produce more than qualitative information. of nano-interfaces, bonding at nano- Indeed, it is likely that the fundamental in- interfaces, structure and dynamics at or near terfacial and spatial information required to a nano-interface, transport across nano- predict the behavior and performance of interfaces, confinement at nano-interfaces, nanostructures will come from simulation deformation of nano-interfaces, activity and and theory in many cases. reactivity at nano-interfaces, etc. must all be Focused theory and simulation at a wide investigated by simulation. Such informa- range of scales is needed to develop a quan- tion is needed before very accurate theories titative understanding of nano-interfaces and relating all of these to material and device their consequent influence on properties and operations and behavior will be fully for- emergent behavior of nanoscale materials mulated and validated. and devices. Breakthroughs in theory and For example, to properly design, opti- simulation will also be necessary in order to mize and fabricate molecular electronic de- describe and predict the complexity of na- vices and components, we must quantita- nostructures and the relationship between tively understand transport across soft/hard structural complexity and emergent proper- contacts, which in turn will likely depend on ties and behavior. the arrangement of components on the sur- A major challenge for nanoscience in face and the complexity of structures that which theory, modeling, and simulation will can be assembled in one, two, and three di- play a pivotal role in the definition of mensions. Comprehending transport will the nano-interface, which is nontrivial be- likely require a quantitative understanding cause the definition may depend upon the of the nature of nano-interfaces between properties or behavior of interest. For exam- biological matter and hard matter, between ple, the relevant nano-interface leading to a synthetic organic matter and hard matter, particular mechanical behavior may be dif- and between biological matter and synthetic ferent than that for transport properties, such organic matter. It will also require optimiza- as electrical transport. Strategies for ad- tion of structures and tailoring of nano- dressing nano-interfaces will also be needed. interfaces, which in turn will require a With an appropriate definition of nano- quantitative understanding of the stability of interfaces, new approaches can be developed nano-interfaces. that characterize relevant nano-interfaces in As we move towards the goal of first- order to construct quantitatively predictive principles-based fully theoretical theories that relate the nano-interfaces to of the properties of nanoscale systems, there interesting nanoscale properties and behav- is an urgent need to critically examine the ior. application of bulk thermodynamic and sta- To develop theories and quantitative un- tistical mechanical theories to nanostructures derstanding of nano-interfaces, it will be and nano-interfaces, where violations of the necessary to obtain information on the or- second law of thermodynamics have been ganization of matter at nano-interfaces, that predicted and observed, and where mechani- is, the arrangement, composition, conforma- cal behavior is not described by continuum theories.

19 Theories are required that abstract from present state of the art limits the number of behavior at a specific nano-interface to pre- atoms to numbers too small to obtain dict the behavior of extended structures or meaningful data in many instances. Conse- material that is comprised of primarily nano- quently, breakthroughs in methodologies interfaces. Additionally, when describing will be required to push ab initio simulations nanostructures that are collections of a to larger scales. Ab initio simulations con- countable number of fundamental building taining of the order of 10,000 atoms would blocks, one must question whether the con- enable, for example, the study of spin cept of temperature has the same meaning as structure at antiferromagnetic-ferromagnetic in a bulk system. An important task for theo- nano-interfaces. reticians and simulationists will be to iden- New or improved computational meth- tify what can be brought to nanoscience ods are needed to simulate also the dynam- from traditional methods of investigation ics of nano-interfaces from first principles. and description, and then determine what Such simulations will be needed to predict, new theories or approaches must be invoked for example, electronic transport across at the nanoscale. nano-interfaces, as well as to provide vali- When materials or devices are domi- dation or benchmarks for coarser-grained nated by nano-interfaces, new physics is theoretical descriptions of nano-interfaces in likely to result. Examples of nano-interface- which electronic structure is not explicitly dominated physics include the electric field considered. For molecular dynamics meth- at surfaces, luminescence of rare-earth ods and other classical, particle-based meth- orthosilicates, surface-enhanced Raman ods, new (for many purposes reactive) force scattering, and deformation of nanocrystal- fields capable of describing nano-interfaces line materials. Nanocomposites and nano- between dissimilar materials are urgently assemblies are just two examples of nano- needed. For some problems, further coarse- structures where nano-interfaces are preva- graining will be required to capture com- lent and may dominate behavior at larger plexity on all relevant length and time scales. The high surface-to-volume ratio of scales, and thus new mesoscale theories and complex nanostructures may also lead to computational methods will be needed. To novel collective or emergent phenomena, connect the properties and behavior of com- which would require new theory and simu- plex nanostructures and/or nano-interfaces lation efforts to address and understand. to properties and behavior at the macroscale, Because of the present lack of theories it may be necessary to develop new consti- to fully describe nano-interfaces at the tutive laws (and ask if it is even appropriate nanoscale and the ramifications of nano- to talk about constitutive laws in systems interfaces on larger scales, and the lack of where properties and behavior are domi- experimental data on nano-interfaces, simu- nated by nano-interfaces) and then develop lations will play a pivotal role in advancing and integrate theories at different levels. knowledge in this area. Ab initio calcula- tions in particular will be required, but the

20 C. Dynamics, Assembly, and Growth of Nanostructures

The focus of this section is the time- dependent properties of nanostructures (such as transport properties) and the time- dependent processes used to produce nano- structures and nanostructured materials (such as directed self assembly, nucleation and growth, and vapor deposition methods). The primary route to manufacturable functional nanoscale systems is universally conceded to be self-assembly, which may be directed or not. Self-assembly of nano- structured materials is scalable to industrial levels of production because it does not re- quire control at the nanoscale (such as, for Figure 12. Spintronics devices utilize an elec- example, manipulation via AFM tip). There tron’s spin rather than its charge. Shown here is an artist’s depiction of a proposal by Bruce is a wide variety of transport mechanisms Kane of the University of Maryland for a relevant to nanoscience, including electron quantum computer based on the nuclear spin transport (relevant to molecular electronics, of phosphorus atoms (green), which interact by nanotubes, and nanowires), spin transport means of spin-polarized electrons (red). The (relevant to spintronics-based devices, see quantum properties of superposition and en- tanglement may someday permit quantum Figure 12), and molecule transport (relevant computers to perform certain types of compu- to chemical and biological sensors, molecu- tations much more quickly using less power lar separations/membranes, and nanoflu- than is possible with conventional charge- idics). based devices. (From S. Das Sarma, “Spin- tronics,” American Scientist 89, 516 (2001)) Examples of nanostructures and nano- structured systems formed by (or capable of formation by) self-assembly include quan- tum dot arrays, nano- and microelectrome- chanical systems (NEMS and MEMS), nanoporous adsorbents and catalytic materi- als (produced by templated self-assembly, see Figure 13), nanocrystals, and biomimetic materials. This list is by no means exhaus- tive. The role that mathematics and computer science researchers can play is to apply mathematical and computational tools de- veloped in other contexts to the understand- Figure 13. Templated self assembly of nano- porous material—in this case, MCM-41. (From ing, control, design, and optimization of X. S. Zhao, “Synthesis, modification, characteri- nanostructured materials and functional zation and application of MCM-41 for VOC systems based on nanoscale structures. Each Control,” Ph.D. thesis, University of Queensland, of these elements—understanding, control, 1998)

21 design, and optimization—is essential to the equilibrium multiphase systems (in- manufacturability of systems composed of cluding interfaces) are required. nanoscale objects, as noted by Stan Wil- Participants also saw the necessity of liams of Hewlett-Packard in the opening talk traditional finite element elastic/plastic of the workshop. codes for stress/strain description at the Among the major challenges facing macroscale, since ultimately nanostructured theory, modeling, and simulation (TMS) materials must connect with the macro- in the dynamics, assembly, and growth of scopic world. It was noted that upscaling nanostructures is simulation of self- (going to coarser length and time scales) assembly (which typically involves many typically results in the introduction of sto- temporal and spatial scales, and many more chastic terms to reflect high-frequency mo- species than the final product), methods for tion at smaller scales; thus, more rigorous calculating electron and spin transport prop- methods for including stochasm and more erties in nanoscale systems, and the proc- effective methods for characterizing and essing and control (particularly to achieve solving differential equations are more uniform size distribution) of self- required. assembly of heterostructures composed of New developments in nonequilibrium hard materials (such as group IV semicon- , both of classical and ductors, obtained by combined ion and mo- quantum systems, are needed. For example, lecular beam epitaxial growth). the recent discovery by nonequilibrium mo- The TMS methods needed to faithfully lecular dynamics, confirmed by nonlinear model the dynamics, assembly, and growth response theory, of violations of the second of nanostructures consist of the usual meth- law of thermodynamics for nanoscale sys- ods we associate with TMS (electronic tems indicates the importance of new theo- structure methods, atomistic simulation, retical developments focused on nanoscale mesoscale and macroscale methods) but systems. Quantifying errors in calculated with additional variations specific to these properties was identified as a significant problems. These variations include: problem that consists of two parts—charac- terization of uncertainty due to inaccuracies • Hybrid electronic structure/molecular inherent in the calculation (such as limita- dynamics methods scalable to very large tions of a force field in molecular dynamics systems are needed to model combined or of a basis set in electronic structure cal- reaction and structure evolution. culations) and performing useful sensitivity • Coarse graining/time spanning methods analyses. are crucial, owing to the large times as- The important aspect of nanoscale sys- sociated with self-assembly and the large tems that makes them such a TMS challenge microscopic and mesoscopic domains is that the characterization of uncertainty that exist in many systems. will often have to be done in the absence of • Since many self-assembly processes take experimental data, since many of the prop- place in solution, electronic structure erty measurement experiments one would methods incorporating solvation are nec- like to perform on nanoscale systems are essary. impossible in many cases or unlikely to be done in cases where they are possible. • Molecular and nanoscale methods that Hence, the concept of self-validating TMS incorporate phase equilibria and non- methods arises as a significant challenge in

22 nanoscience. By self-validating TMS meth- DNA-like synthetic organic macromole- ods we mean that a coarser-grained descrip- cules, and directed self-assembly of quan- tion (whether it be atomistic molecular dy- tum dot arrays and other nanoscale building namics or mesoscale modeling, for example) blocks using physical synthesis and assem- is always validated against more detailed bly methods. Each of these problems has calculations (e.g., electronic structure cal- clear relevance to the DOE mission, creating culations in the case of atomistic molecular possible paradigm shifts in low-energy sepa- dynamics, and atomistic molecular dynam- rations, catalysis, sensors, electronic de- ics in the case of mesoscale modeling). vices, structural and functional hybrid mate- Various computational issues were iden- rials, and computing. Each involves signifi- tified as important for computational nano- cantly different chemical and physical sys- science. In particular, an open source soft- tems, so that the range of TMS methods ware development framework for multiscale needed varies with each problem: the first modeling was identified as a real need. The involves self-assembly from solution; the OCTA project in Japan (http://octa.jp) may second the use of biologically or bio- point the way towards development of such inspired directed self-assembly, again in an open source software development solution; while the third generally does not framework, or may indeed be the appropri- involve solutions. ate framework on which to expand simula- In a related example, Figure 14 shows a tion capabilities. Visualization of nanoscale new computational method for producing systems is challenging, since the spatio- random nanoporous materials by mimicking temporal scales cover such large orders of the near-critical spinodal decomposition magnitude. Truly effective parallelization of process used to produce controlled-pore Monte Carlo was identified as an important glasses (CPGs). While producing model issue. Specifically, efficient domain decom- CPGs which have similar pore distributions position parallelization of Monte Carlo on and sizes to actual CPGs, the method is not a very large systems with long range forces fully detailed simulation of CPG synthesis. was seen as an unsolved problem with rami- Refinement of such techniques to the point fications for computational nanoscience. where new nanoporous materials can be dis- Finally, parallel codes for any TMS method covered and synthesized computationally that scales efficiently to tens of thousands of would be a paradigm-altering step forward processors were targeted. This is because the in nanostructured materials synthesis. highest-performance computers are headed towards this number of processors, yet it is not clear that any scalable codes exist at this point for such large machines. We identified three grand-challenge- scale problems within self-assembly that would require theoretical, algorithmic, and Figure 14. Schematic of controlled-pore glass model preparation. The phase separation process computational advances, and which might used to prepare real materials is imitated via mo- be specifically targeted in any future pro- lecular dynamics simulation, resulting in molecu- gram in computational nanoscience. These lar models with realistic pore networks and struc- are the simulation of self-assembly of tem- tures. (After Gelb et al., Reports On Progress in plated nanoporous materials, self-assembly Physics, Vol. 62, 1999, p. 1573–1659) of nanostructures directed by DNA and

23 III. The Role of Applied Mathematics and Computer Science in Nanoscience

“I am never content until I have resentative areas, there are instances of constructed a mechanical model of “low-hanging fruit,” of problems that are what I am studying. If I succeed in reasonably well defined but whose solutions making one, I understand; other- present significant technical challenges, and wise I do not. . . . When you meas- of phenomena that are not yet at all well de- ure what you are speaking about fined by Kelvinistic standards. and express it in numbers, you In a complementary way, mathematics know something about it, but when and simulation would be enormously stimu- you cannot express it in numbers lated by the challenges of nanoscale model- your knowledge about it is of a ing. While some rapidly developing areas of meagre and unsatisfactory kind.” mathematics, such as fast multipole and —William Thompson (Lord multigrid algorithms, are ready to apply (in Kelvin), 1824–1907 fact, are already being used) in nanoscale Kelvin’s mathematical triumphalism is from modeling, there is much work to do just to a pre-quantum-mechanical era and must be get to the frontier of the science in other discounted as chauvinistic in view of the areas. immense understanding of complex phe- It is important to keep in mind that no nomena that can be accumulated prior to one can predict where the next mathematical satisfactory mathematization. Such hard- breakthrough of consequence to nanoscience won understanding of phenomena, in fact, will come from: it could be from computa- leads eventually to their mathematization. tional geometry or from graph theory; it Nevertheless, Kelvin’s agenda of quantita- might come from changing the mathematical tive understanding remains as a prerequisite structure of existing models. Changing the for simulation and predictive extrapolation mathematical structure creates a new lan- of a system, if not for substantial under- guage, which often can expose an emergent standing. simplicity in a seemingly complex land- Given the enormous power of computa- scape. For instance, no one could have pre- tional simulation to transform any science or dicted the significance of space-filling technology supported by a mathematical curves (a topological fancy) to the optimal model, the mathematization of any new field layout of data in computer memories. is a compelling goal. The very complexity However, it is predictable that the for- and breadth of possibilities in nanoscience ward progress of mathematical modeling and technology cry out for an increasing role will carry first one, then another area of for computational simulation today. This nanoscience forward with it—provided that section, organized around three themes, in- the minds of nanoscientists and mathemati- dicates some of the directions this increasing cians alike are prepared to experiment, to role might take: in bridging time and length look for the points of connection, and to scales, in fast algorithms, and in optimiza- recognize opportunities as they arise. The tion and predictability. In each of these rep- many obvious connections described below

24 should not overshadow the potential for the content phenomena. Stochastic modeling emergence of perhaps more important con- techniques pull averages and higher mo- nections between nanoscience and other ar- ments from intrinsically nondeterministic eas of mathematics in the near future. systems. Moreover, the limits of applicabil- Hence, it will be important at this early ity of these tools and techniques are well stage not to define too narrowly what forms understood in many cases, permitting their of applied mathematics will be important to use in a cost-effective way, given a require- nanoscience applications. Furthermore, ment for accuracy. Sensitivity and stability the collaborative infrastructure and cross- analyses can accompany solutions. fertilization between projects that can be The interaction of scientists and engi- facilitated in a DOE initiative fosters crea- neers with applied mathematicians leads to tivity while making recognition of otherwise continual improvement in scientific and non-obvious connections and opportunities mathematical understanding, computational more likely. tools, and the interdisciplinary training of One of the simplest initial objectives of a the next generation of each. Due to the nov- collaborative dialog between scientists of elty of nanoscale phenomena, such interac- the nanoscale regime and mathematical tion with applied mathematicians is often modelers is to be sure that the low-hanging missing. Interactions must be systematically fruit is regularly plucked. Mathematical re- improved in existing and new areas. sults and software libraries to implement At the next level of objectives for a na- them—many sponsored by the Department noscience-mathematics collaboration lie the of Energy—arrive in a steady stream and problems that are apparently well defined by readily find application in fields that tradi- nanoscientists but for which there are no tionally engage applied mathematicians, routinely practical mathematical solutions such as engineering mechanics, semicon- today. The number of particles that interact, ductor device physics, signal processing, the number of dimensions in which the and geophysical modeling. The differential problem is set, the range of scales to be re- and integral operator equations underlying solved, the density of local minima in which mathematical physics have been placed on a the solution may be trapped, the firm foundation. Issues of existence and size of the ensemble that needs to be exam- uniqueness, convergence and stability of ined for reliable , or some other numerical approximation, accuracy and al- feature puts the existing model beyond the gorithmic complexity, and efficiency and range of today’s analysts and the capacities scalability of implementation on the of today’s computers. In these areas, some agency’s most powerful computers are well fundamentally new mathematics may be re- understood in many scientific areas. quired. The variety of simulation tools available Finally, there exists a set of problems for is enormous and includes Eulerian-type and which mathematicians must come alongside Lagrangian-type adaptive methods for the nanoscientists to help define the models evolution of fields, level sets, and particles. themselves. A critical issue for revolutionary Complex multiscale algorithms have been is that the components and developed for optimal solution of equilib- systems are in a size regime about whose rium and potential problems. Wonderful fundamental behavior we have little under- tools exist for creating low-dimensional rep- standing. The systems are too small for resentations of apparently high-information easily interpreted direct measurements, but

25 too large to be described by current first- Once a theoretical model becomes a principles approaches. They exhibit too trusted companion to experiment, it can many fluctuations to be treated monolithi- multiply the effectiveness of experiments cally in time and space, but are too few to be that are difficult to instrument or interpret, described by a statistical ensemble. In many take too long to set up, or are simply too ex- cases, they fall between the regimes of ap- pensive to perform. Powerful computational plicability of trusted models. Perhaps exist- engines of simulation can be called upon to ing theories can be stretched. More likely, produce, to twist the words of R. W. Ham- new theories and hence new mathematical ming, not just insight, but numbers as well. models will be required before robust fun- Those numbers are the essence of con- damental understandings will emerge. trol and predictability.

A. Bridging Time and Length Scales

Most phenomena of interest in nanoscience easily leads to the same range; Avogadro’s and nanotechnology, like many phenomena number is about 1024. throughout science and engineering, demand Exploiting the concurrency available that multiple scales be represented for with multiple independent ensembles multi- meaningful modeling. However, phenomena plies storage requirements. Space-time re- of interest at the nanoscale may be extreme source requirements are staggering for many in this regard because of the lower end of simulation scenarios. Therefore, nano- the scale range. Many phenomena of im- science poses unprecedented challenges to portance have an interaction with their envi- the mathematics of multiscale representation ronment at macro space and time scales, but and analysis and to the mathematics of syn- must be modeled in any first-principles thesizing models that operate at different sense at atomistic scales. scales. The time scale required to resolve Existing techniques may certainly yield quantum mechanical oscillations may be as some fruit, but we are undoubtedly in need small as a femtosecond (10-15 s), whereas of fundamental breakthroughs in the form of protein folding requires microseconds (a 9 new techniques. Faced with similar types of range of 10 ), and engineered nanosystems challenges in macroscopic continuum prob- may require still longer simulation periods, lems, applied mathematicians have devised a e.g., creep of materials relevant to reliability wide variety of adaptive schemes (adapta- considerations may be measured in years. tion in mesh, adaptation in discretization Counting trials in probabilistic analyses order, adaptation in model fidelity, etc.). contributes another time-like factor to the Bringing these schemes to a “science” has computational complexity, although this required decades, and is not complete in the latter factor comes with welcome, nearly sense of predictable error control for a given complete algorithmic concurrency, unlike investment, e.g., for some hyperbolic prob- evolutionary time. Meanwhile, spatial scales lems. Probably the greatest range of scales may span from angstroms to centimeters (a 8 resolved purely by mesh adaptivity to date is range of 10 ). The cube of this would be about 10 decades in cosmological gravita- characteristic of a three-dimensional contin- tion problems. Of course, this range is prac- uum-based analysis. Counting atoms or tical only when the portion of the domain molecules in a single physical ensemble demanding refinement is small compared to

26 the domain size required to include all of the haps, given the promise of nanotechnology, objects of interest or, what is often the more we should say that it can only succeed demanding requirement, to graft on to relia- through some “spiral” process, whereby bly posed boundary conditions in the far nanotechnology yields superior computers, field. Migrating such adaptive schemes into which yield superior nanoscale models, production software (typically through leading to improved nanotechnology and community models, sometimes in commer- better computers, and so forth. cial analysis packages) lags their scientific We begin with a brief inventory of suc- development by at least an academic gen- cessful scale-bridging models in applied eration and usually more. The challenge for mathematics today. nanoscience is therefore not only great, but also urgent. Applied mathematicians being Mathematical homogenization (or “up- trained today can acquire a doctoral degree scaling”)—whereby reliable coarse-scale without ever having been exposed to nano- results are derived from systems with unre- scale phenomena. solvable scales, without the need to generate the (out of reach) fine scale results as inter- The mathematical models and algo- mediates—is a success story with extensive rithms of nanoscience cover the gamut of theory (dating at least to the 1930s) and ap- continuous and discrete, deterministic and plications. The trade-off between accuracy random. Ultimately, all simulations are exe- and efficiency is well understood. It is par- cuted by deterministic programs (at least, to ticularly fruitful, for instance, in simulating within the limits of race conditions on par- flows through inhomogeneous porous media allel processors) on discrete data. The most or wave propagation through inhomogene- capable computer platforms of 2002 (costing ous transmissive media with multiple scat- in excess of $100M) execute at most tens of terers. One begins by considering just two teraflop/s (1013 floating point operations per scales. Given a 2 × 2 matrix of interactions: second) and store in fast memory at most “coarse-coarse,” “coarse-fine,” “fine- tens of terabytes and in disk farms at most coarse,” and “fine-fine,” where the first hundreds of terabytes. Moreover, these ma- member denotes the scale of an output be- chines do not perform efficiently for models havior and the second member the scale of at these capability extremes. A typical ex- the input that drives the behavior, one can treme problem in to- formally solve to eliminate the “fine-fine” day encompasses only billions of discrete block. (If the interactions are all linear, for degrees of freedom (filling storage larger instance, this can be done with block Gaus- than this with auxiliary data, including ge- sian elimination.) This leaves a Schur com- ometry, physical constitutive data, and linear plement system for the “coarse-coarse” algebra workspaces) and runs for days at block, where the interaction matrix is the tens to hundreds of gigaflop/s. Routine original “coarse-coarse” block and a triple problems encompass only millions of de- product term with the “fine-fine” inverse in grees of freedom and run for hours at around the middle. The effect of the triple-product 1 gigaflop/s. term can often be reliably approximated, The wildest extrapolations of Moore’s yielding a model for coarse scales that in- Law for silicon cannot sate the appetite for corporates the dynamics of both scales. cycles and memory in nanoscience model- In principle, this process can be repeated ing. It is clear that a brute force, first- recursively. The standard theory succeeds principles approach to nanoscale phenomena when there is good separation of scales be- will not succeed in bridging scales—or per-

27 tween the inhomogeneous structure to be tions can be simultaneously represented and “homogenized” and the scales of interest in joined. In this sense, these approaches are the solution. When there is significant inter- somewhat complementary to that of homog- action across length scales (when the enization. An even less dramatic example, “coarse-fine” and the “fine-coarse” interac- still, is the scenario in which the physical tions are large), the computation of effective model remains the same throughout the do- properties at the coarse scales becomes more main, but the mathematical representation of difficult. Work continues on such concepts it changes for computational efficiency. The as “multilevel renormalization,” and ho- classic example of this is a combination of mogenization should certainly be studied for finite elements to resolve the geometry fruitful approaches to nanoscale science. around a complex object (e.g., scatterer) in a approach and boundary Another successful and now quotidian elements to resolve the far field in an inte- means of bridging scales in modeling is the gral equation approach. These space-sharing concept of “space sharing” of different mod- approaches sometimes (depending upon the els, with different resolution requirements. difficulty of the application) lack the For instance, crack propagation in a solid mathematical rigor of adaptive approaches can be effectively modeled with adaptive within a single physical and mathematical finite elements for linear elasticity in the formulation, but with care to monitor and mid and far fields, grafted onto molecular enforce conservation properties at the inter- dynamics models at and immediately sur- face, they are practical and possibly to be rounding the crack tip and the newly ex- emulated in nanoscale phenomena. posed surface. As another example, the Richtmyer-Meshkov instability in fluid me- A somewhat more trivial instance of chanics has been studied with a conventional bridging scales is what we might call “time computational analysis for sharing” of models, in analogy to “space Navier-Stokes in the mid and far fields and sharing.” In such cases, different dynami- direct simulation Monte Carlo on “lumped” cally relevant physics “turns on” at different fluid particles at the mixing interface. At- points in the simulation, so that it is accept- mospheric reentry problems have similarly able to step over fast scales during one pe- been successfully modeled with sparse con- riod, but necessary to resolve them in an- tinuum models away from the body and other. Modeling ignition in a combustion Boltzmann models in the boundary layer at system provides one industrially important the body surface. example. Cosmology enters again here with vastly different physics relevant in the very Very significant scientific and engi- early universe (e.g., first three minutes) than neering progress can be made with such in the long-term evolution. models on today’s high-end computers with such two-physics scale bridging. Less dra- It is conceivable, with good software en- matically, one can also cite a multitude of gineering, to marry the concepts of space successful dual-continuum model ap- sharing and time sharing of multiple physics proaches with different asymptotic forms of models in a large and complex system, so the governing equations in different regimes, that each region of space-time adaptively e.g., classical boundary layer theory, in chooses the most cost-effective model and which viscosity terms are computed only in advancement algorithm. One concept in the regions of steep gradients. In all of these ex- continuum modeling of multirate systems amples, it is important that the dual models whose applicability to nanoscale multirate have a regime of overlap, where their solu- systems should be explored is that of im-

28 plicit integration methods to “step over” fast or material properties for a coarser-scale time scales that are not dynamically relevant problem. to the result, even though they are timestep It is increasingly practical, for instance, limiting, due to stability considerations, in to simulate properties of novel alloys or ex- explicit integration. Examples abound, in- plosives—materials with simple structure cluding acoustic waves in aerodynamics, compared to fiber composites or nanotube gravity waves in climate modeling, Alfven networks—and then to employ these prop- waves in magnetohydrodynamics, and fast erties as proxies for experimental data in reactions between intermediate species in continuum simulations. Sometimes the detailed kinetics of combustion. properties are quite ad hoc, for instance, In these and many other cases, accepta- eddy models in turbulence, to account for bly accurate results on slower (dynamically mixing at larger scales than molecular by an relevant) time scales can be achieved by enhancement of the molecular viscosity in making some equilibrium assumption about turbulent regimes. Of course, the turbulence the fine dynamics in order to filter out the simulation community, reflecting a time- fast scales, enabling the integration to take honored specialty with a modeling history place in coarse timesteps. The price for this far too complex to summarize in the scope transformation of the equations to be inte- of this section, copes with multiple scales grated is usually an implicit solve, where a through as many different means as there are mathematical balance is enforced on each modeling groups. Large-eddy simulation, a timestep. The algorithmic cost of an implicit fairly rigorous approximation, is emerging integrator is much higher per step than that as very popular, and its practitioners may of an explicit integrator, but one can take have fresh ideas when faced with nanoscale many fewer steps to get to a desired point in phenomena. Source terms in turbulent re- simulation time. Whether this trade-off is a acting flows attempt to account for addi- winning one for the overall simulation de- tional reaction intensity along fractally pends upon the cost of the implicit solve. As wrinkled flame fronts by parameterizing a rule of thumb in large-scale partial differ- smoother, resolvable flame front area. Both ential equations (PDEs), if the fast scales are deterministic and probabilistic tools have two orders of magnitude or more faster than proved fruitful in tuning to experiments and the dynamical scales of interest, implicit linking up with the fundamental conserva- methods usually pay off. If the ratio of tion principles. This is a very application- scales is 10 or less, they likely do not. It may specific technology, as will be, undoubtedly, be that many ranges of time scales required many of those appropriate for bridging by nanoscientists (e.g., integration intervals nanoscales. lasting billions of quantum mechanical os- Although it is not a modeling technique cillation periods) can be condensed by im- but a solution technique, one should also plicit algorithmics without loss of any rele- mention multigrid as a paradigm. Multigrid vant physics, following some statistical or has already found success in molecular dy- equilibrium assumptions. namics and in electronic structure algo- Retreating further into simple, but often rithms and is likely to be embraced across a adequate, scale-bridging scenarios, there are wider range of nanoscale applications. many examples of one-way cascades of Multigrid has the beautiful property of con- models, whereby a fine-scale problem is quering numerical ill conditioning (essen- solved off-line to develop constitutive laws tially caused by the presence of too many scales in view of the numerics all at once)

29 by treating each scale on its own terms. This It is axiomatic to a mathematician that improves linear conditioning and nonlinear there are better ways than brute-force propa- robustness of the solution method, and often gation of fine scales to attack a multiscale leads to physical insight as well. For in- problem. A mathematician would argue that stance, problems in geometric design are anyone requiring a billion degrees of free- often best approached by resolving large- dom for a week of wall clock execution time wavelength geometric parameters on coarse is running the wrong algorithm. Indeed, the grids, and small-wavelength geometric pa- nanoscale community must learn to compute rameters on finer ones. Multigrid depends “smarter” and not just “harder.” Part of this on a (usually purely mathematical) decom- will come from optimal algorithmics (e.g., position into ranges of scales, and on the linear scaling in DFT), but a larger and yet ability to construct intergrid transfer opera- unknown portion of it will come from intrin- tors and coarsened versions of the funda- sically multiscale models, created collabo- mental fine-scale operator on coarser scales. ratively with mathematical scientists. It is easy to construct scenarios in which such Another mathematical paradigm that modeling is attractive, and probably neces- may prove to have a resonance with the sary. We mention here an example that ap- nanoscale community is the method known pears to demand vast bridging of scales for variously as proper orthogonal decomposi- unified scientific understanding, the opera- tion or principal component analysis. In both tion of a biosensor to detect trace quantities linear and nonlinear problems, it is often of a biologically active molecule. possible to represent the underlying dynam- ics of interest with a relatively small number • At the smallest scale, a molecule can be of dynamical coefficients representing the sensed by the luminescence characteris- weight of complex, system-dependent tic of an optical energy gap. Quantum modes. The challenge is to discover the Monte Carlo could be used for a high- modes of interest, which are almost never as accuracy comparison with experiment. numerous as the number of characteristic • At the next scale, where one needs to basis functions in the finest grid. The finely gain a quantitative understanding of sur- resolved setting may be required to deter- face structure and interface bonding, mine the principal modes, but once found, DFT and first-principles molecular production computing can be done in a re- dynamics could be employed. duced basis. • Some degree of stochasticity is intrinsic At the next higher scale, one would need at atomistic scales; hence, a Monte Carlo to gain a quantitative understanding of simulation will yield a fluctuation with sta- the atomistic coupling of the molecule to tistics that must be carried to larger length be detected with the solvent that trans- scales. The numerical analysis of stochastic ports it to the sensor. This modeling partial differential equations could enter could be modeled with classical mo- here, the goal being a coarse solution with lecular dynamics at finite temperature the right statistics. The theory of large de- and classical Monte Carlo. viation for stochastic PDEs allows the de- • Finally, at the macroscale, the solvent signing of sampling methods for so-called itself could be characterized with a con- “rare events.” The long time integrations tinuous model, based on finite elements. that are burdensome to the accuracy of de- Each phenomenon depends upon the terministic models may actually be a benefit phenomenon at the scale above and the scale to the accuracy of statistical models.

30 below, which means that there is two-way well-understood error dependence on the coupling between mathematically different computational resolution in each model, and models at different scales. with error control for information transfer between scales. For optimality, algorithms The Holy Grail of nanoscale modeling is should be prudent in their use of fine-scale a seamless bridging of scales with tight bi- information—using the coarsest scale suit- directional coupling between every scale- able for obtaining the requisite error in each adjacent pair of a sequence of models, from result. nano through micro and meso to macro. This bridging necessarily must involve scale- appropriate algorithmics at each scale, with

B. Fast Algorithms

Due to the inherent complexity of nano- instance, fast linear solvers enable the structures, theoretical models will generally solution of nonlinear systems. Efficient need to be instantiated as computational modeling of forward problems allows simulations. Many such simulations will in- for the solution of inverse problems. volve intensive computation, requiring so- Hence, faster kernel computations en- phisticated algorithms and software running able a sufficient number of simulations on state-of-the-art parallel computers. Algo- to be performed in order to improve the rithmic advances can be a critical enabling statistical sampling. component for progress in nanoscience The potential benefits from new capa- modeling. New and more efficient algo- bilities in modeling in the nanosciences re- rithms advance scientific frontiers in several quire continuous advances in the key algo- ways: rithms that underlie nanoscale computations. • Larger simulations can be performed. This progress may come about in any num- For example, more atoms or basis func- ber of ways. Many of the key algorithms tions can be included in a model, in- (fast transforms, eigensolvers, etc.) have ap- creasing fidelity and accuracy. Or dy- plications in other settings, but their impact namical models of the same size can be in the nanosciences alone could justify con- run for longer physical times. Enabling tinued development. Entirely new ap- larger simulations is particularly impor- proaches for some of these problems are dif- tant in the nanosciences to allow meth- ficult to anticipate but could have tremen- ods at one time or length scale to extend dous impact on not only the feasibility of the into the range of applicability of other, computations, but also what will be learned more coarse-grained methodologies in from the computations. For example, wave- order to validate or pass information to lets or other new bases might be able to re- the coarser-grained methodologies. duce the complexity of some nanoscience calculations and may expose features of the • Physical models with more detailed sci- science in new ways. An alternative ap- ence can be simulated, increasing accu- proach for making progress is to specialize a racy and reliability and hence confidence general algorithm to the precise needs of an that simulation results represent reality. application. This approach is the most com- • With faster kernel computations, new mon way to advance the application of algo- types of analysis become tractable. For rithms and often allows for dramatic im-

31 provements in performance. Recent progress sions will likely require additional algo- in linear scaling methods for electronic rithmic advances. Modern software practices structure provides one such example. and open-source parallel libraries can have an enormous impact in easing the develop- Developments in electronic structure ment of new functionality and then deliver- calculations provide a particularly telling ing that new capability to scientists in a illustration of the enabling power of algo- timely and efficient manner. More generally, rithmic advances. First-principles DFT the asymptotic complexity of an algorithm is methods for this problem traditionally scale only one of several aspects relevant to an as the cube of the system size, N. This rapid algorithm’s potential impact in nanoscience. growth with system size limits applications Algorithmic researchers must also consider of these powerful techniques to systems ease of implementation, parallelizability, containing a few hundred atoms on tradi- memory locality, and robustness. tional computers. More efficient cubic algo- rithms and parallel implementations have In an area as diverse as nanoscience, an allowed larger systems to be simulated, but enormous range of computational techniques substantial progress is still limited by the N3 and algorithms have a role to play, and the complexity. Recent algorithmic progress is workshop was able to touch on only a subset removing this barrier. For system sizes of up of them. The following areas were identified to a thousand electrons, ab initio plane wave as particularly important. Many of them are electronic structure codes using fast Fourier generally applicable for a wide range of ap- transforms now effectively scale as N2logN. plications; however, they will be particularly Developments using local representations important to enabling discoveries and quan- are close to achieving linear scaling for very titative understanding in the nanosciences. large systems, at least in some application domains. Linear Algebra in Electronic Structure Calculations A second important example comes from molecular dynamics simulations, Simulation methods based on ab initio elec- where advances in fast multipole methods tronic structure will unquestionably be criti- have reduced the time for computing Cou- cal in future investigations of the properties lombic or other long-range forces from N2 to of matter at the nanoscale. The capability of NlogN or N. Recent innovations in time in- such methods to predict structural and elec- tegration for molecular dynamics are ena- tronic properties without prior empirical bling simulations for much longer physical knowledge or experimental input makes times. Continued developments and ad- them very attractive. It is therefore an im- vances will be needed to enable the neces- portant priority to accelerate the develop- sary simulations for complex nanoscale dy- ment and the deployment of fundamental namics. algorithms used in large-scale electronic structure calculations. These types of advances generate new challenges. The asymptotically more effi- Electronic structure calculations based cient algorithms are generally more complex on DFT and extension to first-principles than their antecedents, and in order to have molecular dynamics (FPMD) have tradition- high fidelity and enough accuracy, some ally benefited greatly from advances in fast may only be more efficient for very large algorithms. An example is the fast Fourier problems. High-quality implementations are transform, which essentially lies at the root a significant undertaking, and parallel ver- of the efficiency of the plane-wave elec-

32 tronic structure method, and thus of the fea- coupled by the potential. Potentially enor- sibility of plane-wave-based FPMD. De- mous gains are possible if specialized meth- pending on the numerical approach taken, ods are developed for solving such systems. the solution of the Kohn-Sham equations is A problem shared by all electronic either cast as an eigenvalue problem—in structure methods is that of efficiently solv- which case the non-linearity due to the ing the Poisson equation. Algorithms for Coulomb and exchange-correlation interac- solving this problem are widespread and tions is included as an outer loop—or as a well developed, although further progress in constrained optimization problem—in which the complex geometries relevant to nano- case the non-linear Kohn-Sham energy science applications would be welcome. The functional is directly minimized with respect community of theorists/modelers involved in to all degrees of freedom describing the nanoscience would benefit from better (and electronic wavefunctions, with the constraint publicly available) software for this prob- that one-particle wavefunctions must remain lem. orthogonal. Linear scaling methods and related tech- Thus the development of efficient eigen- niques avoid the eigenvalue problem alto- value solvers on one hand, and of con- gether. Although this allows for a great re- strained optimization algorithms on the duction in asymptotic complexity, current other, directly impacts the efficiency of methods can lack accuracy and/or be nar- electronic structure codes. Optimization is rowly applicable. A wealth of algorithms discussed elsewhere in this report, but the have been proposed and explored, and the need for fast eigensolvers is pervasive. opportunities for fruitful collaborative inter- Some applications require the k lowest actions are abundant. eigenpairs where k is large—perhaps in the thousands. But there is also a need for codes Monte Carlo Techniques that compute “interior” eigenvalues (eigen- values inside the spectrum), typically near Monte Carlo methods are a class of stochas- the Fermi level. Also in time-dependent tic optimization techniques that are particu- DFT (TDDFT), a large dense matrix is larly faithful to the transition probabilities sometimes diagonalized (in which case all inherent in statistical mechanics or quantum eivenvalues/vectors are computed). physics. This faithfulness to the underlying physics allows them to be applied to many Electronic structure calculations will problems, not just those commonly consid- also likely benefit from all advances that ered as optimization problems. Improve- enhance our capability to solve PDEs nu- ments in the applicability and (especially) merically. In particular, the development of the efficiency of Monte Carlo techniques grid-based methods, adaptive mesh refine- could have a big impact on several aspects ment approaches, grid partitioning methods, of nanoscience modeling. non-linear multigrid solvers, domain de- composition methods, etc. may play an im- Quantum Monte Carlo is currently the portant role in future scalable Kohn-Sham most accurate method for electronic struc- solvers. Specifically, the version of TDDFT ture that can be extended to systems in the that uses ordinary differential equation nanoscience range. It serves as a benchmark (ODE) schemes requires the solution of and a source of insight. It is capable of hundreds or thousands (one for each ground highly accurate numerical solution of the state) of systems of ODEs (actually time- Schrödinger equation. This progress has re- dependent Schrödinger equation) which are quired resolution of the “fermion sign prob-

33 lem,” a challenge to computational mathe- method which merits substantial additional matics. That resolution has recently been development. Other deep technical problems made in principle, but substantially more include the need for developments in pertur- development is needed to make it practical. bation methods and techniques for treating Kinetic Monte Carlo (KMC) methods relativistic effects. are an important class of techniques that Finally, continued advances and devel- treat a range of atomistic and aggregate par- opment in embedding, that is, for using a ticle phenomena above that of electronic hierarchy of methods of varying accuracy structure. Essentially they use “Metropolis- and computational complexity to treat larger Hasting” stochastic dynamics, ascribing systems, are urgently needed. physical time intervals to the moves. A fun- damental question that needs mathematical Data Exploration and Visualization attention is the relationship of the time scale Large simulations can generate copious to the form of the moves used. If this can be quantities of data. A physical scientist is of- solved, then KMC will be both more reliable ten interested in exploring these large data- and much more widely applicable. sets for interesting features. This process is Additional important technical issues difficult to automate, since often the most connected with KMC include the need for important features are unexpected and they efficient scalably parallel approaches, multi- tend to be very application specific. Such scale methods, and efficient quantitative explorations are valuable because they gen- methods for estimating the probabilities and erally reveal deficiencies in the simulation rates of rare events. In many of the applica- itself or unanticipated and interesting as- tions, the a priori probability of a rate con- pects of the physics. This kind of data- trolling event is very small; appropriate im- driven exploring requires rich and easy-to- portance sampling or other techniques need use environments for visualization and data to be developed to make such studies practi- interactivity. Good tools will hide the com- cal. plexity of the data management from the user and will respond in real time to user Monte Carlo methods are also widely interactivity, even though the data analysis used in statistical mechanical studies of may be occurring on a distant parallel ma- nanosystems. All of these variants of Monte chine. Carlo can be made more powerful and use- ful by incorporating developments in com- Although there is considerable research putational geometry and by a theory of op- activity in this general area, there will be timal moves—that is, by mathematical in- significant benefit from customization to vestigation of the dependence of the spec- effectively impact the research of specific trum of the Markov transition matrix upon nanoscientists. On the other extreme, strate- the form of trial moves in dynamics of the gies for deriving maximal information from Metropolis-Hasting. In addition, since many a limited data environment are needed, since nanosystems are driven more by entropy experiments may be difficult or expensive to than by energy, methods that are entropy or conduct. This will be critical, for example, free-energy friendly are badly needed: cur- in getting a handle on management of un- rent methodologies for free energy are usu- certainty. In both these contexts, strategies ally rather awkward. that include experiments for the purpose of supporting the theoretical objective are Equally important is the promise from a recent development in an O(N) Monte Carlo

34 needed, in contrast, say, to modeling for the composite. Efficient methodologies for rep- purpose of explaining experiments. resenting and manipulating such geometries can have a significant impact on the overall Computational Geometry performance of some computations. Com- Considerable geometric complexity is an putational geometry is a mature area of algo- inherent aspect of many objects from the rithmic computer science, but the specific atomistic to the nanoscale. Examples include geometries arising in nanoscience would the potential energy isosurfaces of a polymer benefit from specialized algorithms. and the structure of a self-assembled nano-

C. Optimization and Predictability

In virtually every branch of science, there is large range of complex behavior on non- a need for calculations to estimate the pa- separable temporal and spatial scales. rameters of a system, to find the extremum of an objective function such as minimum Optimization energy or maximum entropy states of a sys- Many of the problems mentioned above rely tem, and to design and control systems. on formulating and solving optimization Since mathematical models are often used to problems. For example, in order to gain a guide experimentation and to predict or quantitative understanding of nanosystems adapt to behaviors, there is an equally strong at a fundamental level, it is essential to have need to understand the errors in these mod- the capability to calculate the ground state of els and to be able to make mathematically the system. This can be formulated as a sound and precise statements regarding the minimum energy problem where the solu- accuracy and precision of the . tion gives the configuration of the particles Since all of these needs pervade science, of the system at the lowest energy, given an it is natural to ask in what way nanoscience energy functional. Complicated nanosystems is special. Why, for example, cannot exist- of interest can have millions to billions of ing techniques be used for the models in particles, resulting in huge optimization nanoscience? The short answer to this ques- problems characterized by an equally huge tion is implicit in the earlier sections of this number of local minima with energy levels report: the degree of detail and dynamic close to the ground state. There is no hope of complexity including wide-ranging, non- solving this problem with brute force meth- separable scales in both space and time ods. To make any headway on such prob- make the formulation and solution of prob- lems requires careful formulation of the ob- lems in this area fundamentally more chal- jective function and the constraints. For ex- lenging than most and arguably all problems ample, progress in the protein-folding prob- successfully tackled to date. The payoff to lem has been achieved by understanding that developing a new set of computational mod- certain sequences of amino acids in the un- els and tools would be better understanding, folded molecule almost always end up in a prediction, design, optimization, and control standard form, e.g., an alpha helix. By in- of complex nanosystems with confidence in ducing the optimization path towards this the reliability of the results. Successes in this conformation, better results have been at- field would almost certainly have impact on tained. Constraints, based on additional other fields that share the complexity of a knowledge, can also be formulated to further reduce the search space.

35 Optimization methods that exploit As alluded to above, the design and physical insight could have a dramatic im- control of nanosystems is vital to the even- pact on efficiency. Monte Carlo methods are tual manufacture of products. The problem discussed elsewhere in this report, but other here is to optimize the design to achieve a stochastic or deterministic optimization desired characteristic or to control the proc- methods can also be accelerated by domain ess so that a given state is reached. In these knowledge. For example, when trying to cases, the constraints of the optimization simulate a novel material, knowledge about problem contain the hierarchy of models de- the structural properties and energetics of scribing these systems. In particular, the related materials could be used to preferen- constraints contain coupled systems of par- tially steer the computation towards prom- tial differential equations (PDE). There is ising regions of the state space. Such meth- considerable interest in such problems in ods are likely to be much more efficient than other fields, e.g., aerodynamics, and some general optimization strategies, but quite efficient algorithms have been created, but limited in their applicability. the number of levels involved in nano- Self-assembly is a central feature of science is far higher than in other fields. nanoscience. Understanding, predicting, and Progress has been made by understand- controlling this process is crucial for the de- ing that these coupled systems of PDEs do sign and manufacture of viable nanosystems. not have to be satisfied at each step of the Clearly the subsystems involved in this pro- optimization process, but only need to be cess are assembling themselves according to satisfied in the limit. This has resulted in an some minimum energy principle. Once an optimized solution in a modest multiple of understanding of the underlying physics is the time required for one nonlinear solve. attained, optimization problems can be for- Such approaches, however, require that the mulated to predict the final configurations. formulation of the optimization problem in- Since these systems are also huge and likely corporate some or all of the state variables in to have many local minima, a careful devel- addition to the control variables. Effective opment of the models, the constraints, and optimization methods in these cases must the algorithms will also be required here. have more control over the PDE software Given the necessity to create models that than is typically the case, e.g., gradients incorporate many scales, it has been sug- and/or adjoints must be computed along gested that a hierarchy of models be created with other quantities. Thus, the formulation to span this range, i.e., a connected set of of the models, the algorithms, and the soft- models that capture the quantum effects ware with the ultimate goal of design and through the micromorphic effects. In this optimization is crucial. case, the parameters at a given level should One can easily envision optimization be attainable from the next lowest level problems arising in many other aspects of using parameter-determining techniques. nanoscience. For example, methods for Optimization formulations for such parame- shape optimization for determining the op- ter estimation problems are ubiquitous, but timal geometry of both tools and products, constructing the correct error models and the non-smooth optimization due to derivative associated optimization problem is not obvi- discontinuities between the models at ad- ous. Simply using least squares may not be joining levels, and discrete optimization for appropriate; more general maximum- particles restricted to discrete locations will likelihood estimators will likely be required. be needed. All of the developments in these areas, as with those mentioned above,

36 should be done collaboratively between problem through based optimizers and nanoscientists. on measurements of outputs. Thus propaga- tion of errors and uncertainty is mapped Predictability both ways, from input to output in direct Although part of the predictability problem simulation and from output to input in pa- can be addressed using optimization tech- rameter determination. Statistical inference niques (e.g., maximum likelihood), other is the tool that connects the two. This infer- statistical methodologies will be necessary ence can be conveniently formulated in a to develop a final statement of confidence in Bayesian framework, so that the inverse the answers. First, however, it is important problem acquires a probabilistic expression, to recognize all of the possible sources of and disparate data collected from different errors in the process. This starts with the sources can then be used to reduce uncer- recognition that none of our models is exact. tainty and to aid in the predictions. The ul- And, since most of the models are based on timate goal of such an approach is the de- PDEs, these equations cannot be solved ex- termination of error bars or confidence in- actly. There will be a combination of discre- tervals associated with a prediction, based tization errors in the formation of the alge- on all sources (data and simulation) on braic systems that the computer sees and which the prediction is based. A necessary roundoff errors in the actual computation. intermediate goal is the association of error There are errors associated with the physical bars or other measures of quantified uncer- models and errors related to parameters and tainty with simulations. In analogy to error initial conditions that define these models. bars, which indicate the precision of an ex- Some of these errors are observational, re- perimental measurement, error bars should lated to field or laboratory data, and some be used to quantify the precision and accu- are numerical or modeling errors, related to racy of a simulation. simulation data. These two types of errors are actually of a similar form, as laboratory Software errors are often the result of a simulation Elsewhere in this report, the issue of soft- needed to interpret the experiment. Further- ware has been raised. The need for well- more, errors at one scale will be propagated designed, object-oriented, open-source to other scales in ways that will be difficult codes that can be used on a variety of plat- to track and manage. forms is essential to the overall success of the nanoscale initiative. There are simply It is important to develop methods for too many modules necessary for simulation determining comprehensive and inclusive and analysis than can be developed in one bounds on the final errors. Such methods place. Modular optimization and statistical include sampling technologies for develop- algorithms and software need to be devel- ing sensitivities to parameters, analytic tech- oped in such a framework to provide tools niques for propagation of errors, and prob- for scientific understanding leading to ad- ability models for numerical or modeling vanced engineering and analysis tools with errors in simulations. The predictions of sci- guaranteed accuracy for design and manu- ence are not a simple forward map of an in- facture. put to an output in a simulation. Parameters and input values needed in the forward simulation are determined as an inverse

37

Appendix A: Workshop Agenda

Friday, May 10, 2002

Speaker Title/Subject 8:00 a.m. Walt Stevens, DOE, Basic Energy Sciences, and “The DOE Perspective on Theory and Mod- Walt Polansky, DOE, Advanced Scientific Com- eling in the Nanoscience Initiative” puting Research 8:30 C. William McCurdy, LBNL Goals and Purpose of This Workshop

Session Chair: Bill McCurdy

9:00 Stanley Williams “Big Theory as the Engine of Invention for Hewlett-Packard Laboratories Nanotechnology: Losing the Born- Oppenheimer Approximation” 9:45 Uzi Landman “Small Is Different: Computational Micros- Georgia Institute of Technology copy of Nanosystems” 10:30 Break

Session Chair: Ellen Stechel

11:00 Steven Louie “Theory and Computation of Electronic, University of California, Berkeley Transport and Optical Properties on the Na- noscale” 11:45 Dion Vlachos “Bridging Length and Time Scales in Materi- University of Delaware als Modeling” 12:30 p.m. Lunch – Working, Catered in Conference Room

Session Chair: David Keyes

1:45 Phil Colella “Computational Mathematics and Computa- LBNL tional Science: Challenges, Successes, and Opportunities” 2:30 Jerry Bernholc “Quantum Mechanics on the Nanoscale: from North Carolina State University Electronic Structure to Virtual Materials” 3:15 Break

Session Chair: Peter Cummings

3:45 Sharon Glotzer “Computational Nanoscience and Soft Matter” University of Michigan 4:30 Alex Zunger “Progress and Challenges in Theoretical Un- National Renewable Energy Laboratory derstanding of Semiconductor Quantum Dots” 5:15 Adjourn 7:00 Conference Dinner – Dinner Speaker: “Massive Scientific Data Sets: Issues and Ap- Bernd Hamann, University of California, Davis proaches Concerning Their Representation and Exploration”

39 Saturday, May 11, 2002

Panel Subject 8:30 a.m. Paul Boggs, SNL/Livermore The Role of Applied Mathematics and Com- Jim Glimm, SUNY Stony Brook puter Science in the Nanoscience Initiative Malvin Kalos, LLNL George Papanicolaou, Stanford University Amos Ron, University of Wisconsin-Madison Panel Moderator: Paul Messina, ANL Yousef Saad, University of Minnesota

Breakout Sessions Chair

10:30 Well Characterized Nano Building Blocks James Chelikowsky, University of Minnesota Complex Nanostructures and Interfaces Sharon Glotzer, University of Michigan Dynamics, Assembly and Growth of Nanostruc- Peter Cummings, University of Tennessee tures 12:00 p.m. Lunch – Working, Catered in Conference Room 1:00 Reports from Breakout Sessions

Breakout Sessions Chair

1:30 Crossing Time and Length Scales George Papanicolaou, Stanford University Fast Algorithms Malvin Kalos, LLNL Optimization and Predictability Paul Boggs, SNL/Livermore 3:00 Reports from Breakout Sessions 3:15 Closing Discussion of the Report of the Workshop 4:30 Workshop Adjourns

40 Appendix B: Workshop Participants

Jerry Bernholc Daryl Chrzan Department of Physics University of California, Berkeley 406A Cox Hall Computational Materials Science North Carolina State University 465 Evans Hall Raleigh, NC 27695-8202 Berkeley, CA 94720 919-515-3126 510-643-1624 [email protected] [email protected]

John Board Phil Colella Lawrence Berkeley National Laboratory Electrical and Computer Engineering One Cyclotron Road, MS-50A-1148 178 Engineering Building Berkeley, CA 94720 Box 90291 510-486-5412 Durham, North Carolina 27706 [email protected] 919-660-5272 [email protected] Larry Curtiss Argonne National Laboratory Paul Boggs 9700 S. Cass Avenue Computational Sciences and Mathematics Argonne, IL 60439 Research Department 630-252-7380 Sandia National Laboratories, MS 9217 [email protected] Livermore, CA 94551 925-294-4630 James Davenport [email protected] Brookhaven National Laboratory P.O. Box 5000 Russel Caflisch Building 463B University of California, Los Angeles Upton, NY 11973-5000 Math, Box 7619D MSD, 155505 631-344-3789 Los Angeles, CA 90095-1438 [email protected] 310-206-0200 [email protected] Michel Dupuis Batelle, Pacific Northwest National Laboratory James Chelikowsky P.O. Box 999/K8-91 University of Minnesota Richland, WA 99352 Chemical Engineering and Material Science [email protected] Amundson Hall, Room 283 421 Washington Avenue SE Herbert Edelsbrunner Minneapolis, MN 55455 Duke University 612-625-4837 Computer Sciences Department [email protected] Box 90129 Durham, NC 27708 919-660-6545 [email protected]

41 Anter El-azab Bernd Hamann Batelle, Pacific Northwest National Laboratory Department of Computer Science P.O. Box 999/K8-91 3035 Engineering II Building Richland, WA 99352 University of California, Davis [email protected] 1 Shields Avenue Davis, CA 95616-8562 Stephen Foiles 530-754-9157 Sandia National Laboratories [email protected] P.O. Box 5800, Mail Stop 1411 Albuquerque, NM 87185-0885 Steve Hammond 505-844-7064 [email protected] National Renewable Energy Laboratory 1617 Cole Blvd., SERF/C5300 Giulia Galli Golden, CO 80401-3393 Lawrence Livermore National Laboratory 303-275-4121 7000 East Ave. [email protected] P.O. Box 808, MS-L415 Livermore, CA 94550 Bruce Harmon 925-423-4223 Ames Laboratory, USDOE [email protected] Iowa State University 311 TASF Jim Glimm Ames, Iowa 50011 Dept. of Applied Mathematics and Statistics 515-294-5772 P-138A Math Tower [email protected] State University of New York at Stony Brook Stony Brook, NY 11794-3600 Grant Heffelfinger 631-632-8355 Sandia National Laboratories [email protected] 1515 Eubank SE Albuquerque, NM 87123 Sharon Glotzer 505-845-7801 Departments of Chemical Engineering, [email protected] Materials Science and Engineering, Macromolecular Science and Engineering John Hules University of Michigan Lawrence Berkeley National Laboratory Room 3014, H. H. Dow Building One Cyclotron Road, MS-50C 2300 Hayward Street Berkeley, CA 94720 Ann Arbor, MI 48109-2136 510-486-6008 734-615-6296 [email protected] [email protected] Malvin Kalos Francois Gygi Lawrence Livermore National Laboratory Lawrence Livermore National Laboratory 7000 East Ave. 7000 East Avenue. P.O. Box 808 P.O. Box 808 Livermore, CA 94550 Livermore, CA 94550 925-424-3079 925-422-6332 [email protected] [email protected]

42 John Kieffer Juan Meza Materials Science and Engineering Lawrence Berkeley National Laboratory University of Michigan One Cyclotron Road, MS 50B2239 Room 2122, H. H. Dow Building Berkeley, CA 94720 2300 Hayward Street 510-486-7684 Ann Arbor, MI 48109-2136 [email protected] 734-763-2595 [email protected] Raul Miranda U.S. Department of Energy Dale Koelling Basic Energy Sciences U.S. Department of Energy 19901 Germantown Road, SC-142 Materials Science and Engineering Germantown, MD 20874-1290 19901 Germantown Road, SC-13 301-903-8014 Germantown, MD 20874-1290 [email protected] 301-903-2187 [email protected] Fred O’Hara P.O. Box 4273 Joel Kress Oak Ridge, TN 37831 Los Alamos National Laboratory 865-482-1447 P.O. Box 1663, MS B268 [email protected] Los Alamos, New Mexico 87545 505-667-8906 George Papanicolaou [email protected] Mathematics Department Building 380, 383V Uzi Landman Stanford University Physics 0430 Stanford, CA 94305 Georgia Institute of Technology 650-723-2081 Atlanta, GA 30332 [email protected] 404-894-3368 [email protected] Walt Polansky U.S. Department of Energy William A. Lester, Jr. Mathematical, Information, and Computational Department of Chemistry Sciences University of California, Berkeley 19901 Germantown Road, SC-32 Berkeley, CA 94720-1460 Germantown, MD 20874-1290 510-643-9590 301-903-5800 [email protected] [email protected]

Steven Louie Art Ramirez University of California, Berkeley Los Alamos National Laboratory Physics, 545 Birge P.O. Box 1663, MS K764 Berkeley, CA 94720 Los Alamos, New Mexico 87545 510-642-1709 505-665-5990 [email protected] [email protected]

Paul Messina Argonne National Laboratory 9700 S. Cass Avenue Argonne, IL 60439 630-252-6257 [email protected]

43 Chuck Romine Malcolm Stocks U.S. Department of Energy Oak Ridge National Laboratory Mathematical, Information, and Computational P.O. Box 2008, MS 6114 Sciences Oak Ridge, TN 37831-6114 19901 Germantown Road, SC-31 865-574-5163 Germantown, MD 20874-1290 [email protected] 301-903-5152 [email protected] Anna Tsao, President AlgoTek, Inc. Amos Ron 3601 Wilson Boulevard, Suite 500 Department of Computer Sciences Arlington, VA 22201 University of Wisconsin-Madison 301-249-3383 1210 West Dayton Street [email protected] Madison, Wisconsin 53706-1685 608-262-6621 Dion Vlachos [email protected] Department of Chemical Engineering and Center for Catalytic Science and Technology Yousef Saad University of Delaware University of Minnesota Newark, DE 19716-3110 Dept. of Computer Science and Engineering 303-831-2830 4-192 EE/CSci Building [email protected] 200 Union Street S.E. Minneapolis, MN 55455 Lin-Wang Wang 612-624-7804 Lawrence Berkeley National Laboratory [email protected] One Cyclotron Road, MS-50F Berkeley, CA 94720 Tamar Schlick 510-486-5571 Courant Institute [email protected] New York University 509 Warren Weaver Hall Stanley Williams New York, NY 01112-1185 HP Fellow and Director, 212-998-3116 Quantum Science Research [email protected] Hewlett-Packard Laboratories 1501 Page Mill Road, MS 1123 Thomas Schulthess Palo Alto, CA 94304 Oak Ridge National Laboratory 650-857-6586 P.O. Box 2008, MS 6114 [email protected] Oak Ridge, TN 37831-6114 865-574-4344 Andrew Williamson [email protected] Lawrence Livermore National Laboratory 7000 East Avenue. Walt Stevens P.O. Box 808 U.S. Department of Energy Livermore, CA 94550 Basic Energy Sciences 925-422-8285 19901 Germantown Road, SC-14 [email protected] Germantown, MD 20874-1290 301-903-2046 [email protected]

44 Hua-Gen Yu Organizers Brookhaven National Laboratory P.O. Box 5000, Building 463B Peter Cummings Upton, N.Y. 11973-5000 Department of Chemical Engineering 631-344-4367 University of Tennessee [email protected] 419 Dougherty Hall Knoxville, TN 37996-2200 Alex Zunger 865-974-0227 National Renewable Energy Laboratory [email protected] 1617 Cole Blvd., SERF/C109 Golden, CO 80401-3393 Bruce Hendrickson 303-384-6672 Parallel Computing Sciences Department [email protected] Sandia National Laboratories Albuquerque, NM 87185-1111 505-845-7599 [email protected]

David Keyes Department of Mathematics and Statistics Old Dominion University 500 Batten Arts & Letters Norfolk, VA 23529-0077 757-683-3882 [email protected]

C. William McCurdy Lawrence Berkeley National Laboratory One Cyclotron Road, MS-50B-4230 Berkeley, CA 94720 510-486-4283 [email protected]

Ellen Stechel Ford Motor Co. Powertrain Operations and Engine Engineering 21500 Oakwood Boulevard, MD#3 Dearborn, MI 48121-2053 313-248-5635 [email protected]

45 DISCLAIMER This document was prepared as an account of work sponsored by the Government. While this document is believed to contain correct information, neither the United States Government nor any agency thereof, nor The Regents of the University of California, nor any of their employees, makes any warranty, express or implied, or assumes any legal re- sponsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, proc- ess, or service by its trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its en- dorsement, recommendation, or favoring by the United States Government or any agency thereof, or The Regents of the University of California. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States Government or any agency thereof or The Regents of the University of California.

46