Turbulence Simulations: Multiscale Modeling and Data-Intensive
Total Page:16
File Type:pdf, Size:1020Kb
Turbulence simulations: multiscale modeling and data-intensive computing methodologies by Jason Graham A dissertation submitted to The Johns Hopkins University in conformity with the requirements for the degree of Doctor of Philosophy. Baltimore, Maryland January, 2014 c Jason Graham 2014 All rights reserved Abstract In this two part work, methodologies for the multiscale modeling of complex tur- bulent flows and data-intensive computing strategies for large-scale turbulent simu- lations are developed and presented. The first part of this thesis is devoted to the simulation of turbulent flows over objects characterized by hierarchies of length-scale. Flows of this type present special challenges associated with the cost of resolving small-scale geometric elements. During large eddy simulation (LES), their effects on the resolved scales must be captured realistically through subgrid-scale models. Prior work performed by Chester et al. [21] proposed a technique called renormal- ized numerical simulation (RNS), which is applicable to objects that display scale- invariant geometric (fractal) properties. The idea of RNS is similar to that of the dynamic model used in LES to determine model parameters for the subgrid-stress tensor model in the bulk of the flow. In RNS, drag forces from the resolved elements that are obtained during the simulation are re-scaled appropriately by determining drag coefficients that are then applied to specify the drag forces associated with the subgrid-scale elements. In the current work we introduce a generalized framework for ii ABSTRACT describing and implementing the RNS methodology thereby extending the method- ology first presented by Chester et al. [21]. Furthermore, we present various other possible practical implementations of RNS that differ on important, technical aspects related to 1) time averaging, 2) spatial localization, and 3) numerical representation of the drag forces. The new RNS framework is then applied to fractal tree canopies consisting of fractal-like trees with both planar cross-section and three dimensional orientations. The results indicate that the propsed time averaged, local, and explicit formulation of RNS is superior to the predecessor formulation as it enables the mod- eling of spatially non-homogenous geometries without using a low-level branch based description and preserves the assumed dynamic similary through temporal filtering. In addition, the overall predicted drag force of the non-planar fractal trees is shown to agree well with experimental data. In addition to RNS, a methodology for generating accurate inflow conditions in multiscale turbulence simulations is present. This tech- nique called concurrent precursor simulation (CPS) allows the synchronous generation of inflow data from an upstream precursor simulation. This approach conceptually is the same as the standard precursor simulations (Lund et al. [72] and Ferrante and Elghobashi [35]) used in the past, however, it eliminates the I/O bottleneck of disk reads and writes by transferring sampled data directly between domains using MPI. Furthermore, issues with recycling time scales of the sample inflow library are removed since the upstream, precursor simulation is performed concurrently with the target simulation. This methodology is applied to a single fractal tree (modeled using RNS) iii ABSTRACT in turbulent duct flow and to a finite length, developing wind farm. In the second part of this work, data-intensive computing strategies addressing the large-scale data problem in direct numerical simulation (DNS) of turbulent flows are presented. DNS provides the highest fidelity of predicited turbulence data. As a result, these data have served a vital in role in turbulence research and access to such data is key to continued development of the field. Classical approaches to the management and dissemination of these large-scale datasets, however, has proven to be cumbersome and prohibitively expensive in some instances thus minimizing the usefulness of these data to a broad community. Therefore, the Johns Hopkins Turbulent Databases (JHTDB) (Perlman et al. [89] and Li et al. [68]) have been created which expose large-scale turbulence datasets to the reasearch community worldwide using Web services. The JHTDB project provides Web service libraries for C, Fortran, and Matlab which allow inter- action with the DNS data. The design and implementation of the Matlab interface along with several examples are presented. Also, the first Web service based, publicly available channel flow DNS database is produced in this work. The implementation of the channel flow DNS and construction of the subsequent database are presented. These data are then used to study the structure and organization of channel flow turbulence. In this study, the Q criterion [50] is employed to measure the vortex sizes and organization. Results appear to indicate good, qualitative agreement with the- oretical predictions with respect to the prescence of large-scale near wall structures and the preponderance of buffer layer vortices. iv ABSTRACT Primary Reader: Professor Charles Meneveau Secondary Readers: Professors Gregory L. Eyink and Randal Burns v Acknowledgments I first and foremost thank my Lord and Savior Jesus Christ, from whom all bless- ings proceed, for the opportunity to pursue the endevours in this work. Without His guidance and care it would not have been possible to complete this journey. An immense thanks goes to my advisor Professor Charles Meneveau for his patience, kindness, and encouragement throughout my doctoral studies. His direction and thoughtful conversations are also greatly acknowledged. I thank Professors G. L. Eyink and R. Burns for their many kind and insightful suggestions made throughout this work, for their stimulating lectures, and fruitful collaborations. To Professors A. Prosperetti, O. Knio, J. Katz, and R. Mittal, I am very greatful for the exceptional courses which they taught and for their endless pursuit of academic excellence. A great thanks goes to Dr. Edward Givelberg, Kalin Kanov, and the entire JHTDB team for exciting and rewarding collaborations. For providing access and support to the PoongBack code along with fruitful collaborations, a cordial thanks belongs to Professor Robert Moser, Myoungkyu Lee and Nicholas Malaya of the University of Texas. To my collegues Claire VerHulst, Adrien Thormann, and Dr. vi ACKNOWLEDGMENTS Kunlun Bai, I am very thankful for fun and thought provoking conversations which have certainly added to the richness of this work. For financial support, I am indebted to the JHU IGERT program on “Modeling Complex Systems” (NSF grant #0801471) and the NSF grant #CMMI-094153 for supporting this effort. And finally, to my wife Cindy, for her unwaivering support and endless patience– without which this work would not have been possible–I am deeply grateful. vii Dedication To my mom, though departed too soon, your inspiration and love live on in this work. And to Noah, who reminds me each day what it means to live. viii Contents Abstract ii Acknowledgments vi List of Tables xiv List of Figures xv 1 Introduction 1 1.1 Overview . 1 1.2 Background and Motivation . 5 1.2.1 Multiscale Modeling . 5 1.2.2 Data-intensive Computing . 12 1.3 Thesis Outline . 21 I Multiscale Modeling 23 2 Renormalized Numerical Simulation 24 ix CONTENTS 2.1 Introduction . 24 2.2 RNS Framework . 25 2.3 LES Implementation . 30 2.4 Planar tree canopy test case . 32 2.5 RNS formulations . 39 2.5.1 Model M1 . 40 2.5.2 Model M2 . 43 2.5.3 Model M3 . 45 2.5.4 Model M4 . 47 2.6 Test results . 49 2.6.1 RNS quantities . 50 2.6.2 Selected flow statistics . 55 2.6.3 Temporal averaging time-scale . 58 2.6.4 Discussion . 59 2.6.5 Grid and RNS Modeling Sensitivity . 60 2.7 Applications to canopy consisting of fractal trees with three non co- planar branches . 64 2.7.1 Flow Field and RNS Results . 66 2.7.2 Comparison with Experimentally Determined Drag Coefficient 75 2.8 Conclusions . 77 3 Concurrent Precursor Simulation 80 x CONTENTS 3.1 Introduction . 80 3.2 Implementation . 81 3.3 Application Cases . 85 3.3.1 Single fractal tree in turbulent duct flow . 86 3.3.2 Finite length wind farm . 89 3.4 Conclusion . 92 II Data-Intensive Computing 94 4 Johns Hopkins Turbulence Databases 95 4.1 Introduction . 95 4.2 Design and Construction . 96 4.2.1 Database Cluster . 99 4.2.2 Web Services . 100 4.3 Channel Flow Database Interpolation and Differentiation Methods . 102 4.3.1 Spatial Interpolation . 103 4.3.2 Spatial Differentiation . 108 4.4 Matlab Client Interface . 110 4.4.1 Design and Implementation . 111 4.4.2 Code Examples . 113 4.5 Conclusions . 116 xi CONTENTS 5 Channel Flow DNS 118 5.1 Introduction . 118 5.2 Governing Equations . 119 5.3 Production Simulation . 121 5.4 Vortex Analysis . 125 5.4.1 Implementation . 130 5.4.2 Results . 134 5.5 Conclusions . 140 6 Concluding Remarks 144 Appendix A LESGO Validation: Flow over wall mounted cubes 148 Appendix B MPI-DB 152 B.1 Introduction . 152 B.2 Related Work . 154 B.3 Channel Flow Simulation . 155 B.4 The MPI-DB software library . 157 B.5 MPI-DB Architecture . 158 B.6 Fortran Interface Design . 160 B.7 Fortran Interface Example . 161 B.8 Results . 166 B.9 Conclusion . 171 xii CONTENTS Appendix C B-Spline Collocation Method 172 Appendix D Channel Flow DNS: Pressure Solver 175 D.1 Introduction . 175 D.2 Pressure Solution . 176 D.2.1 Non-Zero Wavemode Solution . 176 D.2.2 Zero Wavemode Solution . 177 D.3 Validation . 178 D.4 Conclusion . 183 Vita 200 xiii List of Tables 2.1 Definitions of the tested RNS models. 39 2.2 Time averaged drag coefficient, RNS error, and forces for each of the RNS models when applied to simulation of the “V-tree” canopy. Results shown pertain to the sample tree in the middle of the domain.