Quick viewing(Text Mode)

Obf Oceanography 2006 19.Pdf

Obf Oceanography 2006 19.Pdf

or collective redistirbution or collective of this by of any article portion Th ADVANCES IN COMPUTATIONAL published in been has is article Oceanography , Volume 1, a quarterly journal of Th 19, Number BY OLIVER B. FRINGER, JAMES C. MCWILLIAMS, AND ROBERT L. STREET photocopy machine, reposting, or other means reposting, is only photocopy machine, permitted A New

Hybrid Model Society. Copyright e Oceanography 2006 by Th FOR COASTAL SIMULATIONS with the approval of Th

C. Blanco e Oceanography Society. All rights reserved. Permission reserved. Society. All rights e Oceanography is gran e Oceanography Society. Send all correspondence to: Society. Send e Oceanography [email protected] or Th

C. Mendocino

Pt. Arena

Pt. Reyes ted to in teaching copy this and research. for use article Repu e Oceanography Society, PO Box 1931, Rockville, MD 20849-1931, USA. 1931, Rockville, Box PO Society, e Oceanography blication, systemmatic reproduction,

64 Oceanography Vol. 19, No. 1, Mar. 2006 PHYSICAL PROCESSES IN the world’s span an which includes estuaries. Models that are specifi cally designed enormous range of spatio-temporal scales, from circu- for coastal and estuarine processes include the Stanford Un- lation at the global scale down to dissipation and mixing at structured Nonhydrostatic Terrain-Following Adaptive Navier- the centimeter scale. It is impossible to simulate all of these Stokes Simulator (SUNTANS6), DELFT3D7, or the semi-implic- processes using one model due to the shear computing power it TRIM model of Casulli (1999). Coastal and estuarine pro- that would be required. Thus, different ocean models focus on cesses distinguish themselves most signifi cantly from regional specifi c processes and hence different spatio-temporal scales. At and global processes in that they result from the interaction of the largest scales are the global ocean circulation models, such ocean currents with complex boundaries and steep bathym- as the Parallel Ocean Program (POP1), which is used primar- etry. Models that simulate even smaller-scale processes, such ily to simulate seasonal or climactic processes (i.e., the El Niño as wind-waves, do not compute the three-dimensional circula-

Southern Oscillation [ENSO] or ). tion, but instead focus on computing time-averaged effects that

At the intermediate scales are the regional ocean models, such can be included in larger-scale coastal circulation models, such as the (POM2), the Regional Ocean as Simulating WAves Nearshore (SWAN8), a two-dimensional

Modeling System (ROMS3,4), or the MIT General Circulation wind-wave model.

Model (MITgcm5), which are well suited to modeling regional A hybrid model is generated when two or more models that processes such as the or the California Coastal focus on distinct spatio-temporal scales are coupled to form a

Current. Indeed, these models are not necessarily limited to simulation tool that can capture a larger range of spatio-tempo- regional circulation studies; they have been applied extensively ral scales. With the advent of high-performance computing, the to simulate global circulation and, at the smaller scale, coastal hybrid concept has gained popularity in recent years, combining processes, such as and internal waves. Ocean models the expertise behind different model development programs. In that are specifi cally designed to simulate this end of the spatio- this article, a new hybrid simulation tool that is under develop- temporal spectrum are referred to as “coastal ocean models,” ment is presented. It attempts to tackle the problem of simulat- which, like their regional modeling counterparts, are not nec- ing regional circulation using ROMS while correctly incorporat- essarily limited to, but are best suited for, the coastal region, ing the effects of smaller-scale coastal processes with SUNTANS.

1 http://climate.lanl.gov/Models/POP; 2 http://www.aos.princeton.edu/WWWPUBLIC/htdocs.pom; 3 http://www.atmos.ucla.edu/cesr/ROMS_page.html; 4 http://marine.rutgers.edu/po/index.php?model=roms; 5 http://mitgcm.org; 6 http://suntans.stanford.edu; 7 http://www.wldelft.nl/soft/d3d/intro; 8 http://www.wldelft.nl/soft/swan

Oceanography Vol. 19, No. 1, Mar. 2006 65 THE IMPORTANCE OF THE GRID motion would be computed. This type of per off soon. Assuming that it does not, All ocean models represent discrete so- simulation would require a total of 1022 however, and assuming that the simula- lutions of an approximate form of the grid cells, which would in turn require tion started today and was continuously Navier-Stokes equations. These equa- time-step sizes of 0.1 sec if we would ex- transferred to improved computer sys- tions govern the motion of fl uid within pect to resolve the temporal behavior of tems as they emerged, the estimate of 333 the oceans at all scales, ranging from motions at such small scales. A year-long million years surprisingly reduces down global circulation scales down to mix- simulation of the ocean, then, would to a much more manageable 40 years. ing and dissipative scales for processes require 315 million time steps. Because A computation time of 40 years might like double-diffusion and turbulence in average simulation codes require roughly suggest that a direct numerical simula- breaking internal waves. In some regions 100 mathematical operations per grid tion of the oceans is not too far off in of the ocean, these processes can be as cell per time step, a DNS of the ocean the distant future. However, even if a small as a few centimeters. Because fl uid would require a total of 3 x 1030 opera- system were devised that made such a motions do not vary much at such small tions. If we had access to the world’s fast- tremendous simulation feasible, it would scales (they are smeared out by viscos- est supercomputer, Blue Gene, which is be impossible to provide initial condi- ity), we could represent, in principle, located at Lawrence Livermore National tions such as , temperature, and the full range of scales of motion in the Laboratory and is capable of perform- velocity and wave-height fi elds with such oceans if we were to discretize them with ing roughly 300 trillion operations per high resolution. In addition, it would not cells that were one cubic centimeter in second, then our simulation would take be possible to provide high-resolution volume. A discrete solution of the fl ow 333 million years! This, of course, is at boundary conditions such as fi eld represented by these fi nite volumes the present rate of computing power, (we would need to know every detail using the Navier-Stokes equations would which, according to Moore’s law, is dou- about every sand ripple and every coral be a so-called direct numerical simula- bling roughly every 18 months. Although reef!) or heat-fl ux and wind-stress fi elds. tion (DNS) of all of the world’s oceans, Moore’s prediction has stood the test of Furthermore, the shear volume of data and it would eliminate the need for a time over the past two decades, it is likely that would result would be unmanage- turbulence model because all scales of that computing power may begin to ta- able. The alternative, of course, is what

ab Figure 1. Diff erent grid arrangements designed to capture multi-scale phys- ics in the vicinity of a boundary (red line). (a) Grid stretching to resolve a straight boundary. (b) Grid stretching with a curvilinear grid near a relatively smooth boundary. (c) A curvilinear grid near sharply varying topography, which requires masking (gray cells) since the topography varies too quickly. (d) Th e masking procedure can also be per- formed with Cartesian grids, but the boundary is not followed as closely as in (c). (e) An unstructured grid allows for high resolution near irregular to- pography without the use of masking.

66 Oceanography Vol. 19, No. 1, Mar. 2006 members of the ocean and atmosphere enable the analysis of a broad range of est (Figure 1a). Of course, because most modeling communities have been doing scales by employing complex gridding boundary layers occur near complex for over half a century. Instead of com- techniques and advanced methods of dis- topography, a bulk of ocean models, in puting all the scales of motion, simpli- cretizing the governing continuous equa- fact all of those mentioned in the intro- fi cations of the Navier-Stokes equations tions most accurately, such as high-order duction except for SUNTANS and TRIM, are devised such that their discrete solu- advection schemes or high-order time- employ curvilinear grids that can fol- tions are more tractable. The basic prem- stepping techniques. Given an advanced low topographic features while employ- ise behind all numerical simulations is discretization technique, the premise be- ing grid stretching to resolve boundary that because computing power dictates hind the accurate simulation of problems layers in the vicinity of coastal features the number of points that will be used to involving a broad range of length scales is (Figure 1b). This practice of employing discretize a given problem of interest, it that most multi-scale fl ows of interest are sets the range of scales that will be simu- well represented by larger-scale motions, Oliver B. Fringer ([email protected]) lated by a discretization of the equations while a smaller percentage of the fl ows is Assistant Professor, Department of Civil themselves, while the effects of the re- contain small-scale physics that require and Environmental Engineering, Stanford maining set of motions (often referred to computationally expensive high resolu- University, Stanford, CA, USA. James C. as subgrid-scale motions) are parameter- tion. The simplest example of a multi- McWilliams is Louis B. Slichter Professor ized, or modeled, based on values on the scale fl ow is a parallel fl ow near a no-slip of Earth Sciences, Institute of Geophysics simulation grid that are computed. wall where a boundary layer exists. In this and Planetary Physics and Department of To be precise, computing power sets fl ow, the most logical way to handle the Atmospheric and Oceanic Sciences, Univer- the number of grid cells that can be disparate length scales is to employ grid sity of California-Los Angeles, Los Angeles, used to simulate a problem of interest. stretching, whereby the grid is refi ned CA, USA. Robert L. Street is William However, it does not set the size and dis- near the wall where turbulence leads to Alden and Martha Campbell Professor, tribution of these grid cells. As a result, small-scale eddies, and it is coarsened far Emeritus, Department of Civil and Environ- numerous methods have been devised to from the wall where the fl ow is parallel mental Engineering, Stanford University, develop advanced simulation tools that and undisturbed by the motions of inter- Stanford, CA, USA.

cd e

Oceanography Vol. 19, No. 1, Mar. 2006 67 curvilinear coordinates for bottom-fol- to generate, while the unstructured grid allow for an accurate estimate of the rate lowing grids to resolve bottom-boundary in Figure 1e requires the use of advanced of mixing of the pollutant fi eld. However, layers away from coastlines is referred to grid-generation algorithms that are not such small grid resolution also requires as sigma- or s-coordinates. Curvilinear even guaranteed to generate “good” grids. an actual simulation of the effect of coordinates have the disadvantage that That is, depending on the geometry and wind-waves on the surface of the ocean because these waves form, propagate, and break at scales that are much larger than 1 cm. In this case, the added expense as- A hybrid model is generated when two or more sociated with developing a simulation models that focus on distinct spatio-temporal scales code that is able to compute surface-wave are coupled to form a simulation tool that can breaking would far outweigh the im- provements in the prediction of the mix- capture a larger range of spatio-temporal scales. ing and transport of the effl uent plume of interest. It is therefore sometimes ad- vantageous (although rare) to limit the the topography or coastlines that they other grid qualities, some cells in the size of the smallest grid spacing in order follow must be relatively smooth. For grid may be so skewed that they render to prevent the appearance of additional sharply varying topography, it is possible the entire grid useless unless specifi c physics that would add an expensive sim- to employ a curvilinear grid that fol- measures are taken to deal with each par- ulation component that could otherwise lows the coastline as closely as possible, ticular grid cell. be modeled quite effectively, rather than but only computing the fl ow fi eld in the directly computed. “wet’” regions (the unshaded regions CAPTURING MULTISCALE Although the wind-wave example is in Figure 1c). This so-called “mask- PHYSICS an extreme case, a more common com- ing” strategy can also be employed for Although a particular numerical tech- ponent of ocean models that is left out Cartesian, or rectangular grids, but the nique to discretize the continuous equa- of the equations of motion because its smoothly varying features of the coast- tions of motion determines the accuracy inclusion adds a signifi cant computa- line are not captured as well as they are of a simulation to a great extent, the tional expense relative to its actual physi- with curvilinear grids (Figure 1d). The choice of the grid is of crucial impor- cal effect is the nonhydrostatic pressure. alternative is to employ unstructured tance, because it determines which scales Of the models mentioned in the intro- grids, such as in SUNTANS and unTRIM of motion can actually be simulated ver- duction, at present SUNTANS, TRIM, (the unstructured version of TRIM; see sus those that must be parameterized. Al- DELFT3D, and MITgcm are nonhydro- Casulli and Zanolli, 2002), which have though simulating a large range of scales static. In these models, the pressure fi eld the advantage—they allow for high reso- of motion comes at the obvious expense is split into a sum of its hydrostatic and lution in localized regions of fl ows in the of additional computing power, the in- nonhydrostatic components. Hydrostatic vicinity of complex coastlines or embay- creased scales of motion also require pressure is that which is associated with ments (Figure 1e). As with all aspects of an increase in complexity of the actual the weight of fl uid above a specifi c point computational fl uid dynamics, a benefi t physics that must be computed. Consider in the , and generally in- associated with employing one seemingly the problem of the transport of effl u- creases with depth. It also changes as a advantageous methodology always comes ent from a sewage outfall in the coastal result of oscillations in the free surface as with an associated cost, and it is probably ocean that is some distance from the surf well as the density fi eld, because a higher not surprising that the more complex zone. Employing a grid resolution of one free surface implies a larger hydrostatic the grid, the more costly it is to generate. cubic centimeter would enable the direct pressure while a denser fl uid implies The structured grid in Figure 1a is trivial computation of turbulence that would the same. The remaining component of

68 Oceanography Vol. 19, No. 1, Mar. 2006 the pressure fi eld is the nonhydrostatic, as near steep bathymetry or where there sult in an erroneous solution unless the or hydrodynamic pressure fi eld, or that may be short length-scale motions; this nonhydrostatic pressure is incorporated which is associated with fl uid motion. approximation has been a blessing to into the model. An excellent example of Generally speaking, the effects of the the ocean modeling community because what happens when a hydrostatic model nonhydrostatic pressure fi eld become computation of the effects of the non- is employed to compute a fl ow that is large in relation to the hydrostatic pres- hydrostatic pressure fi eld typically takes predominantly nonhydrostatic is de- sure fi eld when the horizontal length three to fi ve times longer than the effects picted in Figure 2. The fi gure depicts the scales of motion are less than or equal of the hydrostatic pressure fi eld. evolution of the internal lock exchange to the depth. Because a vast majority A consequence of the hydrostatic ap- problem, whereby a hypothetical lock of length scales in the ocean are much proximation is that it imposes a restric- separates a dense, salty fl uid from a light, greater than its depth, ocean models, tion on the minimum horizontal grid fresh fl uid, which represents the initial such as POP and POM, do not compute resolution of the fl ow, because the hy- condition in Figure 2. Upon removal of the effect of the nonhydrostatic pressure drostatic approximation assumes that the lock that separates the two fl uids, the on the fl ow fi eld. This hydrostatic ap- the horizontal length scales of motion heavy fl uid fl ows underneath the light proximation is justifi ed because the effect are much larger than the vertical length fl uid as a consequence of the hydrostatic of the nonhydrostatic pressure is usually scales. As a result, increasing the grid res- pressure fi eld, which is greater in the only felt in small regions of a fl ow, such olution in a hydrostatic model may re- heavy fl uid and hence forces that fl uid

Figure 2. Eff ect of computing the lock-exchange problem with a hydrostatic and a nonhydrostatic simulation code. Th e fi gure shows how the hydrostatic code captures the large-scale dynamics by correctly computing the position of the front while it fails to capture the formation of the billows because their horizontal length scales are less than the depth, and therefore highly nonhydrostatic. Th e red fl uid represents the salty or heavy fl uid, while the blue fl uid represents the freshwater or light fl uid. Gravitational circulation and the hydrostatic pressure fi eld enforce a recirculating fl ow fi eld in which the salty, heavy fl uid (red) fl ows beneath the fresh, light fl uid (blue). For details on these simulations please refer to Fringer et al. (submitted).

Oceanography Vol. 19, No. 1, Mar. 2006 69 underneath the light fl uid. Although the simulation. That is, in a simulation that simulation and have the simulation on acceleration of the two masses of water is employs a grid with disparate grid sizes, each grid advance forward at a time-step driven predominantly by the hydrostatic the time-step size with which the fl ow size that is set by the smallest grid spac- pressure, the differences between the hy- over the entire grid advances forward in ing, and hence the smallest physical time drostatic and the nonhydrostatic results time is limited by the shortest time scale scale, on that particular grid. The use of are striking in that the hydrostatic result of motion on the grid. Therefore, very multiple grids requires that the solution does not capture the billows that form fi ne grid resolution comes at the penalty on each grid is coupled with the solu- at the interface between the two fl uids. of requiring a large number of time steps tion on neighboring grids to maintain a These billows are highly nonhydrostatic to advance the solution over a specifi ed continuous solution over the entire do- because the horizontal length scale of time interval. Although a small time step main of interest. This concept is known the motion is less than the depth. This is necessary to accurately compute fl ow as grid nesting, whereby different grids example shows that the large-scale mo- in the regions where the grid is highly are employed in a simulation in a way tion of the fl ow is captured quite well by resolved, small time steps are not neces- that allows the grids that cover the fi ne- the hydrostatic model, in that the posi- sary in other regions of the fl ow where scale motions to advance their solutions tion of the front is similar to that in the the grid is coarse. Ideally, then, a simu- forward in time with much smaller time- nonhydrostatic approximation. How- lation that employs a multi-scale grid step sizes than the larger grids, thereby ever, the hydrostatic model does not cap- would also incorporate a spatially vari- eliminating the need to advance long- ture the fi ner-scale features of the fl ow able time-step size that would advance time-scale motions forward in time with and hence it is not justifi able to include the solution in the fi ne grid with smaller small time steps. such high horizontal resolution in the time steps than in the coarse grid. In this The decision on where to place the hydrostatic simulation because it cannot manner, the unnecessary cost associated nested grids with different resolu- compute the fi ne-scale, nonhydrostatic with computing large-scale fl ows with a tions can be decided initially, based on physics of the fl ow. small time step would be eliminated. a knowledge of where the fl ow is likely to encompass smaller spatio-temporal scales of motion, or it can be decided dy- namically. The former is known as static With the advent of high-performance computing, mesh refi nement, whereas the latter is the hybrid concept has gained popularity in recent known as adaptive mesh refi nement (AMR), whereby refi ned grids are nested years, combining the expertise behind different dynamically when more grid resolution model development programs. is required because of the development of fi ne-scale motions in particular re- gions of the fl ow. Typically, a refi nement ratio is specifi ed that determines the ra- BASICS OF NESTING IN Unfortunately, the use of spatially tio between the grid resolution on a par- NUMERICAL SIMULATIONS variable time-step sizes in ocean models ent grid and that on its child grid; one Although the implementation of dif- is not possible because, while the actual parent grid can have multiple children, ferent grids allows for a simulation to equations could, in principle, be devel- each of which can have their own child capture a wide range of spatial scales, oped to incorporate spatially variable grids (Figure 3). This grid could result especially when using unstructured time-step sizes, the equations would be from a calculation with adaptive mesh grids, the minimum computation time too complex to solve on one single grid. refi nement, or it could be created stati- required for the simulation is still limited However, it is feasible and computation- cally given the knowledge that small, tur- by the smallest time scale involved in the ally effective to employ several grids in a bulent length scales are likely to develop

70 Oceanography Vol. 19, No. 1, Mar. 2006 in the vicinity of the boundary. Either way, this nested grid arrangement pro- vides a great deal of savings because, al- though the smallest grid spacing is 1/16 the size of the largest grid spacing, high resolution exists only where it is needed. If this same domain were discretized with such high resolution over the en- tire domain, then 102,400 cells would be required, as opposed to 21,200 with the nested grid arrangement. Furthermore, because a small percentage of the fl ow contains fi ne grid resolution, then only a small percentage of the fl ow is advanced forward in time using small time steps, while the rest of the fl ow containing the coarse grids is advanced using larger time steps. Transfer of information between grids in a nested simulation can either be one-way, whereby child grids receive information from their parents, but the parent grids are not in turn affected by information on child grids, or it can be Figure 3. Example of a nested set of grids that allows for high resolution in the vicinity of the topography two-way, whereby parent grids are di- (red line) while employing coarse resolution further away from it. Th is grid employs four levels of refi ne- ment in which each refi ned level has twice the resolution of its parent, so that the fi nest grids are 16 rectly affected by the information on the times as resolved as the main coarse grid. Th e largest parent grid has seven child grids (outlined in blue) child grids. One-way nesting is prob- and 52 total off spring to make for a total of 21,200 grid cells. A grid the size of the largest grid with the ably the most common nesting strategy same resolution as the fi nest grid over the entire domain would require 102,400 cells. in ocean modeling and relies on the assumption that the solution on the re- fi ned grid likely does not affect the large- scale motions on the coarse grid. In two- way nesting, this assumption does not Figure 4a), the solution progresses inde- tion can be obtained later, because its hold and information from the fi ne grid pendently on the coarse grid for one-way solution does not affect the coarse-grid is transferred back onto the coarse grid nesting, while in two-way nesting, the solution. For example, predictions from as the simulation progresses. Figure 4 solution on the fi ne grid is transferred global-scale tidal models are often used depicts a schematic of a one-dimensional back onto the coarse grid at the begin- to drive the boundareis of smaller-scale simulation using one- and two-way nest- ning of each coarse-grid time step (Steps coastal models.The idea is that the ing and two levels of refi nement. The 4b and 8b in Figure 4b). In addition to drive the motions in the smaller-scale primary difference between one-way and ease of implementation, the advantage simulation, but the smaller-scale simula- two-way nesting is that after the simula- of one-way nesting is that data at all time tion does not alter the tides enough to tion on the fi ne grid is updated to yield steps for the coarse-grid solution can be justify feeding the small-scale solution its solution at time t + ∆t (Step 2a in computed fi rst, and the fi ne-grid solu- back into the tidal model.

Oceanography Vol. 19, No. 1, Mar. 2006 71 a) One-Way Nesting ∆x Time = t 1a: Boundary transfer from coarse to fine ∆t/2 2a: Advance 3a: Advance coarse- ∆t solution forward in grid solution time on fine grid forward in time ∆t/2

t + ∆t 4a: Boundary transfer from coarse to fine ∆t/2 6a: Advance coarse- 5a: Advance grid solution ∆t solution forward in forward in time time on fine grid ∆t/2

t + 2∆t

b) Two-Way Nesting ∆x Time = t 1b: Boundary transfer from coarse to fine

2b: Advance ∆t/2 3b: Advance coarse- ∆t solution forward in grid solution time on fine grid forward in time ∆t/2

4b: Transfer fine grid back to coarse grid t + ∆t 5b: Boundary transfer from coarse to fine ∆t/2 7b: Advance coarse- 6b: Advance grid solution ∆t solution forward in forward in time time on fine grid ∆t/2 8b: Transfer fine grid back to coarse grid t + 2∆t

Figure 4. Depiction of a one-dimensional simulation involving one-way nesting (a) and two-way nesting (b). For one-way nesting, the pro- cedure to obtain the solution at time t+2∆t proceeds as follows: (1a) Information from the simulation at time t on the coarse grid is trans- ferred to the boundary of the fi ne grid. (2a) Th e solution then progresses on the fi ne grid over two time steps with time step size ∆t/2. (3a) Th e solution on the coarse grid is advanced forward in time one time step. (4a) Th is solution is then transferred again to the boundary of the fi ne grid. Th e procedure then repeats itself in steps 5a and 6a. For two-way nesting, the procedure is altered by transferring the infor- mation from the fi ne grid onto the coarse grid in steps 4b and 8b.

72 Oceanography Vol. 19, No. 1, Mar. 2006 The schematic in Figure 4 implies that in time with smaller time steps than the pled wave-current hybrid model is the the information is transferred back onto simulation of the larger-scale physics incorporation of SWAN into DELFT3D. the coarse grid at the beginning of each on the coarser grids. The implication Nesting of wave models with circula- coarse-grid time step in a one-to-one behind different grids capturing differ- tion models is an example of a hybrid direct transfer. In practice, the fi ne-grid ent spatio-temporal length scales is that tool that couples a two-dimensional solution is fi ltered before it is transferred they also capture different fl ow physics. model with a three-dimensional circula- onto the coarse grid, or the transfer Therefore, an ideal simulation would not tion model. Recent work in coastal ocean may include an average of the fi ne- and only employ different time step sizes on simulations has focused on the devel- coarse-grid solutions, or it may be in the different grids but also different approxi- opment of coupled three-dimensional form of nudging, in which the solution mations based on the particular physics multiphysics codes, where the larger- on the coarse grid progresses as if it were being resolved by the scales on each grid. scale codes compute the regional-scale, being forced to have a tendency to follow This is the idea behind nested hybrid predominantly hydrostatic circulation, the fi ne-grid solution. These represent simulations, which employ different and the smaller-scale codes compute the a small subset of strategies designed to simulation codes on the different grids littoral, coastal-scale, nonhydrostatic account for differences in the simula- for an optimal multi-scale, multi-physics circulation. Among these hybrid codes tion between the fi ne- and coarse-grid calculation. is the nested simulation tool that com- solutions that would otherwise result in An example of a hybrid coupling bines ROMS (Shchepetkin and McWil- abrupt changes in the solution on the strategy is the technique of including the liams, 2005) and SUNTANS (Fringer coarse grid if one-to-one transfers were effect of wind-waves on coastal circula- et al., submitted). ROMS simulates the employed. The alternative is to employ tion models. Because wind-wave spa- regional-scale fl ow on a curvilinear an iterative procedure at each time step tio-temporal scales are much too short grid. It employs state-of-the art numeri- in which the transfer is adjusted until to include in reasonably sized coastal cal methods to discretize the equations the solution at the boundary of the fi ne circulation simulations, the strategy is to of ocean circulation and builds upon grid matches the solution at the bound- compute the effects of wind-waves us- decades of research in the large-scale ary of the coarse grid. This is not gen- ing two-dimensional wave models that ocean modeling community to develop erally required unless the solutions on return the wave-induced current, which a simulation tool to simulate virtually the two grids are tightly coupled, which can then be included in the circulation dissipation-free fl ows in the open ocean. is the case when detailed fl ow physics model. In a one-way nested approach, SUNTANS, on the other hand, simu- crosses the boundary between nested the wave models drive the currents, lates small-scale nonhydrostatic physics grids. For this reason, it is often neces- which in turn do not affect the surface that are based on a discretization of the sary to place child grid boundaries in re- waves. In the two-way nested approach, Navier-Stokes equations in the absence gions of the fl ow where it is unlikely that the currents from the circulation model of physical approximations. Like the complex fl ow physics may cross over an are fed back into the wave model so that nonhydrostatic MITgcm, SUNTANS is inter-grid boundary. it calculates the wave fi eld that has been written in parallel using MPI (message- altered by the currents. Because the time passing interface) and is designed for SUNTANS+ROMS: scale of variation of the wave-induced high-performance computers; however, A NEW HYBRID NESTED currents is typically much slower than unlike MITgcm, which incorporates cur- MODELING TOOL the smallest temporal scale in the cir- vilinear grids, SUNTANS incorporates The primary advantage of model nesting culation model, the models are coupled unstructured grids in a manner similar is that it allows a simulation to capture over a longer time scale, which involves to unTRIM, thereby enabling it to re- a broad range of length scales while en- on the order of tens to hundreds of cir- solve a disparate range of length scales abling the simulation of the small-scale culation-model time steps. Among the within the same grid. physics in the fl ow to advance forward most popular implementations of a cou- The idea behind the ROMS-SUN-

Oceanography Vol. 19, No. 1, Mar. 2006 73 TANS coupling tool is that the large- the other, longer time scale, or slow, in- SUNTANS into ROMS is the fact that scale fl ow fi eld is well represented on ternal-mode physics. In SUNTANS, the both simulation codes are written for curvilinear grids with relatively smooth fast free-surface waves are computed by parallel, high-performance computers. topography, while the small-scale fl ow employing an implicit technique as used Of critical importance to optimal paral- fi eld is dominated by complex topogra- in TRIM by Casulli (1999) to advance lel computations is the notion of load phy near the coast and hence it requires the equations of motion forward in time, balancing, which implies that a group an unstructured grid. Additionally, en- thereby removing the time-step limit on of computers involved in a parallel cal- ergy and momentum conservation are the free-surface motions. culation are all performing the same much more important for the larger- Among the unique challenges in de- amount of work so that one computer scale fl ow; the discretization in ROMS is veloping the ROMS-SUNTANS hybrid is not performing calculations while the able to capture this conservation on the tool is that, although the nature of the others stand idle. Both ROMS and SUN- curvilinear coordinate grid. In particular, one- and two-way coupling strategy TANS are optimally load-balanced on ROMS captures the fi ne balance between is similar to that depicted in Figure 4, their own, and therefore load balancing acceleration and pressure gradi- the information transfer between the is straightforward to achieve using static ents, which dominates the energetics for two simulation codes is complicated by mesh refi nement. However, the creation large-scale fl ows. These characteristics the fact that ROMS employs curvilin- of new adaptive SUNTANS (or ROMS) are not as important for fi ne-scale fl ow ear-coordinate grids, while SUNTANS grids over the course of the simulation calculations, particularly near coastal employs unstructured grids. Complex requires a rebalancing of the workload as features, because physically there is more interpolation procedures are thus re- the simulation progresses. This rebalanc- dissipation and mixing for the small- quired to transfer information back ing is accomplished by redistributing all scale physics captured by SUNTANS. and forth between the grids both in the of the grids involved in the simulation In both the coastal and open ocean, horizontal as well as the vertical, because over the available computers each time a and hence for both ROMS and SUN- ROMS employs curvilinear coordinates new SUNTANS or ROMS grid is adap- TANS, fast free-surface gravity waves in the vertical while SUNTANS employs tively added to the simulation. represent the shortest time-scale phys- z-level grids (Figure 5). In this coupled It is clear that the coupling of two ics in each simulation tool. However, schematic, the coupling of ROMS and vastly different multi-physics simula- for the coastal ocean, only the tidal and SUNTANS grids occurs in the horizon- tion tools like ROMS and SUNTANS is regional time scales associated with the tal, with both grids extending through- quite a complex task. Fortunately, tools free surface are of interest. Therefore, out the water column and exchanging exist that are specifi cally designed to the algorithms in ROMS and SUNTANS information only across vertical faces. couple several multi-physics tools under are designed to allow for larger time Although this type of exchange is the ini- one common framework, thus allowing step sizes without incurring numeri- tial focus of the coupling tool, the long- the implementer to focus on the phys- cal instabilities that might arise when term goal will allow for SUNTANS grids ics rather than the computer science of such large time steps are used to advance to be nested in the vertical; fi ne-scale model coupling. Among the most ad- short-time-scale physics. In ROMS, this SUNTANS grids will resolve processes vanced of these tools is the Earth System problem is handled with a technique in the mixed layer while allowing ROMS Modeling Framework (ESMF9), which called mode-splitting, whereby the free- to compute the larger-scale features be- has been employed to couple over a doz- surface waves, which represent the fast, neath it. In this case, information will be en multi-physics codes to form the NASA or external-mode physics, are advanced exchanged across both vertical and hori- GEOS-5 Atmospheric General Circula- forward in time with time steps that are zontal grid faces. tion Model, and is similar to the PRISM10 typically 100 times smaller than those for Further complicating the nesting of project. For oceanic applications, the

9 Earth System Modeling Framework (ESMF, http://www.esmf.ucar.edu) 10 Programme for Integrated Earth System Modeling (PRISM, http://www.prism.enes.org)

74 Oceanography Vol. 19, No. 1, Mar. 2006 a) Planview (x-y)

Model Coupling Environment Library (MCEL11) has been employed to couple y wave models with three-dimensional cir- x culation models. The ROMS-SUNTANS coupling is handled by a combination of ROMS-SUNTANS specifi c tools and those available in the Model Coupling Toolkit (MCT12), which handles a host of the parallel intercommunication and load-balancing features essential to the ROMS-SUNTANS coupled tool.

PLANNED APPLICATIONS OF THE ROMSSUNTANS HYBRID TOOL Both SUNTANS and ROMS have been applied extensively to the west coast of the United States. ROMS has been z b) Vertical Slice (x-z) employed to study the regional circula- x tion, with a particular emphasis on the California Coastal Current and the dy- namics of upwelling fronts along the California coast, as depicted in Figure 6a. SUNTANS, on the other hand, has been applied to the much smaller domain of Monterey Bay to study the interaction of the tides with the bottom topography and the subsequent generation of inter- nal waves of tidal frequency, or internal tides, as depicted in Figure 6b. These Figure 5. Planview (a) and vertical slice (b) of a ROMS-SUNTANS intergrid boundary, fi ne-scale simulations of Monterey using showing the four points on the ROMS grid (represented by the dark circles) that are SUNTANS are being coupled with the used to interpolate to obtain the velocity (black arrow) at a cell face on the SUN- TANS grid. Th e red lines represent the unstructured, z-level SUNTANS grid, while the larger-scale simulations of ROMS using blue lines represent the curvilinear-coordinate, bottom-following ROMS grid. the ROMS-SUNTANS hybrid tool. In the two-way nested simulations, the primary effect of the coupling is that the ROMS mesoscale currents interact with the in- ternal tides and other tidal time-scale sipation within coastal regions such as tions on the mesoscale dynamics, in par- motions in SUNTANS and have a strong Monterey Bay. This information is fed ticular how they affect the dynamics of effect on the internal tidal wave fi eld, back into the ROMS simulations to ac- the upwelling fronts and their interaction which in turn affects the mixing and dis- count for the effect of the tidal-scale mo- with the California Current System. The

11 Model Coupling Environment Libarary (MCEL, http://www.navo.hpc.mil/Navigator/fall02_Feature2.html) 12 Model Coupling Toolkit (http://www-unix.mcs.anl.gov/mct)

Oceanography Vol. 19, No. 1, Mar. 2006 75 a Model SST in Summer 20

19 46 18

17

16 44 15

14 Figure 6. Example ROMS (a) and SUNTANS C. Blanco (b) simulations. Th e ROMS simulation repre- 13 sents the -surface temperature (in °C) during summer on the west coast of the United States, 42 12 showing how the cold, upwelled water (in blue) interacts with the California Coastal Current to 11 eject cold fl uid out into the deep ocean in the

form of so-called jets and squirts. Th e SUNTANS Latitude 10 result depicts two cross sections through the C. Mendocino complex Monterey Bay bathymetry (in red) that 40 show the east-west velocity fi eld induced by internal waves generated as a result of the inter- action of the tidal currents with the bathymetry. Pt. Arena Th ese internal waves are known to be signifi cant sources of dissipation and mixing in the coastal Pt. Reyes ocean. Th e white line on the red topography 38 represents the Monterey Bay coastline. Th e time step in the SUNTANS simulation is 28 seconds and the average grid spacing is 250 m, while the time step in the ROMS simulation is 1 hour (30 seconds for the surface waves) and the aver- 36 age grid spacing is 3.5 km. Monterey Bay is lo- cated where the 122° E longitude intersects the coastline in the ROMS simulation. −130 −128 −126 −124 −122 Longitude b

76 Oceanography Vol. 19, No. 1, Mar. 2006 feedback of the SUNTANS simulations focus on distinct spatio-temporal scales ling are currently underway. For more into the ROMS simulations effectively are coupled to form a simulation tool information the reader is encouraged acts as an advanced subgrid-scale model that captures a larger range of scales: to visit http://suntans.stanford.edu/ for the ROMS simulations. That is, be- the coupling of ROMS (Shchepetkin roms-suntans. cause the grids in ROMS do not resolve and McWilliams, 2005) and SUNTANS subtidal-scale motions, the feedback of (Fringer et al., submitted). Accurate sim- ACKNOWLEDGMENTS SUNTANS information into ROMS ac- ulation of problems involving a broad Ongoing development of the SUNTANS- counts for information not resolved by range of length scales is possible because ROMS nested simulation tool is support- the ROMS grids. ROMS computes meso- most multi-scale fl ows of interest are ed by the Offi ce of Naval Research under scale dynamics, and SUNTANS computes well represented by the larger-scale mo- grants N00014-05-1-0293 and N00014- the effect of the submesoscale dynamics tions, while a smaller percentage of the 05-1-0294 (Scientifi c offi cers: Dr. C. on the ROMS simulations. fl ow contains small-scale physics that Linwood Vincent, Dr. Terri Paluszkie- The overarching goals of these simu- require computationally expensive high wicz, and Dr. Scott Harper). The authors lations and future simulations using resolution. ROMS simulates the region- have written this article on behalf of the the hybrid tool are two-fold. The fi rst al-scale fl ow on a curvilinear grid, while project team at the University of Cali- and most obvious goal is to enable the SUNTANS simulates the small-scale fornia-Los Angeles (UCLA) and Stan- simulation of multi-scale problems in physics on an unstructured grid. ford University, whose other members the coastal ocean in a way that very- Although this hybrid approach pro- are Alexander Shchepetkin and Yuliya high-resolution physics can be computed vides a tool that captures a broad range Kanarska (UCLA), and Margot Gerritsen, while obtaining an accurate representa- of scales, from the mesoscale down to Ed Gross, Dujuan Kang, and Zhonghua tion of the large-scale fl ow features and tens of meters, it still captures a limited Zhang (Stanford University). The com- their effects on the smaller, fi ne-scale portion of the overall spectrum of fl uid ments of three anonymous reviewers fl ow features of interest in SUNTANS. motions in the ocean. That is, mesoscale who provided invaluable feedback for The second goal is to understand the simulations still rely on boundary con- this article are greatly appreciated. effect of submesoscale dynamics on ditions taken from larger-scale global mesoscale features to ultimately de- ocean circulation simulations, and the REFERENCES vise submesoscale parameterizations in coastal simulations still require turbu- Casulli, V. 1999. A semi-implicit fi nite difference method for non-hydrostatic, free-surface fl ows. ROMS that can accurately predict the lence models and forcing with wind- International Journal for Numerical Methods in effects of the unresolved features of the wave models to account for unresolved Fluids 30(4):425–440. fl ow on the large-scale, resolved features motions not captured by the SUNTANS Casulli, V., and P. Zanolli. 2002. Semi-implicit nu- merical modeling of non-hydrostatic free-surface of the fl ow without actually having to grids. And, as with all well-resolved fl ows for environmental problems. Mathematical compute them. This submesocale pa- simulations of the ocean, the fi delity of and Computer Modeling 36:1,131–1,149. rameterization enables accurate predic- the tool is still highly dependent on the Fringer, O.B., M. Gerritsen, and R.L. Street. Submit- ted. An unstructured-grid, fi nite-volume, nonhy- tions of mesoscale currents over much initial and boundary conditions that drostatic, parallel coastal ocean simulator. Ocean longer time scales that would otherwise are obtained from oceanographic fi eld Modelling. Kang, D., O.B. Fringer, M. Gerritsen, Y. Kanarska, J.C. not be feasible with a hybrid tool because measurements. Nevertheless, the hybrid McWilliams, A.F. Shchepetkin, and R.L. Street. of the computational expense associated tool brings us one step closer to coupling 2006. High-resolution simulations of upwelling with computing a much wider range of scales of motion that would otherwise and internal waves in Monterey Bay using a cou- pled ROMS-SUNTANS multi-scale simulation scales of motion. not be feasible with one single code. tool. Eos Transactions of the American Geophysical Development of this hybrid tool Union 87(36-Suppl.):Abstract OS36G-07. DISCUSSION is an ongoing project (e.g., Kang Shchepetkin, A.F., and J.C. McWilliams. 2005. The Regional Oceanic Modeling System: A split-ex- We have presented an overview of a hy- et al., 2006). Applications of this system plicit, free-surface, topography-following-coordi- brid approach in which two models that and completion of the two-way coup- nate ocean model. Ocean Modelling 9:347–404.

Oceanography Vol. 19, No. 1, Mar. 2006 77