Transcript for:
WRF Wharf Modeling System Overview

this is part of the series of online presentation materials for the wrf wharf modeling system these presentations refer to version 4.2 of the modeling system this particular talk is about program real the description and general functions of that particular package it's identified here in the middle with the dot exe afterwards the real program accepts input data from wps it generates data that will be provided to the wharf model itself warf.exe overall these presentations are grouped into a number of categories there is an initial suggested grouping that should be taken in order an overview the two talks about general descriptions and then the two talks about setup and run this is the presentation of the description and general functions for the real program there is an assumption that the viewer has already seen the wps description in general function and following on to this talk there will be setup and run presentations for both wps and the wharf slash real system this talk goes over what is actually done inside of the real program when we start talking about the function of the real program one of the things that we're going to have to do is include a lot of definitions so that we all are starting from the same place from there we're going to move on to what is required to actually run the real program so what are the standard input variables we want to know what variables are optional what variables are mandatory where do they come from and there are certain questions that we have about restrictions on the vertical coordinate from the data that comes from wps from mecrid inside of the real program and inside the wharf model there is a base state that is computed and used how that base state is defined how it impacts the vertical coordinate and the definition of the potential temperature variable will be discussed when we run the real program in addition to data coming into the program we have data that comes out of the program we need to know what those files are and how we can look at those files and determine whether or not it was a successful real run of course the whole purpose of the real program is to handle vertical interpolation just like the wps mecrid program handled all of the interpolation horizontally to the domain that the modeling system would like to have the real program does the vertical interpolation to the computational coordinate that the wharf model is going to be using there are some suggestions that we have and those recommendations are basically to be very careful finally a small piece about vertical interpolation is also soil level interpolation it's really an initialization for physics and this is a good placeholder to tell us why and when we have to rerun the real program if we're going to modify options inside the name list for the wharf model there are two preprocessors for the wharf model one for an idealized initialization and one for real data initialization the one for idealization or ideal initializations is referred to as ideal.exe when we're going to do real data simulations with the wharf model we call the preprocessor that takes data from wps and provides modified data and sends it into the wharf model that's referred to as the real program or real.exe the real.exe program is available as both either a serial or distributed memory parallel job you don't really get very much performance benefit from running distributed memory unless you have a fairly large domain but what you do get is an aggregate of a fairly large amount of memory if you run on multiple processors the real program is not computationally bound the real program is automatically generated whenever you try to build the wharf model so it's kind of a program and executable of a program that you get for free the real program the wharf model are entirely separate from wps the real program the wharf model have the same registry they're in the same directory structure outside of wps the real program takes input data from wps vertically interpolates it does a little bit of diagnostic slash transformation and it provides data that is suitable for input into the dwarf model we call it the real.exe or the real program because as opposed to an idealized simulation we are considering a real data simulation that is going to be happening in the wharf model so for us that means we're going to be doing a 3d forecast we're not going to be doing a single column or 2d flow over a hill there's going to be real input that goes into the preprocessor and into the wharf model so meteorological data as opposed to idealized initialization for the input meteorological fields there's an expectation that there's going to be a full suite of physics that is utilized inside of the wharf model and a number of those fields will be initialized correctly inside of the real program microphysics surface conditions and maybe even providing some boundary conditions and for nudging a real data case also is a case that is on a sphere so this is a non-cartesian this is a projected domain so the input would be lambert conformal a mercator store polar stereographic or possibly a rotated lat lawn grid you can do global domains but nowadays most of that is going to be absorbed by the impasse model there's an expectation that the static fields are realistic so we don't have smooth topography or idealized land use or no vegetation we have information that comes from static data sets that are input to geogrid and those get forwarded on all the way into the real program and then into the model the lateral boundary conditions are time dependent we don't assume that they are constant slash fixed we also don't assume that they are periodic or other idealized ways the real program does have to generate a number of diagnostics for example what traditionally comes out of the wps met grid program is temperature and relative humidity g potential height none of those fields are what the wharf model would like to see there is a requirement that initial fields are adjusted to be consistent so for example you can imagine if you bring in data from multiple sources they may not all agree that a particular grid cell is a water point or a land point that's something that's fairly easy to figure out inside of the real program and the wharf model does a poor job at adjusting to inconsistent values it probably just segvaults there is a computation of a base state field so a separation into this reference and perturbation and we'll talk about that in just a bit each of the model initialization times requires a full 3d volume of data so there is an initial state that is provided for the wharf model because of the requirement for a time dependent lateral boundary there is a lateral boundary file that is generated for the course grid only and finally there is some vertical interpolation that goes on for the 3d meteorological fields also for the for the soil data you'll hear a number of presenters talk about the difference between runtime options and compile time options it's kind of just a set of terms that you have to start being familiar with when we refer to runtime options inside of the wharf modeling system including wps what we are referring to are the options that can be used to change the simulation that do not require the code to be recompiled and these traditionally come in via input files what we choose is to use fortran namelist files inside of the realprogram and wharf it's called nameless.input and for wps it's nameless.wps these runtime options are different than what we refer to as compile time options based on the name the compile time options are changes that are made to the source code that require the code to be recompiled before they can be activated so these are modifications to the source code they are changes to the compiler flags for example changing it from fully optimized to unoptimized adding in cpp if depths or modifications to the registry file this last piece will also have a full talk on it and it's how the wharf model allows users to make changes inside the source code without having to know too much about the whole modeling system the standard input variables that come into the real program are entirely provided by the wps program mecrid the mechrid program handles the horizontal interpolation of meteorological fields and places that data into a file that the real program reads all of the data transfer in the wharf modeling system is handled via files the real program is able to accept any of the traditional vertical coordinates that come out of the makgrid program if the data is isobaric or if it is on a terrain following sigma coordinate or if it is a hybrid of both an isobaric and a sigma coordinate the only requirement and this is a strict requirement is that the vertical coordinate is monotonic in pressure it can either be increasing or decreasing the fields that go into the real program that come from wps are broken into mandatory and optional fields this data comes from either the geogrid program such as static data topography land categories etc or it's meteorological information and it came from the decoded grid data from ungrib the mandatory data includes fields that you would expect the 3d and surface information from horizontal winds temperature some sort of moisture and the height field while it's mandatory that a 3d soil moisture or 3d soil temperature is required and it's only optional that a 3d soil moisture would be available most of the land surface models require soil moisture so for most modern land surface models soil moisture is also mandatory traditional 2d fields that you would also expect to have as input for the real program provided by wps would be fields such as surface pressure sea level pressure and land mask a number of additional 2d fields are optional but most of the national centers provide these in their grid data sets topography sea surface temperature sea ice flags skin temperature etc most users do not have to worry about the specifics of the vertical coordinate of the input data that's being provided by wps nor do they have to worry about the specific fields that are being provided from wps there are a number of known data sources that the ungrid program has v tables to interpret these v tables are associated with specific national center data sets that the wharf model understands and knows about the v table is able to pull out the necessary information from the grib files and provide that information into the macrid program and that's moved along into the real program for the vertical interpolation and diagnostics as was mentioned there is a separation into a perturbation [Music] and a reference state a base state for a number of time independent dynamically important meteorological fields inside of the wharf model these base state fields are functions only of the topography and a few user selectable constants such as the model top lapse rates temperature at sea level in the middle of the domain so anytime the topography changes the base state is going to be modified also this is entirely internal to the real program and the wharf model the user other than specifying these handful of constants doesn't have to be bothered with the construction of the base state or the perturbation state when there is feedback turned on for nesting there is a mandatory consistency between topography on the course grid and various coincident points on the nest therefore feedback requires the base state to be modified from the real program and the wharf model for every one of the the nests because the base state is only required for the computational vertical coordinate that's utilized in the wharf model only the real program and only the wharf model have to deal with the base state no programs prior to the real program no programs from wps are concerned at all with the bay state the wharf model and therefore the real program both support a hybrid vertical coordinate the hybrid vertical coordinate is such that it is a train following vertical coordinate near the surface so where there is topography the coordinate surfaces move up and down with the topography and then it relaxes towards an isobaric surface aloft the user is allowed to specify what that surface is traditionally around 200 millibars is sufficient this vertical coordinate is optional and the user may choose to have only a terrain following vertical coordinate however since version 4 the default has been to fully exercise the hybrid vertical coordinate the train following vertical coordinate if we look at the ada levels where eight is equal to zero as the model lid and eta equals one is the the surface level you can see the artificial impact of the coordinate surface even very high in the atmosphere the solution of course is to have an isobaric field the problem is that you run into the difficulty of the coordinate surface intersecting with the topography the standard solution is to have a hybrid coordinate system where near the surface it is a terrain following vertical coordinate but it quickly relaxes to an isobaric surface aloft and then from that isobaric surface all the way up to the model lid the vertical or the coordinate surfaces are flat isobaric surfaces the potential temperature field inside of the wharf model is not strictly a dry potential temperature since version 4 the default has been to use a moist potential temperature users have to be a little bit careful when they're interpreting the data that comes out of the wharf model or the real program the variable t is the dry potential temperature with a 300 kelvin offset removed so for example if the field was 300 kelvin the t field would report the value of zero so it's easy to be confused and think that the field that's coming out is either a temperature in degree c or some other unusual diagnostic the t field coming out of the real program coming out of the wharf model is dry potential temperature minus 300. inside of the model moist potential temperature is used for all of the thermodynamic computations it's been found in a number of high-resolution cases that the moist potential temperature gives a better solution on the left there are some artifacts of a moist field that are missing on the right and those artifacts are caused by a strong vertical moisture gradient so lots of moisture for example in the boundary layer and dry above the boundary layer and then strong shear right at the top of the boundary layer the entire wharf modeling system [Music] sends information along to the next standalone program via data files so the real program the real.exe program accepts data from wps from the mec grid program that information is modified vertically interpolated some diagnostics are computed and then files are generated for use in the wharf model for regional forecasts the real program generates two files the wharf input and the warf boundary file the wharf input file is an initial condition and it's a single time period of 3d volume of the entire state of the atmosphere for when the wharf model is starting wind temperature moisture pressure all the surface layer fields all the soil fields the projection information latitude longitude etc there's a boundary file the wharf bdy file the wharf boundary file is constructed only for the most coarse grid the lateral boundary conditions the time dependent boundary conditions for the nests for the inner domains are provided during the run of the wharf model by the parent domain the initial condition file is the same format as is used as input to the wharf data simulation warfda code this is referred to as a cold start where you bring in information and update it with observations for the lateral boundary condition if we process in time periods of ungripped data in time periods of met grid data then when we process in time periods of that data in the real program the lateral boundary file contains only n minus one time slices this is because of the way that we construct our lateral boundary conditions so for example if we have a 36 hour simulation that we're interested in and the data is every six hours we have seven time period slices but we have six boundary condition tendency steps we know the value at time zero hour zero time slice number one and then we know the value at time period hour 6 time period time slice number 2. the difference of those two allows us to compute a tendency that is valid between hour 0 and hour 6. the tendency between hours 6 and at 12 allows us to compute a tendency that's valid between hours 6 and 12. likewise all the way up to the end the data at hour 30 and the data at hour 36 allow us to compute a tendency between 30 and 36 so we only end up with n minus 1 time periods of boundary data compared to the time periods that were processed inside of ungrib metgrid and the real program what the boundary conditions look like it's very similar to what happens inside of a nested run within the wharf model in the region depicted as yellow there is a single specified row and column that comes entirely from large scale data the information in this row and column comes from horizontally interpolated data from met grid that was vertically interpolated inside of the real program and then was differenced inside of the real program to generate a initial value and tendency term the next few rows and columns depicted here as blue are relaxation zones the relaxation rows and columns allow both the large scale terms from the ungrib metgrid dataset to be average to be nudged by the modeling system data that's coming entirely from the dynamics and physics inside of the wharf model so a pure forecast region in this domain would be in the green area a relaxation zone between the simulation that's going on and the large scale forcing from the national center model is depicted in blue and the specified zone which is strictly from the large scale data is depicted in yellow the user can specify how wide the relaxation zone is the default is about five larger simulations or longer term simulations use values that are two to three times larger than that linear or exponential decay of those relaxation zones is also possible excluding diagnostics almost the entirety of the computational resources that are used inside of the real program is used to perform vertical interpolation there are a number of vertical interpolation options that the user has available to them and the important thing is these options can have a significant impact on what the initial conditions and boundary conditions look like that are passed into the model there's a subsequent lecture talking about how the registry is used that registry talk describes how options that are unavailable in the standard name list are able to be added into the nameless dot input file to modify the default way that vertical interpolations are handled on the left hand side is a picture of most of colorado and a little bit of kansas nebraska the field that's depicted is topography the area on the left where it's kind of chopped up is where the rocky mountains are located and then the blue area from about the third of the way over from the the left all the way out to the across kansas is relatively flat even though it's elevated topography it's fairly smooth the blue star identifies where boulder is is located what we're going to do is show [Music] with a few different options the impact of the difference of activating an option versus not activating an option over this domain for the next few slides the left hand side will always refer to the potential temperature field and the right hand side will also always refer to the u component of the horizontal speed so the first option is this force surface in the interp is equal to zero what this says is we can optionally choose to force to have the surface utilized as part of the vertical interpolation if the data came in as isobaric there are locations where the nearest sigma surface would be trapped by the pressure surfaces as opposed to one of the input surface and input surface that came in from the isobaric data if we turn this option if turn the option for surface in v interp is equal to zero which is not the default the difference between the potential temperature field at the surface and the u field at the surface is mostly impacted over areas of strong topography and that's a feature that's an artifact that we're going to see in all of these options when we have fairly smooth isobaric surfaces such as the eastern two thirds of the domain the impact is very small with most of these vertical interpolation options we can also say not only are we requiring that the surface be used in the vertical interpolation but that surface that is available from met grid is used as the bottom boundary condition for the first in levels and in this case six so at level number four you can see on both the potential temperature and the u field very smooth transition on the eastern two thirds of the domain with the the impact however once you get into the topography there's very small scale noise that's introduced because of the the change in the way that this particular surface field is is utilized we can adjust the horizontal we can adjust the order of the vertical interpolation we can make it linear or second order in this case where you would see the strongest difference in that would be up towards the model top at the tropopause or maybe two thirds of the way up to the model top and that's where this this particular field is showing you so at the tropopause even in areas of topography you can get significant small scale changes that are caused by the selecting a higher order vertical interpolation you can choose to have the input data for the wharf model sigma equals one defined entirely as the lowest level of input data that comes in from the large scale model typically the wharf models run at much higher resolution so this tends to have no real impact over relatively flat topography which you see on the eastern two thirds of the domain but in areas of topography this causes a substantial difference in the the fields other options you can optionally choose to smooth the outermost few rows and columns this cg topo smooth that provides is an easier transition through the boundary region where the large scale fields are trying to adjust to the the wharf model and if those large scale fields are much coarser it's reasonable to smooth them as we transition into the the middle of the the wharf domain perhaps used only for much older data where there was some sort of reason to suspect the validity of the surface you can turn the surface fields off entirely that come in from the mcgrid package and you know use surface equals to false again on the eastern part of the domain the fields are fairly smooth but there is a strong discontinuity as we transit from one isobaric field to the next isobaric field when we're doing our interpolation our linear interpolations and then inside of the region which is dominated by high resolution topography or very noisy topography we see very small scale features that are present which would generate a lot of noise so the purpose of showing these few slides for the vertical interpolation is to let the users know that there are a number of options that are available to allow you to fine-tune the initial conditions and the boundary conditions that are going into the wharf model we've modified the name list and the associated registry to utilize a number of defaults that we think are reasonable but since this is a research model those options are entirely available for the user to go in and tinker with and adjust to get their fields just right the internals of the vertical interpolation that go on inside of the real program are largely outside the purview of an individual user the functions that go on basically things such as the data is input and then it is vertically ordered and vertically ordered means that there's a check to make sure that the data is monotonically increasing or decreasing in pressure and then when we do the vertical interpolation a column is either flipped or not to make sure that it's in a decreasing order when we go into the interpolation routine because the [Music] fields inside of the wharf model are a the ada surfaces are a function of normalized pressure all of the vertical interpolations inside of the real prayer program are done in pressure space so to compute the interpolation for pressure we need to be able to compute what the surface pressure is going to be in the wharf model so we have input temperature topography and height and moisture and we use those to compute a surface field of what surface pressure field for the wharf model we are given a vertical coordinate and that's the normalized column pressure we've seen it before where eta goes from one to zero and that normalized vertical coordinate defines then the locations at each ij grid point where we are going to be interpolating two from our mcgrid data the ada surfaces can be computed from the real program or they can be entirely provided by the user and that needs a little bit of a caveat there users have to be careful about that it's probably best for new users to allow the real program to compute a reasonable set of ada surfaces for you traditionally what we recommend is that users provide the number of vertical levels that they are interested in and then let the real program compute a reasonable distribution of those ada surfaces given that total number however if a user would like to provide their own surfaces there are some guidelines that should be adhered to the surfaces should probably be less than about a kilometer in the free atmosphere it's certainly okay to have fairly narrow very fairly thin layers in the near the surface if you're choosing ada surfaces for example and you are someone who is interested in wind energy it is not sufficient to simply go up a kilometer or so and say that that's the entire depth of the atmosphere it's really best to go up to about 50 hectopascals as a minimum model top 20 hectopascals 10 hectopascals are absolutely fine so really it is safest to not choose your own edel values if you have a set of provided values that come from a previous publication that's entirely acceptable but trying to compute your own a levels is not recommended because it tends to cause troubles there are a few nameless options that are inside of the nameless.input file these are the runtime options that we mentioned earlier inside of the domains nameless.record there's this evert which is the ending number of the vertical levels this is the full levels inside of the model so 50 vertical levels for in this case possibly up to three domains p top requested is the pressure at the top of the model this is an isobaric surface this value is in pascal so it's a 10 hesco 10 hectopascal modeled 10 millibar top there are a few constants that a user can provide the temperature at the surface at the sea level for the model actually a thousand hectopascals the temperature the isothermal temperature that would be that the model would not get colder than above the tropopause and the stratospheric base state pressure these are reasonable defaults that are inside of the the wharf model a user can provide their own set of ada levels what is the concern is that there is not a uniformity of the layer thickness on the left-hand side is what we would like to see this is a cross-section of the difference or the thickness of the ada layers as we go from the surface up to the model lid and they're all about the same color because they're all about the same thickness between 700 and 800 meters entirely reasonable above the the boundary layer on the right hand side are levels that were linearly chosen by a user and what you tend to see is a set of levels that are finer in the mid troposphere and then they get an exaggerated stretch as you move from the tropopause up into the stratosphere the problem with having very thick layers is that the absorbing layers at the top of the model require a certain thickness and if the levels are several kilometers thick you only have one or two levels that are inside that absorbing layer and those absorbing layers are very ineffective the take home message is unless you really know what you're doing let the real program choose the ada services for you the real program and the wharf model utilize the exact same registry which is a way that variables and fields are defined they utilize the exact same build structure and in many cases the exact same source code is compiled into both executables the real program and the wharf model are very tightly coupled together there are a number of physical parameterization settings that are available inside of the name list options that have initialization capabilities that are provided by the real program so it is safest if you change physics options to re-run the real program an easy example is for soil level interpolation on the left hand side at the bottom and on the right hand side at the bottom we have the two traditional input fields that are available from the u.s weather service when we're picking up soil level data from the noaa scheme and from the rucks game the noah scheme has data valid between 0 and 10 centimeters so the midpoint is 5 centimeters it has the second layer between 10 and 40 the midpoint is 25 centimeters the third layer is between 40 and 100 centimeters with a midpoint at 70 centimeters and the deepest layers between 100 and 200 centimeters with the midpoint at 150 centimeters the ruck scheme is defined at levels instead of layers and the levels are 0 5 20 40 160 and 300 centimeters so inside of the wharf model when we choose a land surface model option depending on the a priori known defined locations and thicknesses of that particular scheme variables are interpolated between either the noaa or the ruck for example input data for surface scheme number one which is the thermal diffusion only the thicknesses are defined as 1 2 4 8 and 16 centimeters so we would end up with data interpolated from input data from met grid to those five levels if the choice that the user made was to use the noaa surface scheme a land surface scheme the levels defined there are 0 to 10 10 to 40 40 to 100 and then 100 to 200 centimeters so again five layers but the locations are going to be different if we opted to use the six layer ruck scheme or if we opted to use a two layer px scheme what we end up with is not only differing levels in the input data whether it knower or rook to interpolate the data to we end up with different numbers of levels of data that we interpolated to the wharf model allocates space based on the number of levels that are defined or suggested because of the land surface model option so in recap the real program and the wharf model are tightly coupled together because the data in one that gets passed to the other there is an assumed mandate that that data is the exact same size in the other program physical parameterization schemes and there are quite a number of them are initialized by the real program whether via the values that are provided or the actual dimensionality of the the field so the safe thing is if you change physics options inside the wharf model it is always safest to re-run the real program again the real program takes hardly any time at all to run through a few time periods so it's never going to cause you troubles to make sure that the physics options are identical in the real program with what you are running in the wharf model if you are changing dynamics options it's hardly ever required for you to rerun the real program the entire purpose for the real program is to take meteorological data that is on the native coordinate from the first guest model that came in from ungrib and went into met grid and then vertically interpolate that data compute some diagnostics and pass data files on into the wharf model the entire function of the real program is to move data from wps to the wharf model the data that comes into the real program are traditional meteorological variables that one would expect wind temperature moisture pressure geopotential height at the surface and at various levels up through the top of the atmosphere as defined by the model are all required additional other variables are required too such as the topography the sea level pressure surface pressure and there are optional variables but the line between optional and required is really not all that important most the fields that the wharf model would like to see from large established national center models are all available inside of their grid data sets and the ungrib program with the v tables that are already set up is able to pull the variables that the wharf model would like to see from those files and put those file fields in the data stream that moves through mkrid and then through real into the wharf model the data comes ultimately from the national center but it gets horizontally interpolated inside of mecgrid it gets vertically interpolated and diagnosed inside of the real program the only restriction that is on the vertical coordinate that comes from metgrid is the data is strictly monotonically increasing or strictly monotonically decreasing when it comes to pressure there is a base state that is defined inside of the wharf model what the base state does is provides a numerical contrivance an engineering solution to allow the model to have a free extra digit or two when it's doing some horizontal differencing so what we remove is a large portion of a value that we are differencing and when we remove a nearly constant value from neighboring grid cells we end up with a perturbation and that perturbation is able to get a a bit more resolution the real program and the wharf model use the exact same base state and that base state those constants that are used in the bay state are also used to define the the vertical coordinate the other important thing to remember when you're playing around with the base state there there is no perturbation from the potential temperature that is uh on each level but there is a constant offset for the entire 3d volume of data when you see the variable t in the model output it is the dry potential temperature minus 300 kelvin just as there are input variables there are output fields that are required there are two output files the first output file is the wharf input file that's the 3d volume of data for an initial condition for the wharf model there possibly could be one of these initial condition files for each of the required domains that the wharf model is running the second mandatory file is the worf boundary file the wharf boundary file is only required for domain number one the most coarse grid there are a large number of vertical coordinate options inside of the real program those are defined inside the registry and the user may put those inside of the nameless.input file however what are the recommendations it's simple initially just try to use what the defaults are if you see difficulties or problems in your initial data then it's worthwhile to go back and take a look at some of the vertical interpolation options and see if you can tinker with those to get your initial conditions just exactly how you want while this is called soil level interpolation it's really a good example of why what we need to do is make sure that the physics options between the real program and the wharf model are consistent there are a number of initializations that go on inside of the real program that provide data to the wharf model and that data never gets into the wharf model that was used as a precursor for those fields so the real program is the last time that we see isobaric data or the original input data for soil level information so if we change physics options in the wharf model between what was defined in the real program it's always safest to go back re-run the real program with the physics options that are going to be used inside the wharf model and then use those exact same physics options inside of the the wharf model if there are any questions about this presentation or any of the other presentations in the rest of the wharf modeling system tutorial please go ahead and send those to the wharf forum there's a general statement for the wharf user support that is provided and there are a number of resources that a user can take advantage of or has a nice home page that details not only information and documentation but also schedules and availability of tutorials and workshops finally the wharf and wps source codes are maintained with an open development capability on github at the following locations that's the end of this presentation